WorldWideScience

Sample records for large-scale gene-trap screen

  1. Large-scale screens of metagenomic libraries.

    Science.gov (United States)

    Pham, Vinh D; Palden, Tsultrim; DeLong, Edward F

    2007-01-01

    Metagenomic libraries archive large fragments of contiguous genomic sequences from microorganisms without requiring prior cultivation. Generating a streamlined procedure for creating and screening metagenomic libraries is therefore useful for efficient high-throughput investigations into the genetic and metabolic properties of uncultured microbial assemblages. Here, key protocols are presented on video, which we propose is the most useful format for accurately describing a long process that alternately depends on robotic instrumentation and (human) manual interventions. First, we employed robotics to spot library clones onto high-density macroarray membranes, each of which can contain duplicate colonies from twenty-four 384-well library plates. Automation is essential for this procedure not only for accuracy and speed, but also due to the miniaturization of scale required to fit the large number of library clones into highly dense spatial arrangements. Once generated, we next demonstrated how the macroarray membranes can be screened for genes of interest using modified versions of standard protocols for probe labeling, membrane hybridization, and signal detection. We complemented the visual demonstration of these procedures with detailed written descriptions of the steps involved and the materials required, all of which are available online alongside the video.

  2. Large Scale Bacterial Colony Screening of Diversified FRET Biosensors.

    Directory of Open Access Journals (Sweden)

    Julia Litzlbauer

    Full Text Available Biosensors based on Förster Resonance Energy Transfer (FRET between fluorescent protein mutants have started to revolutionize physiology and biochemistry. However, many types of FRET biosensors show relatively small FRET changes, making measurements with these probes challenging when used under sub-optimal experimental conditions. Thus, a major effort in the field currently lies in designing new optimization strategies for these types of sensors. Here we describe procedures for optimizing FRET changes by large scale screening of mutant biosensor libraries in bacterial colonies. We describe optimization of biosensor expression, permeabilization of bacteria, software tools for analysis, and screening conditions. The procedures reported here may help in improving FRET changes in multiple suitable classes of biosensors.

  3. Large-scale screening of hypothetical metal-organic frameworks

    Science.gov (United States)

    Wilmer, Christopher E.; Leaf, Michael; Lee, Chang Yeon; Farha, Omar K.; Hauser, Brad G.; Hupp, Joseph T.; Snurr, Randall Q.

    2012-02-01

    Metal-organic frameworks (MOFs) are porous materials constructed from modular molecular building blocks, typically metal clusters and organic linkers. These can, in principle, be assembled to form an almost unlimited number of MOFs, yet materials reported to date represent only a tiny fraction of the possible combinations. Here, we demonstrate a computational approach to generate all conceivable MOFs from a given chemical library of building blocks (based on the structures of known MOFs) and rapidly screen them to find the best candidates for a specific application. From a library of 102 building blocks we generated 137,953 hypothetical MOFs and for each one calculated the pore-size distribution, surface area and methane-storage capacity. We identified over 300 MOFs with a predicted methane-storage capacity better than that of any known material, and this approach also revealed structure-property relationships. Methyl-functionalized MOFs were frequently top performers, so we selected one such promising MOF and experimentally confirmed its predicted capacity.

  4. Large-scale screening of hypothetical metal-organic frameworks.

    Science.gov (United States)

    Wilmer, Christopher E; Leaf, Michael; Lee, Chang Yeon; Farha, Omar K; Hauser, Brad G; Hupp, Joseph T; Snurr, Randall Q

    2011-11-06

    Metal-organic frameworks (MOFs) are porous materials constructed from modular molecular building blocks, typically metal clusters and organic linkers. These can, in principle, be assembled to form an almost unlimited number of MOFs, yet materials reported to date represent only a tiny fraction of the possible combinations. Here, we demonstrate a computational approach to generate all conceivable MOFs from a given chemical library of building blocks (based on the structures of known MOFs) and rapidly screen them to find the best candidates for a specific application. From a library of 102 building blocks we generated 137,953 hypothetical MOFs and for each one calculated the pore-size distribution, surface area and methane-storage capacity. We identified over 300 MOFs with a predicted methane-storage capacity better than that of any known material, and this approach also revealed structure-property relationships. Methyl-functionalized MOFs were frequently top performers, so we selected one such promising MOF and experimentally confirmed its predicted capacity.

  5. A Rapid DNA Mini-prep Method for Large-Scale Rice Mutant Screening

    Institute of Scientific and Technical Information of China (English)

    QIU Fu-lin; WANG He-he; CHEN Jie; ZHUANG Jie-yun; Hei LEUNG; CHENG Shi-hua; Wu Jian-li

    2006-01-01

    A high throughput rice DNA mini-preparation method was developed. The method is suitable for large-scale mutant bank screening as well as large mapping populations with characteristics of maintaining relatively high level of DNA purity and concentration. The extracted DNA was tested and suitable for regular PCR amplification (SSR) and for Targeting Induced Local Lesion in Genome (TILLING) analysis.

  6. Large-scale plasmonic microarrays for label-free high-throughput screening.

    Science.gov (United States)

    Chang, Tsung-Yao; Huang, Min; Yanik, Ahmet Ali; Tsai, Hsin-Yu; Shi, Peng; Aksu, Serap; Yanik, Mehmet Fatih; Altug, Hatice

    2011-11-07

    Microarrays allowing simultaneous analysis of thousands of parameters can significantly accelerate screening of large libraries of pharmaceutical compounds and biomolecular interactions. For large-scale studies on diverse biomedical samples, reliable, label-free, and high-content microarrays are needed. In this work, using large-area plasmonic nanohole arrays, we demonstrate for the first time a large-scale label-free microarray technology with over one million sensors on a single microscope slide. A dual-color filter imaging method is introduced to dramatically increase the accuracy, reliability, and signal-to-noise ratio of the sensors in a highly multiplexed manner. We used our technology to quantitatively measure protein-protein interactions. Our platform, which is highly compatible with the current microarray scanning systems can enable a powerful screening technology and facilitate diagnosis and treatment of diseases.

  7. Simulation applied to working frequency selection in large-scale vibrating screen's design

    Institute of Scientific and Technical Information of China (English)

    PENG Chen-yu; SU Rong-hua

    2011-01-01

    The working frequency selection of the ZK30525 vibrating screen was studied using ANSYS.Integrating the dynamic performance simulation analysis of the vibrating screen structure,the variation laws of beams' vibration displacements changing with different exciting frequencies were researched.These beams include six beams,with one discharging beam and one in-material beam.Results indicate that vibration displacements in the middle of these beams increase with the augmentation of exciting frequency.When exciting frequency exceeds a certain value,there exists a flat change region for vibration displacement.According to vibrator characteristics,the vibrating screen's working frequency should be selected in the flat change region,and be far away from modal frequencies.The study provides theoretical guidance for the reasonable working frequency selection of the large-scale vibrating screen.

  8. A large scale screen for neural stem cell markers in Xenopus retina.

    Science.gov (United States)

    Parain, Karine; Mazurier, Nicolas; Bronchain, Odile; Borday, Caroline; Cabochette, Pauline; Chesneau, Albert; Colozza, Gabriele; El Yakoubi, Warif; Hamdache, Johanna; Locker, Morgane; Gilchrist, Michael J; Pollet, Nicolas; Perron, Muriel

    2012-04-01

    Neural stem cell research suffers from a lack of molecular markers to specifically assess stem or progenitor cell properties. The organization of the Xenopus ciliary marginal zone (CMZ) in the retina allows the spatial distinction of these two cell types: stem cells are confined to the most peripheral region, while progenitors are more central. Despite this clear advantage, very few genes specifically expressed in retinal stem cells have been discovered so far in this model. To gain insight into the molecular signature of these cells, we performed a large-scale expression screen in the Xenopus CMZ, establishing it as a model system for stem cell gene profiling. Eighteen genes expressed specifically in the CMZ stem cell compartment were retrieved and are discussed here. These encode various types of proteins, including factors associated with proliferation, mitotic spindle organization, DNA/RNA processing, and cell adhesion. In addition, the publication of this work in a special issue on Xenopus prompted us to give a more general illustration of the value of large-scale screens in this model species. Thus, beyond neural stem cell specific genes, we give a broader highlight of our screen outcome, describing in particular other retinal cell markers that we found. Finally, we present how these can all be easily retrieved through a novel module we developed in the web-based annotation tool XenMARK, and illustrate the potential of this powerful searchable database in the context of the retina.

  9. Large Scale Nanoparticle Screening for Small Molecule Analysis in Laser Desorption Ionization Mass Spectrometry.

    Science.gov (United States)

    Yagnik, Gargey B; Hansen, Rebecca L; Korte, Andrew R; Reichert, Malinda D; Vela, Javier; Lee, Young Jin

    2016-09-20

    Nanoparticles (NPs) have been suggested as efficient matrixes for small molecule profiling and imaging by laser-desorption ionization mass spectrometry (LDI-MS), but so far there has been no systematic study comparing different NPs in the analysis of various classes of small molecules. Here, we present a large scale screening of 13 NPs for the analysis of two dozen small metabolite molecules. Many NPs showed much higher LDI efficiency than organic matrixes in positive mode and some NPs showed comparable efficiencies for selected analytes in negative mode. Our results suggest that a thermally driven desorption process is a key factor for metal oxide NPs, but chemical interactions are also very important, especially for other NPs. The screening results provide a useful guideline for the selection of NPs in the LDI-MS analysis of small molecules.

  10. Large-Scale Forward Genetic Screening Analysis of Development of Hematopoiesis in Zebrafish

    Institute of Scientific and Technical Information of China (English)

    Kun Wang; Ning Ma; Yiyue Zhang; Wenqing Zhang; Zhibin Huang; Lingfeng Zhao; Wei Liu; Xiaohui Chen; Ping Meng; Qing Lin; Yali Chi; Mengchang Xu

    2012-01-01

    Zebrafish is a powerful model for the investigation of hematopoiesis.In order to isolate novel mutants with hematopoietic defects,large-scale mutagenesis screening of zebrafish was performed.By scoring specific hematopoietic markers,52 mutants were identified and then classified into four types based on specific phenotypic traits.Each mutant represented a putative mutation of a gene regulating the relevant aspect of hematopoiesis,including early macrophage development,early granulopoiesis,embryonic myelopoiesis,and definitive erythropoiesis/lymphopoiesis.Our method should be applicable for other types of genetic screening in zebrafish.In addition,further study of the mutants we identified may help to unveil the molecular basis of hematopoiesis.

  11. Large-scale virtual screening on public cloud resources with Apache Spark.

    Science.gov (United States)

    Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola

    2017-01-01

    Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.

  12. Screening and large-scale expression of membrane proteins in mammalian cells for structural studies

    Science.gov (United States)

    Goehring, April; Lee, Chia-Hsueh; Wang, Kevin H.; Michel, Jennifer Carlisle; Claxton, Derek P.; Baconguis, Isabelle; Althoff, Thorsten; Fischer, Suzanne; Garcia, K. Christopher; Gouaux, Eric

    2014-01-01

    Structural, biochemical and biophysical studies of eukaryotic membrane proteins are often hampered by difficulties in over-expression of the candidate molecule. Baculovirus transduction of mammalian cells (BacMam), although a powerful method to heterologously express membrane proteins, can be cumbersome for screening and expression of multiple constructs. We therefore developed plasmid Eric Gouaux (pEG) BacMam, a vector optimized for use in screening assays, as well as for efficient production of baculovirus and robust expression of the target protein. In this protocol we show how to use small-scale transient transfection and fluorescence-detection, size-exclusion chromatography (FSEC) experiments using a GFP-His8 tagged candidate protein to screen for monodispersity and expression level. Once promising candidates are identified, we describe how to generate baculovirus, transduce HEK293S GnTI− (N-acetylglucosaminyltransferase I-negative) cells in suspension culture, and over-express the candidate protein. We have used these methods to prepare pure samples of chicken acid-sensing ion channel 1a (cASIC1) and Caenorhabditis elegans glutamate-gated chloride channel (GluCl), for X-ray crystallography, demonstrating how to rapidly and efficiently screen hundreds of constructs and accomplish large-scale expression in 4-6 weeks. PMID:25299155

  13. Screening for lung cancer: time for large-scale screening by chest computed tomography.

    Science.gov (United States)

    Shlomi, Dekel; Ben-Avi, Ronny; Balmor, Gingy Ronen; Onn, Amir; Peled, Nir

    2014-07-01

    Lung cancer is the leading cause of cancer death worldwide. Age and smoking are the primary risk factors for lung cancer. Treatment based on surgical removal in the early stages of the disease results in better survival. Screening programmes for early detection that used chest radiography and sputum cytology failed to attain reduction of lung cancer mortality. Screening by low-dose computed tomography (CT) demonstrated high rates of early-stage lung cancer detection in a high-risk population. Nevertheless, no mortality advantage was manifested in small randomised control trials. A large randomised control trial in the U.S.A., the National Lung Screening Trial (NLST), showed a significant relative reduction of 20% in lung cancer mortality and 6.7% reduction in total mortality, yet no reduction was evidenced in the late-stage prevalence. Screening for lung cancer by low-dose CT reveals a high level of false-positive lesions, which necessitates further noninvasive and invasive evaluations. Based primarily on the NLST eligible criteria, new guidelines have recently been developed by major relevant organisations. The overall recommendation coming out of this collective work calls for lung cancer screening by low-dose CT to be performed in medical centres manned by specialised multidisciplinary teams, as well as for a mandatory, pre-screening, comprehensive discussion with the patient about the risks and advantages involved in the process. Lung cancer screening is on the threshold of a new era, with ever more questions still left open to challenge future studies.

  14. Large scale generation of micro-droplet array by vapor condensation on mesh screen piece

    Science.gov (United States)

    Xie, Jian; Xu, Jinliang; He, Xiaotian; Liu, Qi

    2017-01-01

    We developed a novel micro-droplet array system, which is based on the distinct three dimensional mesh screen structure and sintering and oxidation induced thermal-fluid performance. Mesh screen was sintered on a copper substrate by bonding the two components. Non-uniform residue stress is generated along weft wires, with larger stress on weft wire top location than elsewhere. Oxidation of the sintered package forms micro pits with few nanograsses on weft wire top location, due to the stress corrosion mechanism. Nanograsses grow elsewhere to show hydrophobic behavior. Thus, surface-energy-gradient weft wires are formed. Cooling the structure in a wet air environment nucleates water droplets on weft wire top location, which is more “hydrophilic” than elsewhere. Droplet size is well controlled by substrate temperature, air humidity and cooling time. Because warp wires do not contact copper substrate and there is a larger conductive thermal resistance between warp wire and weft wire, warp wires contribute less to condensation but function as supporting structure. The surface energy analysis of drops along weft wires explains why droplet array can be generated on the mesh screen piece. Because the commercial material is used, the droplet system is cost effective and can be used for large scale utilization.

  15. Large-scale RNA interference screening in mammalian cells identifies novel regulators of mutant huntingtin aggregation.

    Directory of Open Access Journals (Sweden)

    Tomoyuki Yamanaka

    Full Text Available In polyglutamine (polyQ diseases including Huntington's disease (HD, mutant proteins containing expanded polyQ stretch form aggregates in neurons. Genetic or RNAi screenings in yeast, C. elegans or Drosophila have identified multiple genes modifying polyQ aggregation, a few of which are confirmed effective in mammals. However, the overall molecular mechanism underlying polyQ protein aggregation in mammalian cells still remains obscure. We here perform RNAi screening in mouse neuro2a cells to identify mammalian modifiers for aggregation of mutant huntingtin, a causative protein of HD. By systematic cell transfection and automated cell image analysis, we screen ∼ 12000 shRNA clones and identify 111 shRNAs that either suppress or enhance mutant huntingtin aggregation, without altering its gene expression. Classification of the shRNA-targets suggests that genes with various cellular functions such as gene transcription and protein phosphorylation are involved in modifying the aggregation. Subsequent analysis suggests that, in addition to the aggregation-modifiers sensitive to proteasome inhibition, some of them, such as a transcription factor Tcf20, and kinases Csnk1d and Pik3c2a, are insensitive to it. As for Tcf20, which contains polyQ stretches at N-terminus, its binding to mutant huntingtin aggregates is observed in neuro2a cells and in HD model mouse neurons. Notably, except Pik3c2a, the rest of the modifiers identified here are novel. Thus, our first large-scale RNAi screening in mammalian system identifies previously undescribed genetic players that regulate mutant huntingtin aggregation by several, possibly mammalian-specific mechanisms.

  16. Chloroplast 2010: a database for large-scale phenotypic screening of Arabidopsis mutants.

    Science.gov (United States)

    Lu, Yan; Savage, Linda J; Larson, Matthew D; Wilkerson, Curtis G; Last, Robert L

    2011-04-01

    Large-scale phenotypic screening presents challenges and opportunities not encountered in typical forward or reverse genetics projects. We describe a modular database and laboratory information management system that was implemented in support of the Chloroplast 2010 Project, an Arabidopsis (Arabidopsis thaliana) reverse genetics phenotypic screen of more than 5,000 mutants (http://bioinfo.bch.msu.edu/2010_LIMS; www.plastid.msu.edu). The software and laboratory work environment were designed to minimize operator error and detect systematic process errors. The database uses Ruby on Rails and Flash technologies to present complex quantitative and qualitative data and pedigree information in a flexible user interface. Examples are presented where the database was used to find opportunities for process changes that improved data quality. We also describe the use of the data-analysis tools to discover mutants defective in enzymes of leucine catabolism (heteromeric mitochondrial 3-methylcrotonyl-coenzyme A carboxylase [At1g03090 and At4g34030] and putative hydroxymethylglutaryl-coenzyme A lyase [At2g26800]) based upon a syndrome of pleiotropic seed amino acid phenotypes that resembles previously described isovaleryl coenzyme A dehydrogenase (At3g45300) mutants. In vitro assay results support the computational annotation of At2g26800 as hydroxymethylglutaryl-coenzyme A lyase.

  17. Substantial improvements in large-scale redocking and screening using the novel HYDE scoring function

    Science.gov (United States)

    Schneider, Nadine; Hindle, Sally; Lange, Gudrun; Klein, Robert; Albrecht, Jürgen; Briem, Hans; Beyer, Kristin; Claußen, Holger; Gastreich, Marcus; Lemmen, Christian; Rarey, Matthias

    2012-06-01

    The HYDE scoring function consistently describes hydrogen bonding, the hydrophobic effect and desolvation. It relies on HYdration and DEsolvation terms which are calibrated using octanol/water partition coefficients of small molecules. We do not use affinity data for calibration, therefore HYDE is generally applicable to all protein targets. HYDE reflects the Gibbs free energy of binding while only considering the essential interactions of protein-ligand complexes. The greatest benefit of HYDE is that it yields a very intuitive atom-based score, which can be mapped onto the ligand and protein atoms. This allows the direct visualization of the score and consequently facilitates analysis of protein-ligand complexes during the lead optimization process. In this study, we validated our new scoring function by applying it in large-scale docking experiments. We could successfully predict the correct binding mode in 93% of complexes in redocking calculations on the Astex diverse set, while our performance in virtual screening experiments using the DUD dataset showed significant enrichment values with a mean AUC of 0.77 across all protein targets with little or no structural defects. As part of these studies, we also carried out a very detailed analysis of the data that revealed interesting pitfalls, which we highlight here and which should be addressed in future benchmark datasets.

  18. ToxDBScan: Large-Scale Similarity Screening of Toxicological Databases for Drug Candidates

    Directory of Open Access Journals (Sweden)

    Michael Römer

    2014-10-01

    Full Text Available We present a new tool for hepatocarcinogenicity evaluation of drug candidates in rodents. ToxDBScan is a web tool offering quick and easy similarity screening of new drug candidates against two large-scale public databases, which contain expression profiles for substances with known carcinogenic profiles: TG-GATEs and DrugMatrix. ToxDBScan uses a set similarity score that computes the putative similarity based on similar expression of genes to identify chemicals with similar genotoxic and hepatocarcinogenic potential. We propose using a discretized representation of expression profiles, which use only information on up- or down-regulation of genes as relevant features. Therefore, only the deregulated genes are required as input. ToxDBScan provides an extensive report on similar compounds, which includes additional information on compounds, differential genes and pathway enrichments. We evaluated ToxDBScan with expression data from 15 chemicals with known hepatocarcinogenic potential and observed a sensitivity of 88 Based on the identified chemicals, we achieved perfect classification of the independent test set. ToxDBScan is publicly available from the ZBIT Bioinformatics Toolbox.

  19. Celiac disease markers in patients with liver diseases: A single center large scale screening study

    Institute of Scientific and Technical Information of China (English)

    Pavel Drastich; Eva Honsová; Alena Lodererová; Marcela Jare(s)ová; Aneta Pekáriková; Iva Hoffmanová; Ludmila Tu(c)ková

    2012-01-01

    AIM:To study the coincidence of celiac disease,we tested its serological markers in patients with various liver diseases.METHODS:Large-scale screening of serum antibodies against tissue transglutaminase (tTG),and deamidated gliadin using enzyme-linked immunosorbent assay and serum antibodies against endomysium using immunohistochemistry,in patients with various liver diseases (n =962) and patients who underwent liver transplantation (OLTx,n =523) was performed.The expression of tTG in liver tissue samples of patients simultaneously suffering from celiac disease and from various liver diseases using immunohistochemistry was carried out.The final diagnosis of celiac disease was confirmed by histological analysis of small-intestinal biopsy.RESULTS:We found that 29 of 962 patients (3%) with liver diseases and 5 of 523 patients (0.8%) who underwent OLTx were seropositive for IgA and IgG anti-tTG antibodies.However,celiac disease was biopsy-diagnosed in 16 patients:4 with autoimmune hepatitis type Ⅰ,3 with Wilson's disease,3 with celiac hepatitis,2 with primary sclerosing cholangitis,1with primary biliary cirrhosis,1 with Budd-Chiari syndrome,1 with toxic hepatitis,and 1 with non-alcoholic steatohepatitis.Unexpectedly,the highest prevalence of celiac disease was found in patients with Wilson's disease (9.7%),with which it is only rarely associated.On the other hand,no OLTx patients were diagnosed with celiac disease in our study.A pilot study of the expression of tTG in liver tissue using immunohistochemistry documented the overexpression of this molecule in endothelial cells and periportal hepatocytes of patients simultaneously suffering from celiac disease and toxic hepatitis,primary sclerosing cholangitis or autoimmune hepatitis type Ⅰ.CONCLUSION:We suggest that screening for celiac disease may be beneficial not only in patients with associated liver diseases,but also in patients with Wilson's disease.

  20. Large-scale screening of a targeted Enterococcus faecalis mutant library identifies envelope fitness factors.

    Directory of Open Access Journals (Sweden)

    Lionel Rigottier-Gois

    Full Text Available Spread of antibiotic resistance among bacteria responsible for nosocomial and community-acquired infections urges for novel therapeutic or prophylactic targets and for innovative pathogen-specific antibacterial compounds. Major challenges are posed by opportunistic pathogens belonging to the low GC% gram-positive bacteria. Among those, Enterococcus faecalis is a leading cause of hospital-acquired infections associated with life-threatening issues and increased hospital costs. To better understand the molecular properties of enterococci that may be required for virulence, and that may explain the emergence of these bacteria in nosocomial infections, we performed the first large-scale functional analysis of E. faecalis V583, the first vancomycin-resistant isolate from a human bloodstream infection. E. faecalis V583 is within the high-risk clonal complex 2 group, which comprises mostly isolates derived from hospital infections worldwide. We conducted broad-range screenings of candidate genes likely involved in host adaptation (e.g., colonization and/or virulence. For this purpose, a library was constructed of targeted insertion mutations in 177 genes encoding putative surface or stress-response factors. Individual mutants were subsequently tested for their i resistance to oxidative stress, ii antibiotic resistance, iii resistance to opsonophagocytosis, iv adherence to the human colon carcinoma Caco-2 epithelial cells and v virulence in a surrogate insect model. Our results identified a number of factors that are involved in the interaction between enterococci and their host environments. Their predicted functions highlight the importance of cell envelope glycopolymers in E. faecalis host adaptation. This study provides a valuable genetic database for understanding the steps leading E. faecalis to opportunistic virulence.

  1. A systematic, large-scale resequencing screen of X-chromosome coding exons in mental retardation.

    NARCIS (Netherlands)

    Tarpey, P.S.; Smith, R.; Pleasance, E.; Whibley, A.; Edkins, S.; Hardy, C.; O'Meara, S.; Latimer, C.; Dicks, E.; Menzies, A.; Stephens, P.; Blow, M.; Greenman, C.; Xue, Y.; Tyler-Smith, C.; Thompson, D.; Gray, K.; Andrews, J.; Barthorpe, S.; Buck, G.; Cole, J.; Dunmore, R.; Jones, D.; Maddison, M.; Mironenko, T.; Turner, R.; Turrell, K.; Varian, J.; West, S.; Widaa, S.; Wray, P.; Teague, J.; Butler, A.; Jenkinson, A.; Jia, M.; Richardson, D.; Shepherd, R.; Wooster, R.; Tejada, M.I.; Martinez, F.; Carvill, G.; Goliath, R.; Brouwer, A.P.M. de; Bokhoven, H. van; Esch, H. van; Chelly, J.; Raynaud, M.; Ropers, H.H.; Abidi, F.E.; Srivastava, A.K.; Cox, J.; Luo, Y.; Mallya, U.; Moon, J.; Parnau, J.; Mohammed, S.; Tolmie, J.L.; Shoubridge, C.; Corbett, M.; Gardner, A.; Haan, E.; Rujirabanjerd, S.; Shaw, M.A.; Vandeleur, L.; Fullston, T.; Easton, D.F.; Boyle, J.; Partington, M.; Hackett, A.; Field, M.; Skinner, C.; Stevenson, R.E.; Bobrow, M.; Turner, G.; Schwartz, C.E.; Gecz, J.; Raymond, F.L.; Futreal, P.A.; Stratton, M.R.

    2009-01-01

    Large-scale systematic resequencing has been proposed as the key future strategy for the discovery of rare, disease-causing sequence variants across the spectrum of human complex disease. We have sequenced the coding exons of the X chromosome in 208 families with X-linked mental retardation (XLMR),

  2. The iBeetle large-scale RNAi screen reveals gene functions for insect development and physiology.

    Science.gov (United States)

    Schmitt-Engel, Christian; Schultheis, Dorothea; Schwirz, Jonas; Ströhlein, Nadi; Troelenberg, Nicole; Majumdar, Upalparna; Dao, Van Anh; Grossmann, Daniela; Richter, Tobias; Tech, Maike; Dönitz, Jürgen; Gerischer, Lizzy; Theis, Mirko; Schild, Inga; Trauner, Jochen; Koniszewski, Nikolaus D B; Küster, Elke; Kittelmann, Sebastian; Hu, Yonggang; Lehmann, Sabrina; Siemanowski, Janna; Ulrich, Julia; Panfilio, Kristen A; Schröder, Reinhard; Morgenstern, Burkhard; Stanke, Mario; Buchhholz, Frank; Frasch, Manfred; Roth, Siegfried; Wimmer, Ernst A; Schoppmeier, Michael; Klingler, Martin; Bucher, Gregor

    2015-07-28

    Genetic screens are powerful tools to identify the genes required for a given biological process. However, for technical reasons, comprehensive screens have been restricted to very few model organisms. Therefore, although deep sequencing is revealing the genes of ever more insect species, the functional studies predominantly focus on candidate genes previously identified in Drosophila, which is biasing research towards conserved gene functions. RNAi screens in other organisms promise to reduce this bias. Here we present the results of the iBeetle screen, a large-scale, unbiased RNAi screen in the red flour beetle, Tribolium castaneum, which identifies gene functions in embryonic and postembryonic development, physiology and cell biology. The utility of Tribolium as a screening platform is demonstrated by the identification of genes involved in insect epithelial adhesion. This work transcends the restrictions of the candidate gene approach and opens fields of research not accessible in Drosophila.

  3. ROS-activated ATM-dependent phosphorylation of cytoplasmic substrates identified by large scale phosphoproteomics screen

    DEFF Research Database (Denmark)

    Kozlov, Sergei V; Waardenberg, Ashley J; Engholm-Keller, Kasper

    2016-01-01

    substrates (HMGA1 and UIMCI/RAP80), another five were identified in a whole cell extract phosphoproteomic screens and the remaining four proteins had not been identified previously in DNA damage response screens. We validated the phosphorylation of three of these proteins (OSR1, HDGF and ccdc82) as ATM...

  4. Data and animal management software for large-scale phenotype screening.

    Science.gov (United States)

    Ching, Keith A; Cooke, Michael P; Tarantino, Lisa M; Lapp, Hilmar

    2006-04-01

    The mouse N-ethyl-N-nitrosourea (ENU) mutagenesis program at the Genomics Institute of the Novartis Research Foundation (GNF) uses MouseTRACS to analyze phenotype screens and manage animal husbandry. MouseTRACS is a Web-based laboratory informatics system that electronically records and organizes mouse colony operations, prints cage cards, tracks inventory, manages requests, and reports Institutional Animal Care and Use Committee (IACUC) protocol usage. For efficient phenotype screening, MouseTRACS identifies mutants, visualizes data, and maps mutations. It displays and integrates phenotype and genotype data using likelihood odds ratio (LOD) plots of genetic linkage between genotype and phenotype. More detailed mapping intervals show individual single nucleotide polymorphism (SNP) markers in the context of phenotype. In addition, dynamically generated pedigree diagrams and inventory reports linked to screening results summarize the inheritance pattern and the degree of penetrance. MouseTRACS displays screening data in tables and uses standard charts such as box plots, histograms, scatter plots, and customized charts looking at clustered mice or cross pedigree comparisons. In summary, MouseTRACS enables the efficient screening, analysis, and management of thousands of animals to find mutant mice and identify novel gene functions. MouseTRACS is available under an open source license at http://www.mousetracs.sourceforge.net.

  5. Fabrication of Large-Scale Microlens Arrays Based on Screen Printing for Integral Imaging 3D Display.

    Science.gov (United States)

    Zhou, Xiongtu; Peng, Yuyan; Peng, Rong; Zeng, Xiangyao; Zhang, Yong-Ai; Guo, Tailiang

    2016-09-14

    The low-cost large-scale fabrication of microlens arrays (MLAs) with precise alignment, great uniformity of focusing, and good converging performance are of great importance for integral imaging 3D display. In this work, a simple and effective method for large-scale polymer microlens arrays using screen printing has been successfully presented. The results show that the MLAs possess high-quality surface morphology and excellent optical performances. Furthermore, the microlens' shape and size, i.e., the diameter, the height, and the distance between two adjacent microlenses of the MLAs can be easily controlled by modifying the reflowing time and the size of open apertures of the screen. MLAs with the neighboring microlenses almost tangent can be achieved under suitable size of open apertures of the screen and reflowing time, which can remarkably reduce the color moiré patterns caused by the stray light between the blank areas of the MLAs in the integral imaging 3D display system, exhibiting much better reconstruction performance.

  6. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites.

    Science.gov (United States)

    Rizk, Mohamed Abdo; El-Sayed, Shimaa Abd El-Salam; Terkawi, Mohamed Alaa; Youssef, Mohamed Ahmed; El Said, El Said El Shirbini; Elsayed, Gehad; El-Khodery, Sabry; El-Ashker, Maged; Elsify, Ahmed; Omar, Mosaab; Salama, Akram; Yokoyama, Naoaki; Igarashi, Ikuo

    2015-01-01

    A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10%) were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses.

  7. Optimization of a Fluorescence-Based Assay for Large-Scale Drug Screening against Babesia and Theileria Parasites.

    Directory of Open Access Journals (Sweden)

    Mohamed Abdo Rizk

    Full Text Available A rapid and accurate assay for evaluating antibabesial drugs on a large scale is required for the discovery of novel chemotherapeutic agents against Babesia parasites. In the current study, we evaluated the usefulness of a fluorescence-based assay for determining the efficacies of antibabesial compounds against bovine and equine hemoparasites in in vitro cultures. Three different hematocrits (HCTs; 2.5%, 5%, and 10% were used without daily replacement of the medium. The results of a high-throughput screening assay revealed that the best HCT was 2.5% for bovine Babesia parasites and 5% for equine Babesia and Theileria parasites. The IC50 values of diminazene aceturate obtained by fluorescence and microscopy did not differ significantly. Likewise, the IC50 values of luteolin, pyronaridine tetraphosphate, nimbolide, gedunin, and enoxacin did not differ between the two methods. In conclusion, our fluorescence-based assay uses low HCT and does not require daily replacement of culture medium, making it highly suitable for in vitro large-scale drug screening against Babesia and Theileria parasites that infect cattle and horses.

  8. A large scale screen for genes (3rd chromosome) related to Wingless signaling pathway

    Institute of Scientific and Technical Information of China (English)

    LIN Xin-da (林欣大); LIN Xin-hua; CHENG Jia-an (程家安)

    2004-01-01

    A wing specific F 1 genetic screen was carried out using the powerful Drosophila genetic system, combined with yeast FRT/FLP and GAL4/UAS system. Form the wing phenotypes and germline clone embryonic cuticle phenotypes observed in these mutant alleles, a number of mutant alleles of known or unknown genes were isolated. Among them, fifteen mutant alleles related to Wingless signal transduction were further isolated; the arm of these mutations located were determined, and their location in the chromosome were roughly mapped.

  9. Large-scale virtual screening for the identification of new Helicobacter pylori urease inhibitor scaffolds.

    Science.gov (United States)

    Azizian, Homa; Nabati, Farzaneh; Sharifi, Amirhossein; Siavoshi, Farideh; Mahdavi, Mohammad; Amanlou, Massoud

    2012-07-01

    Here, we report a structure-based virtual screening of the ZINC database (containing about five million compounds) by computational docking and the analysis of docking energy calculations followed by in vitro screening against H. pylori urease enzyme. One of the compounds selected showed urease inhibition in the low micromolar range. Barbituric acid and compounds 1a, 1d, 1e, 1f, 1g, 1h were found to be more potent urease inhibitors than the standard inhibitor hydroxyurea, yielding IC(50) values of 41.6, 83.3, 66.6, 50, 58.8, and 60 μM, respectively (IC(50) of hydroxyurea = 100 μM). 5-Benzylidene barbituric acid has enhanced biological activities compared to barbituric acid. Furthermore, the results indicated that among the substituted 5-benzylidene barbiturates, those with para substitution have higher urease inhibitor activities. This may be because the barbituric acid moiety is closer to the bimetallic nickel center in unsubstituted or para-substituted than in ortho- or meta-substituted analogs, so it has greater chelating ability.

  10. Wireless Smartphone ECG Enables Large-Scale Screening in Diverse Populations.

    Science.gov (United States)

    Haberman, Zachary C; Jahn, Ryan T; Bose, Rupan; Tun, Han; Shinbane, Jerold S; Doshi, Rahul N; Chang, Philip M; Saxon, Leslie A

    2015-05-01

    The ubiquitous presence of internet-connected phones and tablets presents a new opportunity for cost-effective and efficient electrocardiogram (ECG) screening and on-demand diagnosis. Wireless, single-lead real-time ECG monitoring supported by iOS and android devices can be obtained quickly and on-demand. ECGs can be immediately downloaded and reviewed using any internet browser. We compared the standard 12-lead ECG to the smartphone ECG in healthy young adults, elite athletes, and cardiology clinic patients. Accuracy for determining baseline ECG intervals and rate and rhythm was assessed. In 381 participants, 30-second lead I ECG waveforms were obtained using an iPhone case or iPad. Standard 12-lead ECGs were acquired immediately after the smartphone tracing was obtained. De-identified ECGs were interpreted by automated algorithms and adjudicated by two board-certified electrophysiologists. Both smartphone and standard ECGs detected atrial rate and rhythm, AV block, and QRS delay with equal accuracy. Sensitivities ranged from 72% (QRS delay) to 94% (atrial fibrillation). Specificities were all above 94% for both modalities. Smartphone ECG accurately detects baseline intervals, atrial rate, and rhythm and enables screening in diverse populations. Efficient ECG analysis using automated discrimination and an enhanced smartphone application with notification capabilities are features that can be easily incorporated into the acquisition process. © 2015 Wiley Periodicals, Inc.

  11. Validation of the Rasch-based Depression Screening in a large scale German general population sample

    Directory of Open Access Journals (Sweden)

    Norra Christine

    2010-09-01

    Full Text Available Abstract Background The study aimed at presenting normative data for both parallel forms of the "Rasch-based Depression Screening (DESC", to examine its Rasch model conformity and convergent and divergent validity based on a representative sample of the German general population. Methods The sample was selected with the assistance of a demographic consulting company applying a face to face interview (N = 2509; mean age = 49.4, SD = 18.2; 55.8% women. Adherence to Rasch model assumptions was determined with analysis of Rasch model fit (infit and outfit, unidimensionality, local independence (principal component factor analysis of the residuals, PCFAR and differential item functioning (DIF with regard to participants' age and gender. Norm values were calculated. Convergent and divergent validity was determined through intercorrelations with the depression and anxiety subscales of the Hospital Anxiety and Depression Scale (HADS-D and HADS-A. Results Fit statistics were below critical values (rDESC-I = .61 and rDESC-II = .60, whereas correlations with HADS-A were rDESC-I = .62 and rDESC-II = .60. Conclusions This study provided further support for the psychometric quality of the DESC. Both forms of the DESC adhered to Rasch model assumptions and showed intercorrelations with HADS subscales that are in line with the literature. The presented normative data offer important advancements for the interpretation of the questionnaire scores and enhance its usefulness for clinical and research applications.

  12. Large-Scale Screening and Identification of Novel Ebola Virus and Marburg Virus Entry Inhibitors.

    Science.gov (United States)

    Anantpadma, Manu; Kouznetsova, Jennifer; Wang, Hang; Huang, Ruili; Kolokoltsov, Andrey; Guha, Rajarshi; Lindstrom, Aaron R; Shtanko, Olena; Simeonov, Anton; Maloney, David J; Maury, Wendy; LaCount, Douglas J; Jadhav, Ajit; Davey, Robert A

    2016-08-01

    Filoviruses are highly infectious, and no FDA-approved drug therapy for filovirus infection is available. Most work to find a treatment has involved only a few strains of Ebola virus and testing of relatively small drug libraries or compounds that have shown efficacy against other virus types. Here we report the findings of a high-throughput screening of 319,855 small molecules from the Molecular Libraries Small Molecule Repository library for their activities against Marburg virus and Ebola virus. Nine of the most potent, novel compounds that blocked infection by both viruses were analyzed in detail for their mechanisms of action. The compounds inhibited known key steps in the Ebola virus infection mechanism by blocking either cell surface attachment, macropinocytosis-mediated uptake, or endosomal trafficking. To date, very few specific inhibitors of macropinocytosis have been reported. The 2 novel macropinocytosis inhibitors are more potent inhibitors of Ebola virus infection and less toxic than ethylisopropylamiloride, one commonly accepted macropinocytosis inhibitor. Each compound blocked infection of primary human macrophages, indicating their potential to be developed as new antifiloviral therapies.

  13. Large-scale screening of disease model through ENU mutagenesis in mice

    Institute of Scientific and Technical Information of China (English)

    HE Fang; WANG Zixing; ZHAO Jing; BAO Jie; DING Jun; RUAN Haibin; XIE Qing; ZHANG Zuoming; GAO Xiang

    2003-01-01

    Manipulation of mouse genome has merged as one of the most important approaches for studying gene function and establishing the disease model because of the high homology between human genome and mouse genome. In this study, the chemical mutagen ethylnitrosourea (ENU) was employed for inducing germ cell mutations in male C57BL/6J mice. The first generation (G1) of the backcross of these mutated mice, totally 3172, was screened for abnormal phenotypes on gross morphology, behavior, learning and memory, auditory brainstem response (ABR), electrocardiogram (ECG), electroretinogram (ERG), flash-visual evoked potential (F-VEP), bone mineral density, and blood sugar level. 595 mice have been identified with specific dominant abnormalities. Fur color changes, eye defects and hearing loss occurred at the highest frequency. Abnormalities related to metabolism alteration are least frequent. Interestingly, eye defects displayed significant left-right asymmetry and sex preference. Sex preference is also observed in mice with abnormal bone mineral density. Among 104 G1 generation mutant mice examined for inheritability, 14 of them have been confirmed for passing abnormal phenotypes to their progenies. However, we did not observe behavior abnormalities of G1 mice to be inheritable, suggesting multi-gene control for these complicated functions in mice. In conclusion, the generation of these mutants paves the way for understanding molecular and cellular mechanisms of these abnormal phenotypes, and accelerates the cloning of disease-related genes.

  14. Project DRIVE: A Compendium of Cancer Dependencies and Synthetic Lethal Relationships Uncovered by Large-Scale, Deep RNAi Screening.

    Science.gov (United States)

    McDonald, E Robert; de Weck, Antoine; Schlabach, Michael R; Billy, Eric; Mavrakis, Konstantinos J; Hoffman, Gregory R; Belur, Dhiren; Castelletti, Deborah; Frias, Elizabeth; Gampa, Kalyani; Golji, Javad; Kao, Iris; Li, Li; Megel, Philippe; Perkins, Thomas A; Ramadan, Nadire; Ruddy, David A; Silver, Serena J; Sovath, Sosathya; Stump, Mark; Weber, Odile; Widmer, Roland; Yu, Jianjun; Yu, Kristine; Yue, Yingzi; Abramowski, Dorothee; Ackley, Elizabeth; Barrett, Rosemary; Berger, Joel; Bernard, Julie L; Billig, Rebecca; Brachmann, Saskia M; Buxton, Frank; Caothien, Roger; Caushi, Justina X; Chung, Franklin S; Cortés-Cros, Marta; deBeaumont, Rosalie S; Delaunay, Clara; Desplat, Aurore; Duong, William; Dwoske, Donald A; Eldridge, Richard S; Farsidjani, Ali; Feng, Fei; Feng, JiaJia; Flemming, Daisy; Forrester, William; Galli, Giorgio G; Gao, Zhenhai; Gauter, François; Gibaja, Veronica; Haas, Kristy; Hattenberger, Marc; Hood, Tami; Hurov, Kristen E; Jagani, Zainab; Jenal, Mathias; Johnson, Jennifer A; Jones, Michael D; Kapoor, Avnish; Korn, Joshua; Liu, Jilin; Liu, Qiumei; Liu, Shumei; Liu, Yue; Loo, Alice T; Macchi, Kaitlin J; Martin, Typhaine; McAllister, Gregory; Meyer, Amandine; Mollé, Sandra; Pagliarini, Raymond A; Phadke, Tanushree; Repko, Brian; Schouwey, Tanja; Shanahan, Frances; Shen, Qiong; Stamm, Christelle; Stephan, Christine; Stucke, Volker M; Tiedt, Ralph; Varadarajan, Malini; Venkatesan, Kavitha; Vitari, Alberto C; Wallroth, Marco; Weiler, Jan; Zhang, Jing; Mickanin, Craig; Myer, Vic E; Porter, Jeffery A; Lai, Albert; Bitter, Hans; Lees, Emma; Keen, Nicholas; Kauffmann, Audrey; Stegmeier, Frank; Hofmann, Francesco; Schmelzle, Tobias; Sellers, William R

    2017-07-27

    Elucidation of the mutational landscape of human cancer has progressed rapidly and been accompanied by the development of therapeutics targeting mutant oncogenes. However, a comprehensive mapping of cancer dependencies has lagged behind and the discovery of therapeutic targets for counteracting tumor suppressor gene loss is needed. To identify vulnerabilities relevant to specific cancer subtypes, we conducted a large-scale RNAi screen in which viability effects of mRNA knockdown were assessed for 7,837 genes using an average of 20 shRNAs per gene in 398 cancer cell lines. We describe findings of this screen, outlining the classes of cancer dependency genes and their relationships to genetic, expression, and lineage features. In addition, we describe robust gene-interaction networks recapitulating both protein complexes and functional cooperation among complexes and pathways. This dataset along with a web portal is provided to the community to assist in the discovery and translation of new therapeutic approaches for cancer. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. The health system and population health implications of large-scale diabetes screening in India: a microsimulation model of alternative approaches.

    Directory of Open Access Journals (Sweden)

    Sanjay Basu

    2015-05-01

    Full Text Available Like a growing number of rapidly developing countries, India has begun to develop a system for large-scale community-based screening for diabetes. We sought to identify the implications of using alternative screening instruments to detect people with undiagnosed type 2 diabetes among diverse populations across India.We developed and validated a microsimulation model that incorporated data from 58 studies from across the country into a nationally representative sample of Indians aged 25-65 y old. We estimated the diagnostic and health system implications of three major survey-based screening instruments and random glucometer-based screening. Of the 567 million Indians eligible for screening, depending on which of four screening approaches is utilized, between 158 and 306 million would be expected to screen as "high risk" for type 2 diabetes, and be referred for confirmatory testing. Between 26 million and 37 million of these people would be expected to meet international diagnostic criteria for diabetes, but between 126 million and 273 million would be "false positives." The ratio of false positives to true positives varied from 3.9 (when using random glucose screening to 8.2 (when using a survey-based screening instrument in our model. The cost per case found would be expected to be from US$5.28 (when using random glucose screening to US$17.06 (when using a survey-based screening instrument, presenting a total cost of between US$169 and US$567 million. The major limitation of our analysis is its dependence on published cohort studies that are unlikely fully to capture the poorest and most rural areas of the country. Because these areas are thought to have the lowest diabetes prevalence, this may result in overestimation of the efficacy and health benefits of screening.Large-scale community-based screening is anticipated to produce a large number of false-positive results, particularly if using currently available survey-based screening

  16. Expanding the mammalian phenotype ontology to support automated exchange of high throughput mouse phenotyping data generated by large-scale mouse knockout screens.

    Science.gov (United States)

    Smith, Cynthia L; Eppig, Janan T

    2015-01-01

    A vast array of data is about to emerge from the large scale high-throughput mouse knockout phenotyping projects worldwide. It is critical that this information is captured in a standardized manner, made accessible, and is fully integrated with other phenotype data sets for comprehensive querying and analysis across all phenotype data types. The volume of data generated by the high-throughput phenotyping screens is expected to grow exponentially, thus, automated methods and standards to exchange phenotype data are required. The IMPC (International Mouse Phenotyping Consortium) is using the Mammalian Phenotype (MP) ontology in the automated annotation of phenodeviant data from high throughput phenotyping screens. 287 new term additions with additional hierarchy revisions were made in multiple branches of the MP ontology to accurately describe the results generated by these high throughput screens. Because these large scale phenotyping data sets will be reported using the MP as the common data standard for annotation and data exchange, automated importation of these data to MGI (Mouse Genome Informatics) and other resources is possible without curatorial effort. Maximum biomedical value of these mutant mice will come from integrating primary high-throughput phenotyping data with secondary, comprehensive phenotypic analyses combined with published phenotype details on these and related mutants at MGI and other resources.

  17. [Large-scale population-based genetic screening and prenatal diagnosis for thalassemias in Zhuhai City of Guangdong Province].

    Science.gov (United States)

    Zhou, Yu-qiu; Shang, Xuan; Yin, Bao-min; Xiong, Fu; Xiao, Qi-zhi; Zhou, Wan-jun; Zhang, Yong-liang; Xu, Xiang-min

    2012-02-01

    To report the results of preventive control program of severe thalassemias in Zhuhai City of Guangdong Province from 1998 to 2010. As the guide centre of marriage and childbearing and the greatest maternity hospital in Zhuhai City of Guangdong Province, Zhuhai Municipal Maternity and Child Healthcare Hospital constructed the genetic screening network for thalassemias testing and referred for follow-up and for genetic counseling. The couples for premarital medical examination or regular healthcare examination in pregnancy were enrolled to this preventive control program. A conventional strategy of screening for heterozygote was used to identify the α- and β-thalassemia traits in women and their spouses according to the standard procedures of hematological phenotype analysis which was recommended by Thalassemia International Federation (TIF). Then those suspected couples at risk were diagnosed for α- and β-thalassemia by PCR-based DNA assays. The couples at risk for severe thalassemias were counseled and offered prenatal diagnosis and termination of pregnancy in case of an affected fetus in the rights of consent and of option voluntarily. From January 1998 to December 2010, 85 522 brides and grooms-to-be for premarital screening and 41 503 pregnant women in addition to 14 141 partners for prenatal screening were recorded, the covering rates of premarital screening and prenatal screening in the city were 92.698% (from 1998 to 2003) and 27.667% (from 2004 to 2010), respectively. Totally 10 726 cases were found to be the carriers of thalassemias, with 7393 for α-thalassemia (5.237%, 7 393/141 166) and 3333 for β-thalassemia (2.361%, 3 333/141 166). A total of 257 couples at-risk for severe thalassemias were detected including 190 for α-thalassemia and 67 for β-thalassemia. Among them, 251 (97.7%, 251/257) couples were performed prenatal diagnosis. During the preventive control program, a total of 72 fetuses with severe thalassemias including hemoglobin H disease

  18. Identification of genes important for cutaneous function revealed by a large scale reverse genetic screen in the mouse.

    Science.gov (United States)

    DiTommaso, Tia; Jones, Lynelle K; Cottle, Denny L; Gerdin, Anna-Karin; Vancollie, Valerie E; Watt, Fiona M; Ramirez-Solis, Ramiro; Bradley, Allan; Steel, Karen P; Sundberg, John P; White, Jacqueline K; Smyth, Ian M

    2014-10-01

    The skin is a highly regenerative organ which plays critical roles in protecting the body and sensing its environment. Consequently, morbidity and mortality associated with skin defects represent a significant health issue. To identify genes important in skin development and homeostasis, we have applied a high throughput, multi-parameter phenotype screen to the conditional targeted mutant mice generated by the Wellcome Trust Sanger Institute's Mouse Genetics Project (Sanger-MGP). A total of 562 different mouse lines were subjected to a variety of tests assessing cutaneous expression, macroscopic clinical disease, histological change, hair follicle cycling, and aberrant marker expression. Cutaneous lesions were associated with mutations in 23 different genes. Many of these were not previously associated with skin disease in the organ (Mysm1, Vangl1, Trpc4ap, Nom1, Sparc, Farp2, and Prkab1), while others were ascribed new cutaneous functions on the basis of the screening approach (Krt76, Lrig1, Myo5a, Nsun2, and Nf1). The integration of these skin specific screening protocols into the Sanger-MGP primary phenotyping pipelines marks the largest reported reverse genetic screen undertaken in any organ and defines approaches to maximise the productivity of future projects of this nature, while flagging genes for further characterisation.

  19. Identification of genes important for cutaneous function revealed by a large scale reverse genetic screen in the mouse.

    Directory of Open Access Journals (Sweden)

    Tia DiTommaso

    2014-10-01

    Full Text Available The skin is a highly regenerative organ which plays critical roles in protecting the body and sensing its environment. Consequently, morbidity and mortality associated with skin defects represent a significant health issue. To identify genes important in skin development and homeostasis, we have applied a high throughput, multi-parameter phenotype screen to the conditional targeted mutant mice generated by the Wellcome Trust Sanger Institute's Mouse Genetics Project (Sanger-MGP. A total of 562 different mouse lines were subjected to a variety of tests assessing cutaneous expression, macroscopic clinical disease, histological change, hair follicle cycling, and aberrant marker expression. Cutaneous lesions were associated with mutations in 23 different genes. Many of these were not previously associated with skin disease in the organ (Mysm1, Vangl1, Trpc4ap, Nom1, Sparc, Farp2, and Prkab1, while others were ascribed new cutaneous functions on the basis of the screening approach (Krt76, Lrig1, Myo5a, Nsun2, and Nf1. The integration of these skin specific screening protocols into the Sanger-MGP primary phenotyping pipelines marks the largest reported reverse genetic screen undertaken in any organ and defines approaches to maximise the productivity of future projects of this nature, while flagging genes for further characterisation.

  20. A vascularized and perfused organ-on-a-chip platform for large-scale drug screening applications.

    Science.gov (United States)

    Phan, Duc T T; Wang, Xiaolin; Craver, Brianna M; Sobrino, Agua; Zhao, Da; Chen, Jerry C; Lee, Lilian Y N; George, Steven C; Lee, Abraham P; Hughes, Christopher C W

    2017-01-31

    There is a growing awareness that complex 3-dimensional (3D) organs are not well represented by monolayers of a single cell type - the standard format for many drug screens. To address this deficiency, and with the goal of improving screens so that drugs with good efficacy and low toxicity can be identified, microphysiological systems (MPS) are being developed that better capture the complexity of in vivo physiology. We have previously described an organ-on-a-chip platform that incorporates perfused microvessels, such that survival of the surrounding tissue is entirely dependent on delivery of nutrients through the vessels. Here we describe an arrayed version of the platform that incorporates multiple vascularized micro-organs (VMOs) on a 96-well plate. Each VMO is independently-addressable and flow through the micro-organ is driven by hydrostatic pressure. The platform is easy to use, requires no external pumps or valves, and is highly reproducible. As a proof-of-concept we have created arrayed vascularized micro tumors (VMTs) and used these in a blinded screen to assay a small library of compounds, including FDA-approved anti-cancer drugs, and successfully identified both anti-angiogenic and anti-tumor drugs. This 3D platform is suitable for efficacy/toxicity screening against multiple tissues in a more physiological environment than previously possible.

  1. Large-scale virtual high-throughput screening for the identification of new battery electrolyte solvents: computing infrastructure and collective properties.

    Science.gov (United States)

    Husch, Tamara; Yilmazer, Nusret Duygu; Balducci, Andrea; Korth, Martin

    2015-02-07

    A volunteer computing approach is presented for the purpose of screening a large number of molecular structures with respect to their suitability as new battery electrolyte solvents. Collective properties like melting, boiling and flash points are evaluated using COSMOtherm and quantitative structure-property relationship (QSPR) based methods, while electronic structure theory methods are used for the computation of electrochemical stability window estimators. Two application examples are presented: first, the results of a previous large-scale screening test (PCCP, 2014, 16, 7919) are re-evaluated with respect to the mentioned collective properties. As a second application example, all reasonable nitrile solvents up to 12 heavy atoms are generated and used to illustrate a suitable filter protocol for picking Pareto-optimal candidates.

  2. A systematic methodology for large scale compound screening: A case study on the discovery of novel S1PL inhibitors.

    Science.gov (United States)

    Deniz, Utku; Ozkirimli, Elif; Ulgen, Kutlu O

    2016-01-01

    Decrease in sphingosine 1-phosphate (S1P) concentration induces migration of pathogenic T cells to the blood stream, disrupts the CNS and it is implicated in multiple sclerosis (MS), a progressive inflammatory disorder of the central nervous system (CNS), and Alzheimer's disease (AD). A promising treatment alternative for MS and AD is inhibition of the activity of the microsomal enzyme sphingosine 1-phosphate lyase (S1PL), which degrades intracellular S1P. This report describes an integrated systematic approach comprising virtual screening, molecular docking, substructure search and molecular dynamics simulation to discover novel S1PL inhibitors. Virtual screening of the ZINC database via ligand-based and structure-based pharmacophore models yielded 10000 hits. After molecular docking, common substructures of the top ranking hits were identified. The ligand binding poses were optimized by induced fit docking. MD simulations were performed on the complex structures to determine the stability of the S1PL-ligand complex and to calculate the binding free energy. Selectivity of the selected molecules was examined by docking them to hERG and cytochrome P450 receptors. As a final outcome, 15 compounds from different chemotypes were proposed as potential S1PL inhibitors. These molecules may guide future medicinal chemistry efforts in the discovery of new compounds against the destructive action of pathogenic T cells.

  3. [A large-scale survey for rare blood group screening among blood donors in Chinese over Nanjing area].

    Science.gov (United States)

    Ma, Ling; Liu, Yan-Chun; Xue, Min; Wei, Peng; Tang, Rong-Cai

    2011-02-01

    The purpose of this study was to investigate the distribution of 10 rare red blood groups in Chinese Nanjing population, so as to provide compatible rare blood to patients and to create a donor data bank. Jk (a-b-) (Kidd) phenotypes were detected by urea, while H-(H), GPA-(MNS), GPC-(Gerbich), i+ (Ii) and Lub-(Lutheran) phenotypes were detected by monoclonal, polyclonal antibodies with U type 96 well microplate technology. The screening of Jsb- and k-(Kell), Fya-(Duffy), Ok-(Ok), s-(MNS) and Dib-(Digeo) phenotypes were performed by polymerase chain reaction. The results showed that 2 Jk (a-b-) out of 40337 donation samples and 3 Fy (a-b+) out of 1782 donation samples were found, while no other rare blood phenotypes (H-, GPA-, GPC-, Lub-, Ok-, s-, Jsb-, k-, Dib- and i+) were detected. It is concluded that the frequencies of Jk (a-b-) and Fya(a-b+) are 0.0049% and 0.168% respectively. No more rare blood phenotype was found in this screening.

  4. Reactive Oxygen Species (ROS)-Activated ATM-Dependent Phosphorylation of Cytoplasmic Substrates Identified by Large-Scale Phosphoproteomics Screen.

    Science.gov (United States)

    Kozlov, Sergei V; Waardenberg, Ashley J; Engholm-Keller, Kasper; Arthur, Jonathan W; Graham, Mark E; Lavin, Martin

    2016-03-01

    Ataxia-telangiectasia, mutated (ATM) protein plays a central role in phosphorylating a network of proteins in response to DNA damage. These proteins function in signaling pathways designed to maintain the stability of the genome and minimize the risk of disease by controlling cell cycle checkpoints, initiating DNA repair, and regulating gene expression. ATM kinase can be activated by a variety of stimuli, including oxidative stress. Here, we confirmed activation of cytoplasmic ATM by autophosphorylation at multiple sites. Then we employed a global quantitative phosphoproteomics approach to identify cytoplasmic proteins altered in their phosphorylation state in control and ataxia-telangiectasia (A-T) cells in response to oxidative damage. We demonstrated that ATM was activated by oxidative damage in the cytoplasm as well as in the nucleus and identified a total of 9,833 phosphorylation sites, including 6,686 high-confidence sites mapping to 2,536 unique proteins. A total of 62 differentially phosphorylated peptides were identified; of these, 43 were phosphorylated in control but not in A-T cells, and 19 varied in their level of phosphorylation. Motif enrichment analysis of phosphopeptides revealed that consensus ATM serine glutamine sites were overrepresented. When considering phosphorylation events, only observed in control cells (not observed in A-T cells), with predicted ATM sites phosphoSerine/phosphoThreonine glutamine, we narrowed this list to 11 candidate ATM-dependent cytoplasmic proteins. Two of these 11 were previously described as ATM substrates (HMGA1 and UIMCI/RAP80), another five were identified in a whole cell extract phosphoproteomic screens, and the remaining four proteins had not been identified previously in DNA damage response screens. We validated the phosphorylation of three of these proteins (oxidative stress responsive 1 (OSR1), HDGF, and ccdc82) as ATM dependent after H2O2 exposure, and another protein (S100A11) demonstrated ATM

  5. Large-scale screening of transcription factor-promoter interactions in spruce reveals a transcriptional network involved in vascular development.

    Science.gov (United States)

    Duval, Isabelle; Lachance, Denis; Giguère, Isabelle; Bomal, Claude; Morency, Marie-Josée; Pelletier, Gervais; Boyle, Brian; MacKay, John J; Séguin, Armand

    2014-06-01

    This research aimed to investigate the role of diverse transcription factors (TFs) and to delineate gene regulatory networks directly in conifers at a relatively high-throughput level. The approach integrated sequence analyses, transcript profiling, and development of a conifer-specific activation assay. Transcript accumulation profiles of 102 TFs and potential target genes were clustered to identify groups of coordinately expressed genes. Several different patterns of transcript accumulation were observed by profiling in nine different organs and tissues: 27 genes were preferential to secondary xylem both in stems and roots, and other genes were preferential to phelloderm and periderm or were more ubiquitous. A robust system has been established as a screening approach to define which TFs have the ability to regulate a given promoter in planta. Trans-activation or repression effects were observed in 30% of TF-candidate gene promoter combinations. As a proof of concept, phylogenetic analysis and expression and trans-activation data were used to demonstrate that two spruce NAC-domain proteins most likely play key roles in secondary vascular growth as observed in other plant species. This study tested many TFs from diverse families in a conifer tree species, which broadens the knowledge of promoter-TF interactions in wood development and enables comparisons of gene regulatory networks found in angiosperms and gymnosperms.

  6. Large Scale Screening of Southern African Plant Extracts for the Green Synthesis of Gold Nanoparticles Using Microtitre-Plate Method.

    Science.gov (United States)

    Elbagory, Abdulrahman M; Cupido, Christopher N; Meyer, Mervin; Hussein, Ahmed A

    2016-11-08

    The preparation of gold nanoparticles (AuNPs) involves a variety of chemical and physical methods. These methods use toxic and environmentally harmful chemicals. Consequently, the synthesis of AuNPs using green chemistry has been under investigation to develop eco-friendly nanoparticles. One approach to achieve this is the use of plant-derived phytochemicals that are capable of reducing gold ions to produce AuNPs. The aim of this study was to implement a facile microtitre-plate method to screen a large number of aqueous plant extracts to determine the optimum concentration (OC) for the bio-synthesis of the AuNPs. Several AuNPs of different sizes and shapes were successfully synthesized and characterized from 17 South African plants. The characterization was done using Ultra Violet-Visible Spectroscopy, Dynamic Light Scattering, High Resolution Transmission Electron Microscopy and Energy-Dispersive X-ray Spectroscopy. We also studied the effects of temperature on the synthesis of the AuNPs and showed that changes in temperatures affect the size and dispersity of the generated AuNPs. We also evaluated the stability of the synthesized AuNPs and showed that some of them are stable in biological buffer solutions.

  7. Screening of ARHSP-TCC patients expands the spectrum of SPG11 mutations and includes a large scale gene deletion.

    Science.gov (United States)

    Denora, Paola S; Schlesinger, David; Casali, Carlo; Kok, Fernando; Tessa, Alessandra; Boukhris, Amir; Azzedine, Hamid; Dotti, Maria Teresa; Bruno, Claudio; Truchetto, Jeremy; Biancheri, Roberta; Fedirko, Estelle; Di Rocco, Maja; Bueno, Clarissa; Malandrini, Alessandro; Battini, Roberta; Sickl, Elisabeth; de Leva, Maria Fulvia; Boespflug-Tanguy, Odile; Silvestri, Gabriella; Simonati, Alessandro; Said, Edith; Ferbert, Andreas; Criscuolo, Chiara; Heinimann, Karl; Modoni, Anna; Weber, Peter; Palmeri, Silvia; Plasilova, Martina; Pauri, Flavia; Cassandrini, Denise; Battisti, Carla; Pini, Antonella; Tosetti, Michela; Hauser, Erwin; Masciullo, Marcella; Di Fabio, Roberto; Piccolo, Francesca; Denis, Elodie; Cioni, Giovanni; Massa, Roberto; Della Giustina, Elvio; Calabrese, Olga; Melone, Marina A B; De Michele, Giuseppe; Federico, Antonio; Bertini, Enrico; Durr, Alexandra; Brockmann, Knut; van der Knaap, Marjo S; Zatz, Mayana; Filla, Alessandro; Brice, Alexis; Stevanin, Giovanni; Santorelli, Filippo M

    2009-03-01

    Autosomal recessive spastic paraplegia with thinning of corpus callosum (ARHSP-TCC) is a complex form of HSP initially described in Japan but subsequently reported to have a worldwide distribution with a particular high frequency in multiple families from the Mediterranean basin. We recently showed that ARHSP-TCC is commonly associated with mutations in SPG11/KIAA1840 on chromosome 15q. We have now screened a collection of new patients mainly originating from Italy and Brazil, in order to further ascertain the spectrum of mutations in SPG11, enlarge the ethnic origin of SPG11 patients, determine the relative frequency at the level of single Countries (i.e., Italy), and establish whether there is one or more common mutation. In 25 index cases we identified 32 mutations; 22 are novel, including 9 nonsense, 3 small deletions, 4 insertions, 1 in/del, 1 small duplication, 1 missense, 2 splice-site, and for the first time a large genomic rearrangement. This brings the total number of SPG11 mutated patients in the SPATAX collection to 111 cases in 44 families and in 17 isolated cases, from 16 Countries, all assessed using homogeneous clinical criteria. While expanding the spectrum of mutations in SPG11, this larger series also corroborated the notion that even within apparently homogeneous population a molecular diagnosis cannot be achieved without full gene sequencing.

  8. Large Scale Screening of Southern African Plant Extracts for the Green Synthesis of Gold Nanoparticles Using Microtitre-Plate Method

    Directory of Open Access Journals (Sweden)

    Abdulrahman M. Elbagory

    2016-11-01

    Full Text Available The preparation of gold nanoparticles (AuNPs involves a variety of chemical and physical methods. These methods use toxic and environmentally harmful chemicals. Consequently, the synthesis of AuNPs using green chemistry has been under investigation to develop eco-friendly nanoparticles. One approach to achieve this is the use of plant-derived phytochemicals that are capable of reducing gold ions to produce AuNPs. The aim of this study was to implement a facile microtitre-plate method to screen a large number of aqueous plant extracts to determine the optimum concentration (OC for the bio-synthesis of the AuNPs. Several AuNPs of different sizes and shapes were successfully synthesized and characterized from 17 South African plants. The characterization was done using Ultra Violet-Visible Spectroscopy, Dynamic Light Scattering, High Resolution Transmission Electron Microscopy and Energy-Dispersive X-ray Spectroscopy. We also studied the effects of temperature on the synthesis of the AuNPs and showed that changes in temperatures affect the size and dispersity of the generated AuNPs. We also evaluated the stability of the synthesized AuNPs and showed that some of them are stable in biological buffer solutions.

  9. Large-scale screening of metal hydrides for hydrogen storage from first-principles calculations based on equilibrium reaction thermodynamics.

    Science.gov (United States)

    Kim, Ki Chul; Kulkarni, Anant D; Johnson, J Karl; Sholl, David S

    2011-04-21

    Systematic thermodynamics calculations based on density functional theory-calculated energies for crystalline solids have been a useful complement to experimental studies of hydrogen storage in metal hydrides. We report the most comprehensive set of thermodynamics calculations for mixtures of light metal hydrides to date by performing grand canonical linear programming screening on a database of 359 compounds, including 147 compounds not previously examined by us. This database is used to categorize the reaction thermodynamics of all mixtures containing any four non-H elements among Al, B, C, Ca, K, Li, Mg, N, Na, Sc, Si, Ti, and V. Reactions are categorized according to the amount of H(2) that is released and the reaction's enthalpy. This approach identifies 74 distinct single step reactions having that a storage capacity >6 wt.% and zero temperature heats of reaction 15 ≤ΔU(0)≤ 75 kJ mol(-1) H(2). Many of these reactions, however, are likely to be problematic experimentally because of the role of refractory compounds, B(12)H(12)-containing compounds, or carbon. The single most promising reaction identified in this way involves LiNH(2)/LiH/KBH(4), storing 7.48 wt.% H(2) and having ΔU(0) = 43.6 kJ mol(-1) H(2). We also examined the complete range of reaction mixtures to identify multi-step reactions with useful properties; this yielded 23 multi-step reactions of potential interest.

  10. The use of saliva as a practical and feasible alternative to urine in large-scale screening for congenital cytomegalovirus infection increasesinclusion and detection rates

    Directory of Open Access Journals (Sweden)

    Emanuelle Santos de Carvalho Cardoso

    2015-04-01

    Full Text Available INTRODUCTION: Although urine is considered the gold-standard material for the detection of congenital cytomegalovirus (CMV infection, it can be difficult to obtain in newborns. The aim of this study was to compare the efficiency of detection of congenital CMV infection in saliva and urine samples. METHODS: One thousand newborns were included in the study. Congenital cytomegalovirus deoxyribonucleic acid (DNA was detected by polymerase chain reaction (PCR. RESULTS: Saliva samples were obtained from all the newborns, whereas urine collection was successful in only 333 cases. There was no statistically significant difference between the use of saliva alone or saliva and urine collected simultaneously for the detection of CMV infection. CONCLUSIONS: Saliva samples can be used in large-scale neonatal screening for CMV infection.

  11. Large-scale microfluidics providing high-resolution and high-throughput screening of Caenorhabditis elegans poly-glutamine aggregation model

    Science.gov (United States)

    Mondal, Sudip; Hegarty, Evan; Martin, Chris; Gökçe, Sertan Kutal; Ghorashian, Navid; Ben-Yakar, Adela

    2016-10-01

    Next generation drug screening could benefit greatly from in vivo studies, using small animal models such as Caenorhabditis elegans for hit identification and lead optimization. Current in vivo assays can operate either at low throughput with high resolution or with low resolution at high throughput. To enable both high-throughput and high-resolution imaging of C. elegans, we developed an automated microfluidic platform. This platform can image 15 z-stacks of ~4,000 C. elegans from 96 different populations using a large-scale chip with a micron resolution in 16 min. Using this platform, we screened ~100,000 animals of the poly-glutamine aggregation model on 25 chips. We tested the efficacy of ~1,000 FDA-approved drugs in improving the aggregation phenotype of the model and identified four confirmed hits. This robust platform now enables high-content screening of various C. elegans disease models at the speed and cost of in vitro cell-based assays.

  12. Database for exchangeable gene trap clones: pathway and gene ontology analysis of exchangeable gene trap clone mouse lines.

    Science.gov (United States)

    Araki, Masatake; Nakahara, Mai; Muta, Mayumi; Itou, Miharu; Yanai, Chika; Yamazoe, Fumika; Miyake, Mikiko; Morita, Ayaka; Araki, Miyuki; Okamoto, Yoshiyuki; Nakagata, Naomi; Yoshinobu, Kumiko; Yamamura, Ken-ichi; Araki, Kimi

    2014-02-01

    Gene trapping in embryonic stem (ES) cells is a proven method for large-scale random insertional mutagenesis in the mouse genome. We have established an exchangeable gene trap system, in which a reporter gene can be exchanged for any other DNA of interest through Cre/mutant lox-mediated recombination. We isolated trap clones, analyzed trapped genes, and constructed the database for Exchangeable Gene Trap Clones (EGTC) [http://egtc.jp]. The number of registered ES cell lines was 1162 on 31 August 2013. We also established 454 mouse lines from trap ES clones and deposited them in the mouse embryo bank at the Center for Animal Resources and Development, Kumamoto University, Japan. The EGTC database is the most extensive academic resource for gene-trap mouse lines. Because we used a promoter-trap strategy, all trapped genes were expressed in ES cells. To understand the general characteristics of the trapped genes in the EGTC library, we used Kyoto Encyclopedia of Genes and Genomes (KEGG) for pathway analysis and found that the EGTC ES clones covered a broad range of pathways. We also used Gene Ontology (GO) classification data provided by Mouse Genome Informatics (MGI) to compare the functional distribution of genes in each GO term between trapped genes in the EGTC mouse lines and total genes annotated in MGI. We found the functional distributions for the trapped genes in the EGTC mouse lines and for the RefSeq genes for the whole mouse genome were similar, indicating that the EGTC mouse lines had trapped a wide range of mouse genes. © 2014 The Authors Development, Growth & Differentiation © 2014 Japanese Society of Developmental Biologists.

  13. A large-scale complex haploinsufficiency-based genetic interaction screen in Candida albicans: analysis of the RAM network during morphogenesis.

    Directory of Open Access Journals (Sweden)

    Nike Bharucha

    2011-04-01

    Full Text Available The morphogenetic transition between yeast and filamentous forms of the human fungal pathogen Candida albicans is regulated by a variety of signaling pathways. How these pathways interact to orchestrate morphogenesis, however, has not been as well characterized. To address this question and to identify genes that interact with the Regulation of Ace2 and Morphogenesis (RAM pathway during filamentation, we report the first large-scale genetic interaction screen in C. albicans.Our strategy for this screen was based on the concept of complex haploinsufficiency (CHI. A heterozygous mutant of CBK1(cbk1Δ/CBK1, a key RAM pathway protein kinase, was subjected to transposon-mediated, insertional mutagenesis. The resulting double heterozygous mutants (6,528 independent strains were screened for decreased filamentation on SpiderMedium (SM. From the 441 mutants showing altered filamentation, 139 transposon insertion sites were sequenced,yielding 41 unique CBK1-interacting genes. This gene set was enriched in transcriptional targets of Ace2 and, strikingly, the cAMP-dependent protein kinase A (PKA pathway, suggesting an interaction between these two pathways. Further analysis indicates that the RAM and PKA pathways co-regulate a common set of genes during morphogenesis and that hyperactivation of the PKA pathway may compensate for loss of RAM pathway function. Our data also indicate that the PKA–regulated transcription factor Efg1 primarily localizes to yeast phase cells while the RAM–pathway regulated transcription factor Ace2 localizes to daughter nuclei of filamentous cells, suggesting that Efg1 and Ace2 regulate a common set of genes at separate stages of morphogenesis. Taken together, our observations indicate that CHI–based screening is a useful approach to genetic interaction analysis in C. albicans and support a model in which these two pathways regulate a common set of genes at different stages of filamentation.

  14. Large-Scale Functional RNAi Screen in C. elegans Identifies TGF-β and Notch Signaling Pathways as Modifiers of CACNA1A

    Directory of Open Access Journals (Sweden)

    Maria da Conceição Pereira

    2016-03-01

    Full Text Available Variants in CACNA1A that encodes the pore-forming α1-subunit of human voltage-gated Cav2.1 (P/Q-type Ca2+ channels cause several autosomal-dominant neurologic disorders, including familial hemiplegic migraine type 1, episodic ataxia type 2, and spinocerebellar ataxia type 6. To identify modifiers of incoordination in movement disorders, we performed a large-scale functional RNAi screen, using the Caenorhabditis elegans strain CB55, which carries a truncating mutation in the unc-2 gene, the worm ortholog for the human CACNA1A. The screen was carried out by the feeding method in 96-well liquid culture format, using the ORFeome v1.1 feeding library, and time-lapse imaging of worms in liquid culture was used to assess changes in thrashing behavior. We looked for genes that, when silenced, either ameliorated the slow and uncoordinated phenotype of unc-2, or interacted to produce a more severe phenotype. Of the 350 putative hits from the primary screen, 37 genes consistently showed reproducible results. At least 75% of these are specifically expressed in the C. elegans neurons. Functional network analysis and gene ontology revealed overrepresentation of genes involved in development, growth, locomotion, signal transduction, and vesicle-mediated transport. We have expanded the functional network of genes involved in neurodegeneration leading to cerebellar ataxia related to unc-2/CACNA1A, further confirming the involvement of the transforming growth factor β pathway and adding a novel signaling cascade, the Notch pathway.

  15. Large-scale RNAi screen of G protein-coupled receptors involved in larval growth, molting and metamorphosis in the red flour beetle

    Directory of Open Access Journals (Sweden)

    Shah Kapil

    2011-08-01

    Full Text Available Abstract Background The G protein-coupled receptors (GPCRs belong to the largest superfamily of integral cell membrane proteins and play crucial roles in physiological processes including behavior, development and reproduction. Because of their broad and diverse roles in cellular signaling, GPCRs are the therapeutic targets for many prescription drugs. However, there is no commercial pesticide targeting insect GPCRs. In this study, we employed functional genomics methods and used the red flour beetle, Tribolium castaneum, as a model system to study the physiological roles of GPCRs during the larval growth, molting and metamorphosis. Results A total of 111 non-sensory GPCRs were identified in the T. castaneum genome. Thirty-nine of them were not reported previously. Large-scale RNA interference (RNAi screen was used to study the function of all these GPCRs during immature stages. Double-stranded RNA (dsRNA-mediated knockdown in the expression of genes coding for eight GPCRs caused severe developmental arrest and ecdysis failure (with more than 90% mortality after dsRNA injection. These GPCRs include dopamine-2 like receptor (TC007490/D2R and latrophilin receptor (TC001872/Cirl. The majority of larvae injected with TC007490/D2R dsRNA died during larval stage prior to entering pupal stage, suggesting that this GPCR is essential for larval growth and development. Conclusions The results from our study revealed the physiological roles of some GPCRs in T. castaneum. These findings could help in development of novel pesticides targeting these GPCRs.

  16. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESULT...

  17. Cellular dissection of the spinal cord motor column by BAC transgenesis and gene trapping in zebrafish

    Directory of Open Access Journals (Sweden)

    Kazuhide eAsakawa

    2013-05-01

    Full Text Available Bacterial artificial chromosome (BAC transgenesis and gene/enhancer trapping are effective approaches for identification of genetically defined neuronal populations in the central nervous system (CNS. Here, we applied these techniques to zebrafish (danio rerio in order to obtain insights into the cellular architecture of the axial motor column in vertebrates. First, by using the BAC for the Mnx class homeodomain protein gene mnr2b/mnx2b, we established the mnGFF7 transgenic line expressing the Gal4FF transcriptional activator in a large part of the motor column. Single cell labelling of Gal4FF-expressing cells in the mnGFF7 line enabled a detailed investigation of the morphological characteristics of individual spinal motoneurons, as well as the overall organisation of the motor column in a spinal segment. Secondly, from a large-scale gene trap screen, we identified transgenic lines that marked discrete subpopulations of spinal motoneurons with Gal4FF. Molecular characterisation of these lines led to the identification of the ADAMTS3 gene, which encodes an evolutionarily conserved ADAMTS family of peptidases and is dynamically expressed in the ventral spinal cord. The transgenic fish established here, along with the identified gene, should facilitate an understanding of the cellular and molecular architecture of the spinal cord motor column and its connection to muscles in vertebrates.

  18. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  19. 大型香蕉筛横梁断裂问题的研究与解决方案%Study on breaking of crossbeam of large-scale banana screen and solutions

    Institute of Scientific and Technical Information of China (English)

    王二锋; 王新文

    2011-01-01

    The large-scale banana screen is applied in the process of classification, desliming, medium remov ing and dewatering, etc. In the actual application, such problems of the large-scale banana screen appear as breaking of crossbeam, cracking of side plate, staving of spring, etc. Among these problems, breaking of cross beam is the most common, and it has seriously influenced the normal production in coal preparation plants. With BLS3661 large-scale banana screen serving as example, the breaking of crossbeam is studied, and solu tions are proposed.%大型香蕉筛应用在选煤厂分级、脱泥、脱介和脱水作业等工艺环节中.在实际应用中,大型香蕉筛出现了横梁断裂、侧板裂纹及弹簧压断等诸多问题,其中横梁断裂最尤为突出,严重影响了选煤厂的正常生产.以BLS3661大型香蕉筛为例,对大型香蕉筛横梁断裂的问题进行了研究,并提出了解决方案.

  20. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  1. Large scale tracking algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  2. Deciphering Seed Sequence Based Off-Target Effects in a Large-Scale RNAi Reporter Screen for E-Cadherin Expression.

    Directory of Open Access Journals (Sweden)

    Robert Adams

    Full Text Available Functional RNAi based screening is affected by large numbers of false positive and negative hits due to prevalent sequence based off-target effects. We performed a druggable genome targeting siRNA screen intended to identify novel regulators of E-cadherin (CDH1 expression, a known key player in epithelial mesenchymal transition (EMT. Analysis of primary screening results indicated a large number of false-positive hits. To address these crucial difficulties we developed an analysis method, SENSORS, which, similar to published methods, is a seed enrichment strategy for analyzing siRNA off-targets in RNAi screens. Using our approach, we were able to demonstrate that accounting for seed based off-target effects stratifies primary screening results and enables the discovery of additional screening hits. While traditional hit detection methods are prone to false positive results which are undetected, we were able to identify false positive hits robustly. Transcription factor MYBL1 was identified as a putative novel target required for CDH1 expression and verified experimentally. No siRNA pool targeting MYBL1 was present in the used siRNA library. Instead, MYBL1 was identified as a putative CDH1 regulating target solely based on the SENSORS off-target score, i.e. as a gene that is a cause for off-target effects down regulating E-cadherin expression.

  3. The Nonmydriatic Fundus Camera in Diabetic Retinopathy Screening: A Cost-Effective Study with Evaluation for Future Large-Scale Application

    Science.gov (United States)

    Scarpa, Giuseppe; Urban, Francesca; Tessarin, Michele; Gallo, Giovanni; Midena, Edoardo

    2016-01-01

    Aims. The study aimed to present the experience of a screening programme for early detection of diabetic retinopathy (DR) using a nonmydriatic fundus camera, evaluating the feasibility in terms of validity, resources absorption, and future advantages of a potential application, in an Italian local health authority. Methods. Diabetic patients living in the town of Ponzano, Veneto Region (Northern Italy), were invited to be enrolled in the screening programme. The “no prevention strategy” with the inclusion of the estimation of blindness related costs was compared with screening costs in order to evaluate a future extensive and feasible implementation of the procedure, through a budget impact approach. Results. Out of 498 diabetic patients eligible, 80% was enrolled in the screening programme. 115 patients (34%) were referred to an ophthalmologist and 9 cases required prompt treatment for either proliferative DR or macular edema. Based on the pilot data, it emerged that an extensive use of the investigated screening programme, within the Greater Treviso area, could prevent 6 cases of blindness every year, resulting in a saving of €271,543.32 (−13.71%). Conclusions. Fundus images obtained with a nonmydriatic fundus camera could be considered an effective, cost-sparing, and feasible screening tool for the early detection of DR, preventing blindness as a result of diabetes. PMID:27885337

  4. Implementation and Evaluation of a Large-Scale Teleretinal Diabetic Retinopathy Screening Program in the Los Angeles County Department of Health Services.

    Science.gov (United States)

    Daskivich, Lauren P; Vasquez, Carolina; Martinez, Carlos; Tseng, Chi-Hong; Mangione, Carol M

    2017-05-01

    Diabetic retinopathy (DR) is the leading cause of blindness in adults of working age in the United States. In the Los Angeles County safety net, a nonvertically integrated system serving underinsured and uninsured patients, the prevalence of DR is approximately 50%, and owing to limited specialty care resources, the average wait times for screening for DR have been 8 months or more. To determine whether a primary care-based teleretinal DR screening (TDRS) program reduces wait times for screening and improves timeliness of needed care in the Los Angeles County safety net. Quasi-experimental, pretest-posttest evaluation of exposure to primary care-based TDRS at 5 of 15 Los Angeles County Department of Health Services safety net clinics from September 1, 2013, to December 31, 2015, with a subgroup analysis of random samples of 600 patients before and after the intervention (1200 total). Primary care clinic-based teleretinal screening for DR. Annual rates of screening for DR before and after implementation of the TDRS program across the 5 clinics, time to screening for DR in a random sample of patients from these clinics, and a description of the larger framework of program implementation. Among the 21 222 patients who underwent the screening (12 790 female, 8084 male, and 348 other gender or not specified; mean [SD] age, 57.4 [9.6] years), the median time to screening for DR decreased from 158 days (interquartile range, 68-324 days) before the intervention to 17 days (interquartile range, 8-50 days) after initiation of the program (P < .001). Overall annual screening rates for DR increased from 5942 of 14 633 patients (40.6%) before implementation to 7470 of 13 133 patients (56.9%) after initiation of the program at all 15 targeted clinics (odds ratio, 1.9; 95% CI, 1.3-2.9; P = .002). Of the 21 222 patients who were screened, 14 595 (68.8%) did not require referral to an eye care professional, 4160 (19.6%) were referred for treatment or monitoring

  5. [A community-based genetic screening of large-scale population and prenatal diagnosis for alpha and beta thalassemia in Zhuhai city of Guangdong province].

    Science.gov (United States)

    Zhou, Yu-qiu; Mo, Qiu-hua; Lu, Jin-han; Li, Li-yan; Liang, Xiong; Jia, Shi-qi; Xiao, Ge-fei; Zhou, Wan-jun; Xiao, Qi-zhi; Xu, Xiang-min

    2008-06-01

    To describe a community-based model for prevention and control of severe alpha and beta thalassemias in Zhuhai city of Guangdong province. Couples for premarital medical examination or regular healthcare examination in pregnancy were enrolled in this prospective screening program, which was supported by the two-level network composed of 6 local hospitals for testing thalassemias and follow-up for genetic counseling. A conventional heterozygote screening strategy was used to determine alpha and beta thalassemia traits in women and their partners according to the standard procedures of hematological phenotype analysis. Then confirmative diagnosis of alpha and beta thalassemia was performed on those couples suspected at-risk for severe thalassemia by using the PCR-based molecular diagnostic assays. The couples at-risk for severe thalassemia were counseled and offered prenatal diagnosis and termination of pregnancy in case of an affected fetus. During the period between January 1998 and December 2005, the screened records included 85522 young females and their partners for premarital screening and 10439 pregnant women for prenatal screening, with 71.38% coverage of total population recorded in this city for premarital screening. Six thousands five hundreds and sixty-three individuals in total were found to be the carriers of thalassemias, with 4312 for alpha thalassemia (4.5%) and 2251 for beta thalassemia (2.3%), respectively. One hundred and forty-eight couples were diagnosed to be at-risk for thalassemias, including 103 for alpha thalassemia and 45 for beta thalassemia, respectively. Successful prenatal diagnosis was made for 142 (98 for alpha thalassemia and 44 for beta thalassemia) out of 148 (95.9%) pregnancies at-risk for severe thalassemias. Twenty-three cases of hydrops fetalis, 4 of Hb H diseases and 14 of beta thalassemia were identified. All 41 pregnancies with affected fetuses were voluntarily terminated. Thus, this has led to a marked decrease of severe

  6. Decision aid on breast cancer screening reduces attendance rate: results of a large-scale, randomized, controlled study by the DECIDEO group.

    Science.gov (United States)

    Bourmaud, Aurelie; Soler-Michel, Patricia; Oriol, Mathieu; Regnier, Véronique; Tinquaut, Fabien; Nourissat, Alice; Bremond, Alain; Moumjid, Nora; Chauvin, Franck

    2016-03-15

    Controversies regarding the benefits of breast cancer screening programs have led to the promotion of new strategies taking into account individual preferences, such as decision aid. The aim of this study was to assess the impact of a decision aid leaflet on the participation of women invited to participate in a national breast cancer screening program. This Randomized, multicentre, controlled trial. Women aged 50 to 74 years, were randomly assigned to receive either a decision aid or the usual invitation letter. Primary outcome was the participation rate 12 months after the invitation. 16 000 women were randomized and 15 844 included in the modified intention-to-treat analysis. The participation rate in the intervention group was 40.25% (3174/7885 women) compared with 42.13% (3353/7959) in the control group (p = 0.02). Previous attendance for screening (RR = 6.24; [95%IC: 5.75-6.77]; p aid reduced the participation rate. The decision aid activate the decision making process of women toward non-attendance to screening. These results show the importance of promoting informed patient choices, especially when those choices cannot be anticipated.

  7. Preliminary design of large-scale origin antiresonant vibrating screen%大型原点反共振振动筛的初步设计

    Institute of Scientific and Technical Information of China (English)

    马超; 邵帅; 吴腾健; 王珊; 赵祥镇雄; 党梦飞; 王新文

    2014-01-01

    The lateral wall of traditional large vibrating sieve box was easy to broke,the part of the vibration mass was high and vibration i-solation effect was poor.To resolve the above problems,adopted the anti-resonance theory to design vibrating screen.The model of antireso-nant vibrating screen was built,the motion equation of antiresonant vibrating screen was achieved.Worked out the amplitude of the steady-state response and the next screen box when it was plastid.Drawed a amplitude-frequency characteristic curve of the double mass system,and calculated out the excitation frequency of antiresonant vibrating screen when it was working.Achieved anti-resonance vibrating screen 3D model through Solidworks software,and complete preliminary design of the antiresonant vibrating screen.The results show that,the bevel on which the position of exciter can be adjusted allows the exciting force line go through the mass centroid.The design ensures the normal operation of the screen box.The spring seats which consist of cylindrical coil springs connect upper mass,lower mass and the ground,ensure the elasticity coefficient required by the system.The amplitude of the upper and lower mass can be monitored and controlled through propor-tion integration differentiation technology.The amplitude of origin antiresonant vibrating screen can be stabilized by changing the excitation frequency.%针对传统大型振动筛筛箱侧帮易断裂,参振质量大,隔振效果差等问题,提出了将反共振理论用于振动筛的设计理念,建立了反共振振动筛的力学模型,并进行动力学分析,得到筛箱和下质体稳态时的响应振幅,通过绘制双质体系统的幅频特性曲线得到反共振振动筛的激振频率。利用Solid-works软件对反共振振动筛进行三维建模,完成了大型原点反共振振动筛的初步设计。结果表明:可调激振器位置的斜面设计使激振力通过上质体质心,保证了筛箱的正常工作

  8. Large-scale virtual high-throughput screening for the identification of new battery electrolyte solvents: evaluation of electronic structure theory methods.

    Science.gov (United States)

    Korth, Martin

    2014-05-07

    The performance of semi-empirical quantum mechanical (SQM), density functional theory (DFT) and wave function theory (WFT) methods is evaluated for the purpose of screening a large number of molecular structures with respect to their electrochemical stability to identify new battery electrolyte solvents. Starting from 100,000 database entries and based on more than 46,000 DFT calculations, 83 candidate molecules are identified and then used for benchmarking lower-level computational models (SQM, DFT) with respect to higher-level WFT reference data. A combination of SQM and WFT methods is suggested as a screening strategy at the electronic structure theory level. Using a subset of over 11,000 typical organic molecules and based on over 22,000 high-level WFT calculations, several simple models are tested for the prediction of ionization potentials (IPs) and electron affinities (EAs). Reference data are made available for the development of more sophisticated QSPR models.

  9. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  10. Second-generation sequencing supply an effective way to screen RNAi targets in large scale for potential application in pest insect control.

    Science.gov (United States)

    Wang, Yubing; Zhang, Hao; Li, Haichao; Miao, Xuexia

    2011-04-11

    The key of RNAi approach success for potential insect pest control is mainly dependent on careful target selection and a convenient delivery system. We adopted second-generation sequencing technology to screen RNAi targets. Illumina's RNA-seq and digital gene expression tag profile (DGE-tag) technologies were used to screen optimal RNAi targets from Ostrinia furnalalis. Total 14690 stage specific genes were obtained which can be considered as potential targets, and 47 were confirmed by qRT-PCR. Ten larval stage specific expression genes were selected for RNAi test. When 50 ng/µl dsRNAs of the genes DS10 and DS28 were directly sprayed on the newly hatched larvae which placed on the filter paper, the larval mortalities were around 40∼50%, while the dsRNAs of ten genes were sprayed on the larvae along with artificial diet, the mortalities reached 73% to 100% at 5 d after treatment. The qRT-PCR analysis verified the correlation between larval mortality and the down-regulation of the target gene expression. Topically applied fluorescent dsRNA confirmed that dsRNA did penetrate the body wall and circulate in the body cavity. It seems likely that the combination of DGE-tag with RNA-seq is a rapid, high-throughput, cost less and an easy way to select the candidate target genes for RNAi. More importantly, it demonstrated that dsRNAs are able to penetrate the integument and cause larval developmental stunt and/or death in a lepidopteron insect. This finding largely broadens the target selection for RNAi from just gut-specific genes to the targets in whole insects and may lead to new strategies for designing RNAi-based technology against insect damage.

  11. Automated large-scale culture and medium-throughput chemical screen for modulators of proliferation and viability of human induced pluripotent stem cell-derived neuroepithelial-like stem cells.

    Science.gov (United States)

    McLaren, Donna; Gorba, Thorsten; Marguerie de Rotrou, Anita; Pillai, Gopalan; Chappell, Clare; Stacey, Alison; Lingard, Sarah; Falk, Anna; Smith, Austin; Koch, Philipp; Brüstle, Oliver; Vickers, Richard; Tinsley, Jon; Flanders, David; Bello, Paul; Craig, Stewart

    2013-03-01

    The aim of this study was to demonstrate proof-of-concept feasibility for the use of human neural stem cells (NSCs) for high-throughput screening (HTS) applications. For this study, an adherent human induced pluripotent stem (iPS) cell-derived long-term, self-renewing, neuroepithelial-like stem (lt-NES) cell line was selected as a representative NSC. Here, we describe the automated large-scale serum-free culture ("scale-up") of human lt-NES cells on the CompacT SelecT cell culture robotic platform, followed by their subsequent automated "scale-out" into a microwell plate format. We also report a medium-throughput screen of 1000 compounds to identify modulators of neural stem cell proliferation and/or survival. The screen was performed on two independent occasions using a cell viability assay with end-point reading resulting in the identification of 24 potential hit compounds, 5 of which were found to increase the proliferation and/or survival of human lt-NES on both occasions. Follow-up studies confirmed a dose-dependent effect of one of the hit compounds, which was a Cdk-2 modulator. This approach could be further developed as part of a strategy to screen compounds to either improve the procedures for the in vitro expansion of neural stem cells or to potentially modulate endogenous neural stem cell behavior in the diseased nervous system.

  12. Large scale screening of digeneans for Neorickettsia endosymbionts using real-time PCR reveals new Neorickettsia genotypes, host associations and geographic records.

    Directory of Open Access Journals (Sweden)

    Stephen E Greiman

    Full Text Available Digeneans are endoparasitic flatworms with complex life cycles including one or two intermediate hosts (first of which is always a mollusk and a vertebrate definitive host. Digeneans may harbor intracellular endosymbiotic bacteria belonging to the genus Neorickettsia (order Rickettsiales, family Anaplasmataceae. Some Neorickettsia are able to invade cells of the digenean's vertebrate host and are known to cause diseases of wildlife and humans. In this study we report the results of screening 771 digenean samples for Neorickettsia collected from various vertebrates in terrestrial, freshwater, brackish, and marine habitats in the United States, China and Australia. Neorickettsia were detected using a newly designed real-time PCR protocol targeting a 152 bp fragment of the heat shock protein coding gene, GroEL, and verified with nested PCR and sequencing of a 1371 bp long region of 16S rRNA. Eight isolates of Neorickettsia have been obtained. Sequence comparison and phylogenetic analysis demonstrated that 7 of these isolates, provisionally named Neorickettsia sp. 1-7 (obtained from allocreadiid Crepidostomum affine, haploporids Saccocoelioides beauforti and Saccocoelioides lizae, faustulid Bacciger sprenti, deropegid Deropegus aspina, a lecithodendriid, and a pleurogenid represent new genotypes and one (obtained from Metagonimoides oregonensis was identical to a published sequence of Neorickettsia known as SF agent. All digenean species reported in this study represent new host records. Three of the 6 digenean families (Haploporidae, Pleurogenidae, and Faustulidae are also reported for the first time as hosts of Neorickettsia. We have detected Neorickettsia in digeneans from China and Australia for the first time based on PCR and sequencing evidence. Our findings suggest that further surveys from broader geographic regions and wider selection of digenean taxa are likely to reveal new Neorickettsia lineages as well as new digenean host associations.

  13. A spatio-temporal screening tool for outlier detection in long term / large scale air quality observation time series and monitoring networks

    Science.gov (United States)

    Kracht, Oliver; Reuter, Hannes I.; Gerboles, Michel

    2013-04-01

    We present a consolidated screening tool for the detection of outliers in air quality monitoring data, which considers both attribute values and spatio-temporal relationships. Furthermore, an application example of warnings on abnormal values in time series of PM10 datasets in AirBase is presented. Spatial or temporal outliers in air quality datasets represent stations or individual measurements which differ significantly from other recordings within their spatio-temporal neighbourhood. Such abnormal values can be identified as being extreme compared to their neighbours, even though they do not necessarily require to differ significantly from the statistical distribution of the entire population. The identification of such outliers can be of interest as the basis of data quality control systems when several contributors report their measurements to the collection of larger datasets. Beyond this, it can also provide a simple solution to investigate the accuracy of station classifications. Seen from another viewpoint, it can be used as a tool to detect irregular air pollution emission events (e.g. the influence of fires, wind erosion events, or other accidental situations). The presented procedure for outlier detection was designed based on already existing literature. Specifically, we adapted the "Smooth Spatial Attribute Method" that was first developed for the identification of outlier values in networks of traffic sensors [1]. Since a free and extensible simulation platform was considered important, all codes were prototyped in the R environment which is available under the GNU General Public License [2]. Our algorithms are based on the definition of a neighbourhood for each air quality measurement, corresponding to a spatio-temporal domain limited by time (e.g., +/- 2 days) and distance (e.g., +/- 1 spherical degrees) around the location of ambient air monitoring stations. The objective of the method is that within such a given spatio-temporal domain, in which

  14. Large Scale Dynamos in Stars

    Science.gov (United States)

    Vishniac, Ethan T.

    2015-01-01

    We show that a differentially rotating conducting fluid automatically creates a magnetic helicity flux with components along the rotation axis and in the direction of the local vorticity. This drives a rapid growth in the local density of current helicity, which in turn drives a large scale dynamo. The dynamo growth rate derived from this process is not constant, but depends inversely on the large scale magnetic field strength. This dynamo saturates when buoyant losses of magnetic flux compete with the large scale dynamo, providing a simple prediction for magnetic field strength as a function of Rossby number in stars. Increasing anisotropy in the turbulence produces a decreasing magnetic helicity flux, which explains the flattening of the B/Rossby number relation at low Rossby numbers. We also show that the kinetic helicity is always a subdominant effect. There is no kinematic dynamo in real stars.

  15. Large-scale circuit simulation

    Science.gov (United States)

    Wei, Y. P.

    1982-12-01

    The simulation of VLSI (Very Large Scale Integration) circuits falls beyond the capabilities of conventional circuit simulators like SPICE. On the other hand, conventional logic simulators can only give the results of logic levels 1 and 0 with the attendent loss of detail in the waveforms. The aim of developing large-scale circuit simulation is to bridge the gap between conventional circuit simulation and logic simulation. This research is to investigate new approaches for fast and relatively accurate time-domain simulation of MOS (Metal Oxide Semiconductors), LSI (Large Scale Integration) and VLSI circuits. New techniques and new algorithms are studied in the following areas: (1) analysis sequencing (2) nonlinear iteration (3) modified Gauss-Seidel method (4) latency criteria and timestep control scheme. The developed methods have been implemented into a simulation program PREMOS which could be used as a design verification tool for MOS circuits.

  16. Quantum Signature of Cosmological Large Scale Structures

    CERN Document Server

    Capozziello, S; De Siena, S; Illuminati, F; Capozziello, Salvatore; Martino, Salvatore De; Siena, Silvio De; Illuminati, Fabrizio

    1998-01-01

    We demonstrate that to all large scale cosmological structures where gravitation is the only overall relevant interaction assembling the system (e.g. galaxies), there is associated a characteristic unit of action per particle whose order of magnitude coincides with the Planck action constant $h$. This result extends the class of physical systems for which quantum coherence can act on macroscopic scales (as e.g. in superconductivity) and agrees with the absence of screening mechanisms for the gravitational forces, as predicted by some renormalizable quantum field theories of gravity. It also seems to support those lines of thought invoking that large scale structures in the Universe should be connected to quantum primordial perturbations as requested by inflation, that the Newton constant should vary with time and distance and, finally, that gravity should be considered as an effective interaction induced by quantization.

  17. Very Large Scale Integration (VLSI).

    Science.gov (United States)

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  18. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  19. Testing gravity on Large Scales

    OpenAIRE

    Raccanelli Alvise

    2013-01-01

    We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep...

  20. Strings and large scale magnetohydrodynamics

    CERN Document Server

    Olesen, P

    1995-01-01

    From computer simulations of magnetohydrodynamics one knows that a turbulent plasma becomes very intermittent, with the magnetic fields concentrated in thin flux tubes. This situation looks very "string-like", so we investigate whether strings could be solutions of the magnetohydrodynamics equations in the limit of infinite conductivity. We find that the induction equation is satisfied, and we discuss the Navier-Stokes equation (without viscosity) with the Lorentz force included. We argue that the string equations (with non-universal maximum velocity) should describe the large scale motion of narrow magnetic flux tubes, because of a large reparametrization (gauge) invariance of the magnetic and electric string fields.

  1. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  2. Models of large scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Frenk, C.S. (Physics Dept., Univ. of Durham (UK))

    1991-01-01

    The ingredients required to construct models of the cosmic large scale structure are discussed. Input from particle physics leads to a considerable simplification by offering concrete proposals for the geometry of the universe, the nature of the dark matter and the primordial fluctuations that seed the growth of structure. The remaining ingredient is the physical interaction that governs dynamical evolution. Empirical evidence provided by an analysis of a redshift survey of IRAS galaxies suggests that gravity is the main agent shaping the large-scale structure. In addition, this survey implies large values of the mean cosmic density, {Omega}> or approx.0.5, and is consistent with a flat geometry if IRAS galaxies are somewhat more clustered than the underlying mass. Together with current limits on the density of baryons from Big Bang nucleosynthesis, this lends support to the idea of a universe dominated by non-baryonic dark matter. Results from cosmological N-body simulations evolved from a variety of initial conditions are reviewed. In particular, neutrino dominated and cold dark matter dominated universes are discussed in detail. Finally, it is shown that apparent periodicities in the redshift distributions in pencil-beam surveys arise frequently from distributions which have no intrinsic periodicity but are clustered on small scales. (orig.).

  3. Large-Scale Galaxy Bias

    CERN Document Server

    Desjacques, Vincent; Schmidt, Fabian

    2016-01-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a pedagogical proof of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which includes the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in i...

  4. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  5. Testing gravity on Large Scales

    Directory of Open Access Journals (Sweden)

    Raccanelli Alvise

    2013-09-01

    Full Text Available We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep surveys those corrections need to be taken into account if we want to measure the growth of structures at a few percent level, and so perform tests on gravity, without introducing systematic errors. Finally, we report the results of some recent cosmological model tests carried out using those precise models.

  6. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  7. Large Scale Correlation Clustering Optimization

    CERN Document Server

    Bagon, Shai

    2011-01-01

    Clustering is a fundamental task in unsupervised learning. The focus of this paper is the Correlation Clustering functional which combines positive and negative affinities between the data points. The contribution of this paper is two fold: (i) Provide a theoretic analysis of the functional. (ii) New optimization algorithms which can cope with large scale problems (>100K variables) that are infeasible using existing methods. Our theoretic analysis provides a probabilistic generative interpretation for the functional, and justifies its intrinsic "model-selection" capability. Furthermore, we draw an analogy between optimizing this functional and the well known Potts energy minimization. This analogy allows us to suggest several new optimization algorithms, which exploit the intrinsic "model-selection" capability of the functional to automatically recover the underlying number of clusters. We compare our algorithms to existing methods on both synthetic and real data. In addition we suggest two new applications t...

  8. Large scale cluster computing workshop

    Energy Technology Data Exchange (ETDEWEB)

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  9. Large Scale Magnetostrictive Valve Actuator

    Science.gov (United States)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  10. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  11. Handbook of Large-Scale Random Networks

    CERN Document Server

    Bollobas, Bela; Miklos, Dezso

    2008-01-01

    Covers various aspects of large-scale networks, including mathematical foundations and rigorous results of random graph theory, modeling and computational aspects of large-scale networks, as well as areas in physics, biology, neuroscience, sociology and technical areas

  12. Large-Scale Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  13. Conundrum of the Large Scale Streaming

    CERN Document Server

    Malm, T M

    1999-01-01

    The etiology of the large scale peculiar velocity (large scale streaming motion) of clusters would increasingly seem more tenuous, within the context of the gravitational instability hypothesis. Are there any alternative testable models possibly accounting for such large scale streaming of clusters?

  14. A gene-trap strategy identifies quiescence-induced genes in synchronized myoblasts

    Indian Academy of Sciences (India)

    Ramkumar Sambasivan; Grace K Pavlath; Jyotsna Dhawan

    2008-03-01

    Cellular quiescence is characterized not only by reduced mitotic and metabolic activity but also by altered gene expression. Growing evidence suggests that quiescence is not merely a basal state but is regulated by active mechanisms. To understand the molecular programme that governs reversible cell cycle exit, we focused on quiescence-related gene expression in a culture model of myogenic cell arrest and activation. Here we report the identification of quiescence-induced genes using a gene-trap strategy. Using a retroviral vector, we generated a library of gene traps in C2C12 myoblasts that were screened for arrest-induced insertions by live cell sorting (FACS-gal). Several independent genetrap lines revealed arrest-dependent induction of gal activity, confirming the efficacy of the FACS screen. The locus of integration was identified in 15 lines. In three lines, insertion occurred in genes previously implicated in the control of quiescence, i.e. EMSY – a BRCA2-interacting protein, p8/com1– a p300HAT-binding protein and MLL5 – a SET domain protein. Our results demonstrate that expression of chromatin modulatory genes is induced in G0, providing support to the notion that this reversibly arrested state is actively regulated.

  15. Large-scale autostereoscopic outdoor display

    Science.gov (United States)

    Reitterer, Jörg; Fidler, Franz; Saint Julien-Wallsee, Ferdinand; Schmid, Gerhard; Gartner, Wolfgang; Leeb, Walter; Schmid, Ulrich

    2013-03-01

    State-of-the-art autostereoscopic displays are often limited in size, effective brightness, number of 3D viewing zones, and maximum 3D viewing distances, all of which are mandatory requirements for large-scale outdoor displays. Conventional autostereoscopic indoor concepts like lenticular lenses or parallax barriers cannot simply be adapted for these screens due to the inherent loss of effective resolution and brightness, which would reduce both image quality and sunlight readability. We have developed a modular autostereoscopic multi-view laser display concept with sunlight readable effective brightness, theoretically up to several thousand 3D viewing zones, and maximum 3D viewing distances of up to 60 meters. For proof-of-concept purposes a prototype display with two pixels was realized. Due to various manufacturing tolerances each individual pixel has slightly different optical properties, and hence the 3D image quality of the display has to be calculated stochastically. In this paper we present the corresponding stochastic model, we evaluate the simulation and measurement results of the prototype display, and we calculate the achievable autostereoscopic image quality to be expected for our concept.

  16. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  17. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  18. Isolation of Novel CreERT2-Driver Lines in Zebrafish Using an Unbiased Gene Trap Approach.

    Directory of Open Access Journals (Sweden)

    Peggy Jungke

    Full Text Available Gene manipulation using the Cre/loxP-recombinase system has been successfully employed in zebrafish to study gene functions and lineage relationships. Recently, gene trapping approaches have been applied to produce large collections of transgenic fish expressing conditional alleles in various tissues. However, the limited number of available cell- and tissue-specific Cre/CreERT2-driver lines still constrains widespread application in this model organism. To enlarge the pool of existing CreERT2-driver lines, we performed a genome-wide gene trap screen using a Tol2-based mCherry-T2a-CreERT2 (mCT2aC gene trap vector. This cassette consists of a splice acceptor and a mCherry-tagged variant of CreERT2 which enables simultaneous labeling of the trapping event, as well as CreERT2 expression from the endogenous promoter. Using this strategy, we generated 27 novel functional CreERT2-driver lines expressing in a cell- and tissue-specific manner during development and adulthood. This study summarizes the analysis of the generated CreERT2-driver lines with respect to functionality, expression, integration, as well as associated phenotypes. Our results significantly enlarge the existing pool of CreERT2-driver lines in zebrafish and combined with Cre-dependent effector lines, the new CreERT2-driver lines will be important tools to manipulate the zebrafish genome.

  19. Isolation of Novel CreERT2-Driver Lines in Zebrafish Using an Unbiased Gene Trap Approach.

    Science.gov (United States)

    Jungke, Peggy; Hammer, Juliane; Hans, Stefan; Brand, Michael

    2015-01-01

    Gene manipulation using the Cre/loxP-recombinase system has been successfully employed in zebrafish to study gene functions and lineage relationships. Recently, gene trapping approaches have been applied to produce large collections of transgenic fish expressing conditional alleles in various tissues. However, the limited number of available cell- and tissue-specific Cre/CreERT2-driver lines still constrains widespread application in this model organism. To enlarge the pool of existing CreERT2-driver lines, we performed a genome-wide gene trap screen using a Tol2-based mCherry-T2a-CreERT2 (mCT2aC) gene trap vector. This cassette consists of a splice acceptor and a mCherry-tagged variant of CreERT2 which enables simultaneous labeling of the trapping event, as well as CreERT2 expression from the endogenous promoter. Using this strategy, we generated 27 novel functional CreERT2-driver lines expressing in a cell- and tissue-specific manner during development and adulthood. This study summarizes the analysis of the generated CreERT2-driver lines with respect to functionality, expression, integration, as well as associated phenotypes. Our results significantly enlarge the existing pool of CreERT2-driver lines in zebrafish and combined with Cre-dependent effector lines, the new CreERT2-driver lines will be important tools to manipulate the zebrafish genome.

  20. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  1. Network robustness under large-scale attacks

    CERN Document Server

    Zhou, Qing; Liu, Ruifang; Cui, Shuguang

    2014-01-01

    Network Robustness under Large-Scale Attacks provides the analysis of network robustness under attacks, with a focus on large-scale correlated physical attacks. The book begins with a thorough overview of the latest research and techniques to analyze the network responses to different types of attacks over various network topologies and connection models. It then introduces a new large-scale physical attack model coined as area attack, under which a new network robustness measure is introduced and applied to study the network responses. With this book, readers will learn the necessary tools to evaluate how a complex network responds to random and possibly correlated attacks.

  2. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  3. Large Scale Metal Additive Techniques Review

    Energy Technology Data Exchange (ETDEWEB)

    Nycz, Andrzej [ORNL; Adediran, Adeola I [ORNL; Noakes, Mark W [ORNL; Love, Lonnie J [ORNL

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  4. Large-scale mapping of mutations affecting zebrafish development

    Directory of Open Access Journals (Sweden)

    Neuhauss Stephan C

    2007-01-01

    Full Text Available Abstract Background Large-scale mutagenesis screens in the zebrafish employing the mutagen ENU have isolated several hundred mutant loci that represent putative developmental control genes. In order to realize the potential of such screens, systematic genetic mapping of the mutations is necessary. Here we report on a large-scale effort to map the mutations generated in mutagenesis screening at the Max Planck Institute for Developmental Biology by genome scanning with microsatellite markers. Results We have selected a set of microsatellite markers and developed methods and scoring criteria suitable for efficient, high-throughput genome scanning. We have used these methods to successfully obtain a rough map position for 319 mutant loci from the Tübingen I mutagenesis screen and subsequent screening of the mutant collection. For 277 of these the corresponding gene is not yet identified. Mapping was successful for 80 % of the tested loci. By comparing 21 mutation and gene positions of cloned mutations we have validated the correctness of our linkage group assignments and estimated the standard error of our map positions to be approximately 6 cM. Conclusion By obtaining rough map positions for over 300 zebrafish loci with developmental phenotypes, we have generated a dataset that will be useful not only for cloning of the affected genes, but also to suggest allelism of mutations with similar phenotypes that will be identified in future screens. Furthermore this work validates the usefulness of our methodology for rapid, systematic and inexpensive microsatellite mapping of zebrafish mutations.

  5. Comparison of methods for genomic localization of gene trap sequences

    Directory of Open Access Journals (Sweden)

    Ferrin Thomas E

    2006-09-01

    Full Text Available Abstract Background Gene knockouts in a model organism such as mouse provide a valuable resource for the study of basic biology and human disease. Determining which gene has been inactivated by an untargeted gene trapping event poses a challenging annotation problem because gene trap sequence tags, which represent sequence near the vector insertion site of a trapped gene, are typically short and often contain unresolved residues. To understand better the localization of these sequences on the mouse genome, we compared stand-alone versions of the alignment programs BLAT, SSAHA, and MegaBLAST. A set of 3,369 sequence tags was aligned to build 34 of the mouse genome using default parameters for each algorithm. Known genome coordinates for the cognate set of full-length genes (1,659 sequences were used to evaluate localization results. Results In general, all three programs performed well in terms of localizing sequences to a general region of the genome, with only relatively subtle errors identified for a small proportion of the sequence tags. However, large differences in performance were noted with regard to correctly identifying exon boundaries. BLAT correctly identified the vast majority of exon boundaries, while SSAHA and MegaBLAST missed the majority of exon boundaries. SSAHA consistently reported the fewest false positives and is the fastest algorithm. MegaBLAST was comparable to BLAT in speed, but was the most susceptible to localizing sequence tags incorrectly to pseudogenes. Conclusion The differences in performance for sequence tags and full-length reference sequences were surprisingly small. Characteristic variations in localization results for each program were noted that affect the localization of sequence at exon boundaries, in particular.

  6. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  7. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  8. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  9. Large-scale Complex IT Systems

    CERN Document Server

    Sommerville, Ian; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challenges and issues in the development of large-scale complex, software-intensive systems. Central to this is the notion that we cannot separate software from the socio-technical environment in which it is used.

  10. Topological Routing in Large-Scale Networks

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    2004-01-01

    A new routing scheme, Topological Routing, for large-scale networks is proposed. It allows for efficient routing without large routing tables as known from traditional routing schemes. It presupposes a certain level of order in the networks, known from Structural QoS. The main issues in applying...... Topological Routing to large-scale networks are discussed. Hierarchical extensions are presented along with schemes for shortest path routing, fault handling and path restoration. Further reserach in the area is discussed and perspectives on the prerequisites for practical deployment of Topological Routing...

  11. Topological Routing in Large-Scale Networks

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    A new routing scheme, Topological Routing, for large-scale networks is proposed. It allows for efficient routing without large routing tables as known from traditional routing schemes. It presupposes a certain level of order in the networks, known from Structural QoS. The main issues in applying...... Topological Routing to large-scale networks are discussed. Hierarchical extensions are presented along with schemes for shortest path routing, fault handling and path restoration. Further reserach in the area is discussed and perspectives on the prerequisites for practical deployment of Topological Routing...

  12. Large scale topic modeling made practical

    DEFF Research Database (Denmark)

    Wahlgreen, Bjarne Ørum; Hansen, Lars Kai

    2011-01-01

    Topic models are of broad interest. They can be used for query expansion and result structuring in information retrieval and as an important component in services such as recommender systems and user adaptive advertising. In large scale applications both the size of the database (number of docume......Topic models are of broad interest. They can be used for query expansion and result structuring in information retrieval and as an important component in services such as recommender systems and user adaptive advertising. In large scale applications both the size of the database (number...... topics at par with a much larger case specific vocabulary....

  13. Quantification of histone H3 Lys27 trimethylation (H3K27me3) by high-throughput microscopy enables cellular large-scale screening for small-molecule EZH2 inhibitors.

    Science.gov (United States)

    Luense, Svenja; Denner, Philip; Fernández-Montalván, Amaury; Hartung, Ingo; Husemann, Manfred; Stresemann, Carlo; Prechtl, Stefan

    2015-02-01

    EZH2 inhibition can decrease global histone H3 lysine 27 trimethylation (H3K27me3) and thereby reactivates silenced tumor suppressor genes. Inhibition of EZH2 is regarded as an option for therapeutic cancer intervention. To identify novel small-molecule (SMOL) inhibitors of EZH2 in drug discovery, trustworthy cellular assays amenable for phenotypic high-throughput screening (HTS) are crucial. We describe a reliable approach that quantifies changes in global levels of histone modification marks using high-content analysis (HCA). The approach was validated in different cell lines by using small interfering RNA and SMOL inhibitors. By automation and miniaturization from a 384-well to 1536-well plate, we demonstrated its utility in conducting phenotypic HTS campaigns and assessing structure-activity relationships (SAR). This assay enables screening of SMOL EZH2 inhibitors and can advance the mechanistic understanding of H3K27me3 suppression, which is crucial with regard to epigenetic therapy. We observed that a decrease in global H3K27me3, induced by EZH2 inhibition, comprises two distinct mechanisms: (1) inhibition of de novo DNA methylation and (II) inhibition of dynamic, replication-independent H3K27me3 turnover. This report describes an HCA assay for primary HTS to identify, profile, and optimize cellular active SMOL inhibitors targeting histone methyltransferases, which could benefit epigenetic drug discovery. © 2014 Society for Laboratory Automation and Screening.

  14. CAS to set up large-scale gardens for energy-rich plants

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    @@ Studies under the title of "Screening & Assessment of Energy Plants & Core Technology for Large-Scale Plantation of the Physic Nut Tree" have recently been initiated as a major project at the CAS Science Cluster for Advanced Industrial Biotechnology.

  15. Large-scale multimedia modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications.

  16. Evaluating Large-Scale Interactive Radio Programmes

    Science.gov (United States)

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  17. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  18. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data dissem

  19. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  20. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that cont

  1. Ensemble methods for large scale inverse problems

    NARCIS (Netherlands)

    Heemink, A.W.; Umer Altaf, M.; Barbu, A.L.; Verlaan, M.

    2013-01-01

    Variational data assimilation, also sometimes simply called the ‘adjoint method’, is used very often for large scale model calibration problems. Using the available data, the uncertain parameters in the model are identified by minimizing a certain cost function that measures the difference between t

  2. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  3. Inflation, large scale structure and particle physics

    Indian Academy of Sciences (India)

    S F King

    2004-02-01

    We review experimental and theoretical developments in inflation and its application to structure formation, including the curvation idea. We then discuss a particle physics model of supersymmetric hybrid inflation at the intermediate scale in which the Higgs scalar field is responsible for large scale structure, show how such a theory is completely natural in the framework extra dimensions with an intermediate string scale.

  4. Effective Expression-Independent Gene Trapping and Mutagenesis Mediated by Sleeping Beauty Transposon

    Institute of Scientific and Technical Information of China (English)

    Guili Song; Qing Li; Yong Long; Perry B. Hackett; Zongbin Cui

    2012-01-01

    Expression-independent gene or polyadenylation [poly(A)] trapping is a powerful tool for genome-wide mutagenesis regardless of whether a targeted gene is expressed.Although a number of poly(A)-trap vectors have been developed for the capture and mutation of genes across a vertebrate genome,further efforts are needed to avoid the 3′-terminal insertion bias and the splice donor (SD)read-through,and to improve the mutagenicity.Here,we present a Sleeping Beauty (SB) transposon-based vector that can overcome these limitations through the inclusion of three functional cassettes required for gene-finding,gene-breaking and large-scale mutagenesis,respectively.The functional cassette contained a reporter/selective marker gene driven by a constitutive promoter in front of a strong SD signal and an AU-rich RNA-destabilizing element (ARE),which greatly reduced the SD read-through events,except that the internal ribosomal entry site (IRES) element was introduced in front of the SD signal to overcome the phenomenon of 3′-bias gene trapping.The breaking cassette consisting of an enhanced splicing acceptor (SA),a poly(A) signal coupled with a transcriptional terminator (TT) effectively disrupted the transcription of trapped genes.Moreover,the Hsp70 promoter from tilapia genome was employed to drive the inducible expression of SB11,which allows the conditional remobilization of a trap insert from a non-coding region.The combination of three cassettes led to effective capture and disruption of endogenous genes in HeLa cells.In addition,the Cre/LoxP system was introduced to delete the Hsp70-SB11 cassette for stabilization of trapped gene interruption and biosafety.Thus,this poly(A)-trap vector is an alternative and effective tool for identification and mutation of endogenous genes in cells and animals.

  5. The large-scale structure of vacuum

    CERN Document Server

    Albareti, F D; Maroto, A L

    2014-01-01

    The vacuum state in quantum field theory is known to exhibit an important number of fundamental physical features. In this work we explore the possibility that this state could also present a non-trivial space-time structure on large scales. In particular, we will show that by imposing the renormalized vacuum energy-momentum tensor to be conserved and compatible with cosmological observations, the vacuum energy of sufficiently heavy fields behaves at late times as non-relativistic matter rather than as a cosmological constant. In this limit, the vacuum state supports perturbations whose speed of sound is negligible and accordingly allows the growth of structures in the vacuum energy itself. This large-scale structure of vacuum could seed the formation of galaxies and clusters very much in the same way as cold dark matter does.

  6. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...... limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its...... main focus. Here the general perception of the nature and role in society of large scale networks as a fundamental infrastructure is analysed. This analysis focuses on the effects of the technical DDN projects and on the perception of network infrastructure as expressed by key decision makers...

  7. Process Principles for Large-Scale Nanomanufacturing.

    Science.gov (United States)

    Behrens, Sven H; Breedveld, Victor; Mujica, Maritza; Filler, Michael A

    2017-06-07

    Nanomanufacturing-the fabrication of macroscopic products from well-defined nanoscale building blocks-in a truly scalable and versatile manner is still far from our current reality. Here, we describe the barriers to large-scale nanomanufacturing and identify routes to overcome them. We argue for nanomanufacturing systems consisting of an iterative sequence of synthesis/assembly and separation/sorting unit operations, analogous to those used in chemicals manufacturing. In addition to performance and economic considerations, phenomena unique to the nanoscale must guide the design of each unit operation and the overall process flow. We identify and discuss four key nanomanufacturing process design needs: (a) appropriately selected process break points, (b) synthesis techniques appropriate for large-scale manufacturing, (c) new structure- and property-based separations, and (d) advances in stabilization and packaging.

  8. Condition Monitoring of Large-Scale Facilities

    Science.gov (United States)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  9. Large-scale structure of the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Shandarin, S.F.; Doroshkevich, A.G.; Zel' dovich, Ya.B. (Inst. Prikladnoj Matematiki, Moscow, USSR)

    1983-01-01

    A review of theory of the large-scale structure of the Universe is given, including formation of clusters and superclusters of galaxies as well as large voids. Particular attention is paid to the theory of neutrino dominated Universe - the cosmological model where neutrinos with the rest mass of several tens eV dominate the mean density. Evolution of small perturbations is discussed, estimates of microwave backgorund radiation fluctuations is given for different angular scales. Adiabatic theory of the Universe structure formation, known as ''cake'' scenario and their successive fragmentation is given. This scenario is based on approximate nonlinear theory of gravitation instability. Results of numerical experiments, modeling the processes of large-scale structure formation are discussed.

  10. Large-scale structure of the universe

    Energy Technology Data Exchange (ETDEWEB)

    Shandarin, S.F.; Doroshkevich, A.G.; Zel' dovich, Y.B.

    1983-01-01

    A survey is given of theories for the origin of large-scale structure in the universe: clusters and superclusters of galaxies, and vast black regions practically devoid of galaxies. Special attention is paid to the theory of a neutrino-dominated universe: a cosmology in which electron neutrinos with a rest mass of a few tens of electron volts would contribute the bulk of the mean density. The evolution of small perturbations is discussed, and estimates are made for the temperature anisotropy of the microwave background radiation on various angular scales. The nonlinear stage in the evolution of smooth irrotational perturbations in a low-pressure medium is described in detail. Numerical experiments simulating large-scale structure formation processes are discussed, as well as their interpretation in the context of catastrophe theory.

  11. Wireless Secrecy in Large-Scale Networks

    CERN Document Server

    Pinto, Pedro C; Win, Moe Z

    2011-01-01

    The ability to exchange secret information is critical to many commercial, governmental, and military networks. The intrinsically secure communications graph (iS-graph) is a random graph which describes the connections that can be securely established over a large-scale network, by exploiting the physical properties of the wireless medium. This paper provides an overview of the main properties of this new class of random graphs. We first analyze the local properties of the iS-graph, namely the degree distributions and their dependence on fading, target secrecy rate, and eavesdropper collusion. To mitigate the effect of the eavesdroppers, we propose two techniques that improve secure connectivity. Then, we analyze the global properties of the iS-graph, namely percolation on the infinite plane, and full connectivity on a finite region. These results help clarify how the presence of eavesdroppers can compromise secure communication in a large-scale network.

  12. ELASTIC: A Large Scale Dynamic Tuning Environment

    Directory of Open Access Journals (Sweden)

    Andrea Martínez

    2014-01-01

    Full Text Available The spectacular growth in the number of cores in current supercomputers poses design challenges for the development of performance analysis and tuning tools. To be effective, such analysis and tuning tools must be scalable and be able to manage the dynamic behaviour of parallel applications. In this work, we present ELASTIC, an environment for dynamic tuning of large-scale parallel applications. To be scalable, the architecture of ELASTIC takes the form of a hierarchical tuning network of nodes that perform a distributed analysis and tuning process. Moreover, the tuning network topology can be configured to adapt itself to the size of the parallel application. To guide the dynamic tuning process, ELASTIC supports a plugin architecture. These plugins, called ELASTIC packages, allow the integration of different tuning strategies into ELASTIC. We also present experimental tests conducted using ELASTIC, showing its effectiveness to improve the performance of large-scale parallel applications.

  13. Measuring Bulk Flows in Large Scale Surveys

    CERN Document Server

    Feldman, H A; Feldman, Hume A.; Watkins, Richard

    1993-01-01

    We follow a formalism presented by Kaiser to calculate the variance of bulk flows in large scale surveys. We apply the formalism to a mock survey of Abell clusters \\'a la Lauer \\& Postman and find the variance in the expected bulk velocities in a universe with CDM, MDM and IRAS--QDOT power spectra. We calculate the velocity variance as a function of the 1--D velocity dispersion of the clusters and the size of the survey.

  14. Statistical characteristics of Large Scale Structure

    OpenAIRE

    Demianski; Doroshkevich

    2002-01-01

    We investigate the mass functions of different elements of the Large Scale Structure -- walls, pancakes, filaments and clouds -- and the impact of transverse motions -- expansion and/or compression -- on their statistical characteristics. Using the Zel'dovich theory of gravitational instability we show that the mass functions of all structure elements are approximately the same and the mass of all elements is found to be concentrated near the corresponding mean mass. At high redshifts, both t...

  15. Topologies for large scale photovoltaic power plants

    OpenAIRE

    Cabrera Tobar, Ana; Bullich Massagué, Eduard; Aragüés Peñalba, Mònica; Gomis Bellmunt, Oriol

    2016-01-01

    © 2016 Elsevier Ltd. All rights reserved. The concern of increasing renewable energy penetration into the grid together with the reduction of prices of photovoltaic solar panels during the last decade have enabled the development of large scale solar power plants connected to the medium and high voltage grid. Photovoltaic generation components, the internal layout and the ac collection grid are being investigated for ensuring the best design, operation and control of these power plants. This ...

  16. Echinococcus multilocularis--adaptation of a worm egg isolation procedure coupled with a multiplex PCR assay to carry out large-scale screening of red foxes (Vulpes vulpes) in Norway.

    Science.gov (United States)

    Davidson, Rebecca K; Oines, Oivind; Madslien, Knut; Mathis, Alexander

    2009-02-01

    Echinococcus multilocularis, causing alveolar echinococcosis in humans, is a highly pathogenic emerging zoonotic disease in central Europe. The gold standard for the identification of this parasite in the main host, the red fox, namely identification of the adult parasite in the intestine at necropsy, is very laborious. Copro-enzyme-linked immunosorbent assay (ELISA) with confirmatory polymerase chain reaction (PCR) has been suggested as an acceptable alternative, but no commercial copro-ELISA tests are currently available and an in-house test is therefore required. Published methods for taeniid egg isolation and a multiplex PCR assay for simultaneous identification of E. multilocularis, E. granulosus and other cestodes were adapted to be carried out on pooled faecal samples from red foxes in Norway. None of the 483 fox faecal samples screened were PCR-positive for E. multilocularis, indicating an apparent prevalence of between 0% and 1.5%. The advantages and disadvantages of using the adapted method are discussed as well as the results pertaining to taeniid and non-taeniid cestodes as identified by multiplex PCR.

  17. Large-scale instabilities of helical flows

    CERN Document Server

    Cameron, Alexandre; Brachet, Marc-Étienne

    2016-01-01

    Large-scale hydrodynamic instabilities of periodic helical flows are investigated using $3$D Floquet numerical computations. A minimal three-modes analytical model that reproduce and explains some of the full Floquet results is derived. The growth-rate $\\sigma$ of the most unstable modes (at small scale, low Reynolds number $Re$ and small wavenumber $q$) is found to scale differently in the presence or absence of anisotropic kinetic alpha (\\AKA{}) effect. When an $AKA$ effect is present the scaling $\\sigma \\propto q\\; Re\\,$ predicted by the $AKA$ effect theory [U. Frisch, Z. S. She, and P. L. Sulem, Physica D: Nonlinear Phenomena 28, 382 (1987)] is recovered for $Re\\ll 1$ as expected (with most of the energy of the unstable mode concentrated in the large scales). However, as $Re$ increases, the growth-rate is found to saturate and most of the energy is found at small scales. In the absence of \\AKA{} effect, it is found that flows can still have large-scale instabilities, but with a negative eddy-viscosity sca...

  18. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  19. Large-Scale Visual Data Analysis

    Science.gov (United States)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  20. Large-scale neuromorphic computing systems

    Science.gov (United States)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  1. Non-target screening and prioritization of potentially persistent, bioaccumulating and toxic domestic wastewater contaminants and their removal in on-site and large-scale sewage treatment plants.

    Science.gov (United States)

    Blum, Kristin M; Andersson, Patrik L; Renman, Gunno; Ahrens, Lutz; Gros, Meritxell; Wiberg, Karin; Haglund, Peter

    2017-01-01

    On-site sewage treatment facilities (OSSFs), which are used to reduce nutrient emissions in rural areas, were screened for anthropogenic compounds with two-dimensional gas chromatography-mass spectrometry (GC×GC-MS). The detected compounds were prioritized based on their persistence, bioaccumulation, ecotoxicity, removal efficiency, and concentrations. This comprehensive prioritization strategy, which was used for the first time on OSSF samples, ranked galaxolide, α-tocopheryl acetate, octocrylene, 2,4,7,9-tetramethyl-5-decyn-4,7-diol, several chlorinated organophosphorus flame retardants and linear alkyl benzenes as the most relevant compounds being emitted from OSSFs. Twenty-six target analytes were then selected for further removal efficiency analysis, including compounds from the priority list along with substances from the same chemical classes, and a few reference compounds. We found significantly better removal of two polar contaminants 2,4,7,9-tetramethyl-5-decyn-4,7-diol (p=0.0003) and tris(2-butoxyethyl) phosphate (p=0.005) in soil beds, a common type of OSSF in Sweden, compared with conventional sewage treatment plants. We also report median removal efficiencies in OSSFs for compounds not studied in this context before, viz. α-tocopheryl acetate (96%), benzophenone (83%), 2-(methylthio)benzothiazole (64%), 2,4,7,9-tetramethyl-5-decyn-4,7-diol (33%), and a range of organophosphorus flame retardants (19% to 98%). The environmental load of the top prioritized compounds in soil bed effluents were in the thousands of nanogram per liter range, viz. 2,4,7,9-tetramethyl-5-decyn-4,7-diol (3000ngL(-1)), galaxolide (1400ngL(-1)), octocrylene (1200ngL(-1)), and α-tocopheryl acetate (660ngL(-1)). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Thinprep液基细胞学在中山市大规模宫颈癌筛查中的应用%Application of Thinprep liquid based cytology in large-scale screening of cervical cancer in Zhongshan city

    Institute of Scientific and Technical Information of China (English)

    王莹; 陈昂; 米贤军; 沈铿; 余艳红; 肖琳; 徐秀梅; 孪峰; 钟守军; 段立锋

    2012-01-01

    Objective: To evaluate the application value of Thinprep liquid based eytologic test (TCT) in large —scale screening of cervical cancer. Methods: The cervical cells of 44 936 cases were detected by TCT, then TBS classification was performed; the cases with cervical lesions above atypical squamous cells of undetermined significance ( ASCUS) /atypical glandular cells ( AGC) were designed as cytological positive cases, and colposcopic multiple punch biopsy was conducted, the cases with cervical intraepithelial neoplasia (CIN) or a-bove CIN lesions were designed as histopathological positive cases, the pathological results were designed as gold standards, then the cytolog-ical results were compared with pathological results after biopsy. Results: Among 44 936 cases, 1 413 cases were diagnosed as cytological positive cases, the total detection rate was 3. 14% , including 202 eases with LSIL (0. 45% ) , 128 cases with HSIL (0. 28% ) , 4 cases with squamous cell carcinoma (SCC) (0. 09‰) , one case with AC (0. 02‰) , 119 cases with ASC - H (0. 27% ) , 919 cases with ASCUS (2. 05% ) , and 40 cases with AGC (0. 09% ) . A total of 761 cases underwent pathological examination, then the results were compared with cytological results, the pathological positive rates of ACC, AC, HSIL, LSIL, ASC - H, ASCIIS, and AGS detected by TCT were 100.00% (4/4), 100.00% (1/1), 90.40% (66/73), 47.27% (52/110), 67.69% (44/65), 16. 32% (79/484), and 41.66% (10/24) , respectively. The pathological accurate ra.es of SCC, AC, HSIL, and LSIL detected by TCT were 100.00% (4/4) , 100.00% ( 1/1) , 60 01% (46/73) , ard 37. 20% (41/110) , respectively. Conclusion: TCT is a convenient and power - efficient screening method for cervical cancer and cervicd precmcerous lesion, which is useful for early diagnosis and early treatment of cervical cancer and cervical precancerous lesion.%目的:评价Thinprep液基细胞学检测技术(TCT)在大规模宫颈癌筛查中的应用价值.方法:对44 936例

  3. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  4. Large-Scale PV Integration Study

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  5. Conformal Anomaly and Large Scale Gravitational Coupling

    CERN Document Server

    Salehi, H

    2000-01-01

    We present a model in which the breackdown of conformal symmetry of a quantum stress-tensor due to the trace anomaly is related to a cosmological effect in a gravitational model. This is done by characterizing the traceless part of the quantum stress-tensor in terms of the stress-tensor of a conformal invariant classical scalar field. We introduce a conformal frame in which the anomalous trace is identified with a cosmological constant. In this conformal frame we establish the Einstein field equations by connecting the quantum stress-tensor with the large scale distribution of matter in the universe.

  6. Large Scale Quantum Simulations of Nuclear Pasta

    Science.gov (United States)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 pasta configurations. This work is supported in part by DOE Grants DE-FG02-87ER40365 (Indiana University) and DE-SC0008808 (NUCLEI SciDAC Collaboration).

  7. Large scale wind power penetration in Denmark

    DEFF Research Database (Denmark)

    Karnøe, Peter

    2013-01-01

    he Danish electricity generating system prepared to adopt nuclear power in the 1970s, yet has become the world's front runner in wind power with a national plan for 50% wind power penetration by 2020. This paper deploys a sociotechnical perspective to explain the historical transformation of "net...... expertise evolves and contributes to the normalization and large-scale penetration of wind power in the electricity generating system. The analysis teaches us how technological paths become locked-in, but also indicates keys for locking them out....

  8. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  9. What is a large-scale dynamo?

    Science.gov (United States)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  10. Large scale phononic metamaterials for seismic isolation

    Energy Technology Data Exchange (ETDEWEB)

    Aravantinos-Zafiris, N. [Department of Sound and Musical Instruments Technology, Ionian Islands Technological Educational Institute, Stylianou Typaldou ave., Lixouri 28200 (Greece); Sigalas, M. M. [Department of Materials Science, University of Patras, Patras 26504 (Greece)

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  11. Hiearchical Engine for Large Scale Infrastructure Simulation

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-15

    HELICS ls a new open-source, cyber-physlcal-energy co-simulation framework for electric power systems. HELICS Is designed to support very-large-scale (100,000+ federates) co­simulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features Include cross platform operating system support, the integration of both eventdrlven (e.g., packetlzed communication) and time-series (e.g.,power flow) simulations, and the ability to co-Iterate among federates to ensure physical model convergence at each time step.

  12. Colloquium: Large scale simulations on GPU clusters

    Science.gov (United States)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  13. Establishing a Gene Trap System Mediated by T-DNA(GUS) in Rice

    Institute of Scientific and Technical Information of China (English)

    Shi-Yan Chen; Ai-Min Wang; Wei Li; Zong-Yang Wang; Xiu-Ling Cai

    2008-01-01

    Two plasmids, p13GUS and p13GUS2, were constructed to create a gene trap system containing the promoterless β-glucuronidase (GUS) reporter gene in the T-DNA region. Transformation of these two plasmids into the rice variety Zhonghua 11 (Oryza sativa ssp. japonica cv.), mediated by Agrobacterium tumefaciens, resulted in 942 independent transgenic lines. Histochemical GUS assays revealed that 31 To plants had various patterns of the reporter gene expression, including expression in only one tissue, and simultaneously in two or more tissues. Hygromycin-resistant (hygr) homozygotes were screened and the copy number of the T-DNA inserts was determined in the GUS-positivs transgenic plants. The flanking sequences of the T-DNA were isolated by inverse-polymerase chain reaction and the insert positions on the rice genome of T-DNA were determined by a basic local alignment search tool in the GUS-positive transgenic plants transformed with plasmid p13GUS. Moreover, calii induced from the seeds of the T1 generation of 911 GUS-negative transgenic lines were subjected to stress and hormone treatments. Histochemical GUS assays were carried out on the calli before and after treatment. The results revealed that calli from 21 lines displayed differential GUS expression after treatment. All of these data demonstrated that this trap system is suitable for identifying rice genes, including those that are sensitive to induction.

  14. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  15. Large-scale Globally Propagating Coronal Waves

    Directory of Open Access Journals (Sweden)

    Alexander Warmuth

    2015-09-01

    Full Text Available Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the “classical” interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which “pseudo waves” are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  16. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  17. Accelerated large-scale multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Lloyd Scott

    2011-12-01

    Full Text Available Abstract Background Multiple sequence alignment (MSA is a fundamental analysis method used in bioinformatics and many comparative genomic applications. Prior MSA acceleration attempts with reconfigurable computing have only addressed the first stage of progressive alignment and consequently exhibit performance limitations according to Amdahl's Law. This work is the first known to accelerate the third stage of progressive alignment on reconfigurable hardware. Results We reduce subgroups of aligned sequences into discrete profiles before they are pairwise aligned on the accelerator. Using an FPGA accelerator, an overall speedup of up to 150 has been demonstrated on a large data set when compared to a 2.4 GHz Core2 processor. Conclusions Our parallel algorithm and architecture accelerates large-scale MSA with reconfigurable computing and allows researchers to solve the larger problems that confront biologists today. Program source is available from http://dna.cs.byu.edu/msa/.

  18. Clumps in large scale relativistic jets

    CERN Document Server

    Tavecchio, F; Celotti, A

    2003-01-01

    The relatively intense X-ray emission from large scale (tens to hundreds kpc) jets discovered with Chandra likely implies that jets (at least in powerful quasars) are still relativistic at that distances from the active nucleus. In this case the emission is due to Compton scattering off seed photons provided by the Cosmic Microwave Background, and this on one hand permits to have magnetic fields close to equipartition with the emitting particles, and on the other hand minimizes the requirements about the total power carried by the jet. The emission comes from compact (kpc scale) knots, and we here investigate what we can predict about the possible emission between the bright knots. This is motivated by the fact that bulk relativistic motion makes Compton scattering off the CMB photons efficient even when electrons are cold or mildly relativistic in the comoving frame. This implies relatively long cooling times, dominated by adiabatic losses. Therefore the relativistically moving plasma can emit, by Compton sc...

  19. Large-scale parametric survival analysis.

    Science.gov (United States)

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  20. Curvature constraints from Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-01-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter $\\Omega_K$ with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on the spatial curvature parameter estimation. We show that constraints on the curvature para...

  1. Large-scale simulations of reionization

    Energy Technology Data Exchange (ETDEWEB)

    Kohler, Katharina; /JILA, Boulder /Fermilab; Gnedin, Nickolay Y.; /Fermilab; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  2. Large-Scale Tides in General Relativity

    CERN Document Server

    Ip, Hiu Yan

    2016-01-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lema\\^itre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation ...

  3. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  4. Large scale water lens for solar concentration.

    Science.gov (United States)

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation.

  5. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... within the development of our urban landscapes. At the same time, urban and landscape designers are confronted with new methodological problems. Within a strategic transformation perspective, the formulation of the design problem or brief becomes an integrated part of the design process. This paper...... discusses new design (education) methods based on a relational concept of urban sites and design processes. Within this logic site survey is not simply a pre-design activity nor is it a question of comprehensive analysis. Site survey is an integrated part of the design process. By means of active site...

  6. Supporting large-scale computational science

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  7. Introducing Large-Scale Innovation in Schools

    Science.gov (United States)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  8. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  9. Cold flows and large scale tides

    Science.gov (United States)

    van de Weygaert, R.; Hoffman, Y.

    1999-01-01

    Within the context of the general cosmological setting it has remained puzzling that the local Universe is a relatively cold environment, in the sense of small-scale peculiar velocities being relatively small. Indeed, it has since long figured as an important argument for the Universe having a low Ω, or if the Universe were to have a high Ω for the existence of a substantial bias between the galaxy and the matter distribution. Here we investigate the dynamical impact of neighbouring matter concentrations on local small-scale characteristics of cosmic flows. While regions where huge nearby matter clumps represent a dominating component in the local dynamics and kinematics may experience a faster collapse on behalf of the corresponding tidal influence, the latter will also slow down or even prevent a thorough mixing and virialization of the collapsing region. By means of N-body simulations starting from constrained realizations of regions of modest density surrounded by more pronounced massive structures, we have explored the extent to which the large scale tidal fields may indeed suppress the `heating' of the small-scale cosmic velocities. Amongst others we quantify the resulting cosmic flows through the cosmic Mach number. This allows us to draw conclusions about the validity of estimates of global cosmological parameters from local cosmic phenomena and the necessity to take into account the structure and distribution of mass in the local Universe.

  10. Large scale mechanical metamaterials as seismic shields

    Science.gov (United States)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  11. Management of large-scale multimedia conferencing

    Science.gov (United States)

    Cidon, Israel; Nachum, Youval

    1998-12-01

    The goal of this work is to explore management strategies and algorithms for large-scale multimedia conferencing over a communication network. Since the use of multimedia conferencing is still limited, the management of such systems has not yet been studied in depth. A well organized and human friendly multimedia conference management should utilize efficiently and fairly its limited resources as well as take into account the requirements of the conference participants. The ability of the management to enforce fair policies and to quickly take into account the participants preferences may even lead to a conference environment that is more pleasant and more effective than a similar face to face meeting. We suggest several principles for defining and solving resource sharing problems in this context. The conference resources which are addressed in this paper are the bandwidth (conference network capacity), time (participants' scheduling) and limitations of audio and visual equipment. The participants' requirements for these resources are defined and translated in terms of Quality of Service requirements and the fairness criteria.

  12. Large-scale wind turbine structures

    Science.gov (United States)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  13. Large-scale tides in general relativity

    Science.gov (United States)

    Ip, Hiu Yan; Schmidt, Fabian

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  14. Large scale probabilistic available bandwidth estimation

    CERN Document Server

    Thouin, Frederic; Rabbat, Michael

    2010-01-01

    The common utilization-based definition of available bandwidth and many of the existing tools to estimate it suffer from several important weaknesses: i) most tools report a point estimate of average available bandwidth over a measurement interval and do not provide a confidence interval; ii) the commonly adopted models used to relate the available bandwidth metric to the measured data are invalid in almost all practical scenarios; iii) existing tools do not scale well and are not suited to the task of multi-path estimation in large-scale networks; iv) almost all tools use ad-hoc techniques to address measurement noise; and v) tools do not provide enough flexibility in terms of accuracy, overhead, latency and reliability to adapt to the requirements of various applications. In this paper we propose a new definition for available bandwidth and a novel framework that addresses these issues. We define probabilistic available bandwidth (PAB) as the largest input rate at which we can send a traffic flow along a pa...

  15. Gravitational redshifts from large-scale structure

    CERN Document Server

    Croft, Rupert A C

    2013-01-01

    The recent measurement of the gravitational redshifts of galaxies in galaxy clusters by Wojtak et al. has opened a new observational window on dark matter and modified gravity. By stacking clusters this determination effectively used the line of sight distortion of the cross-correlation function of massive galaxies and lower mass galaxies to estimate the gravitational redshift profile of clusters out to 4 Mpc/h. Here we use a halo model of clustering to predict the distortion due to gravitational redshifts of the cross-correlation function on scales from 1 - 100 Mpc/h. We compare our predictions to simulations and use the simulations to make mock catalogues relevant to current and future galaxy redshift surveys. Without formulating an optimal estimator, we find that the full BOSS survey should be able to detect gravitational redshifts from large-scale structure at the ~4 sigma level. Upcoming redshift surveys will greatly increase the number of galaxies useable in such studies and the BigBOSS and Euclid exper...

  16. Food appropriation through large scale land acquisitions

    Science.gov (United States)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  17. Large-scale clustering of cosmic voids

    Science.gov (United States)

    Chan, Kwan Chuen; Hamaus, Nico; Desjacques, Vincent

    2014-11-01

    We study the clustering of voids using N -body simulations and simple theoretical models. The excursion-set formalism describes fairly well the abundance of voids identified with the watershed algorithm, although the void formation threshold required is quite different from the spherical collapse value. The void cross bias bc is measured and its large-scale value is found to be consistent with the peak background split results. A simple fitting formula for bc is found. We model the void auto-power spectrum taking into account the void biasing and exclusion effect. A good fit to the simulation data is obtained for voids with radii ≳30 Mpc h-1 , especially when the void biasing model is extended to 1-loop order. However, the best-fit bias parameters do not agree well with the peak-background results. Being able to fit the void auto-power spectrum is particularly important not only because it is the direct observable in galaxy surveys, but also our method enables us to treat the bias parameters as nuisance parameters, which are sensitive to the techniques used to identify voids.

  18. Large scale digital atlases in neuroscience

    Science.gov (United States)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  19. Clinical significance of large-scale screening of A1555G mutation of mitochondria DNA for neonates%线粒体DNA A1555G突变在新生儿大规模筛查的临床意义

    Institute of Scientific and Technical Information of China (English)

    蔡筠; 罗彩群; 谢建生; 吴维青; 耿茜; 徐志勇; 郝颖; 徐晓昕

    2011-01-01

    目的 探讨在新生儿中进行线粒体DNA(mitochondria DNA,mtDNA)A1555G突变基因大规模筛查在预防药物性耳聋的必要性.方法 随机取2008年在深圳市出生的1000名新生儿的血滤纸标本,用Chelex-100树脂提取DNA,PCR扩增,变性高效液相色谱法(denaturing hig-performance liquid chromatography,DHPLC)进行mtDNA A1555G突变基因筛查,计算出阳性突变频率.结果 1000名新生儿血滤纸样本中,共检测出2例样本存在mtDNA A1555G突变,突变率为0.2%.结论 mtDNA A1555G突变在新生儿中出现的频率较高,对其进行mtDNA A1555G突变大规模筛查发现氨基甙类抗生素敏感个体,能有效地对新生儿及其家族高危人群进行合理性指导用药,从而更好地预防药物性耳聋.%Objective To explore the necessity of large-scale screening of mitochondria DNA (mtDNA) A1555G mutation for prevention of aminoglycoside antibiotic induced deafness in newborns.Methods One thousand blood filter samples were collected from neonates born in July 2008 in Shenzhen.DNA was extracted with Chelex-100 Resin and amplified by PCR. The mtDNA A1555G mutation was determined by denaturing high-performance liquid chromatography (DHPLC) for PCR products. The positive frequency was calculated. Results The mitochondrial DNA A1555G mutation was detected in 2 cases of 1000 neonates. The frequency of mutation was 0. 2%. Conclusion There is a high frequency of mtDNAA1555G mutation in neonates, the large-scale screening of mtDNAA1555G mutation in newborns might detect the individuals sensitive to aminoglycoside antibiotic, which is helpful to guide a rational medication for newborns and the maternal relatives at high-risk. Furthermore, it might be useful to prevent aminoglycoside antibiotic induced deafness.

  20. Developing Large-Scale Bayesian Networks by Composition

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale...

  1. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  2. Sensitivity technologies for large scale simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias (Rice University, Houston, TX); Wilcox, Lucas C. (Brown University, Providence, RI); Hill, Judith C. (Carnegie Mellon University, Pittsburgh, PA); Ghattas, Omar (Carnegie Mellon University, Pittsburgh, PA); Berggren, Martin Olof (University of UppSala, Sweden); Akcelik, Volkan (Carnegie Mellon University, Pittsburgh, PA); Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  3. Sensitivity technologies for large scale simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias (Rice University, Houston, TX); Wilcox, Lucas C. (Brown University, Providence, RI); Hill, Judith C. (Carnegie Mellon University, Pittsburgh, PA); Ghattas, Omar (Carnegie Mellon University, Pittsburgh, PA); Berggren, Martin Olof (University of UppSala, Sweden); Akcelik, Volkan (Carnegie Mellon University, Pittsburgh, PA); Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  4. Large Scale Flame Spread Environmental Characterization Testing

    Science.gov (United States)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  5. Synchronization of coupled large-scale Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fangfei, E-mail: li-fangfei@163.com [Department of Mathematics, East China University of Science and Technology, No. 130, Meilong Road, Shanghai, Shanghai 200237 (China)

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  6. Synchronization of coupled large-scale Boolean networks

    Science.gov (United States)

    Li, Fangfei

    2014-03-01

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  7. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  8. Large Scale Reduction of Graphite Oxide Project

    Science.gov (United States)

    Calle, Carlos; Mackey, Paul; Falker, John; Zeitlin, Nancy

    2015-01-01

    This project seeks to develop an optical method to reduce graphite oxide into graphene efficiently and in larger formats than currently available. Current reduction methods are expensive, time-consuming or restricted to small, limited formats. Graphene has potential uses in ultracapacitors, energy storage, solar cells, flexible and light-weight circuits, touch screens, and chemical sensors. In addition, graphite oxide is a sustainable material that can be produced from any form of carbon, making this method environmentally friendly and adaptable for in-situ reduction.

  9. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  10. Measurement of ionospheric large-scale irregularity

    Institute of Scientific and Technical Information of China (English)

    韩文焌; 郑怡嘉; 张喜镇

    1996-01-01

    Based on the observations of a meter-wave aperture synthesis radio telescope,as the scale length of ionospheric irregularity is greatly larger than the baseline length of interferometer,the phase error induced by the output signal of interferometer due to ionosphere is proportional to the baseline length and accordingly the expressions for extracting the information about ionosphere are derived.By using the ray theory and considering that the antenna is always tracking to the radio source in astronomical observation,the wave motion expression of traveling ionospheric disturbance observed in the total electron content is also derived,which is consistent with that obtained from the conception of thin-phase screen;then the Doppler velocity due to antenna tracking is introduced.Finally the inversion analysis for the horizontal phase velocity of TID from observed data is given.

  11. Jacobsen protocols for large-scale epoxidation of cyclic dienyl sulfones: application to the (+)-pretazettine core.

    Science.gov (United States)

    Ebrahimian, G Reza; du Jourdin, Xavier Mollat; Fuchs, Philip L

    2012-05-18

    A Jacobsen epoxidation protocol using H2O2 as oxidant was designed for the large-scale preparation of various epoxy vinyl sulfones. A number of cocatalysts were screened, and pH control led to increased reaction rate, higher turnover number, and improved reliability.

  12. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  13. Multitree Algorithms for Large-Scale Astrostatistics

    Science.gov (United States)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  14. A human haploid gene trap collection to study lncRNAs with unusual RNA biology.

    Science.gov (United States)

    Kornienko, Aleksandra E; Vlatkovic, Irena; Neesen, Jürgen; Barlow, Denise P; Pauler, Florian M

    2016-01-01

    Many thousand long non-coding (lnc) RNAs are mapped in the human genome. Time consuming studies using reverse genetic approaches by post-transcriptional knock-down or genetic modification of the locus demonstrated diverse biological functions for a few of these transcripts. The Human Gene Trap Mutant Collection in haploid KBM7 cells is a ready-to-use tool for studying protein-coding gene function. As lncRNAs show remarkable differences in RNA biology compared to protein-coding genes, it is unclear if this gene trap collection is useful for functional analysis of lncRNAs. Here we use the uncharacterized LOC100288798 lncRNA as a model to answer this question. Using public RNA-seq data we show that LOC100288798 is ubiquitously expressed, but inefficiently spliced. The minor spliced LOC100288798 isoforms are exported to the cytoplasm, whereas the major unspliced isoform is nuclear localized. This shows that LOC100288798 RNA biology differs markedly from typical mRNAs. De novo assembly from RNA-seq data suggests that LOC100288798 extends 289kb beyond its annotated 3' end and overlaps the downstream SLC38A4 gene. Three cell lines with independent gene trap insertions in LOC100288798 were available from the KBM7 gene trap collection. RT-qPCR and RNA-seq confirmed successful lncRNA truncation and its extended length. Expression analysis from RNA-seq data shows significant deregulation of 41 protein-coding genes upon LOC100288798 truncation. Our data shows that gene trap collections in human haploid cell lines are useful tools to study lncRNAs, and identifies the previously uncharacterized LOC100288798 as a potential gene regulator.

  15. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  16. Development of large-scale structure in the Universe

    CERN Document Server

    Ostriker, J P

    1991-01-01

    This volume grew out of the 1988 Fermi lectures given by Professor Ostriker, and is concerned with cosmological models that take into account the large scale structure of the universe. He starts with homogeneous isotropic models of the universe and then, by considering perturbations, he leads us to modern cosmological theories of the large scale, such as superconducting strings. This will be an excellent companion for all those interested in the cosmology and the large scale nature of the universe.

  17. TCT联合TIS技术在中山市大规模宫颈癌筛查中的应用研究%Research on application of TCT combined with TIS technology in large-scale cervical cancer screening in Zhongshan city

    Institute of Scientific and Technical Information of China (English)

    米贤军; 王莹; 沈铿; 吴秋良; 肖琳; 陈昂; 徐秀梅; 孪峰; 钟守军

    2013-01-01

    Objective: To evaluate the application value of Thinprep liquid -based cytology test (TCT) combined with computer-assisted ThinPrep imaging system (TIS) in large - scale cervical cancer screening. Methods: The cervical cells of 10 000 cases were prepared for TCT, then cytopathological doctors read the smears for two times, the first cytological diagnosis was obtained by manually reading ThinPrep thinlayer cytology (TP) , then the second diagnosis was obtained after appropriate interval by TIS; the cytological results were compared and analyzed. The cases with atypical squamous cells of undetermined significance ( ASCUS) /atypical glandular cells (AGC) or a-bove lesions were designed as cytologically positive cases, colposcopic multipoint biopsy was conducted; the cases with cervical intraepithelial neoplasia (CIN) I or above lesions were designed as pathologically positive cases. The results of pathological examination were designed as gold standards; the cytological results and pathological results were compared and analyzed. Results: TP could screen 13. 8 pieces of smears every hour, but TIS could screen 29. 8 pieces of smears every hour, the work efficacy was improved by 115. 94%. Among 10 000 smears, 568 positive smears were detected by TP, the total detection rate was 5. 68% , including 151 smears of low grade squamous intraepithelial lesion (LSIL) , 66 smears of high grade squamous intraepithelial lesion (HSIL) , 3 smears of cervical cancer, 57 smears of atypical squamous cells -cannot exclude HSIL ( ASC -H) , 259 smears of ASCUS, and 32 smears of ACC; 621 positive smears were detected by TIS, the total detection rate was 6. 21% , including 162 smears of LSIL, 85 smears of HSIL, 3 smears of cervical cancer, 65 smears of ASC - H, 273 smears of ASCUS, and 33 smears of AGC. The positive detection rates of LSIL, HSIL, ASC - H, and ASCUS delected by TIS were higher than those detected by TP, increasing by 9. 33% , 7. 28% , 28. 79% , 14. 04% , and 5. 41

  18. Large-scale functional purification of recombinant HIV-1 capsid.

    Directory of Open Access Journals (Sweden)

    Magdeleine Hung

    Full Text Available During human immunodeficiency virus type-1 (HIV-1 virion maturation, capsid proteins undergo a major rearrangement to form a conical core that protects the viral nucleoprotein complexes. Mutations in the capsid sequence that alter the stability of the capsid core are deleterious to viral infectivity and replication. Recently, capsid assembly has become an attractive target for the development of a new generation of anti-retroviral agents. Drug screening efforts and subsequent structural and mechanistic studies require gram quantities of active, homogeneous and pure protein. Conventional means of laboratory purification of Escherichia coli expressed recombinant capsid protein rely on column chromatography steps that are not amenable to large-scale production. Here we present a function-based purification of wild-type and quadruple mutant capsid proteins, which relies on the inherent propensity of capsid protein to polymerize and depolymerize. This method does not require the packing of sizable chromatography columns and can generate double-digit gram quantities of functionally and biochemically well-behaved proteins with greater than 98% purity. We have used the purified capsid protein to characterize two known assembly inhibitors in our in-house developed polymerization assay and to measure their binding affinities. Our capsid purification procedure provides a robust method for purifying large quantities of a key protein in the HIV-1 life cycle, facilitating identification of the next generation anti-HIV agents.

  19. Conditional gene expression in the mouse using a Sleeping Beauty gene-trap transposon

    Directory of Open Access Journals (Sweden)

    Hackett Perry B

    2006-06-01

    Full Text Available Abstract Background Insertional mutagenesis techniques with transposable elements have been popular among geneticists studying model organisms from E. coli to Drosophila and, more recently, the mouse. One such element is the Sleeping Beauty (SB transposon that has been shown in several studies to be an effective insertional mutagen in the mouse germline. SB transposon vector studies have employed different functional elements and reporter molecules to disrupt and report the expression of endogenous mouse genes. We sought to generate a transposon system that would be capable of reporting the expression pattern of a mouse gene while allowing for conditional expression of a gene of interest in a tissue- or temporal-specific pattern. Results Here we report the systematic development and testing of a transposon-based gene-trap system incorporating the doxycycline-repressible Tet-Off (tTA system that is capable of activating the expression of genes under control of a Tet response element (TRE promoter. We demonstrate that the gene trap system is fully functional in vitro by introducing the "gene-trap tTA" vector into human cells by transposition and identifying clones that activate expression of a TRE-luciferase transgene in a doxycycline-dependent manner. In transgenic mice, we mobilize gene-trap tTA vectors, discover parameters that can affect germline mobilization rates, and identify candidate gene insertions to demonstrate the in vivo functionality of the vector system. We further demonstrate that the gene-trap can act as a reporter of endogenous gene expression and it can be coupled with bioluminescent imaging to identify genes with tissue-specific expression patterns. Conclusion Akin to the GAL4/UAS system used in the fly, we have made progress developing a tool for mutating and revealing the expression of mouse genes by generating the tTA transactivator in the presence of a secondary TRE-regulated reporter molecule. A vector like the gene-trap

  20. Safeguards instruments for Large-Scale Reprocessing Plants

    Energy Technology Data Exchange (ETDEWEB)

    Hakkila, E.A. [Los Alamos National Lab., NM (United States); Case, R.S.; Sonnier, C. [Sandia National Labs., Albuquerque, NM (United States)

    1993-06-01

    Between 1987 and 1992 a multi-national forum known as LASCAR (Large Scale Reprocessing Plant Safeguards) met to assist the IAEA in development of effective and efficient safeguards for large-scale reprocessing plants. The US provided considerable input for safeguards approaches and instrumentation. This paper reviews and updates instrumentation of importance in measuring plutonium and uranium in these facilities.

  1. Electrodialysis system for large-scale enantiomer separation

    NARCIS (Netherlands)

    Ent, van der E.M.; Thielen, T.P.H.; Cohen Stuart, M.A.; Padt, van der A.; Keurentjes, J.T.F.

    2001-01-01

    In contrast to analytical methods, the range of technologies currently applied for large-scale enantiomer separations is not very extensive. Therefore, a new system has been developed for large-scale enantiomer separations that can be regarded as the scale-up of a capillary electrophoresis system. I

  2. Electrodialysis system for large-scale enantiomer separation

    NARCIS (Netherlands)

    Ent, van der E.M.; Thielen, T.P.H.; Cohen Stuart, M.A.; Padt, van der A.; Keurentjes, J.T.F.

    2001-01-01

    In contrast to analytical methods, the range of technologies currently applied for large-scale enantiomer separations is not very extensive. Therefore, a new system has been developed for large-scale enantiomer separations that can be regarded as the scale-up of a capillary electrophoresis system.

  3. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  4. Characterization of enhancer trap and gene trap harboring Ac/Ds transposon in transgenic rice

    Institute of Scientific and Technical Information of China (English)

    金维正; 汪少敏; 徐敏; 段瑞君; 吴平

    2004-01-01

    Insertion mutagenesis has become one of the most popular methods for gene functions analysis.Here we report a two-element Ac/Ds transposon system containing enhancer trap and gene trap for gene tagging in rice.The excision of Ds element was examined by PCR amplification.The excision frequency of Ds element varied from 0% to 40% among 20 F2 populations derived from 11 different Ds parents.Southern blot analysis revealed that more than 70% of excised Ds elements reinserted into rice genome and above 70% of the reinserted Ds elements were located at different positions of the chromosome in rice.The result of histochemical GUS analysis indicated that 28% of enhancer trap and 22% of gene trap tagging plants displayed GUS activity in leaves, roots,flowers or seeds.The GUS positive lines will be useful for identifying gene function in rice.

  5. Large scale calcium channel gene rearrangements in episodic ataxia and hemiplegic migraine: implications for diagnostic testing.

    Science.gov (United States)

    Labrum, R W; Rajakulendran, S; Graves, T D; Eunson, L H; Bevan, R; Sweeney, M G; Hammans, S R; Tubridy, N; Britton, T; Carr, L J; Ostergaard, J R; Kennedy, C R; Al-Memar, A; Kullmann, D M; Schorge, S; Temple, K; Davis, M B; Hanna, M G

    2009-11-01

    Episodic ataxia type 2 (EA2) and familial hemiplegic migraine type 1 (FHM1) are autosomal dominant disorders characterised by paroxysmal ataxia and migraine, respectively. Point mutations in CACNA1A, which encodes the neuronal P/Q-type calcium channel, have been detected in many cases of EA2 and FHM1. The genetic basis of typical cases without CACNA1A point mutations is not fully known. Standard DNA sequencing methods may miss large scale genetic rearrangements such as deletions and duplications. The authors investigated whether large scale genetic rearrangements in CACNA1A can cause EA2 and FHM1. The authors used multiplex ligation dependent probe amplification (MLPA) to screen for intragenic CACNA1A rearrangements. The authors identified five previously unreported large scale deletions in CACNA1A in seven families with episodic ataxia and in one case with hemiplegic migraine. One of the deletions (exon 6 of CACNA1A) segregated with episodic ataxia in a four generation family with eight affected individuals previously mapped to 19p13. In addition, the authors identified the first pathogenic duplication in CACNA1A in an index case with isolated episodic diplopia without ataxia and in a first degree relative with episodic ataxia. Large scale deletions and duplications can cause CACNA1A associated channelopathies. Direct DNA sequencing alone is not sufficient as a diagnostic screening test.

  6. Distribution probability of large-scale landslides in central Nepal

    Science.gov (United States)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  7. Organised convection embedded in a large-scale flow

    Science.gov (United States)

    Naumann, Ann Kristin; Stevens, Bjorn; Hohenegger, Cathy

    2017-04-01

    In idealised simulations of radiative convective equilibrium, convection aggregates spontaneously from randomly distributed convective cells into organized mesoscale convection despite homogeneous boundary conditions. Although these simulations apply very idealised setups, the process of self-aggregation is thought to be relevant for the development of tropical convective systems. One feature that idealised simulations usually neglect is the occurrence of a large-scale background flow. In the tropics, organised convection is embedded in a large-scale circulation system, which advects convection in along-wind direction and alters near surface convergence in the convective areas. A large-scale flow also modifies the surface fluxes, which are expected to be enhanced upwind of the convective area if a large-scale flow is applied. Convective clusters that are embedded in a large-scale flow therefore experience an asymmetric component of the surface fluxes, which influences the development and the pathway of a convective cluster. In this study, we use numerical simulations with explicit convection and add a large-scale flow to the established setup of radiative convective equilibrium. We then analyse how aggregated convection evolves when being exposed to wind forcing. The simulations suggest that convective line structures are more prevalent if a large-scale flow is present and that convective clusters move considerably slower than advection by the large-scale flow would suggest. We also study the asymmetric component of convective aggregation due to enhanced surface fluxes, and discuss the pathway and speed of convective clusters as a function of the large-scale wind speed.

  8. Probabilistic cartography of the large-scale structure

    CERN Document Server

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin

    2015-01-01

    The BORG algorithm is an inference engine that derives the initial conditions given a cosmological model and galaxy survey data, and produces physical reconstructions of the underlying large-scale structure by assimilating the data into the model. We present the application of BORG to real galaxy catalogs and describe the primordial and late-time large-scale structure in the considered volumes. We then show how these results can be used for building various probabilistic maps of the large-scale structure, with rigorous propagation of uncertainties. In particular, we study dynamic cosmic web elements and secondary effects in the cosmic microwave background.

  9. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  10. The theory of large-scale ocean circulation

    National Research Council Canada - National Science Library

    Samelson, R. M

    2011-01-01

    "This is a concise but comprehensive introduction to the basic elements of the theory of large-scale ocean circulation for advanced students and researchers"-- "Mounting evidence that human activities...

  11. Learning networks for sustainable, large-scale improvement.

    Science.gov (United States)

    McCannon, C Joseph; Perla, Rocco J

    2009-05-01

    Large-scale improvement efforts known as improvement networks offer structured opportunities for exchange of information and insights into the adaptation of clinical protocols to a variety of settings.

  12. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Document Server

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  13. An Evaluation Framework for Large-Scale Network Structures

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    2004-01-01

    An evaluation framework for large-scale network structures is presented, which facilitates evaluations and comparisons of different physical network structures. A number of quantitative and qualitative parameters are presented, and their importance to networks discussed. Choosing a network...

  14. Modified gravity and large scale flows, a review

    Science.gov (United States)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  15. Some perspective on the Large Scale Scientific Computation Research

    Institute of Scientific and Technical Information of China (English)

    DU Qiang

    2004-01-01

    @@ The "Large Scale Scientific Computation (LSSC) Research"project is one of the State Major Basic Research projects funded by the Chinese Ministry of Science and Technology in the field ofinformation science and technology.

  16. Some perspective on the Large Scale Scientific Computation Research

    Institute of Scientific and Technical Information of China (English)

    DU; Qiang

    2004-01-01

    The "Large Scale Scientific Computation (LSSC) Research"project is one of the State Major Basic Research projects funded by the Chinese Ministry of Science and Technology in the field ofinformation science and technology.……

  17. PetroChina to Expand Dushanzi Refinery on Large Scale

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ A large-scale expansion project for PetroChina Dushanzi Petrochemical Company has been given the green light, a move which will make it one of the largest refineries and petrochemical complexes in the country.

  18. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  19. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...... are presented as the small-scale model underpredicts the overtopping discharge....

  20. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems....

  1. Balancing modern Power System with large scale of wind power

    OpenAIRE

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the s...

  2. A study of MLFMA for large-scale scattering problems

    Science.gov (United States)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  3. Vector dissipativity theory for large-scale impulsive dynamical systems

    Directory of Open Access Journals (Sweden)

    Haddad Wassim M.

    2004-01-01

    Full Text Available Modern complex large-scale impulsive systems involve multiple modes of operation placing stringent demands on controller analysis of increasing complexity. In analyzing these large-scale systems, it is often desirable to treat the overall impulsive system as a collection of interconnected impulsive subsystems. Solution properties of the large-scale impulsive system are then deduced from the solution properties of the individual impulsive subsystems and the nature of the impulsive system interconnections. In this paper, we develop vector dissipativity theory for large-scale impulsive dynamical systems. Specifically, using vector storage functions and vector hybrid supply rates, dissipativity properties of the composite large-scale impulsive systems are shown to be determined from the dissipativity properties of the impulsive subsystems and their interconnections. Furthermore, extended Kalman-Yakubovich-Popov conditions, in terms of the impulsive subsystem dynamics and interconnection constraints, characterizing vector dissipativeness via vector system storage functions, are derived. Finally, these results are used to develop feedback interconnection stability results for large-scale impulsive dynamical systems using vector Lyapunov functions.

  4. Large-scale-vortex dynamos in planar rotating convection

    CERN Document Server

    Guervilly, Céline; Jones, Chris A

    2016-01-01

    Several recent studies have demonstrated how large-scale vortices may arise spontaneously in rotating planar convection. Here we examine the dynamo properties of such flows in rotating Boussinesq convection. For moderate values of the magnetic Reynolds number ($100 \\lesssim Rm \\lesssim 550$, with $Rm$ based on the box depth and the convective velocity), a large-scale (i.e. system-size) magnetic field is generated. The amplitude of the magnetic energy oscillates in time, out of phase with the oscillating amplitude of the large-scale vortex. The dynamo mechanism relies on those components of the flow that have length scales lying between that of the large-scale vortex and the typical convective cell size; smaller-scale flows are not required. The large-scale vortex plays a crucial role in the magnetic induction despite being essentially two-dimensional. For larger magnetic Reynolds numbers, the dynamo is small scale, with a magnetic energy spectrum that peaks at the scale of the convective cells. In this case, ...

  5. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  6. Acoustic Studies of the Large Scale Ocean Circulation

    Science.gov (United States)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  7. Human pescadillo induces large-scale chromatin unfolding

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hao; FANG Yan; HUANG Cuifen; YANG Xiao; YE Qinong

    2005-01-01

    The human pescadillo gene encodes a protein with a BRCT domain. Pescadillo plays an important role in DNA synthesis, cell proliferation and transformation. Since BRCT domains have been shown to induce chromatin large-scale unfolding, we tested the role of Pescadillo in regulation of large-scale chromatin unfolding. To this end, we isolated the coding region of Pescadillo from human mammary MCF10A cells. Compared with the reported sequence, the isolated Pescadillo contains in-frame deletion from amino acid 580 to 582. Targeting the Pescadillo to an amplified, lac operator-containing chromosome region in the mammalian genome results in large-scale chromatin decondensation. This unfolding activity maps to the BRCT domain of Pescadillo. These data provide a new clue to understanding the vital role of Pescadillo.

  8. Transport of Large Scale Poloidal Flux in Black Hole Accretion

    CERN Document Server

    Beckwith, Kris; Krolik, Julian H

    2009-01-01

    We perform a global, three-dimensional GRMHD simulation of an accretion torus embedded in a large scale vertical magnetic field orbiting a Schwarzschild black hole. This simulation investigates how a large scale vertical field evolves within a turbulent accretion disk and whether global magnetic field configurations suitable for launching jets and winds can develop. We identify a ``coronal mechanism'' of magnetic flux motion, which dominates the global flux evolution. In this coronal mechanism, magnetic stresses driven by orbital shear create large-scale half-loops of magnetic field that stretch radially inward and then reconnect, leading to discontinuous jumps in the location of magnetic flux. This mechanism is supplemented by a smaller amount of flux advection in the accretion flow proper. Because the black hole in this case does not rotate, the magnetic flux on the horizon determines the mean magnetic field strength in the funnel around the disk axis; this field strength is regulated by a combination of th...

  9. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data...... is very consistent, while the WMAP 9 year release appears more contaminated by non-CMB residuals than the 7 year release. The second part is concerned with the anomalies of the CMB from two approaches. One is based on an extended inflationary model as the origin of one specific large scale anomaly, namely....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  10. Large Scale Magnetohydrodynamic Dynamos from Cylindrical Differentially Rotating Flows

    CERN Document Server

    Ebrahimi, F

    2015-01-01

    For cylindrical differentially rotating plasmas threaded with a uniform vertical magnetic field, we study large-scale magnetic field generation from finite amplitude perturbations using analytic theory and direct numerical simulations. Analytically, we impose helical fluctuations, a seed field, and a background flow and use quasi-linear theory for a single mode. The predicted large-scale field growth agrees with numerical simulations in which the magnetorotational instability (MRI) arises naturally. The vertically and azimuthally averaged toroidal field is generated by a fluctuation-induced EMF that depends on differential rotation. Given fluctuations, the method also predicts large-scale field growth for MRI-stable rotation profiles and flows with no rotation but shear.

  11. A relativistic signature in large-scale structure

    Science.gov (United States)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  12. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...

  13. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data...... is very consistent, while the WMAP 9 year release appears more contaminated by non-CMB residuals than the 7 year release. The second part is concerned with the anomalies of the CMB from two approaches. One is based on an extended inflationary model as the origin of one specific large scale anomaly, namely....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  14. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  15. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Morteza Hajizadeh-Oghaz; Reza Shoja Razavi; Mohammadreza Loghman Estarki

    2014-08-01

    Yttria–stabilized zirconia nanopowders were synthesized on a relatively large scale using Pechini method. In the present paper, nearly spherical yttria-stabilized zirconia nanopowders with tetragonal structure were synthesized by Pechini process from zirconium oxynitrate hexahydrate, yttrium nitrate, citric acid and ethylene glycol. The phase and structural analyses were accomplished by X-ray diffraction; morphological analysis was carried out by field emission scanning electron microscopy and transmission electron microscopy. The results revealed nearly spherical yttria–stabilized zirconia powder with tetragonal crystal structure and chemical purity of 99.1% by inductively coupled plasma optical emission spectroscopy on a large scale.

  16. Practical Large Scale Syntheses of New Drug Candidates

    Institute of Scientific and Technical Information of China (English)

    Hui-Yin; Li

    2001-01-01

    This presentation will be focus on Practical large scale syntheses of lead compounds and drug candidates from three major therapeutic areas from DuPont Pharmaceuticals Research Laboratory: 1). DMP777-a selective, non-toxic, orally active human elastase inhibitor; 2). DMP754-a potent glycoprotein IIb/IIIa antagonist; 3). R-Wafarin-the pure enantiomeric form of wafarin. The key technology used for preparation these drug candidates is asymmetric hydrogenation under very mild reaction conditions, which produced very high quality final products at large scale (>99% de, >99 A% and >99 wt%). Some practical and GMP aspects of process development will be also discussed.……

  17. Statistical equilibria of large scales in dissipative hydrodynamic turbulence

    CERN Document Server

    Dallas, Vassilios; Alexakis, Alexandros

    2015-01-01

    We present a numerical study of the statistical properties of three-dimensional dissipative turbulent flows at scales larger than the forcing scale. Our results indicate that the large scale flow can be described to a large degree by the truncated Euler equations with the predictions of the zero flux solutions given by absolute equilibrium theory, both for helical and non-helical flows. Thus, the functional shape of the large scale spectra can be predicted provided that scales sufficiently larger than the forcing length scale but also sufficiently smaller than the box size are examined. Deviations from the predictions of absolute equilibrium are discussed.

  18. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  19. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  20. [Issues of large scale tissue culture of medicinal plant].

    Science.gov (United States)

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  1. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    Generation expansion planning (GEP) is the problem of finding the optimal strategy to plan the Construction of new generation while satisfying technical and economical constraints. In the deregulated and competitive environment, large-scale integration of wind generation (WG) in power system has...... necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...

  2. The fractal octahedron network of the large scale structure

    CERN Document Server

    Battaner, E

    1998-01-01

    In a previous article, we have proposed that the large scale structure network generated by large scale magnetic fields could consist of a network of octahedra only contacting at their vertexes. Assuming such a network could arise at different scales producing a fractal geometry, we study here its properties, and in particular how a sub-octahedron network can be inserted within an octahedron of the large network. We deduce that the scale of the fractal structure would range from $\\approx$100 Mpc, i.e. the scale of the deepest surveys, down to about 10 Mpc, as other smaller scale magnetic fields were probably destroyed in the radiation dominated Universe.

  3. Large-scale streaming motions and microwave background anisotropies

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Gonzalez, E.; Sanz, J.L. (Cantabria Universidad, Santander (Spain))

    1989-12-01

    The minimal microwave background radiation is calculated on each angular scale implied by the existence of large-scale streaming motions. These minimal anisotropies, due to the Sachs-Wolfe effect, are obtained for different experiments, and give quite different results from those found in previous work. They are not in conflict with present theories of galaxy formation. Upper limits are imposed on the scale at which large-scale streaming motions can occur by extrapolating results from present double-beam-switching experiments. 17 refs.

  4. Distributed chaos tuned to large scale coherent motions in turbulence

    CERN Document Server

    Bershadskii, A

    2016-01-01

    It is shown, using direct numerical simulations and laboratory experiments data, that distributed chaos is often tuned to large scale coherent motions in anisotropic inhomogeneous turbulence. The examples considered are: fully developed turbulent boundary layer (range of coherence: $14 < y^{+} < 80$), turbulent thermal convection (in a horizontal cylinder), and Cuette-Taylor flow. Two ways of the tuning have been described: one via fundamental frequency (wavenumber) and another via subharmonic (period doubling). For the second way the large scale coherent motions are a natural component of distributed chaos. In all considered cases spontaneous breaking of space translational symmetry is accompanied by reflexional symmetry breaking.

  5. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  6. Large-scale liquid scintillation detectors for solar neutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Benziger, Jay B.; Calaprice, Frank P. [Princeton University Princeton, Princeton, NJ (United States)

    2016-04-15

    Large-scale liquid scintillation detectors are capable of providing spectral yields of the low energy solar neutrinos. These detectors require > 100 tons of liquid scintillator with high optical and radiopurity. In this paper requirements for low-energy neutrino detection by liquid scintillation are specified and the procedures to achieve low backgrounds in large-scale liquid scintillation detectors for solar neutrinos are reviewed. The designs, operations and achievements of Borexino, KamLAND and SNO+ in measuring the low-energy solar neutrino fluxes are reviewed. (orig.)

  7. Optimal Dispatching of Large-scale Water Supply System

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    This paper deals with the use of optimal control techniques in large-scale water distribution networks. According to the network characteristics and actual state of the water supply system in China, the implicit model, which may be solved by utilizing the hierarchical optimization method, is established. In special, based on the analyses of the water supply system containing variable-speed pumps, a software tool has been developed successfully. The application of this model to the city of Shenyang (China) is compared to experiential strategy. The results of this study show that the developed model is a very promising optimization method to control the large-scale water supply systems.

  8. Reliability Evaluation considering Structures of a Large Scale Wind Farm

    DEFF Research Database (Denmark)

    Shin, Je-Seok; Cha, Seung-Tae; Wu, Qiuwei

    2012-01-01

    evaluation on wind farm is necessarily required. Also, because large scale offshore wind farm has a long repair time and a high repair cost as well as a high investment cost, it is essential to take into account the economic aspect. One of methods to efficiently build and to operate wind farm is to construct......Wind energy is one of the most widely used renewable energy resources. Wind power has been connected to the grid as large scale wind farm which is made up of dozens of wind turbines, and the scale of wind farm is more increased recently. Due to intermittent and variable wind source, reliability...

  9. Practical Large Scale Syntheses of New Drug Candidates

    Institute of Scientific and Technical Information of China (English)

    Hui-Yin Li

    2001-01-01

    @@ This presentation will be focus on Practical large scale syntheses of lead compounds and drug candidates from three major therapeutic areas from DuPont Pharmaceuticals Research Laboratory: 1). DMP777-a selective, non-toxic, orally active human elastase inhibitor; 2). DMP754-a potent glycoprotein IIb/IIIa antagonist; 3). R-Wafarin-the pure enantiomeric form of wafarin. The key technology used for preparation these drug candidates is asymmetric hydrogenation under very mild reaction conditions, which produced very high quality final products at large scale (>99% de, >99 A% and >99 wt%). Some practical and GMP aspects of process development will be also discussed.

  10. Fast paths in large-scale dynamic road networks

    CERN Document Server

    Nannicini, Giacomo; Barbier, Gilles; Krob, Daniel; Liberti, Leo

    2007-01-01

    Efficiently computing fast paths in large scale dynamic road networks (where dynamic traffic information is known over a part of the network) is a practical problem faced by several traffic information service providers who wish to offer a realistic fast path computation to GPS terminal enabled vehicles. The heuristic solution method we propose is based on a highway hierarchy-based shortest path algorithm for static large-scale networks; we maintain a static highway hierarchy and perform each query on the dynamically evaluated network.

  11. Large-scale growth of the Plasmodium falciparum malaria parasite in a wave bioreactor.

    Science.gov (United States)

    Dalton, John P; Demanga, Corine G; Reiling, Sarah J; Wunderlich, Juliane; Eng, Jenny W L; Rohrbach, Petra

    2012-01-01

    We describe methods for the large-scale in vitro culturing of synchronous and asynchronous blood-stage Plasmodium falciparum parasites in sterile disposable plastic bioreactors controlled by wave-induced motion (wave bioreactor). These cultures perform better than static flask cultures in terms of preserving parasite cell cycle synchronicity and reducing the number of multiple-infected erythrocytes. The straight-forward methods described here will facilitate the large scale production of malaria parasites for antigen and organelle isolation and characterisation, for the high throughput screening of compound libraries with whole cells or extracts, and the development of live- or whole-cell malaria vaccines under good manufacturing practice compliant standards. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  12. Spectator Higgs, large-scale gauge fields and the non-minimal coupling to gravity

    CERN Document Server

    Giovannini, Massimo

    2016-01-01

    Even if the Higgs field does not affect the evolution of the background geometry, its massive inhomogeneities induce large-scale gauge fields whose energy density depends on the slow-roll parameters, on the effective scalar mass and, last but not least, on the dimensionless coupling to the space-time curvature. Since the non-Abelian gauge modes are screened, the non-minimal coupling to gravity predominantly affects the evolution of the hypercharge and electromagnetic fields. While in the case of minimal coupling the obtained constraints are immaterial, as soon as the coupling increases beyond one fourth the produced fields become overcritical. We chart the whole parameter space of this qualitatively new set of bounds. Whenever the limits on the curvature coupling are enforced, the magnetic field may still be partially relevant for large-scale magnetogenesis and exceed $10^{-20}$ G for the benchmark scale of the protogalactic collapse.

  13. GroFi: Large-scale fiber placement research facility

    Directory of Open Access Journals (Sweden)

    Christian Krombholz

    2016-03-01

    and processes for large-scale composite components. Due to the use of coordinated and simultaneously working layup units a high exibility of the research platform is achieved. This allows the investigation of new materials, technologies and processes on both, small coupons, but also large components such as wing covers or fuselage skins.

  14. Flexibility in design of large-scale methanol plants

    Institute of Scientific and Technical Information of China (English)

    Esben Lauge Sφrensen; Helge Holm-Larsen; Haldor Topsφe A/S

    2006-01-01

    This paper presents a cost effective design for large-scale methanol production. It is demonstrated how recent technological progress can be utilised to design a methanol plant,which is inexpensive and easy to operate, while at the same time very robust towards variations in feed-stock composition and product specifications.

  15. Large-scale search for dark-matter axions

    Energy Technology Data Exchange (ETDEWEB)

    Hagmann, C.A., LLNL; Kinion, D.; Stoeffl, W.; Van Bibber, K.; Daw, E.J. [Massachusetts Inst. of Tech., Cambridge, MA (United States); McBride, J. [Massachusetts Inst. of Tech., Cambridge, MA (United States); Peng, H. [Massachusetts Inst. of Tech., Cambridge, MA (United States); Rosenberg, L.J. [Massachusetts Inst. of Tech., Cambridge, MA (United States); Xin, H. [Massachusetts Inst. of Tech., Cambridge, MA (United States); Laveigne, J. [Florida Univ., Gainesville, FL (United States); Sikivie, P. [Florida Univ., Gainesville, FL (United States); Sullivan, N.S. [Florida Univ., Gainesville, FL (United States); Tanner, D.B. [Florida Univ., Gainesville, FL (United States); Moltz, D.M. [Lawrence Berkeley Lab., CA (United States); Powell, J. [Lawrence Berkeley Lab., CA (United States); Clarke, J. [Lawrence Berkeley Lab., CA (United States); Nezrick, F.A. [Fermi National Accelerator Lab., Batavia, IL (United States); Turner, M.S. [Fermi National Accelerator Lab., Batavia, IL (United States); Golubev, N.A. [Russian Academy of Sciences, Moscow (Russia); Kravchuk, L.V. [Russian Academy of Sciences, Moscow (Russia)

    1998-01-01

    Early results from a large-scale search for dark matter axions are presented. In this experiment, axions constituting our dark-matter halo may be resonantly converted to monochromatic microwave photons in a high-Q microwave cavity permeated by a strong magnetic field. Sensitivity at the level of one important axion model (KSVZ) has been demonstrated.

  16. Temporal Variation of Large Scale Flows in the Solar Interior

    Indian Academy of Sciences (India)

    Sarbani Basu; H. M. Antia

    2000-09-01

    We attempt to detect short-term temporal variations in the rotation rate and other large scale velocity fields in the outer part of the solar convection zone using the ring diagram technique applied to Michelson Doppler Imager (MDI) data. The measured velocity field shows variations by about 10 m/s on the scale of few days.

  17. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the homogeniza

  18. Quantized pressure control in large-scale nonlinear hydraulic networks

    NARCIS (Netherlands)

    Persis, Claudio De; Kallesøe, Carsten Skovmose; Jensen, Tom Nørgaard

    2010-01-01

    It was shown previously that semi-global practical pressure regulation at designated points of a large-scale nonlinear hydraulic network is guaranteed by distributed proportional controllers. For a correct implementation of the control laws, each controller, which is located at these designated poin

  19. Main Achievements of Cotton Large-scale Transformation System

    Institute of Scientific and Technical Information of China (English)

    LI Fu-guang; LIU Chuan-liang; WU Zhi-xia; ZHANG Chao-jun; ZHANG Xue-yan

    2008-01-01

    @@ Cotton large-scale transformation methods system was established based on innovation of cotton transformation methods.It obtains 8000 transgenic cotton plants per year by combining Agrobacteriurn turnefaciens-mediated,pollen-tube pathway and biolistic methods together efficiently.More than 1000 transgenie lines are selected from the transgenic plants with molecular assistant breeding and conventional breeding methods.

  20. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  1. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  2. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  3. Regeneration and propagation of reed grass for large-scale ...

    African Journals Online (AJOL)

    전서범

    2012-01-26

    Jan 26, 2012 ... containing different sucrose concentrations; this experiment found that 60 g L-1 ... All these uses of reeds require the large-scale rege- ... numbers of plant in a small space within a short time ... callus stock and grown in vitro were used in this study. .... presence of 4-FA were converted to friable and light-.

  4. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  5. Large-Scale Assessment and English Language Learners with Disabilities

    Science.gov (United States)

    Liu, Kristin K.; Ward, Jenna M.; Thurlow, Martha L.; Christensen, Laurene L.

    2017-01-01

    This article highlights a set of principles and guidelines, developed by a diverse group of specialists in the field, for appropriately including English language learners (ELLs) with disabilities in large-scale assessments. ELLs with disabilities make up roughly 9% of the rapidly increasing ELL population nationwide. In spite of the small overall…

  6. Large scale radial stability density of Hill's equation

    NARCIS (Netherlands)

    Broer, Henk; Levi, Mark; Simo, Carles

    2013-01-01

    This paper deals with large scale aspects of Hill's equation (sic) + (a + bp(t)) x = 0, where p is periodic with a fixed period. In particular, the interest is the asymptotic radial density of the stability domain in the (a, b)-plane. It turns out that this density changes discontinuously in a certa

  7. Water Implications of Large-Scale Land Acquisitions in Ghana

    Directory of Open Access Journals (Sweden)

    Timothy Olalekan Williams

    2012-06-01

    The paper offers recommendations which can help the government to achieve its stated objective of developing a "policy framework and guidelines for large-scale land acquisitions by both local and foreign investors for biofuels that will protect the interests of investors and the welfare of Ghanaian farmers and landowners".

  8. Evaluating Large-scale National Public Management Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Morten Balle

    This article explores differences and similarities between two evaluations of large-scale administrative reforms which were carried out in the 2000s: The evaluation of the Norwegian NAV reform (EVANAV) and the evaluation of the Danish Local Government Reform (LGR). We provide a comparative analys...

  9. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena i

  10. Main Achievements of Cotton Large-scale Transformation System

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Cotton large-scale transformation methods system was established based on innovation of cotton transformation methods.It obtains 8000 transgenic cotton plants per year by combining Agrobacterium tumefaciens-mediated,pollen-tube pathway and biolistic methods together efficiently.More than

  11. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  12. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  13. Cost Overruns in Large-scale Transportation Infrastructure Projects

    DEFF Research Database (Denmark)

    Cantarelli, Chantal C; Flyvbjerg, Bent; Molin, Eric J. E

    2010-01-01

    Managing large-scale transportation infrastructure projects is difficult due to frequent misinformation about the costs which results in large cost overruns that often threaten the overall project viability. This paper investigates the explanations for cost overruns that are given in the literature...

  14. Ultra-Large-Scale Systems: Scale Changes Everything

    Science.gov (United States)

    2008-03-06

    Statistical Mechanics, Complexity Networks Are Everywhere Recurring “scale free” structure • internet & yeast protein structures Analogous dynamics...Design • Design Representation and Analysis • Assimilation • Determining and Managing Requirements 43 Ultra-Large-Scale Systems Linda Northrop: March

  15. The Role of Plausible Values in Large-Scale Surveys

    Science.gov (United States)

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1) address…

  16. Large-scale data analysis using the Wigner function

    Science.gov (United States)

    Earnshaw, R. A.; Lei, C.; Li, J.; Mugassabi, S.; Vourdas, A.

    2012-04-01

    Large-scale data are analysed using the Wigner function. It is shown that the 'frequency variable' provides important information, which is lost with other techniques. The method is applied to 'sentiment analysis' in data from social networks and also to financial data.

  17. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  18. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  19. High-Throughput, Large-Scale SNP Genotyping: Bioinformatics Considerations

    OpenAIRE

    Margetic, Nino

    2004-01-01

    In order to provide a high-throughput, large-scale genotyping facility at the national level we have developed a set of inter-dependent information systems. A combination of commercial, publicly-available and in-house developed tools links a series of data repositories based both on flat files and relational databases providing an almost complete semi-automated pipeline.

  20. Chain Analysis for large-scale Communication systems

    NARCIS (Netherlands)

    Grijpink, Jan

    2010-01-01

    The chain concept is introduced to explain how large-scale information infrastructures so often fail and sometimes even backfire. Next, the assessment framework of the doctrine of Chain-computerisation and its chain analysis procedure are outlined. In this procedure chain description precedes assess

  1. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  2. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  3. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  4. How large-scale subsidence affects stratocumulus transitions (discussion paper)

    NARCIS (Netherlands)

    Van der Dussen, J.J.; De Roode, S.R.; Siebesma, A.P.

    2015-01-01

    Some climate modeling results suggest that the Hadley circulation might weaken in a future climate, causing a subsequent reduction in the large-scale subsidence velocity in the subtropics. In this study we analyze the cloud liquid water path (LWP) budget from large-eddy simulation (LES) results of

  5. Large-Scale Innovation and Change in UK Higher Education

    Science.gov (United States)

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  6. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    Science.gov (United States)

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  7. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena i

  8. Large-Scale Innovation and Change in UK Higher Education

    Science.gov (United States)

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  9. Primordial non-Gaussianity from the large scale structure

    CERN Document Server

    Desjacques, Vincent

    2010-01-01

    Primordial non-Gaussianity is a potentially powerful discriminant of the physical mechanisms that generated the cosmological fluctuations observed today. Any detection of non-Gaussianity would have profound implications for our understanding of cosmic structure formation. In this paper, we review past and current efforts in the search for primordial non-Gaussianity in the large scale structure of the Universe.

  10. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured...

  11. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...

  12. Large scale solar district heating. Evaluation, modelling and designing - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The appendices present the following: A) Cad-drawing of the Marstal CSHP design. B) Key values - large-scale solar heating in Denmark. C) Monitoring - a system description. D) WMO-classification of pyranometers (solarimeters). E) The computer simulation model in TRNSYS. F) Selected papers from the author. (EHS)

  13. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  14. Large-scale flow generation by inhomogeneous helicity

    CERN Document Server

    Yokoi, Nobumitsu

    2015-01-01

    The effect of kinetic helicity (velocity--vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters into the Reynolds stress (mirrorsymmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with non-uniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of hom...

  15. Series Design of Large-Scale NC Machine Tool

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi

    2007-01-01

    Product system design is a mature concept in western developed countries. It has been applied in war industry during the last century. However, up until now, functional combination is still the main method for product system design in China. Therefore, in terms of a concept of product generation and product interaction we are in a weak position compared with the requirements of global markets. Today, the idea of serial product design has attracted much attention in the design field and the definition of product generation as well as its parameters has already become the standard in serial product designs. Although the design of a large-scale NC machine tool is complicated, it can be further optimized by the precise exercise of object design by placing the concept of platform establishment firmly into serial product design. The essence of a serial product design has been demonstrated by the design process of a large-scale NC machine tool.

  16. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    Research on the evaluation of large-scale public sector reforms is rare. This article sets out to fill that gap in the evaluation literature and argues that it is of vital importance. The impact of such reforms is considerable. Furthermore they change the context in which evaluations of other...... and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...... compare the evaluation process (focus and purpose), the evaluators and the organization of the evaluation as well as the utilization of the evaluation results. The analysis uncovers several significant findings including how the initial organization of the evaluation show strong impact on the utilization...

  17. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    CERN Document Server

    Blackman, Eric G

    2014-01-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. H...

  18. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  19. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  20. Constraining cosmological ultra-large scale structure using numerical relativity

    CERN Document Server

    Braden, Jonathan; Peiris, Hiranya V; Aguirre, Anthony

    2016-01-01

    Cosmic inflation, a period of accelerated expansion in the early universe, can give rise to large amplitude ultra-large scale inhomogeneities on distance scales comparable to or larger than the observable universe. The cosmic microwave background (CMB) anisotropy on the largest angular scales is sensitive to such inhomogeneities and can be used to constrain the presence of ultra-large scale structure (ULSS). We numerically evolve nonlinear inhomogeneities present at the beginning of inflation in full General Relativity to assess the CMB quadrupole constraint on the amplitude of the initial fluctuations and the size of the observable universe relative to a length scale characterizing the ULSS. To obtain a statistically significant number of simulations, we adopt a toy model in which inhomogeneities are injected along a preferred direction. We compute the likelihood function for the CMB quadrupole including both ULSS and the standard quantum fluctuations produced during inflation. We compute the posterior given...

  1. Ultra-large scale cosmology with next-generation experiments

    CERN Document Server

    Alonso, David; Ferreira, Pedro G; Maartens, Roy; Santos, Mario G

    2015-01-01

    Future surveys of large-scale structure will be able to measure perturbations on the scale of the cosmological horizon, and so could potentially probe a number of novel relativistic effects that are negligibly small on sub-horizon scales. These effects leave distinctive signatures in the power spectra of clustering observables and, if measurable, would open a new window on relativistic cosmology. We quantify the size and detectability of the effects for a range of future large-scale structure surveys: spectroscopic and photometric galaxy redshift surveys, intensity mapping surveys of neutral hydrogen, and continuum surveys of radio galaxies. Our forecasts show that next-generation experiments, reaching out to redshifts z ~ 4, will not be able to detect previously-undetected general-relativistic effects from the single-tracer power spectra alone, although they may be able to measure the lensing magnification in the auto-correlation. We also perform a rigorous joint forecast for the detection of primordial non-...

  2. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  3. Supermassive black holes, large scale structure and holography

    CERN Document Server

    Mongan, T R

    2013-01-01

    A holographic analysis of large scale structure in the universe estimates the mass of supermassive black holes at the center of large scale structures with matter density varying inversely as the square of the distance from their center. The estimate is consistent with two important test cases involving observations of the supermassive black hole with mass 3.6\\times10^{-6} times the galactic mass in Sagittarius A^{*} near the center of our Milky Way and the 2\\times10^{9} solar mass black hole in the quasar ULAS J112001.48+064124.3 at redshift z=7.085. It is also consistent with upper bounds on central black hole masses in globular clusters M15, M19 and M22 developed using the Jansky Very Large Array in New Mexico.

  4. Optimization of Survivability Analysis for Large-Scale Engineering Networks

    CERN Document Server

    Poroseva, S V

    2012-01-01

    Engineering networks fall into the category of large-scale networks with heterogeneous nodes such as sources and sinks. The survivability analysis of such networks requires the analysis of the connectivity of the network components for every possible combination of faults to determine a network response to each combination of faults. From the computational complexity point of view, the problem belongs to the class of exponential time problems at least. Partially, the problem complexity can be reduced by mapping the initial topology of a complex large-scale network with multiple sources and multiple sinks onto a set of smaller sub-topologies with multiple sources and a single sink connected to the network of sources by a single link. In this paper, the mapping procedure is applied to the Florida power grid.

  5. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  6. Distant galaxy clusters in the XMM Large Scale Structure survey

    CERN Document Server

    Willis, J P; Bremer, M N; Pierre, M; Adami, C; Ilbert, O; Maughan, B; Maurogordato, S; Pacaud, F; Valtchanov, I; Chiappetti, L; Thanjavur, K; Gwyn, S; Stanway, E R; Winkworth, C

    2012-01-01

    (Abridged) Distant galaxy clusters provide important tests of the growth of large scale structure in addition to highlighting the process of galaxy evolution in a consistently defined environment at large look back time. We present a sample of 22 distant (z>0.8) galaxy clusters and cluster candidates selected from the 9 deg2 footprint of the overlapping X-ray Multi Mirror (XMM) Large Scale Structure (LSS), CFHTLS Wide and Spitzer SWIRE surveys. Clusters are selected as extended X-ray sources with an accompanying overdensity of galaxies displaying optical to mid-infrared photometry consistent with z>0.8. Nine clusters have confirmed spectroscopic redshifts in the interval 0.80.8 clusters.

  7. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    Science.gov (United States)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  8. The complexity nature of large-scale software systems

    Institute of Scientific and Technical Information of China (English)

    Yan Dong; Qi Guo-Ning; Gu Xin-Jian

    2006-01-01

    In software engineering, class diagrams are often used to describe the system's class structures in Unified Modelling Language (UML). A class diagram, as a graph, is a collection of static declarative model elements, such as classes, interfaces, and the relationships of their connections with each other. In this paper, class graphs are examined within several Java software systems provided by Sun and IBM, and some new features are found. For a large-scale Java software system, its in-degree distribution tends to an exponential distribution, while its out-degree and degree distributions reveal the power-law behaviour. And then a directed preferential-random model is established to describe the corresponding degree distribution features and evolve large-scale Java software systems.

  9. Electron drift in a large scale solid xenon

    CERN Document Server

    Yoo, J

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7\\,cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163\\,K), the drift speed is 0.193 $\\pm$ 0.003 cm/$\\mu$s while the drift speed in the solid phase (157\\,K) is 0.397 $\\pm$ 0.006 cm/$\\mu$s at 900 V/cm over 8.0\\,cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  10. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  11. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    As a result of the growing demand for food, feed and industrial raw materials in the first decade of this century, and the usually welcoming policies regarding investors amongst the governments of developing countries, there has been a renewed interest in agriculture and an increase in large...... to ‘land grabbing’ for large-scale farming (i.e. outgrower schemes and contract farming could modernise agricultural production while allowing smallholders to maintain their land ownership), to integrate them into global agro-food value chains and to increase their productivity and welfare. However......, the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...

  12. Clusters and Large-Scale Structure: the Synchrotron Keys

    CERN Document Server

    Rudnick, L; Andernach, H; Battaglia, N; Brown, S; Brunetti, Gf; Burns, J; Clarke, T; Dolag, K; Farnsworth, D; Giovannini, G; Hallman, E; Johnston-Hollit, M; Jones, T W; Kang, H; Kassim, N; Kravtsov, A; Lazio, J; Lonsdale, C; McNamara, B; Myers, S; Owen, F; Pfrommer, C; Ryu, D; Sarazin, C; Subrahmanyan, R; Taylor, G; Taylor, R

    2009-01-01

    For over four decades, synchrotron-radiating sources have played a series of pathfinding roles in the study of galaxy clusters and large scale structure. Such sources are uniquely sensitive to the turbulence and shock structures of large-scale environments, and their cosmic rays and magnetic fields often play important dynamic and thermodynamic roles. They provide essential complements to studies at other wavebands. Over the next decade, they will fill essential gaps in both cluster astrophysics and the cosmological growth of structure in the universe, especially where the signatures of shocks and turbulence, or even the underlying thermal plasma itself, are otherwise undetectable. Simultaneously, synchrotron studies offer a unique tool for exploring the fundamental question of the origins of cosmic magnetic fields. This work will be based on the new generation of m/cm-wave radio telescopes now in construction, as well as major advances in the sophistication of 3-D MHD simulations.

  13. Cluster Galaxy Dynamics and the Effects of Large Scale Environment

    CERN Document Server

    White, Martin; Smit, Renske

    2010-01-01

    We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters. We pay particular attention to velocity dispersions, matching galaxies to subhalos which are explicitly tracked in the simulation. We find that not only do halos persist as subhalos when they fall into a larger host, groups of subhalos retain their identity for long periods within larger host halos. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and ...

  14. Quantum noise in large-scale coherent nonlinear photonic circuits

    CERN Document Server

    Santori, Charles; Beausoleil, Raymond G; Tezak, Nikolas; Hamerly, Ryan; Mabuchi, Hideo

    2014-01-01

    A semiclassical simulation approach is presented for studying quantum noise in large-scale photonic circuits incorporating an ideal Kerr nonlinearity. A netlist-based circuit solver is used to generate matrices defining a set of stochastic differential equations, in which the resonator field variables represent random samplings of the Wigner quasi-probability distributions. Although the semiclassical approach involves making a large-photon-number approximation, tests on one- and two-resonator circuits indicate satisfactory agreement between the semiclassical and full-quantum simulation results in the parameter regime of interest. The semiclassical model is used to simulate random errors in a large-scale circuit that contains 88 resonators and hundreds of components in total, and functions as a 4-bit ripple counter. The error rate as a function of on-state photon number is examined, and it is observed that the quantum fluctuation amplitudes do not increase as signals propagate through the circuit, an important...

  15. Imaging HF-induced large-scale irregularities above HAARP

    Science.gov (United States)

    Djuth, Frank T.; Reinisch, Bodo W.; Kitrosser, David F.; Elder, John H.; Snyder, A. Lee; Sales, Gary S.

    2006-02-01

    The University of Massachusetts-Lowell digisonde is used with the HAARP high-frequency (HF), ionospheric modification facility to obtain radio images of artificially-produced, large-scale, geomagnetic field-aligned irregularities. F region irregularities generated with the HAARP beam pointed in the vertical and geomagnetic field-aligned directions are examined in a smooth background plasma. It is found that limited large-scale irregularity production takes place with vertical transmissions, whereas there is a dramatic increase in the number of source irregularities with the beam pointed parallel to the geomagnetic field. Strong irregularity production appears to be confined to within ~5° of the geomagnetic zenith and does not fill the volume occupied by the HF beam. A similar effect is observed in optical images of artificial airglow.

  16. In the fast lane: large-scale bacterial genome engineering.

    Science.gov (United States)

    Fehér, Tamás; Burland, Valerie; Pósfai, György

    2012-07-31

    The last few years have witnessed rapid progress in bacterial genome engineering. The long-established, standard ways of DNA synthesis, modification, transfer into living cells, and incorporation into genomes have given way to more effective, large-scale, robust genome modification protocols. Expansion of these engineering capabilities is due to several factors. Key advances include: (i) progress in oligonucleotide synthesis and in vitro and in vivo assembly methods, (ii) optimization of recombineering techniques, (iii) introduction of parallel, large-scale, combinatorial, and automated genome modification procedures, and (iv) rapid identification of the modifications by barcode-based analysis and sequencing. Combination of the brute force of these techniques with sophisticated bioinformatic design and modeling opens up new avenues for the analysis of gene functions and cellular network interactions, but also in engineering more effective producer strains. This review presents a summary of recent technological advances in bacterial genome engineering.

  17. Robust regression for large-scale neuroimaging studies.

    OpenAIRE

    2015-01-01

    PUBLISHED Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypot...

  18. Measuring large scale space perception in literary texts

    Science.gov (United States)

    Rossi, Paolo

    2007-07-01

    A center and radius of “perception” (in the sense of environmental cognition) can be formally associated with a written text and operationally defined. Simple algorithms for their computation are presented, and indicators for anisotropy in large scale space perception are introduced. The relevance of these notions for the analysis of literary and historical records is briefly discussed and illustrated with an example taken from medieval historiography.

  19. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  20. Large-scale Alfvén vortices

    Energy Technology Data Exchange (ETDEWEB)

    Onishchenko, O. G., E-mail: onish@ifz.ru [Institute of Physics of the Earth, 10 B. Gruzinskaya, 123242 Moscow, Russian Federation and Space Research Institute, 84/32 Profsouznaya str., 117997 Moscow (Russian Federation); Pokhotelov, O. A., E-mail: pokh@ifz.ru [Institute of Physics of the Earth, 10 B. Gruzinskaya, 123242 Moscow (Russian Federation); Horton, W., E-mail: wendell.horton@gmail.com [Institute for Fusion Studies and Applied Research Laboratory, University of Texas at Austin, Austin, Texas 78713 (United States); Scullion, E., E-mail: scullie@tcd.ie [School of Physics, Trinity College Dublin, Dublin 2 (Ireland); Fedun, V., E-mail: v.fedun@sheffield.ac.uk [Department of Automatic Control and Systems Engineering, University of Sheffield, Sheffield S13JD (United Kingdom)

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  1. UAV Data Processing for Large Scale Topographical Mapping

    Science.gov (United States)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV) bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI) i.e. Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan) level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D) and digital terrain models (3D) will be integrated in order to provide Digital Elevation Models (DEM) as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD). Finally this result will be used as the benchmark for alternative geospatial data acquisition in the future in which it can support

  2. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  3. Large scale-small scale duality and cosmological constant

    CERN Document Server

    Darabi, F

    1999-01-01

    We study a model of quantum cosmology originating from a classical model of gravitation where a self interacting scalar field is coupled to gravity with the metric undergoing a signature transition. We show that there are dual classical signature changing solutions, one at large scales and the other at small scales. It is possible to fine-tune the physics in both scales with an infinitesimal effective cosmological constant.

  4. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  5. Information Tailoring Enhancements for Large-Scale Social Data

    Science.gov (United States)

    2016-09-26

    Social Data Final Report Reporting Period: September 22, 2015 – September 16, 2016 Contract No. N00014-15-P-5138 Sponsored by ONR...Report September 22, 20 15 - September 16, 20 16 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Information Tailoring Enhancements for Large-Scale Social ...goals of(i) further enhancing capability to analyze unstructured social media data at scale and rapidly, and (ii) improving IAI social media software

  6. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula

    2008-01-01

    , but also provides a more global view on drug-target relations. Here we review recent attempts to apply large-scale computational analyses to predict novel interactions of drugs and targets from molecular and cellular features. In this context, we quantify the family-dependent probability of two proteins...... to bind the same ligand as function of their sequence similarity. We finally discuss how phenotypic data could help to expand our understanding of the complex mechanisms of drug action....

  7. One-dimensional adhesion model for large scale structures

    Directory of Open Access Journals (Sweden)

    Kayyunnapara Thomas Joseph

    2010-05-01

    Full Text Available We discuss initial value problems and initial boundary value problems for some systems of partial differential equations appearing in the modelling for the large scale structure formation in the universe. We restrict the initial data to be bounded measurable and locally bounded variation function and use Volpert product to justify the product which appear in the equation. For more general initial data in the class of generalized functions of Colombeau, we construct the solution in the sense of association.

  8. Robust regression for large-scale neuroimaging studies.

    OpenAIRE

    BOKDE, ARUN

    2015-01-01

    PUBLISHED Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypot...

  9. Multimodel Design of Large Scale Systems with Multiple Decision Makers.

    Science.gov (United States)

    1982-08-01

    virtue. 5- , Lead me from darkneu to light. - Lead me from death to eternal Life. ( Vedic Payer) p. I, MULTIMODEL DESIGN OF LARGE SCALE SYSTEMS WITH...BFI-S2L) is stable for all e in H. To avoid mathematical complications, the feedback matrices of (2.31) are restricted to be of the form, S(e)= Fli + 0...control values used during all past sampling intervals. This information pattern, though not of ouch practical importance, is mathematically con

  10. A Large-Scale Study of Online Shopping Behavior

    OpenAIRE

    Nalchigar, Soroosh; Weber, Ingmar

    2012-01-01

    The continuous growth of electronic commerce has stimulated great interest in studying online consumer behavior. Given the significant growth in online shopping, better understanding of customers allows better marketing strategies to be designed. While studies of online shopping attitude are widespread in the literature, studies of browsing habits differences in relation to online shopping are scarce. This research performs a large scale study of the relationship between Internet browsing hab...

  11. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  12. Multivariate Clustering of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Critchlow, T

    2003-06-13

    Simulations of complex scientific phenomena involve the execution of massively parallel computer programs. These simulation programs generate large-scale data sets over the spatio-temporal space. Modeling such massive data sets is an essential step in helping scientists discover new information from their computer simulations. In this paper, we present a simple but effective multivariate clustering algorithm for large-scale scientific simulation data sets. Our algorithm utilizes the cosine similarity measure to cluster the field variables in a data set. Field variables include all variables except the spatial (x, y, z) and temporal (time) variables. The exclusion of the spatial dimensions is important since ''similar'' characteristics could be located (spatially) far from each other. To scale our multivariate clustering algorithm for large-scale data sets, we take advantage of the geometrical properties of the cosine similarity measure. This allows us to reduce the modeling time from O(n{sup 2}) to O(n x g(f(u))), where n is the number of data points, f(u) is a function of the user-defined clustering threshold, and g(f(u)) is the number of data points satisfying f(u). We show that on average g(f(u)) is much less than n. Finally, even though spatial variables do not play a role in building clusters, it is desirable to associate each cluster with its correct spatial region. To achieve this, we present a linking algorithm for connecting each cluster to the appropriate nodes of the data set's topology tree (where the spatial information of the data set is stored). Our experimental evaluations on two large-scale simulation data sets illustrate the value of our multivariate clustering and linking algorithms.

  13. Large-Scale Integrated Carbon Nanotube Gas Sensors

    OpenAIRE

    Kim, Joondong

    2012-01-01

    Carbon nanotube (CNT) is a promising one-dimensional nanostructure for various nanoscale electronics. Additionally, nanostructures would provide a significant large surface area at a fixed volume, which is an advantage for high-responsive gas sensors. However, the difficulty in fabrication processes limits the CNT gas sensors for the large-scale production. We review the viable scheme for large-area application including the CNT gas sensor fabrication and reaction mechanism with a practical d...

  14. Turbulent large-scale structure effects on wake meandering

    Science.gov (United States)

    Muller, Y.-A.; Masson, C.; Aubrun, S.

    2015-06-01

    This work studies effects of large-scale turbulent structures on wake meandering using Large Eddy Simulations (LES) over an actuator disk. Other potential source of wake meandering such as the instablility mechanisms associated with tip vortices are not treated in this study. A crucial element of the efficient, pragmatic and successful simulations of large-scale turbulent structures in Atmospheric Boundary Layer (ABL) is the generation of the stochastic turbulent atmospheric flow. This is an essential capability since one source of wake meandering is these large - larger than the turbine diameter - turbulent structures. The unsteady wind turbine wake in ABL is simulated using a combination of LES and actuator disk approaches. In order to dedicate the large majority of the available computing power in the wake, the ABL ground region of the flow is not part of the computational domain. Instead, mixed Dirichlet/Neumann boundary conditions are applied at all the computational surfaces except at the outlet. Prescribed values for Dirichlet contribution of these boundary conditions are provided by a stochastic turbulent wind generator. This allows to simulate large-scale turbulent structures - larger than the computational domain - leading to an efficient simulation technique of wake meandering. Since the stochastic wind generator includes shear, the turbulence production is included in the analysis without the necessity of resolving the flow near the ground. The classical Smagorinsky sub-grid model is used. The resulting numerical methodology has been implemented in OpenFOAM. Comparisons with experimental measurements in porous-disk wakes have been undertaken, and the agreements are good. While temporal resolution in experimental measurements is high, the spatial resolution is often too low. LES numerical results provide a more complete spatial description of the flow. They tend to demonstrate that inflow low frequency content - or large- scale turbulent structures - is

  15. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse

  16. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  17. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  18. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  19. Robust regression for large-scale neuroimaging studies.

    Science.gov (United States)

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies.

  20. Systematic Literature Review of Agile Scalability for Large Scale Projects

    Directory of Open Access Journals (Sweden)

    Hina saeeda

    2015-09-01

    Full Text Available In new methods, “agile” has come out as the top approach in software industry for the development of the soft wares. With different shapes agile is applied for handling the issues such as low cost, tight time to market schedule continuously changing requirements, Communication & Coordination, team size and distributed environment. Agile has proved to be successful in the small and medium size project, however, it have several limitations when applied on large size projects. The purpose of this study is to know agile techniques in detail, finding and highlighting its restrictions for large size projects with the help of systematic literature review. The systematic literature review is going to find answers for the Research questions: 1 How to make agile approaches scalable and adoptable for large projects?2 What are the existing methods, approaches, frameworks and practices support agile process in large scale projects? 3 What are limitations of existing agile approaches, methods, frameworks and practices with reference to large scale projects? This study will identify the current research problems of the agile scalability for large size projects by giving a detail literature review of the identified problems, existed work for providing solution to these problems and will find out limitations of the existing work for covering the identified problems in the agile scalability. All the results gathered will be summarized statistically based on these finding remedial work will be planned in future for handling the identified limitations of agile approaches for large scale projects.

  1. Critical thinking, politics on a large scale and media democracy

    Directory of Open Access Journals (Sweden)

    José Antonio IBÁÑEZ-MARTÍN

    2015-06-01

    Full Text Available The first approximation to the social current reality offers us numerous motives for the worry. The spectacle of violence and of immorality can scare us easily. But more worrying still it is to verify that the horizon of conviviality, peace and wellbeing that Europe had been developing from the Treaty of Rome of 1957 has compromised itself seriously for the economic crisis. Today we are before an assault to the democratic politics, which is qualified, on the part of the media democracy, as an exhausted system, which is required to be changed into a new and great politics, a politics on a large scale. The article analyses the concept of a politics on a large scale, primarily attending to Nietzsche, and noting its union with the great philosophy and the great education. The study of the texts of Nietzsche leads us to the conclusion of how in them we often find an interesting analysis of the problems and a misguided proposal for solutions. We cannot think to suggest solutions to all the problems, but we outline various proposals about changes of political activity, that reasonably are defended from the media democracy. In conclusion, we point out that a politics on a large scale requires statesmen, able to suggest modes of life in common that can structure a long-term coexistence.

  2. Large Scale Magnetic Fields: Density Power Spectrum in Redshift Space

    Indian Academy of Sciences (India)

    Rajesh Gopal; Shiv K. Sethi

    2003-09-01

    We compute the density redshift-space power spectrum in the presence of tangled magnetic fields and compare it with existing observations. Our analysis shows that if these magnetic fields originated in the early universe then it is possible to construct models for which the shape of the power spectrum agrees with the large scale slope of the observed power spectrum. However requiring compatibility with observed CMBR anisotropies, the normalization of the power spectrum is too low for magnetic fields to have significant impact on the large scale structure at present. Magnetic fields of a more recent origin generically give density power spectrum ∝ 4 which doesn’t agree with the shape of the observed power spectrum at any scale. Magnetic fields generate curl modes of the velocity field which increase both the quadrupole and hexadecapole of the redshift space power spectrum. For curl modes, the hexadecapole dominates over quadrupole. So the presence of curl modes could be indicated by an anomalously large hexadecapole, which has not yet been computed from observation. It appears difficult to construct models in which tangled magnetic fields could have played a major role in shaping the large scale structure in the present epoch. However if they did, one of the best ways to infer their presence would be from the redshift space effects in the density power spectrum.

  3. A visualization framework for large-scale virtual astronomy

    Science.gov (United States)

    Fu, Chi-Wing

    Motivated by advances in modern positional astronomy, this research attempts to digitally model the entire Universe through computer graphics technology. Our first challenge is space itself. The gigantic size of the Universe makes it impossible to put everything into a typical graphics system at its own scale. The graphics rendering process can easily fail because of limited computational precision, The second challenge is that the enormous amount of data could slow down the graphics; we need clever techniques to speed up the rendering. Third, since the Universe is dominated by empty space, objects are widely separated; this makes navigation difficult. We attempt to tackle these problems through various techniques designed to extend and optimize the conventional graphics framework, including the following: power homogeneous coordinates for large-scale spatial representations, generalized large-scale spatial transformations, and rendering acceleration via environment caching and object disappearance criteria. Moreover, we implemented an assortment of techniques for modeling and rendering a variety of astronomical bodies, ranging from the Earth up to faraway galaxies, and attempted to visualize cosmological time; a method we call the Lightcone representation was introduced to visualize the whole space-time of the Universe at a single glance. In addition, several navigation models were developed to handle the large-scale navigation problem. Our final results include a collection of visualization tools, two educational animations appropriate for planetarium audiences, and state-of-the-art-advancing rendering techniques that can be transferred to practice in digital planetarium systems.

  4. Impact of Large-scale Geological Architectures On Recharge

    Science.gov (United States)

    Troldborg, L.; Refsgaard, J. C.; Engesgaard, P.; Jensen, K. H.

    Geological and hydrogeological data constitutes the basis for assessment of ground- water flow pattern and recharge zones. The accessibility and applicability of hard ge- ological data is often a major obstacle in deriving plausible conceptual models. Nev- ertheless focus is often on parameter uncertainty caused by the effect of geological heterogeneity due to lack of hard geological data, thus neglecting the possibility of alternative conceptualizations of the large-scale geological architecture. For a catchment in the eastern part of Denmark we have constructed different geologi- cal models based on different conceptualization of the major geological trends and fa- cies architecture. The geological models are equally plausible in a conceptually sense and they are all calibrated to well head and river flow measurements. Comparison of differences in recharge zones and subsequently well protection zones emphasize the importance of assessing large-scale geological architecture in hydrological modeling on regional scale in a non-deterministic way. Geostatistical modeling carried out in a transitional probability framework shows the possibility of assessing multiple re- alizations of large-scale geological architecture from a combination of soft and hard geological information.

  5. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  6. Large-scale magnetic topologies of early M dwarfs

    CERN Document Server

    Donati, JF; Petit, P; Delfosse, X; Forveille, T; Aurière, M; Cabanac, R; Dintrans, B; Fares, R; Gastine, T; Jardine, MM; Lignières, F; Paletou, F; Velez, J Ramirez; Théado, S

    2008-01-01

    We present here additional results of a spectropolarimetric survey of a small sample of stars ranging from spectral type M0 to M8 aimed at investigating observationally how dynamo processes operate in stars on both sides of the full convection threshold (spectral type M4). The present paper focuses on early M stars (M0--M3), i.e. above the full convection threshold. Applying tomographic imaging techniques to time series of rotationally modulated circularly polarised profiles collected with the NARVAL spectropolarimeter, we determine the rotation period and reconstruct the large-scale magnetic topologies of 6 early M dwarfs. We find that early-M stars preferentially host large-scale fields with dominantly toroidal and non-axisymmetric poloidal configurations, along with significant differential rotation (and long-term variability); only the lowest-mass star of our subsample is found to host an almost fully poloidal, mainly axisymmetric large-scale field ressembling those found in mid-M dwarfs. This abrupt chan...

  7. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  8. Reliability assessment for components of large scale photovoltaic systems

    Science.gov (United States)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  9. Searching for Large Scale Structure in Deep Radio Surveys

    CERN Document Server

    Baleisis, A; Loan, A J; Wall, J V; Baleisis, Audra; Lahav, Ofer; Loan, Andrew J.; Wall, Jasper V.

    1997-01-01

    (Abridged Abstract) We calculate the expected amplitude of the dipole and higher spherical harmonics in the angular distribution of radio galaxies. The median redshift of radio sources in existing catalogues is z=1, which allows us to study large scale structure on scales between those accessible to present optical and infrared surveys, and that of the Cosmic Microwave Background (CMB). The dipole is due to 2 effects which turn out to be of comparable magnitude: (i) our motion with respect to the CMB, and (ii) large scale structure, parameterised here by a family of Cold Dark Matter power-spectra. We make specific predictions for the Green Bank (87GB) and Parkes-MIT-NRAO (PMN) catalogues. For these relatively sparse catalogues both the motion and large scale structure dipole effects are expected to be smaller than the Poisson shot-noise. However, we detect dipole and higher harmonics in the combined 87GB-PMN catalogue which are far larger than expected. We attribute this to a 2 % flux mismatch between the two...

  10. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  11. Star formation associated with a large-scale infrared bubble

    CERN Document Server

    Xu, Jin-Long

    2014-01-01

    Using the data from the Galactic Ring Survey (GRS) and Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE), we performed a study for a large-scale infrared bubble with a size of about 16 pc at a distance of 2.0 kpc. We present the 12CO J=1-0, 13CO J=1-0 and C18O J=1-0 observations of HII region G53.54-0.01 (Sh2-82) obtained at the the Purple Mountain Observation (PMO) 13.7 m radio telescope to investigate the detailed distribution of associated molecular material. The large-scale infrared bubble shows a half-shell morphology at 8 um. H II regions G53.54-0.01, G53.64+0.24, and G54.09-0.06 are situated on the bubble. Comparing the radio recombination line velocities and associated 13CO J=1-0 components of the three H II regions, we found that the 8 um emission associated with H II region G53.54-0.01 should belong to the foreground emission, and only overlap with the large-scale infrared bubble in the line of sight. Three extended green objects (EGOs, the candidate massive young stellar objects), ...

  12. Equivalent common path method in large-scale laser comparator

    Science.gov (United States)

    He, Mingzhao; Li, Jianshuang; Miao, Dongjing

    2015-02-01

    Large-scale laser comparator is main standard device that providing accurate, reliable and traceable measurements for high precision large-scale line and 3D measurement instruments. It mainly composed of guide rail, motion control system, environmental parameters monitoring system and displacement measurement system. In the laser comparator, the main error sources are temperature distribution, straightness of guide rail and pitch and yaw of measuring carriage. To minimize the measurement uncertainty, an equivalent common optical path scheme is proposed and implemented. Three laser interferometers are adjusted to parallel with the guide rail. The displacement in an arbitrary virtual optical path is calculated using three displacements without the knowledge of carriage orientations at start and end positions. The orientation of air floating carriage is calculated with displacements of three optical path and position of three retroreflectors which are precisely measured by Laser Tracker. A 4th laser interferometer is used in the virtual optical path as reference to verify this compensation method. This paper analyzes the effect of rail straightness on the displacement measurement. The proposed method, through experimental verification, can improve the measurement uncertainty of large-scale laser comparator.

  13. Large scale structure around a z=2.1 cluster

    CERN Document Server

    Hung, Chao-Ling; Chiang, Yi-Kuan; Capak, Peter; Cowley, Michael J; Darvish, Behnam; Kacprzak, Glenn G; Kovac, K; Lilly, Simon J; Nanayakkara, Themiya; Spitler, Lee R; Tran, Kim-Vy H; Yuan, Tiantian

    2016-01-01

    The most prodigious starburst galaxies are absent in massive galaxy clusters today, but their connection with large scale environments is less clear at $z\\gtrsim2$. We present a search of large scale structure around a galaxy cluster core at $z=2.095$ using a set of spectroscopically confirmed galaxies. We find that both color-selected star-forming galaxies (SFGs) and dusty star-forming galaxies (DSFGs) show significant overdensities around the $z=2.095$ cluster. A total of 8 DSFGs (including 3 X-ray luminous active galactic nuclei, AGNs) and 34 SFGs are found within a 10 arcmin radius (corresponds to $\\sim$15 cMpc at $z\\sim2.1$) from the cluster center and within a redshift range of $\\Delta z=0.02$, which leads to galaxy overdensities of $\\delta_{\\rm DSFG}\\sim12.3$ and $\\delta_{\\rm SFG}\\sim2.8$. The cluster core and the extended DSFG- and SFG-rich structure together demonstrate an active cluster formation phase, in which the cluster is accreting a significant amount of material from large scale structure whi...

  14. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  15. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  16. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  17. Foundational perspectives on causality in large-scale brain networks.

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  18. On the Phenomenology of an Accelerated Large-Scale Universe

    Directory of Open Access Journals (Sweden)

    Martiros Khurshudyan

    2016-10-01

    Full Text Available In this review paper, several new results towards the explanation of the accelerated expansion of the large-scale universe is discussed. On the other hand, inflation is the early-time accelerated era and the universe is symmetric in the sense of accelerated expansion. The accelerated expansion of is one of the long standing problems in modern cosmology, and physics in general. There are several well defined approaches to solve this problem. One of them is an assumption concerning the existence of dark energy in recent universe. It is believed that dark energy is responsible for antigravity, while dark matter has gravitational nature and is responsible, in general, for structure formation. A different approach is an appropriate modification of general relativity including, for instance, f ( R and f ( T theories of gravity. On the other hand, attempts to build theories of quantum gravity and assumptions about existence of extra dimensions, possible variability of the gravitational constant and the speed of the light (among others, provide interesting modifications of general relativity applicable to problems of modern cosmology, too. In particular, here two groups of cosmological models are discussed. In the first group the problem of the accelerated expansion of large-scale universe is discussed involving a new idea, named the varying ghost dark energy. On the other hand, the second group contains cosmological models addressed to the same problem involving either new parameterizations of the equation of state parameter of dark energy (like varying polytropic gas, or nonlinear interactions between dark energy and dark matter. Moreover, for cosmological models involving varying ghost dark energy, massless particle creation in appropriate radiation dominated universe (when the background dynamics is due to general relativity is demonstrated as well. Exploring the nature of the accelerated expansion of the large-scale universe involving generalized

  19. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    Energy Technology Data Exchange (ETDEWEB)

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  20. Statistical Modeling of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  1. Nonlinear evolution of large-scale structure in the universe

    Energy Technology Data Exchange (ETDEWEB)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-08-15

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r/sub 0/ = 5.1; its expected value in a neutrino dominated universe is 4(..cap omega..h)/sup -1/ (H/sub 0/ = 100h km s/sup -1/ Mpc/sup -1/). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Ly..cap alpha.. absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with ..cap omega..<1.

  2. Design techniques for large scale linear measurement systems

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.

    1979-03-01

    Techniques to design measurement schemes for systems modeled by large scale linear time invariant systems, i.e., physical systems modeled by a large number (> 5) of ordinary differential equations, are described. The techniques are based on transforming the physical system model to a coordinate system facilitating the design and then transforming back to the original coordinates. An example of a three-stage, four-species, extraction column used in the reprocessing of spent nuclear fuel elements is presented. The basic ideas are briefly discussed in the case of noisy measurements. An example using a plutonium nitrate storage vessel (reprocessing) with measurement uncertainty is also presented.

  3. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P. (PA Energy, Malling (Denmark)); Vedde, J. (SiCon. Silicon and PV consulting, Birkeroed (Denmark))

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  4. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred

    2010-08-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one billion elements. We investigate communication protocols for the GPU cluster to compensate for the slow Gigabit Ethernet network between the GPU compute nodes and to maintain overall efficiency. A diesel engine intake-port and a nozzle, meshed in different resolutions, give good real world examples for the scalability tests on the GPU cluster. © 2010 IEEE.

  5. Using Large Scale Test Results for Pedagogical Purposes

    DEFF Research Database (Denmark)

    Dolin, Jens

    2012-01-01

    The use and influence of large scale tests (LST), both national and international, has increased dramatically within the last decade. This process has revealed a tension between the legitimate need for information about the performance of the educational system and teachers to inform policy...... wash back effects known from other research but gave additionally some insight in teachers’ attitudes towards LSTs. To account for these findings results from another research project - the Validation of PISA – will be included. This project analyzed how PISA has influenced the Danish educational...

  6. Cosmological parameters from large scale structure - geometric versus shape information

    CERN Document Server

    Hamann, Jan; Lesgourgues, Julien; Rampf, Cornelius; Wong, Yvonne Y Y

    2010-01-01

    The matter power spectrum as derived from large scale structure (LSS) surveys contains two important and distinct pieces of information: an overall smooth shape and the imprint of baryon acoustic oscillations (BAO). We investigate the separate impact of these two types of information on cosmological parameter estimation, and show that for the simplest cosmological models, the broad-band shape information currently contained in the SDSS DR7 halo power spectrum (HPS) is by far superseded by geometric information derived from the baryonic features. An immediate corollary is that contrary to popular beliefs, the upper limit on the neutrino mass m_\

  7. Practical Optimal Control of Large-scale Water Distribution Network

    Institute of Scientific and Technical Information of China (English)

    Lv Mou(吕谋); Song Shuang

    2004-01-01

    According to the network characteristics and actual state of the water supply system in China, the implicit model, which can be solved by the hierarchical optimization method, was established. In special, based on the analyses of the water supply system containing variable-speed pumps, a software has been developed successfully. The application of this model to the city of Hangzhou (China) was compared to experiential strategy. The results of this study showed that the developed model is a promising optimization method to control the large-scale water supply systems.

  8. Controlled growth of large-scale silver nanowires

    Institute of Scientific and Technical Information of China (English)

    Xiao Cong-Wen; Yang Hai-Tao; Shen Cheng-Min; Li Zi-An; Zhang Huai-Ruo; Liu Fei; Yang Tian-Zhong; Chen Shu-Tang; Gao Hong-Jun

    2005-01-01

    Large-scale silver nanowires with controlled aspect ratio were synthesized via reducing silver nitrate with 1, 2-propanediol in the presence of poly (vinyl pyrrolidone) (PVP). Scanning electron microscopy, transmission electron microscopy and x-ray powder diffraction were employed to characterize these silver nanowires. The diameter of the silver nanowires can be readily controlled in the range of 100 to 400 nm by varying the experimental conditions. X-ray photoelectron spectroscopy and Fourier transform infrared spectroscopy results show that there exists no chemical bond between the silver and the nitrogen atoms. The interaction between PVP and silver nanowires is mainly through the oxygen atom in the carbonyl group.

  9. Accurate emulators for large-scale computer experiments

    CERN Document Server

    Haaland, Ben; 10.1214/11-AOS929

    2012-01-01

    Large-scale computer experiments are becoming increasingly important in science. A multi-step procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator in multiple steps. In practice, the procedure shows substantial improvements in overall accuracy, but its theoretical properties are not well established. We introduce the terms nominal and numeric error and decompose the overall error of an interpolator into nominal and numeric portions. Bounds on the numeric and nominal error are developed to show theoretically that substantial gains in overall accuracy can be attained with the multi-step approach.

  10. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  11. Application of methanol synthesis reactor to large-scale plants

    Institute of Scientific and Technical Information of China (English)

    LOU Ren; XU Rong-liang; LOU Shou-lin

    2006-01-01

    The developing status of world large-scale methanol production technology is analyzed and Linda's JW low-pressure methanol synthesis reactor with uniform temperature is described. JW serial reactors have been successfully introduced in and applied in Harbin Gasification Plant and the productivity has been increased by 50% and now nine sets of equipments are successfully running in Harbin Gasification Plant,Jiangsu Xinya, Shandong Kenli,Henan Zhongyuan, Handan Xinyangguang,' Shanxi Weihua and Inner Mongolia Tianye. Now it has manufacturing the reactors of 300,000 t/a for Liaoning Dahua. Some solutions for the structure problems of 1000 ~5000 t/d methanol synthesis rectors are put forward.

  12. Large-scale magnetic fields from inflation in teleparallel gravity

    CERN Document Server

    Bamba, Kazuharu; Luo, Ling-Wei

    2013-01-01

    Generation of large-scale magnetic fields in inflationary cosmology is studied in teleparallelism, where instead of the scalar curvature in general relativity, the torsion scalar describes the gravity theory. In particular, we investigate a coupling of the electromagnetic field to the torsion scalar during inflation, which leads to the breaking of conformal invariance of the electromagnetic field. We demonstrate that for a power-law type coupling, the current magnetic field strength of $\\sim 10^{-9}$ G on 1 Mpc scale can be generated, if the backreaction effects and strong coupling problem are not taken into consideration.

  13. An Evaluation Framework for Large-Scale Network Structures

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    2004-01-01

    structure is a matter of trade-offs between different desired properties, and given a specific case with specific known or expected demands and constraints, the parameters presented will be weighted differently. The decision of such a weighting is supported by a discussion of each parameter. The paper......An evaluation framework for large-scale network structures is presented, which facilitates evaluations and comparisons of different physical network structures. A number of quantitative and qualitative parameters are presented, and their importance to networks discussed. Choosing a network...

  14. Large scale solar cooling plants in America, Asia and Europe

    Energy Technology Data Exchange (ETDEWEB)

    Holter, Christian; Olsacher, Nicole [S.O.L.I.D. GmbH, Graz (Austria)

    2010-07-01

    Large scale solar cooling plants with an area between 120 - 1600 m{sup 2} are representative examples to illustrate S.O.L.I.D.'s experiences. The selected three reference solar cooling plants are located on three different continents: America, Asia and Europe. Every region has different framework conditions and its unforeseen challenges but professional experience and innovative ideas form the basis that each plant is operating well and satisfying the customer's demand. This verifies that solar cooling already is a proven technology. (orig.)

  15. Simple Method for Large-Scale Fabrication of Plasmonic Structures

    CERN Document Server

    Makarov, Sergey V; Mukhin, Ivan S; Shishkin, Ivan I; Mozharov, Alexey M; Krasnok, Alexander E; Belov, Pavel A

    2015-01-01

    A novel method for single-step, lithography-free, and large-scale laser writing of nanoparticle-based plasmonic structures has been developed. Changing energy of femtosecond laser pulses and thickness of irradiated gold film it is possible to vary diameter of the gold nanoparticles, while the distance between them can be varied by laser scanning parameters. This method has an advantage over the most previously demonstrated methods in its simplicity and versatility, while the quality of the structures is good enough for many applications. In particular, resonant light absorbtion/scattering and surface-enhanced Raman scattering have been demonstrated on the fabricated nanostructures.

  16. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    Science.gov (United States)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  17. Petascale computations for Large-scale Atomic and Molecular collisions

    CERN Document Server

    McLaughlin, Brendan M

    2014-01-01

    Petaflop architectures are currently being utilized efficiently to perform large scale computations in Atomic, Molecular and Optical Collisions. We solve the Schroedinger or Dirac equation for the appropriate collision problem using the R-matrix or R-matrix with pseudo-states approach. We briefly outline the parallel methodology used and implemented for the current suite of Breit-Pauli and DARC codes. Various examples are shown of our theoretical results compared with those obtained from Synchrotron Radiation facilities and from Satellite observations. We also indicate future directions and implementation of the R-matrix codes on emerging GPU architectures.

  18. An iterative decoupling solution method for large scale Lyapunov equations

    Science.gov (United States)

    Athay, T. M.; Sandell, N. R., Jr.

    1976-01-01

    A great deal of attention has been given to the numerical solution of the Lyapunov equation. A useful classification of the variety of solution techniques are the groupings of direct, transformation, and iterative methods. The paper summarizes those methods that are at least partly favorable numerically, giving special attention to two criteria: exploitation of a general sparse system matrix structure and efficiency in resolving the governing linear matrix equation for different matrices. An iterative decoupling solution method is proposed as a promising approach for solving large-scale Lyapunov equation when the system matrix exhibits a general sparse structure. A Fortran computer program that realizes the iterative decoupling algorithm is also discussed.

  19. Large-Scale Self-Consistent Nuclear Mass Calculations

    CERN Document Server

    Stoitsov, M V; Dobaczewski, J; Nazarewicz, W

    2006-01-01

    The program of systematic large-scale self-consistent nuclear mass calculations that is based on the nuclear density functional theory represents a rich scientific agenda that is closely aligned with the main research directions in modern nuclear structure and astrophysics, especially the radioactive nuclear beam physics. The quest for the microscopic understanding of the phenomenon of nuclear binding represents, in fact, a number of fundamental and crucial questions of the quantum many-body problem, including the proper treatment of correlations and dynamics in the presence of symmetry breaking. Recent advances and open problems in the field of nuclear mass calculations are presented and discussed.

  20. Generation of large-scale winds in horizontally anisotropic convection

    CERN Document Server

    von Hardenberg, J; Provenzale, A; Spiegel, E A

    2015-01-01

    We simulate three-dimensional, horizontally periodic Rayleigh-B\\'enard convection between free-slip horizontal plates, rotating about a horizontal axis. When both the temperature difference between the plates and the rotation rate are sufficiently large, a strong horizontal wind is generated that is perpendicular to both the rotation vector and the gravity vector. The wind is turbulent, large-scale, and vertically sheared. Horizontal anisotropy, engendered here by rotation, appears necessary for such wind generation. Most of the kinetic energy of the flow resides in the wind, and the vertical turbulent heat flux is much lower on average than when there is no wind.

  1. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , whereas Chapter 4 indicates that sugarcane outgrowers’ easy access to credit and technology and their high productivity compared to the plantation does not necessarily improve their income and asset stocks particularly when participation in outgrower schemes is mandatory, the buyer has monopsony market...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  2. Structure and function of large-scale brain systems.

    Science.gov (United States)

    Koziol, Leonard F; Barker, Lauren A; Joyce, Arthur W; Hrin, Skip

    2014-01-01

    This article introduces the functional neuroanatomy of large-scale brain systems. Both the structure and functions of these brain networks are presented. All human behavior is the result of interactions within and between these brain systems. This system of brain function completely changes our understanding of how cognition and behavior are organized within the brain, replacing the traditional lesion model. Understanding behavior within the context of brain network interactions has profound implications for modifying abstract constructs such as attention, learning, and memory. These constructs also must be understood within the framework of a paradigm shift, which emphasizes ongoing interactions within a dynamically changing environment.

  3. Clusters as cornerstones of large-scale structure.

    Science.gov (United States)

    Gottlöber, S.; Retzlaff, J.; Turchaninov, V.

    1997-04-01

    Galaxy clusters are one of the best tracers of large-scale structure in the Universe on scales well above 100 Mpc. The authors investigate here the clustering properties of a redshift sample of Abell/ACO clusters and compare the observational sample with mock samples constructed from N-body simulations on the basis of four different cosmological models. The authors discuss the power spectrum, the Minkowski functionals and the void statistics of these samples and conclude, that the SCDM and TCDM models are ruled out whereas the ACDM and BSI models are in agreement with the observational data.

  4. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    contribute to the total amount of Frequency Containment Reserves (FCR) required by TSOs, reserves which are released during transients. To realize this PVPPs have to operate below their maximum available power and operate in Frequency Sensitive Mode (FSM). The reserve can also be used to fulfill future grid...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...... codes (GCs) requirements such as Power Ramp Limitation (PRL) during high slopes of irradiance....

  5. Cost Overruns in Large-scale Transportation Infrastructure Projects

    DEFF Research Database (Denmark)

    Cantarelli, Chantal C; Flyvbjerg, Bent; Molin, Eric J. E

    2010-01-01

    Managing large-scale transportation infrastructure projects is difficult due to frequent misinformation about the costs which results in large cost overruns that often threaten the overall project viability. This paper investigates the explanations for cost overruns that are given in the literature....... Overall, four categories of explanations can be distinguished: technical, economic, psychological, and political. Political explanations have been seen to be the most dominant explanations for cost overruns. Agency theory is considered the most interesting for political explanations and an eclectic theory...

  6. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  7. Facilitating dynamo action via control of large-scale turbulence.

    Science.gov (United States)

    Limone, A; Hatch, D R; Forest, C B; Jenko, F

    2012-12-01

    The magnetohydrodynamic dynamo effect is considered to be the major cause of magnetic field generation in geo- and astrophysical systems. Recent experimental and numerical results show that turbulence constitutes an obstacle to dynamos; yet its role in this context is not totally clear. Via numerical simulations, we identify large-scale turbulent vortices with a detrimental effect on the amplification of the magnetic field in a geometry of experimental interest and propose a strategy for facilitating the dynamo instability by manipulating these detrimental "hidden" dynamics.

  8. Large-Scale Patterns of Filament Channels and Filaments

    Science.gov (United States)

    Mackay, Duncan

    2016-07-01

    In this review the properties and large-scale patterns of filament channels and filaments will be considered. Initially, the global formation locations of filament channels and filaments are discussed, along with their hemispheric pattern. Next, observations of the formation of filament channels and filaments are described where two opposing views are considered. Finally, the wide range of models that have been constructed to consider the formation of filament channels and filaments over long time-scales are described, along with the origin of the hemispheric pattern of filaments.

  9. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    , among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which....... The simulation results show that the gray and non-gray calculations of the same oxy-fuel WSGGM make distinctly different predictions in the wall radiative heat transfer, incident radiative flux, radiative source, gas temperature and species profiles. In relative to the non-gray implementation, the gray...

  10. Adiabatic hyperspherical approach to large-scale nuclear dynamics

    CERN Document Server

    Suzuki, Yasuyuki

    2015-01-01

    We formulate a fully microscopic approach to large-scale nuclear dynamics using a hyperradius as a collective coordinate. An adiabatic potential is defined by taking account of all possible configurations at a fixed hyperradius, and its hyperradius dependence plays a key role in governing the global nuclear motion. In order to go to larger systems beyond few-body systems, we suggest basis functions of a microscopic multicluster model, propose a method for calculating matrix elements of an adiabatic Hamiltonian with use of Fourier transforms, and test its effectiveness.

  11. Laser Welding of Large Scale Stainless Steel Aircraft Structures

    Science.gov (United States)

    Reitemeyer, D.; Schultz, V.; Syassen, F.; Seefeld, T.; Vollertsen, F.

    In this paper a welding process for large scale stainless steel structures is presented. The process was developed according to the requirements of an aircraft application. Therefore, stringers are welded on a skin sheet in a t-joint configuration. The 0.6 mm thickness parts are welded with a thin disc laser, seam length up to 1920 mm are demonstrated. The welding process causes angular distortions of the skin sheet which are compensated by a subsequent laser straightening process. Based on a model straightening process parameters matching the induced welding distortion are predicted. The process combination is successfully applied to stringer stiffened specimens.

  12. Less is more: regularization perspectives on large scale machine learning

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Deep learning based techniques provide a possible solution at the expanse of theoretical guidance and, especially, of computational requirements. It is then a key challenge for large scale machine learning to devise approaches guaranteed to be accurate and yet computationally efficient. In this talk, we will consider a regularization perspectives on machine learning appealing to classical ideas in linear algebra and inverse problems to scale-up dramatically nonparametric methods such as kernel methods, often dismissed because of prohibitive costs. Our analysis derives optimal theoretical guarantees while providing experimental results at par or out-performing state of the art approaches.

  13. Search for Large Scale Anisotropies with the Pierre Auger Observatory

    Science.gov (United States)

    Bonino, R.; Pierre Auger Collaboration

    The Pierre Auger Observatory studies the nature and the origin of Ultra High Energy Cosmic Rays (>3\\cdot1018 eV). Completed at the end of 2008, it has been continuously operating for more than six years. Using data collected from 1 January 2004 until 31 March 2009, we search for large scale anisotropies with two complementary analyses in different energy windows. No significant anisotropies are observed, resulting in bounds on the first harmonic amplitude at the 1% level at EeV energies.

  14. Inflation in de Sitter spacetime and CMB large scales anomaly

    CERN Document Server

    Zhao, Dong; Wang, Ping; Chang, Zhe

    2014-01-01

    The influence of cosmological constant type dark energy in the early universe is investigated. This is accommodated by a new dispersion relation in de Sitter spacetime. We perform a global fitting to explore the cosmological parameters space by using the CosmoMC package with the recently released Planck TT and WMAP Polarization datasets. Using the results from global fitting, we compute a new CMB temperature-temperature spectrum. The obtained TT spectrum has lower power compared with the one based on $\\Lambda$CDM model at large scales.

  15. Destruction of Be star disk by large scale magnetic fields

    Science.gov (United States)

    Ud-Doula, Asif; Owocki, Stanley P.; Kee, Nathaniel; Vanyo, Michael

    2017-01-01

    Classical Be stars are rapidly rotating stars with circumstellar disks that come and go on time scale of years. Recent observational data strongly suggests that these stars lack the ~10% incidence of global magnetic fields observed in other main-sequence B stars. Such an apparent lack of magnetic fields may indicate that Be disks are fundamentally incompatible with a significant large scale magnetic field. In this work, using numerical magnetohydrodynamics (MHD) simulations, we show that a dipole field of only 100G can lead to the quick disruption of a Be disk. Such a limit is in line with the observational upper limits for these objects.

  16. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  17. Large scale obscuration and related climate effects open literature bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  18. An Atmospheric Large-Scale Cold Plasma Jet

    Institute of Scientific and Technical Information of China (English)

    吕晓桂; 任春生; 马腾才; 冯岩; 王德真

    2012-01-01

    This letter reports on the generation and characteristics of a large-scale dielectric barrier discharge plasma jet at atmospheric pressure. With appropriate parameters, diffuse plasma with a 50×5 mm2 cross-sectional area is obtained. The characteristics of the discharges are diag- nosed by using electrical and optical methods. In addition to being generated in helium, plasma is also generated in a mixed gas of helium and oxygen. The oxygen atomic radiant intensity (3p5P→ 3s5S, 3p3P→3s3S transition) is not proportional to the proportion of oxygen in the gas mixture, as shown by the experimental results.

  19. An iterative decoupling solution method for large scale Lyapunov equations

    Science.gov (United States)

    Athay, T. M.; Sandell, N. R., Jr.

    1976-01-01

    A great deal of attention has been given to the numerical solution of the Lyapunov equation. A useful classification of the variety of solution techniques are the groupings of direct, transformation, and iterative methods. The paper summarizes those methods that are at least partly favorable numerically, giving special attention to two criteria: exploitation of a general sparse system matrix structure and efficiency in resolving the governing linear matrix equation for different matrices. An iterative decoupling solution method is proposed as a promising approach for solving large-scale Lyapunov equation when the system matrix exhibits a general sparse structure. A Fortran computer program that realizes the iterative decoupling algorithm is also discussed.

  20. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  1. Large-scale biophysical evaluation of protein PEGylation effects

    DEFF Research Database (Denmark)

    Vernet, Erik; Popa, Gina; Pozdnyakova, Irina

    2016-01-01

    PEGylation is the most widely used method to chemically modify protein biopharmaceuticals, but surprisingly limited public data is available on the biophysical effects of protein PEGylation. Here we report the first large-scale study, with site-specific mono-PEGylation of 15 different proteins...... and characterization of 61 entities in total using a common set of analytical methods. Predictions of molecular size were typically accurate in comparison with actual size determined by size-exclusion chromatography (SEC) or dynamic light scattering (DLS). In contrast, there was no universal trend regarding the effect...

  2. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  3. Measuring Large-Scale Social Networks with High Resolution

    DEFF Research Database (Denmark)

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr

    2014-01-01

    , telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation......This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years-the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions...

  4. Quantum computation for large-scale image classification

    Science.gov (United States)

    Ruan, Yue; Chen, Hanwu; Tan, Jianing; Li, Xi

    2016-10-01

    Due to the lack of an effective quantum feature extraction method, there is currently no effective way to perform quantum image classification or recognition. In this paper, for the first time, a global quantum feature extraction method based on Schmidt decomposition is proposed. A revised quantum learning algorithm is also proposed that will classify images by computing the Hamming distance of these features. From the experimental results derived from the benchmark database Caltech 101, and an analysis of the algorithm, an effective approach to large-scale image classification is derived and proposed against the background of big data.

  5. Development of large-scale functional brain networks in children.

    Directory of Open Access Journals (Sweden)

    Kaustubh Supekar

    2009-07-01

    Full Text Available The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y and 22 young-adults (ages 19-22 y. Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  6. Simulating the Large-Scale Structure of HI Intensity Maps

    CERN Document Server

    Seehars, Sebastian; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel

    2015-01-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations, the halo model, and a phenomenological prescription for assigning HI mass to halos. The simulations span a redshift range of 0.35 < z < 0.9 in redshift bins of width $\\Delta z \\approx 0.05$ and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects on the angular clustering of HI. We apply and compare several estimators for the angular power spectrum and its covariance. We verify that they agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  7. Simulating the large-scale structure of HI intensity maps

    Science.gov (United States)

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 20483 particles (particle mass 1.6 × 1011 Msolar / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (108 Msolar / h assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 lesssim z lesssim 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  8. Large-scale stabilization control of input-constrained quadrotor

    Directory of Open Access Journals (Sweden)

    Jun Jiang

    2016-10-01

    Full Text Available The quadrotor has been the most popular aircraft in the last decade due to its excellent dynamics and continues to attract ever-increasing research interest. Delivering a quadrotor from a large fixed-wing aircraft is a promising application of quadrotors. In such an application, the quadrotor needs to switch from a highly unstable status, featured as large initial states, to a safe and stable flight status. This is the so-called large-scale stability control problem. In such an extreme scenario, the quadrotor is at risk of actuator saturation. This can cause the controller to update incorrectly and lead the quadrotor to spiral and crash. In this article, to safely control the quadrotor in such scenarios, the control input constraint is analyzed. The key states of a quadrotor dynamic model are selected, and a two-dimensional dynamic model is extracted based on a symmetrical body configuration. A generalized point-wise min-norm nonlinear control method is proposed based on the Lyapunov function, and large-scale stability control is hence achieved. An enhanced point-wise, min-norm control is further provided to improve the attitude control performance, with altitude performance degenerating slightly. Simulation results showed that the proposed control methods can stabilize the input-constrained quadrotor and the enhanced method can improve the performance of the quadrotor in critical states.

  9. Maestro: an orchestration framework for large-scale WSN simulations.

    Science.gov (United States)

    Riliskis, Laurynas; Osipov, Evgeny

    2014-03-18

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.

  10. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  11. Large Scale Land Acquisition as a driver of slope instability

    Science.gov (United States)

    Danilo Chiarelli, Davide; Rulli, Maria Cristina; Davis, Kyle F.; D'Odorico, Paolo

    2017-04-01

    Forests play a key role in preventing shallow landslides and deforestation has been analyzed as one of the main causes of increased mass wasting in hillsplopes undergoing land cover change. In the last few years vast tracts of lands have been acquired by foreign investors to satisfy an increasing demand for agricultural products. Large Scale Land Acquisitions (LSLA) often entail the conversion of forested landscapes into agricultural fields. Mozambique has been a major target of LSLAs and there is evidence that many of the acquired land have recently undergone forest clearing. The Zambezia Province in Mozambique has lost more than 500000ha of forest from 2000 to 2014; 25.4% of them were in areas acquired by large scale land investors. According to Land Matrix, an open-source database of reported land deals, there are currently 123 intended and confirmed deals in Mozambique; collectively, they account for 2.34million ha, the majority of which are located in forested areas. This study analyses the relationship between deforestation taking place inside LSLA areas(usually for agricultural purpose) and the likelihood of landslides occurrence in the Zambezia province in Mozambique. To this aim we use a spatially distributed and physically based model that couples slope stability analysis with a hillslope scale hydrological model and we compare the change in slope stability associated the forest loss documented by satellite imagery.

  12. High Speed Networking and Large-scale Simulation in Geodynamics

    Science.gov (United States)

    Kuang, Weijia; Gary, Patrick; Seablom, Michael; Truszkowski, Walt; Odubiyi, Jide; Jiang, Weiyuan; Liu, Dong

    2004-01-01

    Large-scale numerical simulation has been one of the most important approaches for understanding global geodynamical processes. In this approach, peta-scale floating point operations (pflops) are often required to carry out a single physically-meaningful numerical experiment. For example, to model convective flow in the Earth's core and generation of the geomagnetic field (geodynamo), simulation for one magnetic free-decay time (approximately 15000 years) with a modest resolution of 150 in three spatial dimensions would require approximately 0.2 pflops. If such a numerical model is used to predict geomagnetic secular variation over decades and longer, with e.g. an ensemble Kalman filter assimilation approach, approximately 30 (and perhaps more) independent simulations of similar scales would be needed for one data assimilation analysis. Obviously, such a simulation would require an enormous computing resource that exceeds the capacity of a single facility currently available at our disposal. One solution is to utilize a very fast network (e.g. 10Gb optical networks) and available middleware (e.g. Globus Toolkit) to allocate available but often heterogeneous resources for such large-scale computing efforts. At NASA GSFC, we are experimenting with such an approach by networking several clusters for geomagnetic data assimilation research. We shall present our initial testing results in the meeting.

  13. Literature Review: Herbal Medicine Treatment after Large-Scale Disasters.

    Science.gov (United States)

    Takayama, Shin; Kaneko, Soichiro; Numata, Takehiro; Kamiya, Tetsuharu; Arita, Ryutaro; Saito, Natsumi; Kikuchi, Akiko; Ohsawa, Minoru; Kohayagawa, Yoshitaka; Ishii, Tadashi

    2017-09-27

    Large-scale natural disasters, such as earthquakes, tsunamis, volcanic eruptions, and typhoons, occur worldwide. After the Great East Japan earthquake and tsunami, our medical support operation's experiences suggested that traditional medicine might be useful for treating the various symptoms of the survivors. However, little information is available regarding herbal medicine treatment in such situations. Considering that further disasters will occur, we performed a literature review and summarized the traditional medicine approaches for treatment after large-scale disasters. We searched PubMed and Cochrane Library for articles written in English, and Ichushi for those written in Japanese. Articles published before 31 March 2016 were included. Keywords "disaster" and "herbal medicine" were used in our search. Among studies involving herbal medicine after a disaster, we found two randomized controlled trials investigating post-traumatic stress disorder (PTSD), three retrospective investigations of trauma or common diseases, and seven case series or case reports of dizziness, pain, and psychosomatic symptoms. In conclusion, herbal medicine has been used to treat trauma, PTSD, and other symptoms after disasters. However, few articles have been published, likely due to the difficulty in designing high quality studies in such situations. Further study will be needed to clarify the usefulness of herbal medicine after disasters.

  14. The effect of large scale inhomogeneities on the luminosity distance

    Science.gov (United States)

    Brouzakis, Nikolaos; Tetradis, Nikolaos; Tzavara, Eleftheria

    2007-02-01

    We study the form of the luminosity distance as a function of redshift in the presence of large scale inhomogeneities, with sizes of order 10 Mpc or larger. We approximate the Universe through the Swiss-cheese model, with each spherical region described by the Lemaitre Tolman Bondi metric. We study the propagation of light beams in this background, assuming that the locations of the source and the observer are random. We derive the optical equations for the evolution of the beam area and shear. Through their integration we determine the configurations that can lead to an increase of the luminosity distance relative to the homogeneous cosmology. We find that this can be achieved if the Universe is composed of spherical void-like regions, with matter concentrated near their surface. For inhomogeneities consistent with the observed large scale structure, the relative increase of the luminosity distance is of the order of a few per cent at redshifts near 1, and falls short of explaining the substantial increase required by the supernova data. On the other hand, the effect we describe is important for the correct determination of the energy content of the Universe from observations.

  15. Large Scale and Performance tests of the ATLAS Online Software

    Institute of Scientific and Technical Information of China (English)

    Alexandrov; H.Wolters; 等

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system.It encompasses the functionality needed to configure,control and monitor the DAQ.Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal.Resular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system.Feedback is received and returned into the development process.Studies of the system.behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size,Large scale and performance tests of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software.Of particular interest were the run control state transitions in various configurations of the run control hierarchy.For the purpose of the tests,the software from other Trigger/DAQ sub-systems has been emulated.This paper presents a brief overview of the online system structure,its components and the large scale integration tests and their results.

  16. Modulation of energetic coherent motions by large-scale topography

    Science.gov (United States)

    Lai, Wing; Hamed, Ali M.; Troolin, Dan; Chamorro, Leonardo P.

    2016-11-01

    The distinctive characteristics and dynamics of the large-scale coherent motions induced over 2D and 3D large-scale wavy walls were explored experimentally with time-resolved volumetric PIV, and selected wall-normal high-resolution stereo PIV in a refractive-index-matching channel. The 2D wall consists of a sinusoidal wave in the streamwise direction with amplitude to wavelength ratio a/ λx = 0.05, while the 3D wall has an additional wave in the spanwise direction with a/ λy = 0.1. The ?ow was characterized at Re 8000, based on the bulk velocity and the channel half height. The walls are such that the amplitude to boundary layer thickness ratio is a/ δ99 0.1, which resemble geophysical-like topography. Insight on the dynamics of the coherent motions, Reynolds stress and spatial interaction of sweep and ejection events will be discussed in terms of the wall topography modulation.

  17. Large Scale Organization of a Near Wall Turbulent Boundary Layer

    Science.gov (United States)

    Stanislas, Michel; Dekou Tiomajou, Raoul Florent; Foucaut, Jean Marc

    2016-11-01

    This study lies in the context of large scale coherent structures investigation in a near wall turbulent boundary layer. An experimental database at high Reynolds numbers (Re θ = 9830 and Re θ = 19660) was obtained in the LML wind tunnel with stereo-PIV at 4 Hz and hot wire anemometry at 30 kHz. A Linear Stochastic Estimation procedure, is used to reconstruct a 3 component field resolved in space and time. Algorithms were developed to extract coherent structures from the reconstructed field. A sample of 3D view of the structures is depicted in Figure 1. Uniform momentum regions are characterized with their mean hydraulic diameter in the YZ plane, their life time and their contribution to Reynolds stresses. The vortical motions are characterized by their position, radius, circulation and vorticity in addition to their life time and their number computed at a fixed position from the wall. The spatial organization of the structures was investigated through a correlation of their respective indicative functions in the spanwise direction. The simplified large scale model that arise is compared to the ones available in the literature. Streamwise low (green) and high (yellow) uniform momentum regions with positive (red) and negative (blue) vortical motions. This work was supported by Campus International pour la Sécurité et l'Intermodalité des Transports.

  18. Large-scale direct shear testing of geocell reinforced soil

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tests on the shear property of geocell reinforced soils were carried out by using large-scale direct shear equipment with shear-box-dimensions of 500 mm×500 mm×400 mm (length×width×height).Three types of specimens,silty gravel soil,geoceli reinforced silty gravel soil and geoceli reinforood cement stabilizing silty gravel soil were used to investigate the shear stress-displacement behavior,the shear strength and the strengthening mechanism of geocell reinforced soils.The comparisons of large-scale shear test with triaxial compression test for the same type of soil were conducted to evaluate the influences of testing method on the shear strength as well.The test results show that the unreinforced soil and geocell reinforced soil give similar nonlinear features on the behavior of shear stress and displacement.The geocell reinforced cement stabilizing soil has a quasi-elastic characteristic in the case of normal stress coming up to 1.0 GPa.The tests with the reinforcement of geocell result in an increase of 244% in cohesion,and the tests with the geocell and the cement stabilization result in an increase of 10 times in cohesion compared with the unreinforced soil.The friction angle does not change markedly.The geocell reinforcement develops a large amount of cohesion on the shear strength of soils.

  19. Brief Mental Training Reorganizes Large-Scale Brain Networks.

    Science.gov (United States)

    Tang, Yi-Yuan; Tang, Yan; Tang, Rongxiang; Lewis-Peacock, Jarrod A

    2017-01-01

    Emerging evidences have shown that one form of mental training-mindfulness meditation, can improve attention, emotion regulation and cognitive performance through changing brain activity and structural connectivity. However, whether and how the short-term mindfulness meditation alters large-scale brain networks are not well understood. Here, we applied a novel data-driven technique, the multivariate pattern analysis (MVPA) to resting-state fMRI (rsfMRI) data to identify changes in brain activity patterns and assess the neural mechanisms induced by a brief mindfulness training-integrative body-mind training (IBMT), which was previously reported in our series of randomized studies. Whole brain rsfMRI was performed on an undergraduate group who received 2 weeks of IBMT with 30 min per session (5 h training in total). Classifiers were trained on measures of functional connectivity in this fMRI data, and they were able to reliably differentiate (with 72% accuracy) patterns of connectivity from before vs. after the IBMT training. After training, an increase in positive functional connections (60 connections) were detected, primarily involving bilateral superior/middle occipital gyrus, bilateral frontale operculum, bilateral superior temporal gyrus, right superior temporal pole, bilateral insula, caudate and cerebellum. These results suggest that brief mental training alters the functional connectivity of large-scale brain networks at rest that may involve a portion of the neural circuitry supporting attention, cognitive and affective processing, awareness and sensory integration and reward processing.

  20. Large-scale Direct Targeting for Drug Repositioning and Discovery

    Science.gov (United States)

    Zheng, Chunli; Guo, Zihu; Huang, Chao; Wu, Ziyin; Li, Yan; Chen, Xuetong; Fu, Yingxue; Ru, Jinlong; Ali Shar, Piar; Wang, Yuan; Wang, Yonghua

    2015-01-01

    A system-level identification of drug-target direct interactions is vital to drug repositioning and discovery. However, the biological means on a large scale remains challenging and expensive even nowadays. The available computational models mainly focus on predicting indirect interactions or direct interactions on a small scale. To address these problems, in this work, a novel algorithm termed weighted ensemble similarity (WES) has been developed to identify drug direct targets based on a large-scale of 98,327 drug-target relationships. WES includes: (1) identifying the key ligand structural features that are highly-related to the pharmacological properties in a framework of ensemble; (2) determining a drug’s affiliation of a target by evaluation of the overall similarity (ensemble) rather than a single ligand judgment; and (3) integrating the standardized ensemble similarities (Z score) by Bayesian network and multi-variate kernel approach to make predictions. All these lead WES to predict drug direct targets with external and experimental test accuracies of 70% and 71%, respectively. This shows that the WES method provides a potential in silico model for drug repositioning and discovery. PMID:26155766

  1. Large-scale magnetic fields in magnetohydrodynamic turbulence.

    Science.gov (United States)

    Alexakis, Alexandros

    2013-02-22

    High Reynolds number magnetohydrodynamic turbulence in the presence of zero-flux large-scale magnetic fields is investigated as a function of the magnetic field strength. For a variety of flow configurations, the energy dissipation rate [symbol: see text] follows the scaling [Symbol: see text] proportional U(rms)(3)/ℓ even when the large-scale magnetic field energy is twenty times larger than the kinetic energy. A further increase of the magnetic energy showed a transition to the [Symbol: see text] proportional U(rms)(2) B(rms)/ℓ scaling implying that magnetic shear becomes more efficient at this point at cascading the energy than the velocity fluctuations. Strongly helical configurations form nonturbulent helicity condensates that deviate from these scalings. Weak turbulence scaling was absent from the investigation. Finally, the magnetic energy spectra support the Kolmogorov spectrum k(-5/3) while kinetic energy spectra are closer to the Iroshnikov-Kraichnan spectrum k(-3/2) as observed in the solar wind.

  2. Using Large Scale Structure to test Multifield Inflation

    CERN Document Server

    Ferraro, Simone

    2014-01-01

    Primordial non-Gaussianity of local type is known to produce a scale-dependent contribution to the galaxy bias. Several classes of multi-field inflationary models predict non-Gaussian bias which is stochastic, in the sense that dark matter and halos don't trace each other perfectly on large scales. In this work, we forecast the ability of next-generation Large Scale Structure surveys to constrain common types of primordial non-Gaussianity like $f_{NL}$, $g_{NL}$ and $\\tau_{NL}$ using halo bias, including stochastic contributions. We provide fitting functions for statistical errors on these parameters which can be used for rapid forecasting or survey optimization. A next-generation survey with volume $V = 25 h^{-3}$Mpc$^3$, median redshift $z = 0.7$ and mean bias $b_g = 2.5$, can achieve $\\sigma(f_{NL}) = 6$, $\\sigma(g_{NL}) = 10^5$ and $\\sigma(\\tau_{NL}) = 10^3$ if no mass information is available. If halo masses are available, we show that optimally weighting the halo field in order to reduce sample variance...

  3. Large-scale magnetic topologies of mid-M dwarfs

    CERN Document Server

    Morin, J; Petit, P; Delfosse, X; Forveille, T; Albert, L; Aurière, M; Cabanac, R; Dintrans, B; Fares, R; Gastine, T; Jardine, M M; Lignières, F; Paletou, F; Velez, J C Ramirez; Théado, S

    2008-01-01

    We present in this paper the first results of a spectropolarimetric analysis of a small sample (~ 20) of active stars ranging from spectral type M0 to M8, which are either fully-convective or possess a very small radiative core. This study aims at providing new constraints on dynamo processes in fully-convective stars. The present paper focuses on 5 stars of spectral type ~M4, i.e. with masses close to the full convection threshold (~ 0.35 Msun), and with short rotational periods. Tomographic imaging techniques allow us to reconstruct the surface magnetic topologies from the rotationally modulated time-series of circularly polarised profiles. We fnd that all stars host mainly axisymmetric large-scale poloidal fields. Three stars were observed at two different epochs separated by ~1 yr; we find the magnetic topologies to be globally stable on this timescale. We also provide an accurate estimation of the rotational period of all stars, thus allowing us to start studying how rotation impacts the large-scale magn...

  4. Exploring Cloud Computing for Large-scale Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Han, Binh; Yin, Jian; Gorton, Ian

    2013-06-27

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address these challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.

  5. The Impact of Large Scale Environments on Cluster Entropy Profiles

    Science.gov (United States)

    Trierweiler, Isabella; Su, Yuanyuan

    2017-01-01

    We perform a systematic analysis of 21 clusters imaged by the Suzaku satellite to determine the relation between the richness of cluster environments and entropy at large radii. Entropy profiles for clusters are expected to follow a power-law, but Suzaku observations show that the entropy profiles of many clusters are significantly flattened beyond 0.3 Rvir. While the entropy at the outskirts of clusters is thought to be highly dependent on the large scale cluster environment, the exact nature of the environment/entropy relation is unclear. Using the Sloan Digital Sky Survey and 6dF Galaxy Survey, we study the 20 Mpc large scale environment for all clusters in our sample. We find no strong relation between the entropy deviations at the virial radius and the total luminosity of the cluster surroundings, indicating that accretion and mergers have a more complex and indirect influence on the properties of the gas at large radii. We see a possible anti-correlation between virial temperature and richness of the cluster environment and find that density excess appears to play a larger role in the entropy flattening than temperature, suggesting that clumps of gas can lower entropy.

  6. Solving Large Scale Structure in Ten Easy Steps with COLA

    CERN Document Server

    Tassev, Svetlin; Eisenstein, Daniel

    2013-01-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100Mpc/h with particles of mass ~5*10^9Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10^11Msolar/h. This is only at a modest speed penalty when compared to mocks obt...

  7. IP over optical multicasting for large-scale video delivery

    Science.gov (United States)

    Jin, Yaohui; Hu, Weisheng; Sun, Weiqiang; Guo, Wei

    2007-11-01

    In the IPTV systems, multicasting will play a crucial role in the delivery of high-quality video services, which can significantly improve bandwidth efficiency. However, the scalability and the signal quality of current IPTV can barely compete with the existing broadcast digital TV systems since it is difficult to implement large-scale multicasting with end-to-end guaranteed quality of service (QoS) in packet-switched IP network. China 3TNet project aimed to build a high performance broadband trial network to support large-scale concurrent streaming media and interactive multimedia services. The innovative idea of 3TNet is that an automatic switched optical networks (ASON) with the capability of dynamic point-to-multipoint (P2MP) connections replaces the conventional IP multicasting network in the transport core, while the edge remains an IP multicasting network. In this paper, we will introduce the network architecture and discuss challenges in such IP over Optical multicasting for video delivery.

  8. A Novel Approach Towards Large Scale Cross-Media Retrieval

    Institute of Scientific and Technical Information of China (English)

    Bo Lu; Guo-Ren Wang; Ye Yuan

    2012-01-01

    With the rapid development of Internet and multimedia technology,cross-media retrieval is concerned to retrieve all the related media objects with multi-modality by submitting a query media object.Unfortunately,the complexity and the heterogeneity of multi-modality have posed the following two major challenges for cross-media retrieval:1) how to construct a unified and compact model for media objects with multi-modality,2) how to improve the performance of retrieval for large scale cross-media database.In this paper,we propose a novel method which is dedicate to solving these issues to achieve effective and accurate cross-media retrieval.Firstly,a multi-modality semantic relationship graph (MSRG) is constructed using the semantic correlation amongst the media objects with multi-modality.Secondly,all the media objects in MSRG are mapped onto an isomorphic semantic space.Further,an efficient indexing MK-tree based on heterogeneous data distribution is proposed to manage the media objects within the semantic space and improve the performance of cross-media retrieval.Extensive experiments on real large scale cross-media datasets indicate that our proposal dramatically improves the accuracy and efficiency of cross-media retrieval,outperforming the existing methods significantly.

  9. Large scale petroleum reservoir simulation and parallel preconditioning algorithms research

    Institute of Scientific and Technical Information of China (English)

    SUN Jiachang; CAO Jianwen

    2004-01-01

    Solving large scale linear systems efficiently plays an important role in a petroleum reservoir simulator, and the key part is how to choose an effective parallel preconditioner. Properly choosing a good preconditioner has been beyond the pure algebraic field. An integrated preconditioner should include such components as physical background, characteristics of PDE mathematical model, nonlinear solving method, linear solving algorithm, domain decomposition and parallel computation. We first discuss some parallel preconditioning techniques, and then construct an integrated preconditioner, which is based on large scale distributed parallel processing, and reservoir simulation-oriented. The infrastructure of this preconditioner contains such famous preconditioning construction techniques as coarse grid correction, constraint residual correction and subspace projection correction. We essentially use multi-step means to integrate totally eight types of preconditioning components in order to give out the final preconditioner. Million-grid cell scale industrial reservoir data were tested on native high performance computers. Numerical statistics and analyses show that this preconditioner achieves satisfying parallel efficiency and acceleration effect.

  10. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    Science.gov (United States)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  11. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  12. Large-scale columnar vortices in rotating turbulence

    Science.gov (United States)

    Yokoyama, Naoto; Takaoka, Masanori

    2016-11-01

    In the rotating turbulence, flow structures are affected by the angular velocity of the system's rotation. When the angular velocity is small, three-dimensional statistically-isotropic flow, which has the Kolmogorov spectrum all over the inertial subrange, is formed. When the angular velocity increases, the flow becomes two-dimensional anisotropic, and the energy spectrum has a power law k-2 in the small wavenumbers in addition to the Kolmogorov spectrum in the large wavenumbers. When the angular velocity decreases, the flow returns to the isotropic one. It is numerically found that the transition between the isotropic and anisotropic flows is hysteretic; the critical angular velocity at which the flow transitions from the anisotropic one to the isotropic one, and that of the reverse transition are different. It is also observed that the large-scale columnar structures in the anisotropic flow depends on the external force which maintains a statistically-steady state. In some cases, small-scale anticyclonic structures are aligned in a columnar structure apart from the cyclonic Taylor column. The formation mechanism of the large-scale columnar structures will be discussed. This work was partially supported by JSPS KAKENHI.

  13. EVALUATING UNMANNED AERIAL PLATFORMS FOR CULTURAL HERITAGE LARGE SCALE MAPPING

    Directory of Open Access Journals (Sweden)

    A. Georgopoulos

    2016-06-01

    Full Text Available When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  14. Evaluating large scale orthophotos derived from high resolution satellite imagery

    Science.gov (United States)

    Ioannou, Maria Teresa; Georgopoulos, Andreas

    2013-08-01

    For the purposes of a research project, for the compilation of the archaeological and environmental digital map of the island of Antiparos, the production of updated large scale orthophotos was required. Hence suitable stereoscopic high resolution satellite imagery was acquired. Two Geoeye-1 stereopairs were enough to cover this small island of the Cyclades complex in the central Aegean. For the orientation of the two stereopairs numerous ground control points were determined using GPS observations. Some of them would also serve as check points. The images were processed using commercial stereophotogrammetric software suitable to process satellite stereoscopic imagery. The results of the orientations are evaluated and the digital terrain model was produced using automated and manual procedures. The DTM was checked both internally and externally with comparison to other available DTMs. In this paper the procedures for producing the desired orthophotography are critically presented and the final result is compared and evaluated for its accuracy, completeness and efficiency. The final product is also compared against the orthophotography produced by Ktimatologio S.A. using aerial images in 2007. The orthophotography produced has been evaluated metrically using the available check points, while qualitative evaluation has also been performed. The results are presented and a critical approach for the usability of satellite imagery for the production of large scale orthophotos is attempted.

  15. A study of synthetic large scales in turbulent boundary layers

    Science.gov (United States)

    Duvvuri, Subrahmanyam; Luhar, Mitul; Barnard, Casey; Sheplak, Mark; McKeon, Beverley

    2013-11-01

    Synthetic spanwise-constant spatio-temporal disturbances are excited in a turbulent boundary layer through a spatially impulsive patch of dynamic wall-roughness. The downstream flow response is studied through hot wire anemometry, pressure measurements at the wall and direct measurements of wall-shear-stress made using a novel micro-machined capacitive floating element sensor. These measurements are phase-locked to the input perturbation to recover the synthetic large-scale motion and characterize its structure and wall signature. The phase relationship between the synthetic large scale and small scale activity provides further insights into the apparent amplitude modulation effect between them, and the dynamics of wall-bounded turbulent flows in general. Results from these experiments will be discussed in the context of the critical-layer behavior revealed by the resolvent analysis of McKeon & Sharma (J Fluid Mech, 2010), and compared with similar earlier work by Jacobi & McKeon (J Fluid Mech, 2011). Model predictions are shown to be in broad agreement with experiments. The support of AFOSR grant #FA 9550-12-1-0469, Resnick Institute Graduate Research Fellowship (S.D.) and Sandia Graduate Fellowship (C.B.) are gratefully acknowledged.

  16. Brief Mental Training Reorganizes Large-Scale Brain Networks

    Science.gov (United States)

    Tang, Yi-Yuan; Tang, Yan; Tang, Rongxiang; Lewis-Peacock, Jarrod A.

    2017-01-01

    Emerging evidences have shown that one form of mental training—mindfulness meditation, can improve attention, emotion regulation and cognitive performance through changing brain activity and structural connectivity. However, whether and how the short-term mindfulness meditation alters large-scale brain networks are not well understood. Here, we applied a novel data-driven technique, the multivariate pattern analysis (MVPA) to resting-state fMRI (rsfMRI) data to identify changes in brain activity patterns and assess the neural mechanisms induced by a brief mindfulness training—integrative body–mind training (IBMT), which was previously reported in our series of randomized studies. Whole brain rsfMRI was performed on an undergraduate group who received 2 weeks of IBMT with 30 min per session (5 h training in total). Classifiers were trained on measures of functional connectivity in this fMRI data, and they were able to reliably differentiate (with 72% accuracy) patterns of connectivity from before vs. after the IBMT training. After training, an increase in positive functional connections (60 connections) were detected, primarily involving bilateral superior/middle occipital gyrus, bilateral frontale operculum, bilateral superior temporal gyrus, right superior temporal pole, bilateral insula, caudate and cerebellum. These results suggest that brief mental training alters the functional connectivity of large-scale brain networks at rest that may involve a portion of the neural circuitry supporting attention, cognitive and affective processing, awareness and sensory integration and reward processing.

  17. The large scale magnetic fields of thin accretion disks

    CERN Document Server

    Cao, Xinwu

    2013-01-01

    Large scale magnetic field threading an accretion disk is a key ingredient in the jet formation model. The most attractive scenario for the origin of such a large scale field is the advection of the field by the gas in the accretion disk from the interstellar medium or a companion star. However, it is realized that outward diffusion of the accreted field is fast compared to the inward accretion velocity in a geometrically thin accretion disk if the value of the Prandtl number Pm is around unity. In this work, we revisit this problem considering the angular momentum of the disk is removed predominantly by the magnetically driven outflows. The radial velocity of the disk is significantly increased due to the presence of the outflows. Using a simplified model for the vertical disk structure, we find that even moderately weak fields can cause sufficient angular momentum loss via a magnetic wind to balance outward diffusion. There are two equilibrium points, one at low field strengths corresponding to a plasma-bet...

  18. Summarizing Large-Scale Database Schema Using Community Detection

    Institute of Scientific and Technical Information of China (English)

    Xue Wang; Xuan Zhou; Shan Wang

    2012-01-01

    Schema summarization on large-scale databases is a challenge.In a typical large database schema,a great proportion of the tables are closely connected through a few high degree tables.It is thus difficult to separate these tables into clusters that represent different topics.Moreover,as a schema can be very big,the schema summary needs to be structured into multiple levels,to further improve the usability.In this paper,we introduce a new schema summarization approach utilizing the techniques of community detection in social networks.Our approach contains three steps.First,we use a community detection algorithm to divide a database schema into subject groups,each representing a specific subject.Second,we cluster the subject groups into abstract domains to form a multi-level navigation structure.Third,we discover representative tables in each cluster to label the schema summary.We evaluate our approach on Freebase,a real world large-scale database.The results show that our approach can identify subject groups precisely.The generated abstract schema layers are very helpful for users to explore database.

  19. Modelling large-scale halo bias using the bispectrum

    CERN Document Server

    Pollack, Jennifer E; Porciani, Cristiano

    2011-01-01

    We study the relation between the halo and matter density fields -- commonly termed bias -- in the LCDM framework. In particular, we examine the local model of biasing at quadratic order in matter density. This model is characterized by parameters b_1 and b_2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales and find that the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo power spectra and construct estimates for an effective large-scale bias. We measure the configuration dependence of the halo bispectra B_hhh and reduced bispectra Q_hhh for very large-scale k-space triangles. From this we constrain b_1 and b_2. Using the lowest-order perturbation theory, we find that for B_hhh the...

  20. Large-Scale Mass Distribution in the Illustris-Simulation

    CERN Document Server

    Haider, Markus; Vogelsberger, Mark; Genel, Shy; Springel, Volker; Torrey, Paul; Hernquist, Lars

    2015-01-01

    Observations at low redshifts thus far fail to account for all of the baryons expected in the Universe according to cosmological constraints. A large fraction of the baryons presumably resides in a thin and warm-hot medium between the galaxies, where they are difficult to observe due to their low densities and high temperatures. Cosmological simulations of structure formation can be used to verify this picture and provide quantitative predictions for the distribution of mass in different large-scale structure components. Here we study the distribution of baryons and dark matter at different epochs using data from the Illustris Simulation. We identify regions of different dark matter density with the primary constituents of large-scale structure, allowing us to measure mass and volume of haloes, filaments and voids. At redshift zero, we find that 49 % of the dark matter and 23 % of the baryons are within haloes. The filaments of the cosmic web host a further 45 % of the dark matter and 46 % of the baryons. The...

  1. Halo detection via large-scale Bayesian inference

    Science.gov (United States)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  2. The combustion behavior of large scale lithium titanate battery

    Science.gov (United States)

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  3. Large-scale climatic control on European precipitation

    Science.gov (United States)

    Lavers, David; Prudhomme, Christel; Hannah, David

    2010-05-01

    Precipitation variability has a significant impact on society. Sectors such as agriculture and water resources management are reliant on predictable and reliable precipitation supply with extreme variability having potentially adverse socio-economic impacts. Therefore, understanding the climate drivers of precipitation is of human relevance. This research examines the strength, location and seasonality of links between precipitation and large-scale Mean Sea Level Pressure (MSLP) fields across Europe. In particular, we aim to evaluate whether European precipitation is correlated with the same atmospheric circulation patterns or if there is a strong spatial and/or seasonal variation in the strength and location of centres of correlations. The work exploits time series of gridded ERA-40 MSLP on a 2.5˚×2.5˚ grid (0˚N-90˚N and 90˚W-90˚E) and gridded European precipitation from the Ensemble project on a 0.5°×0.5° grid (36.25˚N-74.25˚N and 10.25˚W-24.75˚E). Monthly Spearman rank correlation analysis was performed between MSLP and precipitation. During winter, a significant MSLP-precipitation correlation dipole pattern exists across Europe. Strong negative (positive) correlation located near the Icelandic Low and positive (negative) correlation near the Azores High pressure centres are found in northern (southern) Europe. These correlation dipoles resemble the structure of the North Atlantic Oscillation (NAO). The reversal in the correlation dipole patterns occurs at the latitude of central France, with regions to the north (British Isles, northern France, Scandinavia) having a positive relationship with the NAO, and regions to the south (Italy, Portugal, southern France, Spain) exhibiting a negative relationship with the NAO. In the lee of mountain ranges of eastern Britain and central Sweden, correlation with North Atlantic MSLP is reduced, reflecting a reduced influence of westerly flow on precipitation generation as the mountains act as a barrier to moist

  4. Large-scale shielding structures in low earth orbits

    Science.gov (United States)

    Panov, D. V.; Silnikov, M. V.; Mikhaylin, A. I.; Rubzov, I. S.; Nosikov, V. B.; Minenko, E. Yu.; Murtazin, D. A.

    2015-04-01

    The problems involved in the design-engineering digital simulation of large-size transformable-screen constructions for protecting spacecraft and equipment from space debris and meteoroids were considered. The engineering principles used to improve the design and efficiency of protective screens are presented. The use of embedded matrix transducers located all over the composite material used for armor tiles is proposed for the construction of protective clad screens; this approach enables efficient detection of damaged areas of the protective screen, the assessment of the level of damage, and the prediction of damage to spacecraft and equipment structures.

  5. Probing large-scale structure with radio observations

    Science.gov (United States)

    Brown, Shea D.

    This thesis focuses on detecting magnetized relativistic plasma in the intergalactic medium (IGM) of filamentary large-scale structure (LSS) by observing synchrotron emission emitted by structure formation shocks. Little is known about the IGM beyond the largest clusters of galaxies, and synchrotron emission holds enormous promise as a means of probing magnetic fields and relativistic particle populations in these low density regions. I'll first report on observations taken at the Very Large Array and the Westerbork Synthesis Radio Telescope of the diffuse radio source 0809+39. I use these observations to demonstrate that 0809+39 is likely the first "radio relic" discovered that is not associated with a rich |"X-ray emitting cluster of galaxies. I then demonstrate that an unconventional reprocessing of the NVSS polarization survey can reveal structures on scales from 15' to hundreds of degrees, far larger than the nominal shortest-baseline scale. This yields hundreds of new diffuse sources as well as the identification of a new nearby galactic loop . These observations also highlight the major obstacle that diffuse galactic foreground emission poses for any search for large-scale, low surface- brightness extragalactic emission. I therefore explore the cross-correlation of diffuse radio emission with optical tracers of LSS as a means of statistically detecting the presence of magnetic fields in the low-density regions of the cosmic web. This initial study with the Bonn 1.4 GHz radio survey yields an upper limit of 0.2 mG for large-scale filament magnetic fields. Finally, I report on new Green Bank Telescope and Westerbork Synthesis Radio Telescope observations of the famous Coma cluster of galaxies. Major findings include an extension to the Coma cluster radio relic source 1253+275 which makes its total extent ~2 Mpc, as well as a sharp edge, or "front", on the Western side of the radio halo which shows a strong correlation with merger activity associated with an

  6. The predictability of large-scale wind-driven flows

    Directory of Open Access Journals (Sweden)

    A. Mahadevan

    2001-01-01

    Full Text Available The singular values associated with optimally growing perturbations to stationary and time-dependent solutions for the general circulation in an ocean basin provide a measure of the rate at which solutions with nearby initial conditions begin to diverge, and hence, a measure of the predictability of the flow. In this paper, the singular vectors and singular values of stationary and evolving examples of wind-driven, double-gyre circulations in different flow regimes are explored. By changing the Reynolds number in simple quasi-geostrophic models of the wind-driven circulation, steady, weakly aperiodic and chaotic states may be examined. The singular vectors of the steady state reveal some of the physical mechanisms responsible for optimally growing perturbations. In time-dependent cases, the dominant singular values show significant variability in time, indicating strong variations in the predictability of the flow. When the underlying flow is weakly aperiodic, the dominant singular values co-vary with integral measures of the large-scale flow, such as the basin-integrated upper ocean kinetic energy and the transport in the western boundary current extension. Furthermore, in a reduced gravity quasi-geostrophic model of a weakly aperiodic, double-gyre flow, the behaviour of the dominant singular values may be used to predict a change in the large-scale flow, a feature not shared by an analogous two-layer model. When the circulation is in a strongly aperiodic state, the dominant singular values no longer vary coherently with integral measures of the flow. Instead, they fluctuate in a very aperiodic fashion on mesoscale time scales. The dominant singular vectors then depend strongly on the arrangement of mesoscale features in the flow and the evolved forms of the associated singular vectors have relatively short spatial scales. These results have several implications. In weakly aperiodic, periodic, and stationary regimes, the mesoscale energy

  7. Large-Scale Graphene Film Deposition for Monolithic Device Fabrication

    Science.gov (United States)

    Al-shurman, Khaled

    Since 1958, the concept of integrated circuit (IC) has achieved great technological developments and helped in shrinking electronic devices. Nowadays, an IC consists of more than a million of compacted transistors. The majority of current ICs use silicon as a semiconductor material. According to Moore's law, the number of transistors built-in on a microchip can be double every two years. However, silicon device manufacturing reaches its physical limits. To explain, there is a new trend to shrinking circuitry to seven nanometers where a lot of unknown quantum effects such as tunneling effect can not be controlled. Hence, there is an urgent need for a new platform material to replace Si. Graphene is considered a promising material with enormous potential applications in many electronic and optoelectronics devices due to its superior properties. There are several techniques to produce graphene films. Among these techniques, chemical vapor deposition (CVD) offers a very convenient method to fabricate films for large-scale graphene films. Though CVD method is suitable for large area growth of graphene, the need for transferring a graphene film to silicon-based substrates is required. Furthermore, the graphene films thus achieved are, in fact, not single crystalline. Also, graphene fabrication utilizing Cu and Ni at high growth temperature contaminates the substrate that holds Si CMOS circuitry and CVD chamber as well. So, lowering the deposition temperature is another technological milestone for the successful adoption of graphene in integrated circuits fabrication. In this research, direct large-scale graphene film fabrication on silicon based platform (i.e. SiO2 and Si3N4) at low temperature was achieved. With a focus on low-temperature graphene growth, hot-filament chemical vapor deposition (HF-CVD) was utilized to synthesize graphene film using 200 nm thick nickel film. Raman spectroscopy was utilized to examine graphene formation on the bottom side of the Ni film

  8. Climatological context for large-scale coral bleaching

    Science.gov (United States)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  9. Screening CO

    NARCIS (Netherlands)

    Ramírez, A.; Hagedoorn, S.; Kramers, L.; Wildenborg, T.; Hendriks, C.

    2010-01-01

    This paper describes the development and application of a methodology to screen and rank Dutch reservoirs suitable for long-term large scale CO2 storage. The screening focuses on off- and on-shore individual aquifers, gas and oil fields. In total 176 storage reservoirs have been taken int

  10. Large-scale structure non-Gaussianities with modal methods

    Science.gov (United States)

    Schmittfull, Marcel

    2016-10-01

    Relying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ~ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).

  11. Order reduction of large-scale linear oscillatory system models

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowksi, D.J. (Pacific Northwest Lab., Richland, WA (United States))

    1994-02-01

    Eigen analysis and signal analysis techniques of deriving representations of power system oscillatory dynamics result in very high-order linear models. In order to apply many modern control design methods, the models must be reduced to a more manageable order while preserving essential characteristics. Presented in this paper is a model reduction method well suited for large-scale power systems. The method searches for the optimal subset of the high-order model that best represents the system. An Akaike information criterion is used to define the optimal reduced model. The method is first presented, and then examples of applying it to Prony analysis and eigenanalysis models of power systems are given.

  12. Solving large scale traveling salesman problems by chaotic neurodynamics.

    Science.gov (United States)

    Hasegawa, Mikio; Ikeguch, Tohru; Aihara, Kazuyuki

    2002-03-01

    We propose a novel approach for solving large scale traveling salesman problems (TSPs) by chaotic dynamics. First, we realize the tabu search on a neural network, by utilizing the refractory effects as the tabu effects. Then, we extend it to a chaotic neural network version. We propose two types of chaotic searching methods, which are based on two different tabu searches. While the first one requires neurons of the order of n2 for an n-city TSP, the second one requires only n neurons. Moreover, an automatic parameter tuning method of our chaotic neural network is presented for easy application to various problems. Last, we show that our method with n neurons is applicable to large TSPs such as an 85,900-city problem and exhibits better performance than the conventional stochastic searches and the tabu searches.

  13. The XMM/Megacam-VST/VIRMOS Large Scale Structure Survey

    CERN Document Server

    Pierre, M

    2000-01-01

    The objective of the XMM-LSS Survey is to map the large scale structure of the universe, as highlighted by clusters and groups of galaxies, out to a redshift of about 1, over a single 8x8 sq.deg. area. For the first time, this will reveal the topology of the distribution of the deep potential wells and provide statistical measurements at truly cosmological distances. In addition, clusters identified via their X-ray properties will form the basis for the first uniformly-selected, multi-wavelength survey of the evolution of clusters and individual cluster galaxies as a function of redshift. The survey will also address the very important question of the QSO distribution within the cosmic web.

  14. Chirping for large-scale maritime archaeological survey

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2014-01-01

    Archaeological wrecks exposed on the sea floor are mapped using side-scan and multibeam techniques, whereas the detection of submerged archaeological sites, such as Stone Age settlements, and wrecks, partially or wholly embedded in sea-floor sediments, requires the application of high...... those employed in several detailed studies of known wreck sites and from the way in which geologists map the sea floor and the geological column beneath it. The strategy has been developed on the basis of extensive practical experience gained during the use of an off-the-shelf 2D chirp system and, given......-resolution subbottom profilers. This paper presents a strategy for cost-effective, large-scale mapping of previously undetected sediment-embedded sites and wrecks based on subbottom profiling with chirp systems. The mapping strategy described includes (a) definition of line spacing depending on the target; (b...

  15. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other......Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... institutional settings. Policies mandating LSRFs should consider that research prioritized on the basis of technological relevance limits the international reach of collaborations. Additionally, the propensity for international collaboration is lower for resident scientists than for those affiliated...

  16. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Marcello [Univ. of Toronto, ON (Canada); Baldauf, T. [Inst. of Advanced Studies, Princeton, NJ (United States); Bond, J. Richard [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Dalal, N. [Univ. of Illinois, Urbana-Champaign, IL (United States); Putter, R. D. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Dore, O. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Green, Daniel [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Hirata, Chris [The Ohio State Univ., Columbus, OH (United States); Huang, Zhiqi [Univ. of Toronto, ON (Canada); Huterer, Dragan [Univ. of Michigan, Ann Arbor, MI (United States); Jeong, Donghui [Pennsylvania State Univ., University Park, PA (United States); Johnson, Matthew C. [York Univ., Toronto, ON (Canada); Perimeter Inst., Waterloo, ON (Canada); Krause, Elisabeth [Stanford Univ., CA (United States); Loverde, Marilena [Univ. of Chicago, IL (United States); Meyers, Joel [Univ. of Toronto, ON (Canada); Meeburg, Daniel [Univ. of Toronto, ON (Canada); Senatore, Leonardo [Stanford Univ., CA (United States); Shandera, Sarah [Pennsylvania State Univ., University Park, PA (United States); Silverstein, Eva [Stanford Univ., CA (United States); Slosar, Anze [Brookhaven National Lab. (BNL), Upton, NY (United States); Smith, Kendrick [Perimeter Inst., Waterloo, Toronto, ON (Canada); Zaldarriaga, Matias [Univ. of Toronto, ON (Canada); Assassi, Valentin [Cambridge Univ. (United Kingdom); Braden, Jonathan [Univ. of Toronto, ON (Canada); Hajian, Amir [Univ. of Toronto, ON (Canada); Kobayashi, Takeshi [Perimeter Inst., Waterloo, Toronto, ON (Canada); Univ. of Toronto, ON (Canada); Stein, George [Univ. of Toronto, ON (Canada); Engelen, Alexander van [Univ. of Toronto, ON (Canada)

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude f$loc\\atop{NL}$ (f$eq\\atop{NL}$), natural target levels of sensitivity are Δf$loc, eq\\atop{NL}$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.

  17. U-shaped Vortex Structures in Large Scale Cloud Cavitation

    Science.gov (United States)

    Cao, Yantao; Peng, Xiaoxing; Xu, Lianghao; Hong, Fangwen

    2015-12-01

    The control of cloud cavitation, especially large scale cloud cavitation(LSCC), is always a hot issue in the field of cavitation research. However, there has been little knowledge on the evolution of cloud cavitation since it is associated with turbulence and vortex flow. In this article, the structure of cloud cavitation shed by sheet cavitation around different hydrofoils and a wedge were observed in detail with high speed camera (HSC). It was found that the U-shaped vortex structures always existed in the development process of LSCC. The results indicated that LSCC evolution was related to this kind of vortex structures, and it may be a universal character for LSCC. Then vortex strength of U-shaped vortex structures in a cycle was analyzed with numerical results.

  18. Spatial solitons in photonic lattices with large-scale defects

    Institute of Scientific and Technical Information of China (English)

    Yang Xiao-Yu; Zheng Jiang-Bo; Dong Liang-Wei

    2011-01-01

    We address the existence, stability and propagation dynamics of solitons supported by large-scale defects surrounded by the harmonic photonic lattices imprinted in the defocusing saturable nonlinear medium. Several families of soliton solutions, including flat-topped, dipole-like, and multipole-like solitons, can be supported by the defected lattices with different heights of defects. The width of existence domain of solitons is determined solely by the saturable parameter. The existence domains of various types of solitons can be shifted by the variations of defect size, lattice depth and soliton order. Solitons in the model are stable in a wide parameter window, provided that the propagation constant exceeds a critical value, which is in sharp contrast to the case where the soliton trains is supported by periodic lattices imprinted in defocusing saturable nonlinear medium. We also find stable solitons in the semi-infinite gap which rarely occur in the defocusing media.

  19. Large scale protein separations: engineering aspects of chromatography.

    Science.gov (United States)

    Chisti, Y; Moo-Young, M

    1990-01-01

    The engineering considerations common to large scale chromatographic purification of proteins are reviewed. A discussion of the industrial chromatography fundamentals is followed by aspects which affect the scale of separation. The separation column geometry, the effect of the main operational parameters on separation performance, and the physical characteristics of column packing are treated. Throughout, the emphasis is on ion exchange and size exclusion techniques which together constitute the major portion of commercial chromatographic protein purifications. In all cases, the state of current technology is examined and areas in need of further development are noted. The physico-chemical advances now underway in chromatographic separation of biopolymers would ensure a substantially enhanced role for these techniques in industrial production of products of new biotechnology.

  20. Planck intermediate results. XLII. Large-scale Galactic magnetic fields

    CERN Document Server

    Adam, R; Alves, M I R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Bartolo, N; Battaner, E; Benabed, K; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Bucher, M; Burigana, C; Butler, R C; Calabrese, E; Cardoso, J -F; Catalano, A; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Couchot, F; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Dickinson, C; Diego, J M; Dolag, K; Doré, O; Ducout, A; Dupac, X; Elsner, F; Enßlin, T A; Eriksen, H K; Ferrière, K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; González-Nuevo, J; Górski, K M; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F K; Harrison, D L; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hobson, M; Hornstrup, A; Hurier, G; Jaffe, A H; Jaffe, T R; Jones, W C; Juvela, M; Keihänen, E; Keskitalo, R; Kisner, T S; Knoche, J; Kunz, M; Kurki-Suonio, H; Lamarre, J -M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leahy, J P; Leonardi, R; Levrier, F; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maggio, G; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Masi, S; Melchiorri, A; Mennella, A; Migliaccio, M; Miville-Deschênes, M -A; Moneti, A; Montier, L; Morgante, G; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Nørgaard-Nielsen, H U; Oppermann, N; Orlando, E; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Pasian, F; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pierpaoli, E; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Puget, J -L; Rachen, J P; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Scott, D; Spencer, L D; Stolyarov, V; Stompor, R; Strong, A W; Sudiwala, R; Sunyaev, R; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L A; Wandelt, B D; Wehus, I K; Yvon, D; Zacchei, A; Zonca, A

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature were largely constrained by synchrotron emission and Faraday rotation measures. We select three different but representative models and compare their predicted polarized synchrotron and dust emission with that measured by the Planck satellite. We first update these models to match the Planck synchrotron products using a common model for the cosmic-ray leptons. We discuss the impact on this analysis of the ongoing problems of component separation in the Planck microwave bands and of the uncertain cosmic-ray spectrum. In particular, the inferred degree of ordering in the magnetic fields is sensitive to these systematic uncertainties. We then compare the resulting simulated emission to the observed dust emission and find that the dust predictions do not match the morphology in the Planck data, particularly the vertical profile in latitude. We show how the dust data can then be used to further improve these magnetic field models, particu...

  1. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we......Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  2. A mini review: photobioreactors for large scale algal cultivation.

    Science.gov (United States)

    Gupta, Prabuddha L; Lee, Seung-Mok; Choi, Hee-Jeong

    2015-09-01

    Microalgae cultivation has gained much interest in terms of the production of foods, biofuels, and bioactive compounds and offers a great potential option for cleaning the environment through CO2 sequestration and wastewater treatment. Although open pond cultivation is most affordable option, there tends to be insufficient control on growth conditions and the risk of contamination. In contrast, while providing minimal risk of contamination, closed photobioreactors offer better control on culture conditions, such as: CO2 supply, water supply, optimal temperatures, efficient exposure to light, culture density, pH levels, and mixing rates. For a large scale production of biomass, efficient photobioreactors are required. This review paper describes general design considerations pertaining to photobioreactor systems, in order to cultivate microalgae for biomass production. It also discusses the current challenges in designing of photobioreactors for the production of low-cost biomass.

  3. Lightweight computational steering of very large scale molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Beazley, D.M. [Univ. of Utah, Salt Lake City, UT (United States). Dept. of Computer Science; Lomdahl, P.S. [Los Alamos National Lab., NM (United States)

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.

  4. Large-scale comparative visualisation of sets of multidimensional data

    CERN Document Server

    Vohl, Dany; Fluke, Christopher J; Poudel, Govinda; Georgiou-Karistianis, Nellie; Hassan, Amr H; Benovitski, Yuri; Wong, Tsz Ho; Kaluza, Owen; Nguyen, Toan D; Bonnington, C Paul

    2016-01-01

    We present encube $-$ a qualitative, quantitative and comparative visualisation and analysis system, with application to high-resolution, immersive three-dimensional environments and desktop displays. encube extends previous comparative visualisation systems by considering: 1) the integration of comparative visualisation and analysis into a unified system; 2) the documentation of the discovery process; and 3) an approach that enables scientists to continue the research process once back at their desktop. Our solution enables tablets, smartphones or laptops to be used as interaction units for manipulating, organising, and querying data. We highlight the modularity of encube, allowing additional functionalities to be included as required. Additionally, our approach supports a high level of collaboration within the physical environment. We show how our implementation of encube operates in a large-scale, hybrid visualisation and supercomputing environment using the CAVE2 at Monash University, and on a local deskt...

  5. Large-scale nanostructured low-temperature solar selective absorber.

    Science.gov (United States)

    Chi, Kequn; Yang, Liu; Liu, Zhaolang; Gao, PingQi; Ye, Jichun; He, Sailing

    2017-05-15

    A large-scale nanostructured low-temperature solar selective absorber is demonstrated experimentally. It consists of a silicon dioxide thin film coating on a rough refractory tantalum substrate, fabricated based simply on self-assembled, closely packed polystyrene nanospheres. Because of the strong light harvesting of the surface nanopatterns and constructive interference within the top silicon dioxide coating, our absorber has a much higher solar absorption (0.84) than its planar counterpart (0.78). Though its absorption is lower than that of commercial black paint with ultra-broad absorption, the greatly suppressed absorption/emission in the long range still enables a superior heat accumulation. The working temperature is as high as 196.3°C under 7-sun solar illumination in ambient conditions-much higher than those achieved by the two comparables.

  6. Towards online multiresolution community detection in large-scale networks.

    Directory of Open Access Journals (Sweden)

    Jianbin Huang

    Full Text Available The investigation of community structure in networks has aroused great interest in multiple disciplines. One of the challenges is to find local communities from a starting vertex in a network without global information about the entire network. Many existing methods tend to be accurate depending on a priori assumptions of network properties and predefined parameters. In this paper, we introduce a new quality function of local community and present a fast local expansion algorithm for uncovering communities in large-scale networks. The proposed algorithm can detect multiresolution community from a source vertex or communities covering the whole network. Experimental results show that the proposed algorithm is efficient and well-behaved in both real-world and synthetic networks.

  7. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate......Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  8. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  9. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram

    2017-03-06

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  10. Unfolding large-scale online collaborative human dynamics

    CERN Document Server

    Zha, Yilong; Zhou, Changsong

    2015-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to others with a power-law waiting time, and (iii) population growth due to increasing number of interacting individuals. This unfolding allows us to obtain analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal "simplicity" beyond complex interac...

  11. Large-scale Cosmic Flows from Cosmicflows-2 Catalog

    CERN Document Server

    Watkins, Richard

    2014-01-01

    We calculate the large-scale bulk flow from the Cosmicflows-2 peculiar velocity catalog (Tully et al. 2013) using the minimum variance method introduced in Watkins et al. (2009). We find a bulk flow of 262 +/- 60 km/sec on a scale of 100 Mpc/h, a result somewhat smaller than that found from the COMPOSITE catalog introduced in Watkins et al. (2009), a compendium of peculiar velocity data that has many objects in common with the Cosmicflows-2 sample. We find that distances are systematically larger in the Cosmicflows-2 catalog for objects in common due to a different approach to bias correction, and that this explains the difference in the bulk flows derived from the the two catalogs. The bulk flow result from the Cosmicflows-2 survey is consistent with expectations from LCDM, and thus this catalog potentially resolves an important challenge to the standard cosmological model.

  12. Theoretical expectations for bulk flows in large scale surveys

    CERN Document Server

    Feldman, H A; Hume A Feldman; Richard Watkins

    1993-01-01

    We calculate the theoretical expectation for the bulk motion of a large scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We consider the power spectrum calculated from the IRAS-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that sparse sampling and clustering can lead to an unexpectedly large bulk flow, even in a very deep survey. Our results suggest that the expected bulk motion is inconsistent with that reported by Lauer and Postman at the 90-94% confidence level.

  13. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime, thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  14. Theoretical expectations for bulk flows in large-scale surveys

    Science.gov (United States)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  15. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...... approach applied throughout design and organizational implementation. To pursue this aim we extend the iterative PD prototyping approach by (1) emphasizing PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporating...... improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extending initial design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The extended approach is exemplified through...

  16. A Large-Scale 3D Object Recognition dataset

    DEFF Research Database (Denmark)

    Sølund, Thomas; Glent Buch, Anders; Krüger, Norbert

    2016-01-01

    This paper presents a new large scale dataset targeting evaluation of local shape descriptors and 3d object recognition algorithms. The dataset consists of point clouds and triangulated meshes from 292 physical scenes taken from 11 different views; a total of approximately 3204 views. Each...... geometric groups; concave, convex, cylindrical and flat 3D object models. The object models have varying amount of local geometric features to challenge existing local shape feature descriptors in terms of descriptiveness and robustness. The dataset is validated in a benchmark which evaluates the matching...... performance of 7 different state-of-the-art local shape descriptors. Further, we validate the dataset in a 3D object recognition pipeline. Our benchmark shows as expected that local shape feature descriptors without any global point relation across the surface have a poor matching performance with flat...

  17. SOLVING TRUST REGION PROBLEM IN LARGE SCALE OPTIMIZATION

    Institute of Scientific and Technical Information of China (English)

    Bing-sheng He

    2000-01-01

    This paper presents a new method for solving the basic problem in the “model trust region” approach to large scale minimization: Compute a vector x such that 1/2xTHx + cTx = min, subject to the constraint ‖x‖2≤a. The method is a combination of the CG method and a projection and contraction (PC) method. The first (CG) method with x0 = 0 as the start point either directly offers a solution of the problem, or--as soon as the norm of the iterate greater than a, --it gives a suitable starting point and a favourable choice of a crucial scaling parameter in the second (PC) method. Some numerical examples are given, which indicate that the method is applicable.

  18. Statistics of Caustics in Large-Scale Structure Formation

    Science.gov (United States)

    Feldbrugge, Job L.; Hidding, Johan; van de Weygaert, Rien

    2016-10-01

    The cosmic web is a complex spatial pattern of walls, filaments, cluster nodes and underdense void regions. It emerged through gravitational amplification from the Gaussian primordial density field. Here we infer analytical expressions for the spatial statistics of caustics in the evolving large-scale mass distribution. In our analysis, following the quasi-linear Zel'dovich formalism and confined to the 1D and 2D situation, we compute number density and correlation properties of caustics in cosmic density fields that evolve from Gaussian primordial conditions. The analysis can be straightforwardly extended to the 3D situation. We moreover, are currently extending the approach to the non-linear regime of structure formation by including higher order Lagrangian approximations and Lagrangian effective field theory.

  19. On the Hyperbolicity of Large-Scale Networks

    CERN Document Server

    Kennedy, W Sean; Saniee, Iraj

    2013-01-01

    Through detailed analysis of scores of publicly available data sets corresponding to a wide range of large-scale networks, from communication and road networks to various forms of social networks, we explore a little-studied geometric characteristic of real-life networks, namely their hyperbolicity. In smooth geometry, hyperbolicity captures the notion of negative curvature; within the more abstract context of metric spaces, it can be generalized as d-hyperbolicity. This generalized definition can be applied to graphs, which we explore in this report. We provide strong evidence that communication and social networks exhibit this fundamental property, and through extensive computations we quantify the degree of hyperbolicity of each network in comparison to its diameter. By contrast, and as evidence of the validity of the methodology, applying the same methods to the road networks shows that they are not hyperbolic, which is as expected. Finally, we present practical computational means for detection of hyperb...

  20. Visualizing large-scale uncertainty in astrophysical data.

    Science.gov (United States)

    Li, Hongwei; Fu, Chi-Wing; Li, Yinggang; Hanson, Andrew

    2007-01-01

    Visualization of uncertainty or error in astrophysical data is seldom available in simulations of astronomical phenomena, and yet almost all rendered attributes possess some degree of uncertainty due to observational error. Uncertainties associated with spatial location typically vary signicantly with scale and thus introduce further complexity in the interpretation of a given visualization. This paper introduces effective techniques for visualizing uncertainty in large-scale virtual astrophysical environments. Building upon our previous transparently scalable visualization architecture, we develop tools that enhance the perception and comprehension of uncertainty across wide scale ranges. Our methods include a unified color-coding scheme for representing log-scale distances and percentage errors, an ellipsoid model to represent positional uncertainty, an ellipsoid envelope model to expose trajectory uncertainty, and a magic-glass design supporting the selection of ranges of log-scale distance and uncertainty parameters, as well as an overview mode and a scalable WIM tool for exposing the magnitudes of spatial context and uncertainty.

  1. The gamma ray background from large scale structure formation

    CERN Document Server

    Gabici, S; Gabici, Stefano; Blasi, Pasquale

    2003-01-01

    Hierarchical clustering of dark matter halos is thought to describe well the large scale structure of the universe. The baryonic component of the halos is shock heated to the virial temperature while a small fraction of the energy flux through the shocks may be energized through the first order Fermi process to relativistic energy per particle. It has been proposed that the electrons accelerated in this way may upscatter the photons of the universal microwave background to gamma ray energies and indeed generate a diffuse background of gamma rays that compares well to the observations. In this paper we calculate the spectra of the particles accelerated at the merger shocks and re-evaluate the contribution of structure formation to the extragalactic diffuse gamma ray background (EDGRB), concluding that this contribution adds up to at most 10% of the observed EDGRB.

  2. The large scale dust distribution in the inner galaxy

    Science.gov (United States)

    Hauser, M. G.; Dwek, E.; Gezari, D.; Silverberg, R.; Kelsall, T.; Stier, M.; Cheung, L.

    1983-01-01

    Initial results are presented from a new large-scale survey of the first quadrant of the galactic plane at wavelengths of 160, 260, and 300 microns. The submillimeter wavelength emission, interpreted as thermal radiation by dust grains, reveals an optically thin disk of angular width about 0.09 deg (FWHM) with a mean dust temperature of 23 K and significant variation of the dust mass column density. Comparison of the dust column density with the gas column density inferred from CO survey data shows a striking spatial correlation. The mean luminosity per hydrogen atom is found to be 2.5 x 10 to the -30th W/H, implying a radiant energy density in the vicinity of the dust an order of magnitude larger than in the solar neighborhood. The data favor dust in molecular clouds as the dominant submillimeter radiation source.

  3. Building a Large-Scale Knowledge Base for Machine Translation

    CERN Document Server

    Knight, K; Knight, Kevin; Luk, Steve K.

    1994-01-01

    Knowledge-based machine translation (KBMT) systems have achieved excellent results in constrained domains, but have not yet scaled up to newspaper text. The reason is that knowledge resources (lexicons, grammar rules, world models) must be painstakingly handcrafted from scratch. One of the hypotheses being tested in the PANGLOSS machine translation project is whether or not these resources can be semi-automatically acquired on a very large scale. This paper focuses on the construction of a large ontology (or knowledge base, or world model) for supporting KBMT. It contains representations for some 70,000 commonly encountered objects, processes, qualities, and relations. The ontology was constructed by merging various online dictionaries, semantic networks, and bilingual resources, through semi-automatic methods. Some of these methods (e.g., conceptual matching of semantic taxonomies) are broadly applicable to problems of importing/exporting knowledge from one KB to another. Other methods (e.g., bilingual match...

  4. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    Catastrophic storms and storm surges induce rapid and substantial changes along sandy barrier coasts, potentially causing severe environmental and economic damage. Coastal impacts of modern storms are associated with washover deposition, dune erosion, barrier breaching, and coastline and shoreface...... erosion. Little is however known about the impact of major storms and their post-storm coastal recovery on geologic and historic evolution of barrier systems. We apply high-resolution optically stimulated luminescence dating on a barrier system in the Wadden Sea (Denmark) and show that 5 to 8 meters...... of marine sand accumulated in an aggrading-prograding shoal and on a prograding shoreface during and within 3 to 4 decades (“healing phase”) after the most destructive storm documented for the Wadden Sea. Furthermore, we show that the impact of this storm caused large-scale shoreline erosion and barrier...

  5. Magnetic fields and the large-scale structure

    CERN Document Server

    Battaner, E

    1999-01-01

    The large-scale structure of the Universe has been observed to be characterized by long filaments, forming polyhedra, with a remarkable 100-200 Mpc periodicity, suggesting a regular network. The introduction of magnetic fields into the physics of the evolution of structure formation provides some clues to understanding this unexpected lattice structure. A relativistic treatment of the evolution of pre-recombination inhomogeneities, including magnetic fields, is presented to show that equivalent-to-present field strengths of the order of $10^{-8}$ G could have played an important role. Primordial magnetic tubes generated at inflation, at scales larger than the horizon before recombination, could have produced filamentary density structures, with comoving lengths larger than about 10 Mpc. Structures shorter than this would have been destroyed by diffusion due to the small pre-recombination conductivity. If filaments constitute a lattice, the primordial magnetic field structures that produced the post-recombinat...

  6. Supersymmetry and Large Scale Left-Right Symmetry

    CERN Document Server

    Aulakh, Charanjit S; Rasin, A; Senjanovic, G; Aulakh, Charanjit S.; Melfo, Alejandra; Rasin, Andrija; Senjanovic, Goran

    1998-01-01

    We present a systematic study of the construction of large scale supersymmetric left-right theories, by utilizing holomorphic invariants to characterize flat directions, both at the renormalizable and the non-renormalizable level. We show that the low energy limit of the minimal supersymmetric Left-Right models is the supersymmetric standard model with an exact R-parity. Whereas in the renormalizable version the scale of parity breaking is undetermined, in the non-renormalizable one it must be bigger than about $10^{10} - 10^{12}$ GeV. The precise nature of the see-saw mechanism differs in the two versions, and we discuss it at length. In both versions of the theory a number of Higgs scalars and fermions with masses much below the $B-L$ and $SU(2)_R$ breaking scales is predicted. For a reasonable choice of parameters, either charged or doubly-charged such particles may be accesible to experiment.

  7. Hashkat: Large-scale simulations of online social networks

    CERN Document Server

    Ryczko, Kevin; Buhagiar, Nicholas; Tamblyn, Isaac

    2016-01-01

    Hashkat (http://hashkat.org) is a free, open source, agent based simulation software package designed to simulate large-scale online social networks (e.g. Twitter, Facebook, LinkedIn, etc). It allows for dynamic agent generation, edge creation, and information propagation. The purpose of hashkat is to study the growth of online social networks and how information flows within them. Like real life online social networks, hashkat incorporates user relationships, information diffusion, and trending topics. Hashkat was implemented in C++, and was designed with extensibility in mind. The software includes Shell and Python scripts for easy installation and usability. In this report, we describe all of the algorithms and features integrated into hashkat before moving on to example use cases. In general, hashkat can be used to understand the underlying topology of social networks, validate sampling methods of such networks, develop business strategy for advertising on online social networks, and test new features of ...

  8. Interloper bias in future large-scale structure surveys

    CERN Document Server

    Pullen, Anthony R; Dore, Olivier; Raccanelli, Alvise

    2015-01-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing inferences based on the power spectrum. We also construct a formalism for predicting biases for cosmological parameter measurements, and we demonstrate that a 0.3% interloper fraction could bias measurements of the growth rate by more than 10% of the error, which can affect constraints from upcoming surveys o...

  9. High pressure sheet metal forming of large scale body structures

    Energy Technology Data Exchange (ETDEWEB)

    Trompeter, M.; Krux, R.; Homberg, W.; Kleiner, M. [Dortmund Univ. (Germany). Inst. of Forming Technology and Lightweight Construction

    2005-07-01

    An important trend in the automotive industry is the weight reduction of car bodies by lightweight construction. One approach to realise lightweight structures is the use of load optimised sheet metal parts (e.g. tailored blanks), especially for crash relevant car body structures. To form such parts which are mostly complex and primarily made of high strength steels, the use of working media based forming processes is favorable. The paper presents the manufacturing of a large scale structural component made of tailor rolled blanks (TRB) by high pressure sheet metal forming (HBU). The paper focuses mainly on the tooling system, which is integrated into a specific 100 MN hydroform press at the IUL. The HBU tool basically consists of a multipoint blankholder, a specially designed flange draw-in sensor, which is necessary to determine the material flow, and a sealing system. Furthermore, the paper presents a strategy for an effective closed loop flange draw-in control. (orig.)

  10. Mass Efficiencies for Common Large-Scale Precision Space Structures

    Science.gov (United States)

    Williams, R. Brett; Agnes, Gregory S.

    2005-01-01

    This paper presents a mass-based trade study for large-scale deployable triangular trusses, where the longerons can be monocoque tubes, isogrid tubes, or coilable longeron trusses. Such structures are typically used to support heavy reflectors, solar panels, or other instruments, and are subject to thermal gradients that can vary a great deal based on orbital altitude, location in orbit, and self-shadowing. While multi layer insulation (MLI) blankets are commonly used to minimize the magnitude of these thermal disturbances, they subject the truss to a nonstructural mass penalty. This paper investigates the impact of these add-on thermal protection layers on selecting the lightest precision structure for a given loading scenario.

  11. Computational solutions to large-scale data management and analysis.

    Science.gov (United States)

    Schadt, Eric E; Linderman, Michael D; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P

    2010-09-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist - such as cloud and heterogeneous computing - to successfully tackle our big data problems.

  12. Statistics of Caustics in Large-Scale Structure Formation

    CERN Document Server

    Feldbrugge, Job; van de Weygaert, Rien

    2014-01-01

    The cosmic web is a complex spatial pattern of walls, filaments, cluster nodes and underdense void regions. It emerged through gravitational amplification from the Gaussian primordial density field. Here we infer analytical expressions for the spatial statistics of caustics in the evolving large-scale mass distribution. In our analysis, following the quasi-linear Zeldovich formalism and confined to the 1D and 2D situation, we compute number density and correlation properties of caustics in cosmic density fields that evolve from Gaussian primordial conditions. The analysis can be straightforwardly extended to the 3D situation. We moreover, are currently extending the approach to the non-linear regime of structure formation by including higher order Lagrangian approximations and Lagrangian effective field theory.

  13. Battery technologies for large-scale stationary energy storage.

    Science.gov (United States)

    Soloveichik, Grigorii L

    2011-01-01

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β″-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  14. Large scale land use cartography of special areas

    Energy Technology Data Exchange (ETDEWEB)

    Amico, F.D.; Maccarone, D.; Pandiscia, G.V. [NuovaTelespazio S.p.A., Rome (Italy)] [and others

    1996-11-01

    On 06 October 1993 an aerial remote sensing mission has been done on the {open_quote}Mounts of the Sila{close_quotes} area, using a DAEDALUS ATM multispectral scanner, in the framework of the TELAER project, supported by I.A.S.M. (Istituto per l`Assistenza e lo Sviluppo del Mezzogiorno). The study area is inside the National Park of Calabria, well known for its coniferous forests. The collected imagery were used to produce a large scale land use cartography, on the scale of 1 to 5000, extracting information on natural and anthropical vegetation from the multispectral images, with the aid of stereo photos acquired simultaneously. 5 refs., 1 fig., 1 tab.

  15. Recovery Act - Large Scale SWNT Purification and Solubilization

    Energy Technology Data Exchange (ETDEWEB)

    Michael Gemano; Dr. Linda B. McGown

    2010-10-07

    The goal of this Phase I project was to establish a quantitative foundation for development of binary G-gels for large-scale, commercial processing of SWNTs and to develop scientific insight into the underlying mechanisms of solubilization, selectivity and alignment. In order to accomplish this, we performed systematic studies to determine the effects of G-gel composition and experimental conditions that will enable us to achieve our goals that include (1) preparation of ultra-high purity SWNTs from low-quality, commercial SWNT starting materials, (2) separation of MWNTs from SWNTs, (3) bulk, non-destructive solubilization of individual SWNTs in aqueous solution at high concentrations (10-100 mg/mL) without sonication or centrifugation, (4) tunable enrichment of subpopulations of the SWNTs based on metallic vs. semiconductor properties, diameter, or chirality and (5) alignment of individual SWNTs.

  16. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  17. A large-scale crop protection bioassay data set.

    Science.gov (United States)

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  18. Automatic Installation and Configuration for Large Scale Farms

    CERN Document Server

    Novák, J

    2005-01-01

    Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and they became essential in all areas of life. Soon it was realized that nodes are able to work cooperatively, in order to solve new, more complex tasks. This conception got materialized in coherent aggregations of computers called farms and clusters. Collective application of nodes, being efficient and economical, was adopted in education, research and industry before long. But maintainance, especially in large scale, appeared as a problem to be resolved. New challenges needed new methods and tools. Development work has been started to build farm management applications and frameworks. In the first part of the thesis, these systems are introduced. After a general description of the matter, a comparative analysis of different approaches and tools illustrates the practical aspects of the theoretical discussion. CERN, the European Organization of Nuclear Research is the largest Particle Physics laboratory in the world....

  19. Transition from large-scale to small-scale dynamo.

    Science.gov (United States)

    Ponty, Y; Plunian, F

    2011-04-15

    The dynamo equations are solved numerically with a helical forcing corresponding to the Roberts flow. In the fully turbulent regime the flow behaves as a Roberts flow on long time scales, plus turbulent fluctuations at short time scales. The dynamo onset is controlled by the long time scales of the flow, in agreement with the former Karlsruhe experimental results. The dynamo mechanism is governed by a generalized α effect, which includes both the usual α effect and turbulent diffusion, plus all higher order effects. Beyond the onset we find that this generalized α effect scales as O(Rm(-1)), suggesting the takeover of small-scale dynamo action. This is confirmed by simulations in which dynamo occurs even if the large-scale field is artificially suppressed.

  20. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.