WorldWideScience

Sample records for potential large-scale re-emergence

  1. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  2. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  3. How is Europe positioned for a re-emergence of Schmallenberg virus?

    Science.gov (United States)

    Stavrou, Anastasios; Daly, Janet M; Maddison, Ben; Gough, Kevin; Tarlinton, Rachael

    2017-12-01

    Schmallenberg virus (SBV) caused a large scale epidemic in Europe from 2011 to 2013, infecting ruminants and causing foetal deformities after infection of pregnant animals. The main impact of the virus was financial loss due to restrictions on trade of animals, meat and semen. Although effective vaccines were produced, their uptake was never high. Along with the subsequent decline in new SBV infections and natural replacement of previously exposed livestock, this has resulted in a decrease in the number of protected animals. Recent surveillance has shown that a large population of naïve animals is currently present in Europe and that the virus is circulating at a low level. These changes in animal status, in combination with favourable conditions for insect vectors, may open the door to the re-emergence of SBV and another large scale outbreak in Europe. This review details the potential and preparedness for SBV re-emergence in Europe, discusses possible co-ordinated sentinel monitoring programmes for ruminant seroconversion and the presence of SBV in the insect vectors, and provides an overview of the economic impact associated with diagnosis, control and the effects of non-vaccination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Software Toolchain for Large-Scale RE-NFA Construction on FPGA

    Directory of Open Access Journals (Sweden)

    Yi-Hua E. Yang

    2009-01-01

    and O(n×m memory by our software. A large number of RE-NFAs are placed onto a two-dimensional staged pipeline, allowing scalability to thousands of RE-NFAs with linear area increase and little clock rate penalty due to scaling. On a PC with a 2 GHz Athlon64 processor and 2 GB memory, our prototype software constructs hundreds of RE-NFAs used by Snort in less than 10 seconds. We also designed a benchmark generator which can produce RE-NFAs with configurable pattern complexity parameters, including state count, state fan-in, loop-back and feed-forward distances. Several regular expressions with various complexities are used to test the performance of our RE-NFA construction software.

  5. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  6. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  7. Carbon dioxide recycling: emerging large-scale technologies with industrial potential.

    Science.gov (United States)

    Quadrelli, Elsje Alessandra; Centi, Gabriele; Duplan, Jean-Luc; Perathoner, Siglinda

    2011-09-19

    This Review introduces this special issue of ChemSusChem dedicated to CO(2) recycling. Its aim is to offer an up-to-date overview of CO(2) chemical utilization (inorganic mineralization, organic carboxylation, reduction reactions, and biochemical conversion), as a continuation and extension of earlier books and reviews on this topic, but with a specific focus on large-volume routes and projects/pilot plants that are currently emerging at (pre-)industrial level. The Review also highlights how some of these routes will offer a valuable opportunity to introduce renewable energy into the existing energy and chemical infrastructure (i.e., "drop-in" renewable energy) by synthesis of chemicals from CO(2) that are easy to transport and store. CO(2) conversion therefore has the potential to become a key pillar of the sustainable and resource-efficient production of chemicals and energy from renewables. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Recent Regional Climate State and Change - Derived through Downscaling Homogeneous Large-scale Components of Re-analyses

    Science.gov (United States)

    Von Storch, H.; Klehmet, K.; Geyer, B.; Li, D.; Schubert-Frisius, M.; Tim, N.; Zorita, E.

    2015-12-01

    Global re-analyses suffer from inhomogeneities, as they process data from networks under development. However, the large-scale component of such re-analyses is mostly homogeneous; additional observational data add in most cases to a better description of regional details and less so on large-scale states. Therefore, the concept of downscaling may be applied to homogeneously complementing the large-scale state of the re-analyses with regional detail - wherever the condition of homogeneity of the large-scales is fulfilled. Technically this can be done by using a regional climate model, or a global climate model, which is constrained on the large scale by spectral nudging. This approach has been developed and tested for the region of Europe, and a skillful representation of regional risks - in particular marine risks - was identified. While the data density in Europe is considerably better than in most other regions of the world, even here insufficient spatial and temporal coverage is limiting risk assessments. Therefore, downscaled data-sets are frequently used by off-shore industries. We have run this system also in regions with reduced or absent data coverage, such as the Lena catchment in Siberia, in the Yellow Sea/Bo Hai region in East Asia, in Namibia and the adjacent Atlantic Ocean. Also a global (large scale constrained) simulation has been. It turns out that spatially detailed reconstruction of the state and change of climate in the three to six decades is doable for any region of the world.The different data sets are archived and may freely by used for scientific purposes. Of course, before application, a careful analysis of the quality for the intended application is needed, as sometimes unexpected changes in the quality of the description of large-scale driving states prevail.

  9. Utilisation of ISA Reverse Genetics and Large-Scale Random Codon Re-Encoding to Produce Attenuated Strains of Tick-Borne Encephalitis Virus within Days.

    Science.gov (United States)

    de Fabritus, Lauriane; Nougairède, Antoine; Aubry, Fabien; Gould, Ernest A; de Lamballerie, Xavier

    2016-01-01

    Large-scale codon re-encoding is a new method of attenuating RNA viruses. However, the use of infectious clones to generate attenuated viruses has inherent technical problems. We previously developed a bacterium-free reverse genetics protocol, designated ISA, and now combined it with large-scale random codon-re-encoding method to produce attenuated tick-borne encephalitis virus (TBEV), a pathogenic flavivirus which causes febrile illness and encephalitis in humans. We produced wild-type (WT) and two re-encoded TBEVs, containing 273 or 273+284 synonymous mutations in the NS5 and NS5+NS3 coding regions respectively. Both re-encoded viruses were attenuated when compared with WT virus using a laboratory mouse model and the relative level of attenuation increased with the degree of re-encoding. Moreover, all infected animals produced neutralizing antibodies. This novel, rapid and efficient approach to engineering attenuated viruses could potentially expedite the development of safe and effective new-generation live attenuated vaccines.

  10. Bactericidal assessment of nano-silver on emerging and re-emerging human pathogens.

    Science.gov (United States)

    Anuj, Samir A; Gajera, Harsukh P; Hirpara, Darshna G; Golakiya, Baljibhai A

    2018-04-24

    With the threat of the growing number of bacteria resistant to antibiotics, the re-emergence of previously deadly infections and the emergence of new infections, there is an urgent need for novel therapeutic agent. Silver in the nano form, which is being used increasingly as antibacterial agents, may extend its antibacterial application to emerging and re-emerging multidrug-resistant pathogens, the main cause of nosocomial diseases worldwide. In the present study, a completely bottom up method to prepare green nano-silver was used. To explore the action of nano-silver on emerging Bacillus megaterium MTCC 7192 and re-emerging Pseudomonas aeruginosa MTCC 741 pathogenic bacteria, the study includes an analysis of the bacterial membrane damage through Scanning Electron Microscope (SEM) as well as alternation of zeta potential and intracellular leakages. In this work, we observed genuine bactericidal property of nano-silver as compare to broad spectrum antibiotics against emerging and re-emerging mode. After being exposed to nano-silver, the membrane becomes scattered from their original ordered arrangement based on SEM observation. Moreover, our results also suggested that alternation of zeta potential enhanced membrane permeability, and beyond a critical point, it leads to cell death. The leakages of intracellular constituents were confirmed by Gas Chromatography-Mass Spectrometry (GC-MS). In conclusion, the combine results suggested that at a specific dose, nano-silver may destroy the structure of bacterial membrane and depress its activity, which causes bacteria to die eventually. Copyright © 2018 Elsevier GmbH. All rights reserved.

  11. Large scale integration of flexible non-volatile, re-addressable memories using P(VDF-TrFE) and amorphous oxide transistors

    International Nuclear Information System (INIS)

    Gelinck, Gerwin H; Cobb, Brian; Van Breemen, Albert J J M; Myny, Kris

    2015-01-01

    Ferroelectric polymers and amorphous metal oxide semiconductors have emerged as important materials for re-programmable non-volatile memories and high-performance, flexible thin-film transistors, respectively. However, realizing sophisticated transistor memory arrays has proven to be a challenge, and demonstrating reliable writing to and reading from such a large scale memory has thus far not been demonstrated. Here, we report an integration of ferroelectric, P(VDF-TrFE), transistor memory arrays with thin-film circuitry that can address each individual memory element in that array. n-type indium gallium zinc oxide is used as the active channel material in both the memory and logic thin-film transistors. The maximum process temperature is 200 °C, allowing plastic films to be used as substrate material. The technology was scaled up to 150 mm wafer size, and offers good reproducibility, high device yield and low device variation. This forms the basis for successful demonstration of memory arrays, read and write circuitry, and the integration of these. (paper)

  12. A re-entrant flowshop heuristic for online scheduling of the paper path in a large scale printer

    NARCIS (Netherlands)

    Waqas, U.; Geilen, M.C.W.; Kandelaars, J.; Somers, L.J.A.M.; Basten, T.; Stuijk, S.; Vestjens, P.G.H.; Corporaal, H.

    2015-01-01

    A Large Scale Printer (LSP) is a Cyber Physical System (CPS) printing thousands of sheets per day with high quality. The print requests arrive at run-time requiring online scheduling. We capture the LSP scheduling problem as online scheduling of re-entrant flowshops with sequence dependent setup

  13. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  14. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    Science.gov (United States)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  15. Threats from emerging and re-emerging neglected tropical diseases (NTDs).

    Science.gov (United States)

    Mackey, Tim K; Liang, Bryan A

    2012-01-01

    Neglected tropical diseases impact over 1 billion of the world's poorest populations and require special attention. However, within the NTDs recognized by the World Health Organization, some are also dually categorized as emerging and re-emerging infectious diseases requiring more detailed examination on potential global health risks. We reviewed the 17 NTDs classified by the WHO to determine if those NTDs were also categorized by the US Centers for Disease Control and Prevention as emerging and re-emerging infectious diseases (''EReNTDs''). We then identified common characteristics and risks associated with EReNTDs. Identified EReNTDs of dengue, rabies, Chagas Disease, and cysticercosis disproportionately impact resource-poor settings with poor social determinants of health, spread through globalization, are impacted by vector control, lack available treatments, and threaten global health security. This traditionally neglected subset of diseases requires urgent attention and unique incentive structures to encourage investment in innovation and coordination. Multi-sectorial efforts and targeted public-private partnerships would spur needed R&D for effective and accessible EReNTD treatments, improvement of social determinants of health, crucial low-income country development, and health system strengthening efforts. Utilization of One Health principles is essential for enhancing knowledge to efficaciously address public health aspects of these EReNTDs globally.

  16. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  17. Processing and properties of large grain (RE)BCO

    International Nuclear Information System (INIS)

    Cardwell, D.A.

    1998-01-01

    The potential of high temperature superconductors to generate large magnetic fields and to carry current with low power dissipation at 77 K is particularly attractive for a variety of permanent magnet applications. As a result large grain bulk (RE)-Ba-Cu-O ((RE)BCO) materials have been developed by melt process techniques in an attempt to fabricate practical materials for use in high field devices. This review outlines the current state of the art in this field of processing, including seeding requirements for the controlled fabrication of these materials, the origin of striking growth features such as the formation of a facet plane around the seed, platelet boundaries and (RE) 2 BaCuO 5 (RE-211) inclusions in the seeded melt grown microstructure. An observed variation in critical current density in large grain (RE)BCO samples is accounted for by Sm contamination of the material in the vicinity of the seed and with the development of a non-uniform growth morphology at ∼4 mm from the seed position. (RE)Ba 2 Cu 3 O 7-δ (RE-123) dendrites are observed to form and bro[en preferentially within the a/b plane of the lattice in this growth regime. Finally, trapped fields in excess of 3 T have been reported in irr[iated U-doped YBCO and (RE) 1+x Ba 2-x Cu 3 O y (RE=Sm, Nd) materials have been observed to carry transport current in fields of up to 10 T at 77 K. This underlines the potential of bulk (RE)BCO materials for practical permanent magnet type applications. (orig.)

  18. Indicators for early identification of re-emerging mycotoxins

    NARCIS (Netherlands)

    Fels-Klerx, van der H.J.; Dekkers, S.; Kandhai, M.C.; Jeurissen, S.M.F.; Booij, C.J.H.; Heer, de C.

    2010-01-01

    The aim of this study was to select the most important indicators for early identification of re-emerging mycotoxins in wheat, maize, peanuts and tree nuts. The study was based on a holistic approach and, consequently, potential indicators were evaluated not only from the food production chain but

  19. Threats from emerging and re-emerging neglected tropical diseases (NTDs

    Directory of Open Access Journals (Sweden)

    Tim K. Mackey

    2012-08-01

    Full Text Available Background: Neglected tropical diseases impact over 1 billion of the world's poorest populations and require special attention. However, within the NTDs recognized by the World Health Organization, some are also dually categorized as emerging and re-emerging infectious diseases requiring more detailed examination on potential global health risks. Methods: We reviewed the 17 NTDs classified by the WHO to determine if those NTDs were also categorized by the US Centers for Disease Control and Prevention as emerging and re-emerging infectious diseases (‘‘EReNTDs’’. We then identified common characteristics and risks associated with EReNTDs. Results: Identified EReNTDs of dengue, rabies, Chagas Disease, and cysticercosis disproportionately impact resource-poor settings with poor social determinants of health, spread through globalization, are impacted by vector control, lack available treatments, and threaten global health security. This traditionally neglected subset of diseases requires urgent attention and unique incentive structures to encourage investment in innovation and coordination. Discussion: Multi-sectorial efforts and targeted public–private partnerships would spur needed R&D for effective and accessible EReNTD treatments, improvement of social determinants of health, crucial low-income country development, and health system strengthening efforts. Utilization of One Health principles is essential for enhancing knowledge to efficaciously address public health aspects of these EReNTDs globally.

  20. Emerging and Re-Emerging Zoonoses of Dogs and Cats

    Directory of Open Access Journals (Sweden)

    Bruno B. Chomel

    2014-07-01

    Full Text Available Since the middle of the 20th century, pets are more frequently considered as “family members” within households. However, cats and dogs still can be a source of human infection by various zoonotic pathogens. Among emerging or re-emerging zoonoses, viral diseases, such as rabies (mainly from dog pet trade or travel abroad, but also feline cowpox and newly recognized noroviruses or rotaviruses or influenza viruses can sicken our pets and be transmitted to humans. Bacterial zoonoses include bacteria transmitted by bites or scratches, such as pasteurellosis or cat scratch disease, leading to severe clinical manifestations in people because of their age or immune status and also because of our closeness, not to say intimacy, with our pets. Cutaneous contamination with methicillin-resistant Staphylococcus aureus, Leptospira spp., and/or aerosolization of bacteria causing tuberculosis or kennel cough are also emerging/re-emerging pathogens that can be transmitted by our pets, as well as gastro-intestinal pathogens such as Salmonella or Campylobacter. Parasitic and fungal pathogens, such as echinococcosis, leishmaniasis, onchocercosis, or sporotrichosis, are also re-emerging or emerging pet related zoonoses. Common sense and good personal and pet hygiene are the key elements to prevent such a risk of zoonotic infection.

  1. Assessment of renewable energy resources potential for large scale and standalone applications in Ethiopia

    NARCIS (Netherlands)

    Tucho, Gudina Terefe; Weesie, Peter D.M.; Nonhebel, Sanderine

    2014-01-01

    This study aims to determine the contribution of renewable energy to large scale and standalone application in Ethiopia. The assessment starts by determining the present energy system and the available potentials. Subsequently, the contribution of the available potentials for large scale and

  2. [Emergent viral infections

    NARCIS (Netherlands)

    Galama, J.M.D.

    2001-01-01

    The emergence and re-emergence of viral infections is an ongoing process. Large-scale vaccination programmes led to the eradication or control of some viral infections in the last century, but new viruses are always emerging. Increased travel is leading to a rise in the importation of exotic

  3. Burden of emerging/re emerging diseases in India

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Burden of emerging/re emerging diseases in India. 1-2 million deaths for 1994 epidemic of plague. 20,565 deaths in 2004 due to rabies. 400 million chronic carriers of hepatitis B virus. More than 18 million carriers of hepatitis C virus. 'Mutant' measles virus infection in ...

  4. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Science.gov (United States)

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  5. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Directory of Open Access Journals (Sweden)

    Xinhua He

    2014-01-01

    Full Text Available This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  6. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    Science.gov (United States)

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  7. The Staff Observation Aggression Scale - Revised (SOAS-R) - adjustment and validation for emergency primary health care.

    Science.gov (United States)

    Morken, Tone; Baste, Valborg; Johnsen, Grethe E; Rypdal, Knut; Palmstierna, Tom; Johansen, Ingrid Hjulstad

    2018-05-08

    Many emergency primary health care workers experience aggressive behaviour from patients or visitors. Simple incident-reporting procedures exist for inpatient, psychiatric care, but a similar and simple incident-report for other health care settings is lacking. The aim was to adjust a pre-existing form for reporting aggressive incidents in a psychiatric inpatient setting to the emergency primary health care settings. We also wanted to assess the validity of the severity scores in emergency primary health care. The Staff Observation Scale - Revised (SOAS-R) was adjusted to create a pilot version of the Staff Observation Scale - Revised Emergency (SOAS-RE). A Visual Analogue Scale (VAS) was added to the form to judge the severity of the incident. Data for validation of the pilot version of SOAS-RE were collected from ten casualty clinics in Norway during 12 months. Variance analysis was used to test gender and age differences. Linear regression analysis was performed to evaluate the relative impact that each of the five SOAS-RE columns had on the VAS score. The association between SOAS-RE severity score and VAS severity score was calculated by the Pearson correlation coefficient. The SOAS-R was adjusted to emergency primary health care, refined and called The Staff Observation Aggression Scale - Revised Emergency (SOAS-RE). A total of 350 SOAS-RE forms were collected from the casualty clinics, but due to missing data, 291 forms were included in the analysis. SOAS-RE scores ranged from 1 to 22. The mean total severity score of SOAS-RE was 10.0 (standard deviation (SD) =4.1) and the mean VAS score was 45.4 (SD = 26.7). We found a significant correlation of 0.45 between the SOAS-RE total severity scores and the VAS severity ratings. The linear regression analysis showed that individually each of the categories, which described the incident, had a low impact on the VAS score. The SOAS-RE seems to be a useful instrument for research, incident-recording and management

  8. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  9. Disease elimination and re-emergence in differential-equation models.

    Science.gov (United States)

    Greenhalgh, Scott; Galvani, Alison P; Medlock, Jan

    2015-12-21

    Traditional differential equation models of disease transmission are often used to predict disease trajectories and evaluate the effectiveness of alternative intervention strategies. However, such models cannot account explicitly for probabilistic events, such as those that dominate dynamics when disease prevalence is low during the elimination and re-emergence phases of an outbreak. To account for the dynamics at low prevalence, i.e. the elimination and risk of disease re-emergence, without the added analytical and computational complexity of a stochastic model, we develop a novel application of control theory. We apply our approach to analyze historical data of measles elimination and re-emergence in Iceland from 1923 to 1938, predicting the temporal trajectory of local measles elimination and re-emerge as a result of disease migration from Copenhagen, Denmark. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The Re-Emergence and Emergence of Vector-Borne Rickettsioses in Taiwan

    Directory of Open Access Journals (Sweden)

    Nicholas T. Minahan

    2017-12-01

    Full Text Available Rickettsial diseases, particularly vector-borne rickettsioses (VBR, have a long history in Taiwan, with studies on scrub typhus and murine typhus dating back over a century. The climatic and geographic diversity of Taiwan’s main island and its offshore islands provide many ecological niches for the diversification and maintenance of rickettsiae alike. In recent decades, scrub typhus has re-emerged as the most prevalent type of rickettsiosis in Taiwan, particularly in eastern Taiwan and its offshore islands. While murine typhus has also re-emerged on Taiwan’s western coast, it remains neglected. Perhaps more alarming than the re-emergence of these rickettsioses is the emergence of newly described VBR. The first case of human infection with Rickettsia felis was confirmed in 2005, and undetermined spotted fever group rickettsioses have recently been detected. Taiwan is at a unique advantage in terms of detecting and characterizing VBR, as it has universal health coverage and a national communicable disease surveillance system; however, these systems have not been fully utilized for this purpose. Here, we review the existing knowledge on the eco-epidemiology of VBR in Taiwan and recommend future courses of action.

  11. Re-Emergent Tremor of Parkinson's Disease Masquerading as Essential Tremor

    Directory of Open Access Journals (Sweden)

    Sarah Morgan

    2016-03-01

    Full Text Available Background: The re-emergent tremor of Parkinson’s disease (PD is generally recognized as a postural tremor. Phenomenology Shown: A PD patient with a re-emergent tremor occurring during a task (spiral drawing, which on the surface produced a tremor that resembled that of essential tremor (ET. Educational Value: Researchers and clinicians should be aware of features of this re-emergent tremor to help distinguish it from that of ET.

  12. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  13. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  14. Investigations on efficiency of the emergency cooling by means of large-scale tests

    International Nuclear Information System (INIS)

    Hicken, E.F.

    1982-01-01

    The RSK guidelines contain the maximum permissible loads (max. cladding tube temperature 1200 0 C, max. Zr/H 2 O-reaction of 1% Zr). Their observance implies that only a small number of fuel rods fail. The safety research has to produce the evidence that the limiting loads are not exceeded. The analytical investigations on the emergency cooling behaviour could so far only be verified in scaled-down test facilities. After about 100 tests in four different large-scale test facilities the experimental investigations on the blow-down phase for large cracks are finished in the main. With the refill- and flood process the systems behaviour in scaled down test stands, the multidimensional conditions in the reactor pressure vessel can, however, only be simulated on the original scale. More experiments are planned as part of the 2D/3D-project (CCTF , SCTF, UPTF) and as part of the PKL-tests, so that more than 200 tests in seven plants will be available then. As to the small cracks the physical phenomena are known. The current investigations are used to increase the reliability of statement. After their being finished approximately 300 tests in seven plants will be available. (orig./HP) [de

  15. Towards Agent-Based Simulation of Emerging and Large-Scale Social Networks. Examples of the Migrant Crisis and MMORPGs

    Directory of Open Access Journals (Sweden)

    Schatten, Markus

    2016-10-01

    Full Text Available Large-scale agent based simulation of social networks is described in the context of the migrant crisis in Syria and the EU as well as massively multi-player on-line role playing games (MMORPG. The recipeWorld system by Terna and Fontana is proposed as a possible solution to simulating large-scale social networks. The initial system has been re-implemented using the Smart Python multi-Agent Development Environment (SPADE and Pyinteractive was used for visualization. We present initial models of simulation that we plan to develop further in future studies. Thus this paper is research in progress that will hopefully establish a novel agent-based modelling system in the context of the ModelMMORPG project.

  16. Potential climatic impacts and reliability of large-scale offshore wind farms

    International Nuclear Information System (INIS)

    Wang Chien; Prinn, Ronald G

    2011-01-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land

  17. Backup flexibility classes in emerging large-scale renewable electricity systems

    International Nuclear Information System (INIS)

    Schlachtberger, D.P.; Becker, S.; Schramm, S.; Greiner, M.

    2016-01-01

    Highlights: • Flexible backup demand in a European wind and solar based power system is modelled. • Three flexibility classes are defined based on production and consumption timescales. • Seasonal backup capacities are shown to be only used below 50% renewable penetration. • Large-scale transmission between countries can reduce fast flexible capacities. - Abstract: High shares of intermittent renewable power generation in a European electricity system will require flexible backup power generation on the dominant diurnal, synoptic, and seasonal weather timescales. The same three timescales are already covered by today’s dispatchable electricity generation facilities, which are able to follow the typical load variations on the intra-day, intra-week, and seasonal timescales. This work aims to quantify the changing demand for those three backup flexibility classes in emerging large-scale electricity systems, as they transform from low to high shares of variable renewable power generation. A weather-driven modelling is used, which aggregates eight years of wind and solar power generation data as well as load data over Germany and Europe, and splits the backup system required to cover the residual load into three flexibility classes distinguished by their respective maximum rates of change of power output. This modelling shows that the slowly flexible backup system is dominant at low renewable shares, but its optimized capacity decreases and drops close to zero once the average renewable power generation exceeds 50% of the mean load. The medium flexible backup capacities increase for modest renewable shares, peak at around a 40% renewable share, and then continuously decrease to almost zero once the average renewable power generation becomes larger than 100% of the mean load. The dispatch capacity of the highly flexible backup system becomes dominant for renewable shares beyond 50%, and reach their maximum around a 70% renewable share. For renewable shares

  18. Response of human populations to large-scale emergencies

    Science.gov (United States)

    Bagrow, James; Wang, Dashun; Barabási, Albert-László

    2010-03-01

    Until recently, little quantitative data regarding collective human behavior during dangerous events such as bombings and riots have been available, despite its importance for emergency management, safety and urban planning. Understanding how populations react to danger is critical for prediction, detection and intervention strategies. Using a large telecommunications dataset, we study for the first time the spatiotemporal, social and demographic response properties of people during several disasters, including a bombing, a city-wide power outage, and an earthquake. Call activity rapidly increases after an event and we find that, when faced with a truly life-threatening emergency, information rapidly propagates through a population's social network. Other events, such as sports games, do not exhibit this propagation.

  19. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  20. EPA RE-Powering Mapper Large Scale

    Data.gov (United States)

    U.S. Environmental Protection Agency — The U.S. Environmental Protection Agency (EPA) Office of Land and Emergency Management (OLEM) Office of Communications, Partnerships and Analysis (OCPA) initiated...

  1. Large-Nc nuclear potential puzzle

    International Nuclear Information System (INIS)

    Belitsky, A.V.; Cohen, T.D.

    2002-01-01

    An analysis of the baryon-baryon potential from the point of view of large-N c QCD is performed. A comparison is made between the N c -scaling behavior directly obtained from an analysis at the quark-gluon level to the N c scaling of the potential for a generic hadronic field theory in which it arises via meson exchanges and for which the parameters of the theory are given by their canonical large-N c scaling behavior. The purpose of this comparison is to use large-N c consistency to test the widespread view that the interaction between nuclei arises from QCD through the exchange of mesons. Although at the one- and two-meson exchange level the scaling rules for the potential derived from the hadronic theory matches the quark-gluon level prediction, at the three- and higher-meson exchange level a generic hadronic theory yields a potential which scales with N c faster than that of the quark-gluon theory

  2. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  3. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  4. Assessing the Capacity of the US Health Care System to Use Additional Mechanical Ventilators During a Large-Scale Public Health Emergency.

    Science.gov (United States)

    Ajao, Adebola; Nystrom, Scott V; Koonin, Lisa M; Patel, Anita; Howell, David R; Baccam, Prasith; Lant, Tim; Malatino, Eileen; Chamberlin, Margaret; Meltzer, Martin I

    2015-12-01

    A large-scale public health emergency, such as a severe influenza pandemic, can generate large numbers of critically ill patients in a short time. We modeled the number of mechanical ventilators that could be used in addition to the number of hospital-based ventilators currently in use. We identified key components of the health care system needed to deliver ventilation therapy, quantified the maximum number of additional ventilators that each key component could support at various capacity levels (ie, conventional, contingency, and crisis), and determined the constraining key component at each capacity level. Our study results showed that US hospitals could absorb between 26,200 and 56,300 additional ventilators at the peak of a national influenza pandemic outbreak with robust pre-pandemic planning. The current US health care system may have limited capacity to use additional mechanical ventilators during a large-scale public health emergency. Emergency planners need to understand their health care systems' capability to absorb additional resources and expand care. This methodology could be adapted by emergency planners to determine stockpiling goals for critical resources or to identify alternatives to manage overwhelming critical care need.

  5. Re-evaluation of the 1995 Hanford Large Scale Drum Fire Test Results

    International Nuclear Information System (INIS)

    Yang, J M

    2007-01-01

    A large-scale drum performance test was conducted at the Hanford Site in June 1995, in which over one hundred (100) 55-gal drums in each of two storage configurations were subjected to severe fuel pool fires. The two storage configurations in the test were pallet storage and rack storage. The description and results of the large-scale drum test at the Hanford Site were reported in WHC-SD-WM-TRP-246, ''Solid Waste Drum Array Fire Performance,'' Rev. 0, 1995. This was one of the main references used to develop the analytical methodology to predict drum failures in WHC-SD-SQA-ANAL-501, 'Fire Protection Guide for Waste Drum Storage Array,'' September 1996. Three drum failure modes were observed from the test reported in WHC-SD-WM-TRP-246. They consisted of seal failure, lid warping, and catastrophic lid ejection. There was no discernible failure criterion that distinguished one failure mode from another. Hence, all three failure modes were treated equally for the purpose of determining the number of failed drums. General observations from the results of the test are as follows: (lg b ullet) Trash expulsion was negligible. (lg b ullet) Flame impingement was identified as the main cause for failure. (lg b ullet) The range of drum temperatures at failure was 600 C to 800 C. This is above the yield strength temperature for steel, approximately 540 C (1,000 F). (lg b ullet) The critical heat flux required for failure is above 45 kW/m 2 . (lg b ullet) Fire propagation from one drum to the next was not observed. The statistical evaluation of the test results using, for example, the student's t-distribution, will demonstrate that the failure criteria for TRU waste drums currently employed at nuclear facilities are very conservative relative to the large-scale test results. Hence, the safety analysis utilizing the general criteria described in the five bullets above will lead to a technically robust and defensible product that bounds the potential consequences from postulated

  6. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  7. Jets from jets: re-clustering as a tool for large radius jet reconstruction and grooming at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Nachman, Benjamin; Nef, Pascal; Schwartzman, Ariel; Swiatlowski, Maximilian [SLAC National Accelerator Laboratory, Stanford University,2575 Sand Hill Rd, Menlo Park, CA 94025 (United States); Wanotayaroj, Chaowaroj [Center for High Energy Physics, University of Oregon,1371 E. 13th Ave, Eugene, OR 97403 (United States)

    2015-02-12

    Jets with a large radius R≳1 and grooming algorithms are widely used to fully capture the decay products of boosted heavy particles at the Large Hadron Collider (LHC). Unlike most discriminating variables used in such studies, the jet radius is usually not optimized for specific physics scenarios. This is because every jet configuration must be calibrated, insitu, to account for detector response and other experimental effects. One solution to enhance the availability of large-R jet configurations used by the LHC experiments is jet re-clustering. Jet re-clustering introduces an intermediate scale rlarge radius jets. In this paper we systematically study and propose new jet re-clustering configurations and show that re-clustered large radius jets have essentially the same jet mass performance as large radius groomed jets. Jet re-clustering has the benefit that no additional large-R calibration is necessary, allowing the re-clustered large radius parameter to be optimized in the context of specific precision measurements or searches for new physics.

  8. Jets from jets: re-clustering as a tool for large radius jet reconstruction and grooming at the LHC

    International Nuclear Information System (INIS)

    Nachman, Benjamin; Nef, Pascal; Schwartzman, Ariel; Swiatlowski, Maximilian; Wanotayaroj, Chaowaroj

    2015-01-01

    Jets with a large radius R≳1 and grooming algorithms are widely used to fully capture the decay products of boosted heavy particles at the Large Hadron Collider (LHC). Unlike most discriminating variables used in such studies, the jet radius is usually not optimized for specific physics scenarios. This is because every jet configuration must be calibrated, insitu, to account for detector response and other experimental effects. One solution to enhance the availability of large-R jet configurations used by the LHC experiments is jet re-clustering. Jet re-clustering introduces an intermediate scale rlarge radius jets. In this paper we systematically study and propose new jet re-clustering configurations and show that re-clustered large radius jets have essentially the same jet mass performance as large radius groomed jets. Jet re-clustering has the benefit that no additional large-R calibration is necessary, allowing the re-clustered large radius parameter to be optimized in the context of specific precision measurements or searches for new physics.

  9. Threats and Re-emergence of Chickungunya Fever in Indian Sub-continent

    Directory of Open Access Journals (Sweden)

    Mahajan S

    2009-02-01

    Full Text Available Zoonoses are among the most frequent and dreaded risk to which mankind is exposed today, human health is inextricably linked to animal health and production. Over the past 6 years, a number of zoonotic and vector borne viral diseases were recorded in South-east Asia and the Western Pacific and there was sudden upsurge in the number of emerging and re-emerging zoonotic diseases in Indian Sub-continent and Chikungunya fever is one of them. The precise reasons for the re-emergence of Chikungunya in the Indian subcontinent as well as the other small countries in the southern Indian Ocean are an enigma. Although, it is well recognized that re-emergence of viral infections are due to a variety of social, environmental, behavioural and biological changes, which of these contributed to the re-emergence of Chikungunya virus would be interesting to unravel. Chikungunya is generally spread through bites of infected mosquitoes; mosquitoes become infected when they feed on a animal infected with CHIK virus. Monkey and possibly other wild animals may serve as reservoirs of infection. [Vet. World 2009; 2(1.000: 40-42

  10. Emerging and Re-Emerging Infectious Diseases. Grades 9-12. NIH Curriculum Supplement Series.

    Science.gov (United States)

    Biological Sciences Curriculum Study, Colorado Springs.

    This curriculum supplement guide brings the latest medical discoveries to classrooms. This module focuses on the objectives of introducing students to major concepts related to emerging and re-emerging infectious diseases, and developing an understanding of the relationship between biomedical research and personal and public health. This module…

  11. Emerging & re-emerging infections in India: An overview

    Directory of Open Access Journals (Sweden)

    T Dikid

    2013-01-01

    Full Text Available The incidence of emerging infectious diseases in humans has increased within the recent past or threatens to increase in the near future. Over 30 new infectious agents have been detected worldwide in the last three decades; 60 per cent of these are of zoonotic origin. Developing countries such as India suffer disproportionately from the burden of infectious diseases given the confluence of existing environmental, socio-economic, and demographic factors. In the recent past, India has seen outbreaks of eight organisms of emerging and re-emerging diseases in various parts of the country, six of these are of zoonotic origin. Prevention and control of emerging infectious diseases will increasingly require the application of sophisticated epidemiologic and molecular biologic technologies, changes in human behaviour, a national policy on early detection of and rapid response to emerging infections and a plan of action. WHO has made several recommendations for national response mechanisms. Many of these are in various stages of implementation in India. However, for a country of size and population of India, the emerging infections remain a real and present danger. A meaningful response must approach the problem at the systems level. A comprehensive national strategy on infectious diseases cutting across all relevant sectors with emphasis on strengthened surveillance, rapid response, partnership building and research to guide public policy is needed.

  12. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  13. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  14. Assessment of clean development mechanism potential of large-scale energy efficiency measures in heavy industries

    International Nuclear Information System (INIS)

    Hayashi, Daisuke; Krey, Matthias

    2007-01-01

    This paper assesses clean development mechanism (CDM) potential of large-scale energy efficiency measures in selected heavy industries (iron and steel, cement, aluminium, pulp and paper, and ammonia) taking India and Brazil as examples of CDM project host countries. We have chosen two criteria for identification of the CDM potential of each energy efficiency measure: (i) emission reductions volume (in CO 2 e) that can be expected from the measure and (ii) likelihood of the measure passing the additionality test of the CDM Executive Board (EB) when submitted as a proposed CDM project activity. The paper shows that the CDM potential of large-scale energy efficiency measures strongly depends on the project-specific and country-specific context. In particular, technologies for the iron and steel industry (coke dry quenching (CDQ), top pressure recovery turbine (TRT), and basic oxygen furnace (BOF) gas recovery), the aluminium industry (point feeder prebake (PFPB) smelter), and the pulp and paper industry (continuous digester technology) offer promising CDM potential

  15. Jets from jets: re-clustering as a tool for large radius jet reconstruction and grooming at the LHC

    Science.gov (United States)

    Nachman, Benjamin; Nef, Pascal; Schwartzman, Ariel; Swiatlowski, Maximilian; Wanotayaroj, Chaowaroj

    2015-02-01

    Jets with a large radius R ≳ 1 and grooming algorithms are widely used to fully capture the decay products of boosted heavy particles at the Large Hadron Collider (LHC). Unlike most discriminating variables used in such studies, the jet radius is usually not optimized for specific physics scenarios. This is because every jet configuration must be calibrated, insitu, to account for detector response and other experimental effects. One solution to enhance the availability of large- R jet configurations used by the LHC experiments is jet re-clustering. Jet re-clustering introduces an intermediate scale r groomed jets. Jet re-clustering has the benefit that no additional large-R calibration is necessary, allowing the re-clustered large radius parameter to be optimized in the context of specific precision measurements or searches for new physics.

  16. Large-scale CO2 injection demos for the development of monitoring and verification technology and guidelines (CO2ReMoVe)

    Energy Technology Data Exchange (ETDEWEB)

    Wildenborg, T.; David, P. [TNO Built Environment and Geosciences, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Bentham, M.; Chadwick, A.; Kirk, K. [British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham NG12 5GG (United Kingdom); Dillen, M. [SINTEF Petroleum Research, Trondheim (Norway); Groenenberg, H. [Unit Policy Studies, Energy Research Centre of the Netherlands ECN, Amsterdam (Netherlands); Deflandre, J.P.; Le Gallo, J. [Institut Francais du Petrole, Rueil-Malmaison (France)

    2009-04-15

    The objectives of the EU project CO2ReMoVe are to undertake the research and development necessary to establish scientifically based standards for monitoring future CCS operations and to develop the performance assessment methodologies necessary to demonstrate the long-term reliability of geological storage of CO2. This could in turn lead to guidelines for the certification of sites suitable for CCS on a wide scale. Crucial to the project portfolio are the continuing large-scale CO2 injection operation at Sleipner, the injection operation at In Salah (Algeria) and the recently started injection project at Snoehvit (Norway). Two pilot sites are also currently in the project portfolio, Ketzin in Germany and K12-B in the offshore continental shelf of the Netherlands.

  17. Mechanisms of innate immune evasion in re-emerging RNA viruses.

    Science.gov (United States)

    Ma, Daphne Y; Suthar, Mehul S

    2015-06-01

    Recent outbreaks of Ebola, West Nile, Chikungunya, Middle Eastern Respiratory and other emerging/re-emerging RNA viruses continue to highlight the need to further understand the virus-host interactions that govern disease severity and infection outcome. As part of the early host antiviral defense, the innate immune system mediates pathogen recognition and initiation of potent antiviral programs that serve to limit virus replication, limit virus spread and activate adaptive immune responses. Concordantly, viral pathogens have evolved several strategies to counteract pathogen recognition and cell-intrinsic antiviral responses. In this review, we highlight the major mechanisms of innate immune evasion by emerging and re-emerging RNA viruses, focusing on pathogens that pose significant risk to public health. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  19. Emergency response preparedness: the French experience of large scale exercises

    International Nuclear Information System (INIS)

    Chanson, D.; Desnoyers, B.; Chabane, J.M.

    2004-01-01

    In compliance with the IAEA regulations for the transport of radioactive material in the event of accidents during transport of radioactive material, emergency provisions to protect persons, property and environment have to be established and developed by the relevant national organisations. In France, the prefect of the department where the accident occurs is responsible for decisions and measures required to ensure the protection of both population and property at risk owing to the accident. During an accident, the ministers concerned provide the prefect with recommendations and information, in order to help him take the requisite decisions. On their side, the nuclear industry and transport companies also have to be prepared to intervene and to support the authorities at their request, depending on their capacities and their specialities. To prepare the emergency teams properly and acquire effective emergency plans, training exercises have to be conducted regularly with every ministerial department involved, the nuclear industry and transport companies, members of the public and the media. Then, the feedback from such exercises shall be taken into account to improve the emergency procedures. This paper will introduce: - emergency response preparedness: what is required by the relevant regulations? - emergency response preparedness: how is France organised? - the French experience of conducting large training exercises simulating accidents involving the transport of radioactive material; - the main difficulties and lessons learned; - the perspectives

  20. Re-Emergence of Rift Valley Fever in Madagascar

    Centers for Disease Control (CDC) Podcasts

    2010-05-27

    This podcast describes the re-emergence of Rift Valley Fever in Madagascar during two rainy seasons in 2008 and 2009. CDC epidemiologist Dr. Pierre Rollin discusses what researchers learned about the outbreak and about infections in the larger population in Madagascar.  Created: 5/27/2010 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 5/27/2010.

  1. Overdamped large-eddy simulations of turbulent pipe flow up to Reτ = 1500

    Science.gov (United States)

    Feldmann, Daniel; Avila, Marc

    2018-04-01

    We present results from large-eddy simulations (LES) of turbulent pipe flow in a computational domain of 42 radii in length. Wide ranges of shear the Reynolds number and Smagorinsky model parameter are covered, 180 ≤ Reτ ≤ 1500 and 0.05 ≤ Cs ≤ 1.2, respectively. The aim is to asses the effect of Cs on the resolved flow field and turbulence statistics as well as to test whether very large scale motions (VLSM) in pipe flow can be isolated from the near-wall cycle by enhancing the dissipative character of the static Smagorinsky model with elevated Cs values. We found that the optimal Cs to achieve best agreement with reference data varies with Reτ and further depends on the wall normal location and the quantity of interest. Furthermore, for increasing Reτ , the optimal Cs for pipe flow LES seems to approach the theoretically optimal value for LES of isotropic turbulence. In agreement with previous studies, we found that for increasing Cs small-scale streaks in simple flow field visualisations are gradually quenched and replaced by much larger smooth streaks. Our analysis of low-order turbulence statistics suggests, that these structures originate from an effective reduction of the Reynolds number and thus represent modified low-Reynolds number near-wall streaks rather than VLSM. We argue that overdamped LES with the static Smagorinsky model cannot be used to unambiguously determine the origin and the dynamics of VLSM in pipe flow. The approach might be salvaged by e.g. using more sophisticated LES models accounting for energy flux towards large scales or explicit anisotropic filter kernels.

  2. Computational domain length and Reynolds number effects on large-scale coherent motions in turbulent pipe flow

    Science.gov (United States)

    Feldmann, Daniel; Bauer, Christian; Wagner, Claus

    2018-03-01

    We present results from direct numerical simulations (DNS) of turbulent pipe flow at shear Reynolds numbers up to Reτ = 1500 using different computational domains with lengths up to ?. The objectives are to analyse the effect of the finite size of the periodic pipe domain on large flow structures in dependency of Reτ and to assess a minimum ? required for relevant turbulent scales to be captured and a minimum Reτ for very large-scale motions (VLSM) to be analysed. Analysing one-point statistics revealed that the mean velocity profile is invariant for ?. The wall-normal location at which deviations occur in shorter domains changes strongly with increasing Reτ from the near-wall region to the outer layer, where VLSM are believed to live. The root mean square velocity profiles exhibit domain length dependencies for pipes shorter than 14R and 7R depending on Reτ. For all Reτ, the higher-order statistical moments show only weak dependencies and only for the shortest domain considered here. However, the analysis of one- and two-dimensional pre-multiplied energy spectra revealed that even for larger ?, not all physically relevant scales are fully captured, even though the aforementioned statistics are in good agreement with the literature. We found ? to be sufficiently large to capture VLSM-relevant turbulent scales in the considered range of Reτ based on our definition of an integral energy threshold of 10%. The requirement to capture at least 1/10 of the global maximum energy level is justified by a 14% increase of the streamwise turbulence intensity in the outer region between Reτ = 720 and 1500, which can be related to VLSM-relevant length scales. Based on this scaling anomaly, we found Reτ⪆1500 to be a necessary minimum requirement to investigate VLSM-related effects in pipe flow, even though the streamwise energy spectra does not yet indicate sufficient scale separation between the most energetic and the very long motions.

  3. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  4. Life as an emergent phenomenon: studies from a large-scale boid simulation and web data

    Science.gov (United States)

    Ikegami, Takashi; Mototake, Yoh-ichi; Kobori, Shintaro; Oka, Mizuki; Hashimoto, Yasuhiro

    2017-11-01

    A large group with a special structure can become the mother of emergence. We discuss this hypothesis in relation to large-scale boid simulations and web data. In the boid swarm simulations, the nucleation, organization and collapse dynamics were found to be more diverse in larger flocks than in smaller flocks. In the second analysis, large web data, consisting of shared photos with descriptive tags, tended to group together users with similar tendencies, allowing the network to develop a core-periphery structure. We show that the generation rate of novel tags and their usage frequencies are high in the higher-order cliques. In this case, novelty is not considered to arise randomly; rather, it is generated as a result of a large and structured network. We contextualize these results in terms of adjacent possible theory and as a new way to understand collective intelligence. We argue that excessive information and material flow can become a source of innovation. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  5. Chikungunya fever: A re-emerging viral infection

    Directory of Open Access Journals (Sweden)

    Chhabra M

    2008-01-01

    Full Text Available Chikungunya (CHIK fever is a re-emerging viral disease characterized by abrupt onset of fever with severe arthralgia followed by constitutional symptoms and rash lasting for 1-7 days. The disease is almost self-limiting and rarely fatal. Chikungunya virus (CHIKV is a RNA virus belonging to family Togaviridae, genus Alphavirus. Molecular characterization has demonstrated two distinct lineages of strains which cause epidemics in Africa and Asia. These geographical genotypes exhibit differences in the transmission cycles. In contrast to Africa where sylvatic cycle is maintained between monkeys and wild mosquitoes, in Asia the cycle continues between humans and the Aedes aegypti mosquito. CHIKV is known to cause epidemics after a period of quiescence. The first recorded epidemic occurred in Tanzania in 1952-1953. In Asia, CHIK activity was documented since its isolation in Bangkok, Thailand in 1958. Virus transmission continued till 1964. After hiatus, the virus activity re-appeared in the mid-1970s and declined by 1976. In India, well-documented outbreaks occurred in 1963 and 1964 in Kolkata and southern India, respectively. Thereafter, a small outbreak of CHIK was reported from Sholapur district, Maharashtra in 1973. CHIKV emerged in the islands of South West Indian Ocean viz. French island of La Reunion, Mayotee, Mauritius and Seychelles which are reporting the outbreak since February, 2005. After quiescence of about three decades, CHIKV re-emerged in India in the states of Andhra Pradesh, Karnataka, Maharashtra, Madhya Pradesh and Tamil Nadu since December, 2005. Cases have also been reported from Rajasthan, Gujarat and Kerala. The outbreak is still continuing. National Institute of Communicable Diseases has conducted epidemiological, entomological and laboratory investigations for confirmation of the outbreak. These have been discussed in detail along with the major challenges that the country faced during the current outbreak.

  6. Re-Emergence of Rift Valley Fever in Madagascar

    Centers for Disease Control (CDC) Podcasts

    This podcast describes the re-emergence of Rift Valley Fever in Madagascar during two rainy seasons in 2008 and 2009. CDC epidemiologist Dr. Pierre Rollin discusses what researchers learned about the outbreak and about infections in the larger population in Madagascar.

  7. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  8. Structure function scaling in a Reλ = 250 turbulent mixing layer

    KAUST Repository

    Attili, Antonio

    2011-12-22

    A highly resolved Direct Numerical Simulation of a spatially developing turbulent mixing layer is presented. In the fully developed region, the flow achieves a turbulent Reynolds number Reλ = 250, high enough for a clear separation between large and dissipative scales, so for the presence of an inertial range. Structure functions have been calculated in the self-similar region using velocity time series and Taylor\\'s frozen turbulence hypothesis. The Extended Self-Similarity (ESS) concept has been employed to evaluate relative scaling exponents. A wide range of scales with scaling exponents and intermittency levels equal to homogeneous isotropic turbulence has been identified. Moreover an additional scaling range exists for larger scales; it is characterized by smaller exponents, similar to the values reported in the literature for flows with strong shear.

  9. Structure function scaling in a Reλ = 250 turbulent mixing layer

    KAUST Repository

    Attili, Antonio; Bisetti, Fabrizio

    2011-01-01

    A highly resolved Direct Numerical Simulation of a spatially developing turbulent mixing layer is presented. In the fully developed region, the flow achieves a turbulent Reynolds number Reλ = 250, high enough for a clear separation between large and dissipative scales, so for the presence of an inertial range. Structure functions have been calculated in the self-similar region using velocity time series and Taylor's frozen turbulence hypothesis. The Extended Self-Similarity (ESS) concept has been employed to evaluate relative scaling exponents. A wide range of scales with scaling exponents and intermittency levels equal to homogeneous isotropic turbulence has been identified. Moreover an additional scaling range exists for larger scales; it is characterized by smaller exponents, similar to the values reported in the literature for flows with strong shear.

  10. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  11. Potential climatic impacts and reliability of very large-scale wind farms

    Directory of Open Access Journals (Sweden)

    C. Wang

    2010-02-01

    Full Text Available Meeting future world energy needs while addressing climate change requires large-scale deployment of low or zero greenhouse gas (GHG emission technologies such as wind energy. The widespread availability of wind power has fueled substantial interest in this renewable energy source as one of the needed technologies. For very large-scale utilization of this resource, there are however potential environmental impacts, and also problems arising from its inherent intermittency, in addition to the present need to lower unit costs. To explore some of these issues, we use a three-dimensional climate model to simulate the potential climate effects associated with installation of wind-powered generators over vast areas of land or coastal ocean. Using wind turbines to meet 10% or more of global energy demand in 2100, could cause surface warming exceeding 1 °C over land installations. In contrast, surface cooling exceeding 1 °C is computed over ocean installations, but the validity of simulating the impacts of wind turbines by simply increasing the ocean surface drag needs further study. Significant warming or cooling remote from both the land and ocean installations, and alterations of the global distributions of rainfall and clouds also occur. These results are influenced by the competing effects of increases in roughness and decreases in wind speed on near-surface turbulent heat fluxes, the differing nature of land and ocean surface friction, and the dimensions of the installations parallel and perpendicular to the prevailing winds. These results are also dependent on the accuracy of the model used, and the realism of the methods applied to simulate wind turbines. Additional theory and new field observations will be required for their ultimate validation. Intermittency of wind power on daily, monthly and longer time scales as computed in these simulations and inferred from meteorological observations, poses a demand for one or more options to ensure

  12. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  13. A world wide public health problem: the principal re-emerging infectious diseases.

    Science.gov (United States)

    De Luca D'Alessandro, E; Giraldi, G

    2011-01-01

    The extraordinary progress in the knowledge of infectious disease, the discovery of antibiotics and effective vaccines are among the great achievement of the nineteenth and twentieth centuries. These achievement have led to a dramatic reduction in the levels of mortality from these diseases. According to the World Health Organization, the term "re-emerging infectious diseases" refers to infectious diseases, which although well known, have not been of recent public health importance. However, climate change, migration, changes in health services, antibiotic resistance, population increase, international travel, the increase in the number of immune-depressed patients ,etc have lead to the re-emergence of these diseases. The climate changes are exposing sectors of the population to inadequate fresh air, water, food and resources for survival which, in consequence, provoke increases in both internal and international migration. In this particular period in which we find ourselves, characterized by globalization, the international community has become aware that the re-emergence of these diseases poses an important risk for public health underlines the necessity to adopt appropriate strategies for their prevention and control. The re-emerging diseases of the twenty-first century are a serious problem for public health and even though there has been enormous progress in medical science and in the battle against infectious diseases, they are still a long way from being really brought under control. A well organized monitoring system would enable the epidemiological characteristics of the infectious diseases to be analyzed and the success or otherwise of preventive interventions to be precisely evaluated. For this reason, the World Health Organization and the European Union have discussed the formation of a collaborative network for the monitoring and control of re-emerging diseases and has initiated special programmes. The battle between humanity and infectious disease

  14. Assessing the Challenges in the Application of Potential Probiotic Lactic Acid Bacteria in the Large-Scale Fermentation of Spanish-Style Table Olives

    Directory of Open Access Journals (Sweden)

    Francisco Rodríguez-Gómez

    2017-05-01

    Full Text Available This work studies the inoculation conditions for allowing the survival/predominance of a potential probiotic strain (Lactobacillus pentosus TOMC-LAB2 when used as a starter culture in large-scale fermentations of green Spanish-style olives. The study was performed in two successive seasons (2011/2012 and 2012/2013, using about 150 tons of olives. Inoculation immediately after brining (to prevent wild initial microbiota growth followed by re-inoculation 24 h later (to improve competitiveness was essential for inoculum predominance. Processing early in the season (September showed a favorable effect on fermentation and strain predominance on olives (particularly when using acidified brines containing 25 L HCl/vessel but caused the disappearance of the target strain from both brines and olives during the storage phase. On the contrary, processing in October slightly reduced the target strain predominance on olives (70–90% but allowed longer survival. The type of inoculum used (laboratory vs. industry pre-adapted never had significant effects. Thus, this investigation discloses key issues for the survival and predominance of starter cultures in large-scale industrial fermentations of green Spanish-style olives. Results can be of interest for producing probiotic table olives and open new research challenges on the causes of inoculum vanishing during the storage phase.

  15. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  16. Differential responses of emergent intertidal coral reef fauna to a large-scale El-Niño southern oscillation event: sponge and coral resilience.

    Science.gov (United States)

    Kelmo, Francisco; Bell, James J; Moraes, Simone Souza; Gomes, Rilza da Costa Tourinho; Mariano-Neto, Eduardo; Attrill, Martin J

    2014-01-01

    There is a paucity of information on the impacts of the 1997-8 El Niño event and subsequent climatic episodes on emergent intertidal coral reef assemblages. Given the environmental variability intertidal reefs experience, such reefs may potentially be more resilient to climatic events and provide important insights into the adaptation of reef fauna to future ocean warming. Here we report the results of a 17-year (1995-2011) biodiversity survey of four emergent coral reef ecosystems in Bahia, Brazil, to assess the impact of a major El Niño event on the reef fauna, and determine any subsequent recovery. The densities of two species of coral, Favia gravida and Siderastrea stellata, did not vary significantly across the survey period, indicating a high degree of tolerance to the El Niño associated stress. However, there were marked decreases in the diversity of other taxa. Molluscs, bryozoans and ascidians suffered severe declines in diversity and abundance and had not recovered to pre-El Niño levels by the end of the study. Echinoderms were reduced to a single species in 1999, Echinometra lucunter, although diversity levels had recovered by 2002. Sponge assemblages were not impacted by the 1997-8 event and their densities had increased by the study end. Multivariate analysis indicated that a stable invertebrate community had re-established on the reefs after the El Niño event, but it has a different overall composition to the pre-El Niño community. It is unclear if community recovery will continue given more time, but our study highlights that any increase in the frequency of large-scale climatic events to more than one a decade is likely to result in a persistent lower-diversity state. Our results also suggest some coral and sponge species are particularly resilient to the El Niño-associated stress and therefore represent suitable models to investigate temperature adaptation in reef organisms.

  17. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    Science.gov (United States)

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  18. Large-scale urban point cloud labeling and reconstruction

    Science.gov (United States)

    Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu

    2018-04-01

    The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.

  19. Future potential distribution of the emerging amphibian chytrid fungus under anthropogenic climate change.

    Science.gov (United States)

    Rödder, Dennis; Kielgast, Jos; Lötters, Stefan

    2010-11-01

    Anthropogenic climate change poses a major threat to global biodiversity with a potential to alter biological interactions at all spatial scales. Amphibians are the most threatened vertebrates and have been subject to increasing conservation attention over the past decade. A particular concern is the pandemic emergence of the parasitic chytrid fungus Batrachochytrium dendrobatidis, which has been identified as the cause of extremely rapid large-scale declines and species extinctions. Experimental and observational studies have demonstrated that the host-pathogen system is strongly influenced by climatic parameters and thereby potentially affected by climate change. Herein we project a species distribution model of the pathogen onto future climatic scenarios generated by the IPCC to examine their potential implications on the pandemic. Results suggest that predicted anthropogenic climate change may reduce the geographic range of B. dendrobatidis and its potential influence on amphibian biodiversity.

  20. Emerging mosquito-borne viruses: transmission and modulation of host defence

    NARCIS (Netherlands)

    Fros, J.J.

    2015-01-01

    Summary

    Two highly pathogenic arthropod-borne (arbo)viruses, West Nile virus (WNV) and chikungunya virus (CHIKV), recently (re-)emerged in both Europe and the Americas. This resulted in large-scale epidemics of severe encephalitic and arthritogenic human disease,

  1. Re-emergence of Chikungunya in India: Molecular studies

    Indian Academy of Sciences (India)

    Re-emergence of Chikungunya in India: Molecular studies · PowerPoint Presentation · CHIKUNGUNYA virus · Slide 4 · Slide 5 · CHIKV Genotypes · Slide 7 · In view of long absence of CHIK epidemics, it was postulated that CHIK virus has disappeared from India and South-East Asia. Serological surveys supported this view.

  2. Experts' Perceptions on China's Capacity to Manage Emerging and Re-emerging Zoonotic Diseases in an Era of Climate Change.

    Science.gov (United States)

    Hansen, A; Xiang, J; Liu, Q; Tong, M X; Sun, Y; Liu, X; Chen, K; Cameron, S; Hanson-Easey, S; Han, G-S; Weinstein, P; Williams, C; Bi, P

    2017-11-01

    Zoonotic diseases transmitted by arthropods and rodents are a major public health concern in China. However, interventions in recent decades have helped lower the incidence of several diseases despite the country's large, frequently mobile population and socio-economic challenges. Increasing globalization, rapid urbanization and a warming climate now add to the complexity of disease control and prevention and could challenge China's capacity to respond to threats of emerging and re-emerging zoonoses. To investigate this notion, face-to-face interviews were conducted with 30 infectious disease experts in four cities in China. The case study diseases under discussion were malaria, dengue fever and haemorrhagic fever with renal syndrome, all of which may be influenced by changing meteorological conditions. Data were analysed using standard qualitative techniques. The study participants viewed the current disease prevention and control system favourably and were optimistic about China's capacity to manage climate-sensitive diseases in the future. Several recommendations emerged from the data including the need to improve health literacy in the population regarding the transmission of infectious diseases and raising awareness of the health impacts of climate change amongst policymakers and health professionals. Participants thought that research capacity could be strengthened and human resources issues for front-line staff should be addressed. It was considered important that authorities are well prepared in advance for outbreaks such as dengue fever in populous subtropical areas, and a prompt and coordinated response is required when outbreaks occur. Furthermore, health professionals need to remain skilled in the identification of diseases for which incidence is declining, so that re-emerging or emerging trends can be rapidly identified. Recommendations such as these may be useful in formulating adaptation plans and capacity building for the future control and

  3. Measurements of the large-scale direct-current Earth potential and possible implications for the geomagnetic dynamo.

    Science.gov (United States)

    1985-07-05

    The magnitude of the large-scale direct-current earth potential was measured on a section of a recently laid transatlantic telecommunications cable. Analysis of the data acquired on the 4476-kilometer cable yielded a mean direct-current potential drop of less than about 0.072 +/- 0.050 millivolts per kilometer. Interpreted in terms of a generation of the potential by the earth's geodynamo, such a small value of the mean potential implies that the toroidal and poloidal magnetic fields of the dynamo are approximately equal at the core-mantle boundary.

  4. Human tularemia in Italy. Is it a re-emerging disease?

    Science.gov (United States)

    D'Alessandro, D; Napoli, C; Nusca, A; Bella, A; Funari, E

    2015-07-01

    Tularemia is a contagious infectious disease due to Francisiella tularensis that can cause serious clinical manifestations and significant mortality if untreated. Although the frequency and significance of the disease has diminished over the last decades in Central Europe, over the past few years, there is new evidence suggesting that tularemia has re-emerged worldwide. To know the real epidemiology of the disease is at the root of correct control measures. In order to evaluate whether tularemia is re-emerging in Italy, data on mortality and morbidity (obtained by the National Institute of Statistics; ISTAT), Italian cases described in the scientific literature and data concerning hospitalizations for tularemia (obtained by the National Hospital Discharge Database) were analysed. From 1979 to 2010, ISTAT reported 474 cases and no deaths. The overall number of cases obtained from the literature review was at least 31% higher than that reported by ISTAT. Moreover, the number of cases reported by ISTAT was 3·5 times smaller than hospitalized cases. In Italy tularemia is sporadic, rarely endemic and self-limiting; but, although the trend of reported tularemia does not support the hypothesis of a re-emerging disease, the study demonstrates a wide underreporting of the disease. The real frequency of the disease should be carefully investigated and taken into account in order to implement specific prevention measures.

  5. Re-thinking china's densified biomass fuel policies: Large or small scale?

    International Nuclear Information System (INIS)

    Shan, Ming; Li, Dingkai; Jiang, Yi; Yang, Xudong

    2016-01-01

    Current policies and strategies related to the utilization of densified biomass fuel (DBF) in China are mainly focused on medium- or large-scale manufacturing modes, which cannot provide feasible solutions to solve the household energy problems in China's rural areas. To simplify commercial processes related to the collection of DBF feedstock and the production and utilization of fuel, a novel village-scale DBF approach is proposed. Pilot demonstration projects have shown the feasibility and flexibility of this new approach in realizing sustainable development in rural China. Effective utilization of DBF in rural China will lead to gains for global, regional, and local energy savings, environmental protection, sustainable development, and related social benefits. It could also benefit other developing countries for better utilization of biomass as a viable household energy source. This proposal therefore delivers the possibility of reciprocal gains, and as such deserves the attention of policy makers and various stakeholders. - Highlights: •A field survey of Chinese densified biomass fuel (DBF) development is conducted. •The current situation and problems related to China's DBF industry are analyzed. •A novel and viable village-scale DBF utilization mode is proposed. •Further actions are suggested to boost the utilization of DBF in rural China.

  6. Landscape of emerging and re-emerging infectious diseases in China: impact of ecology, climate, and behavior.

    Science.gov (United States)

    Liu, Qiyong; Xu, Wenbo; Lu, Shan; Jiang, Jiafu; Zhou, Jieping; Shao, Zhujun; Liu, Xiaobo; Xu, Lei; Xiong, Yanwen; Zheng, Han; Jin, Sun; Jiang, Hai; Cao, Wuchun; Xu, Jianguo

    2018-02-01

    For the past several decades, the infectious disease profile in China has been shifting with rapid developments in social and economic aspects, environment, quality of food, water, housing, and public health infrastructure. Notably, 5 notifiable infectious diseases have been almost eradicated, and the incidence of 18 additional notifiable infectious diseases has been significantly reduced. Unexpectedly, the incidence of over 10 notifiable infectious diseases, including HIV, brucellosis, syphilis, and dengue fever, has been increasing. Nevertheless, frequent infectious disease outbreaks/events have been reported almost every year, and imported infectious diseases have increased since 2015. New pathogens and over 100 new genotypes or serotypes of known pathogens have been identified. Some infectious diseases seem to be exacerbated by various factors, including rapid urbanization, large numbers of migrant workers, changes in climate, ecology, and policies, such as returning farmland to forests. This review summarizes the current experiences and lessons from China in managing emerging and re-emerging infectious diseases, especially the effects of ecology, climate, and behavior, which should have merits in helping other countries to control and prevent infectious diseases.

  7. Rapid BAL Variability: Re-Emerging Absorption

    Energy Technology Data Exchange (ETDEWEB)

    Erakuman, Damla [Department of Astronomy and Space Sciences, Faculty of Science, Erciyes University, Kayseri (Turkey); Filiz Ak, Nurten, E-mail: damla.erakuman@gmail.com [Department of Astronomy and Space Sciences, Faculty of Science, Erciyes University, Kayseri (Turkey); Astronomy and Space Sciences Observatory and Research Center, Erciyes University, Kayseri (Turkey)

    2017-11-08

    We study BAL variations of SDSS J141955.28+522741.4 utilizing 32 epochs of spectroscopic observations from SDSS. We identify three individual BAL troughs for C iv and one BAL trough for Si iv. The deepest C iv BAL trough shows significant EW variations in timescales of a few 10 h. The fast component of the deepest C iv BAL presents disappearance and re-emergence preserving its initial velocity range and profile. All identified BAL troughs show coordinated variations supporting that the possible mechanism behind variations are the ionization level changes of the absorbing gas.

  8. Assessing the capacity of the healthcare system to use additional mechanical ventilators during a large-scale public health emergency (PHE)

    Science.gov (United States)

    Ajao, Adebola; Nystrom, Scott V.; Koonin, Lisa M.; Patel, Anita; Howell, David R.; Baccam, Prasith; Lant, Tim; Malatino, Eileen; Chamberlin, Margaret; Meltzer, Martin I.

    2015-01-01

    A large-scale Public Health Emergency (PHE), like a severe influenza pandemic can generate large numbers of critically ill patients in a short time. We modeled the number of mechanical ventilators that could be used in addition to the number of hospital-based ventilators currently in use. We identified key components of the healthcare system needed to deliver ventilation therapy, quantified the maximum number of additional ventilators that each key component could support at various capacity levels (i.e. conventional, contingency and crisis) and determined the constraining key component at each capacity level. Our study results showed that U.S. hospitals could absorb between 26,200 and 56,300 additional ventilators at the peak of a national influenza pandemic outbreak with robust pre-pandemic planning. This methodology could be adapted by emergency planners to determine stockpiling goals for critical resources or identify alternatives to manage overwhelming critical care need. PMID:26450633

  9. A quantitative risk assessment approach for mosquito-borne diseases: malaria re-emergence in southern France

    Directory of Open Access Journals (Sweden)

    Luty Adrian JF

    2008-08-01

    Full Text Available Abstract Background The Camargue region is a former malaria endemic area, where potential Anopheles vectors are still abundant. Considering the importation of Plasmodium due to the high number of imported malaria cases in France, the aim of this article was to make some predictions regarding the risk of malaria re-emergence in the Camargue. Methods Receptivity (vectorial capacity and infectivity (vector susceptibility were inferred using an innovative probabilistic approach and considering both Plasmodium falciparum and Plasmodium vivax. Each parameter of receptivity (human biting rate, anthropophily, length of trophogonic cycle, survival rate, length of sporogonic cycle and infectivity were estimated based on field survey, bibliographic data and expert knowledge and fitted with probability distributions taking into account the variability and the uncertainty of the estimation. Spatial and temporal variations of the parameters were determined using environmental factors derived from satellite imagery, meteorological data and entomological field data. The entomological risk (receptivity/infectivity was calculated using 10,000 different randomly selected sets of values extracted from the probability distributions. The result was mapped in the Camargue area. Finally, vulnerability (number of malaria imported cases was inferred using data collected in regional hospitals. Results The entomological risk presented large spatial, temporal and Plasmodium species-dependent variations. The sensitivity analysis showed that susceptibility, survival rate and human biting rate were the three most influential parameters for entomological risk. Assessment of vulnerability showed that among the imported cases in the region, only very few were imported in at-risk areas. Conclusion The current risk of malaria re-emergence seems negligible due to the very low number of imported Plasmodium. This model demonstrated its efficiency for mosquito-borne diseases risk

  10. Acoustic scaling: A re-evaluation of the acoustic model of Manchester Studio 7

    Science.gov (United States)

    Walker, R.

    1984-12-01

    The reasons for the reconstruction and re-evaluation of the acoustic scale mode of a large music studio are discussed. The design and construction of the model using mechanical and structural considerations rather than purely acoustic absorption criteria is described and the results obtained are given. The results confirm that structural elements within the studio gave rise to unexpected and unwanted low-frequency acoustic absorption. The results also show that at least for the relatively well understood mechanisms of sound energy absorption physical modelling of the structural and internal components gives an acoustically accurate scale model, within the usual tolerances of acoustic design. The poor reliability of measurements of acoustic absorption coefficients, is well illustrated. The conclusion is reached that such acoustic scale modelling is a valid and, for large scale projects, financially justifiable technique for predicting fundamental acoustic effects. It is not appropriate for the prediction of fine details because such small details are unlikely to be reproduced exactly at a different size without extensive measurements of the material's performance at both scales.

  11. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    DEFF Research Database (Denmark)

    Jensen, Tue Vissing; Pinson, Pierre

    2017-01-01

    , we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven...... to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecastingof renewable power generation....

  12. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  13. Large scale electronic structure calculations in the study of the condensed phase

    NARCIS (Netherlands)

    van Dam, H.J.J.; Guest, M.F.; Sherwood, P.; Thomas, J.M.H.; van Lenthe, J.H.; van Lingen, J.N.J.; Bailey, C.L.; Bush, I.J.

    2006-01-01

    We consider the role that large-scale electronic structure computations can now play in the modelling of the condensed phase. To structure our analysis, we consider four distict ways in which today's scientific targets can be re-scoped to take advantage of advances in computing resources: 1. time to

  14. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  15. Identification of a potential superhard compound ReCN

    International Nuclear Information System (INIS)

    Fan, Xiaofeng; Li, M.M.; Singh, David J.; Jiang, Qing; Zheng, W.T.

    2015-01-01

    Highlights: • We identify a new ternary compound ReCN with theoretical calculation. • The ternary compound ReCN is with two stable structures with P63mc and P3m1. • ReCN is a semiconductor from the calculation of electronic structures. • ReCN is found to possess the outstanding mechanical properties. • ReCN may be synthesized relatively easily. - Abstract: We identify a new ternary compound, ReCN and characterize its properties including structural stability and indicators of hardness using first principles calculations. We find that there are two stable structures with space groups P63mc (HI) and P3m1 (HII), in which there are no C–C and N–N bonds. Both structures, H1 and III are elastically and dynamically stable. The electronic structures show that ReCN is a semiconductor, although the parent compounds, ReC 2 and ReN 2 are both metallic. ReCN is found to possess the outstanding mechanical properties with the large bulk modulus, shear modulus and excellent ideal strengths. In addition, ReCN may perhaps be synthesized relatively easily because it becomes thermodynamic stable with respect to decomposition at very low pressures

  16. Demonstration of Mobile Auto-GPS for Large Scale Human Mobility Analysis

    Science.gov (United States)

    Horanont, Teerayut; Witayangkurn, Apichon; Shibasaki, Ryosuke

    2013-04-01

    The greater affordability of digital devices and advancement of positioning and tracking capabilities have presided over today's age of geospatial Big Data. Besides, the emergences of massive mobile location data and rapidly increase in computational capabilities open up new opportunities for modeling of large-scale urban dynamics. In this research, we demonstrate the new type of mobile location data called "Auto-GPS" and its potential use cases for urban applications. More than one million Auto-GPS mobile phone users in Japan have been observed nationwide in a completely anonymous form for over an entire year from August 2010 to July 2011 for this analysis. A spate of natural disasters and other emergencies during the past few years has prompted new interest in how mobile location data can help enhance our security, especially in urban areas which are highly vulnerable to these impacts. New insights gleaned from mining the Auto-GPS data suggest a number of promising directions of modeling human movement during a large-scale crisis. We question how people react under critical situation and how their movement changes during severe disasters. Our results demonstrate a case of major earthquake and explain how people who live in Tokyo Metropolitan and vicinity area behave and return home after the Great East Japan Earthquake on March 11, 2011.

  17. Toxoplasmosis a re-emerging ancient disease | Neils | Zoologist (The)

    African Journals Online (AJOL)

    Toxoplasmosis a re-emerging ancient disease. JS Neils, IA Lawal. Abstract. No Abstract. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · http://dx.doi.org/10.4314/tzool.v4i1.45219 · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians ...

  18. Re-spending rebound: A macro-level assessment for OECD countries and emerging economies

    International Nuclear Information System (INIS)

    Antal, Miklós; Bergh, Jeroen C.J.M. van den

    2014-01-01

    It is well-known that energy conservation can lead to rebound effects that partly offset the original energy savings. One particular rebound mechanism is re-spending of money savings associated with energy savings on energy intensive goods or services. We calculate the average magnitude of this “re-spending rebound” for different fuels and countries, and for both energy and carbon (CO 2 ) emissions. We find that emerging economies, neglected in past studies, typically have larger rebounds than OECD countries. Since such economies play an increasingly important role in the global economy the re-spending rebound is a growing concern. The re-spending effect is generally larger for gasoline than for natural gas and electricity. Paradoxically, stronger financial incentives to conserve energy tend to increase the rebound. This suggests that with climate regulation and peak oil the re-spending rebound may become more important. We discuss the policy implications of our findings. - highlights: • Energy and carbon rebound due to re-spending of money savings is analyzed. • The average magnitude of this rebound is calculated for several countries. • Emerging economies typically have substantially larger rebounds than OECD countries. • The effect is generally stronger for gasoline than for natural gas and electricity. • Policy conclusions are drawn

  19. Large-scale simulations with distributed computing: Asymptotic scaling of ballistic deposition

    International Nuclear Information System (INIS)

    Farnudi, Bahman; Vvedensky, Dimitri D

    2011-01-01

    Extensive kinetic Monte Carlo simulations are reported for ballistic deposition (BD) in (1 + 1) dimensions. The large system sizes L observed for the onset of asymptotic scaling (L ≅ 2 12 ) explains the widespread discrepancies in previous reports for exponents of BD in one and likely in higher dimensions. The exponents obtained directly from our simulations, α = 0.499 ± 0.004 and β = 0.336 ± 0.004, capture the exact values α = 1/2 and β = 1/3 for the one-dimensional Kardar-Parisi-Zhang equation. An analysis of our simulations suggests a criterion for identifying the onset of true asymptotic scaling, which enables a more informed evaluation of exponents for BD in higher dimensions. These simulations were made possible by the Simulation through Social Networking project at the Institute for Advanced Studies in Basic Sciences in 2007, which was re-launched in November 2010.

  20. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  1. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  2. Co-evolution of intelligent socio-technical systems modelling and applications in large scale emergency and transport domains

    CERN Document Server

    2013-01-01

    As the interconnectivity between humans through technical devices is becoming ubiquitous, the next step is already in the making: ambient intelligence, i.e. smart (technical) environments, which will eventually play the same active role in communication as the human players, leading to a co-evolution in all domains where real-time communication is essential. This topical volume, based on the findings of the Socionical European research project, gives equal attention to two highly relevant domains of applications: transport, specifically traffic, dynamics from the viewpoint of a socio-technical interaction and evacuation scenarios for large-scale emergency situations. Care was taken to investigate as much as possible the limits of scalability and to combine the modeling using complex systems science approaches with relevant data analysis.

  3. RED Alert – Early warning or detection of global re-emerging infectious disease (RED)

    Energy Technology Data Exchange (ETDEWEB)

    Deshpande, Alina [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-13

    This is the PDF of a presentation for a webinar given by Los Alamos National Laboratory (LANL) on the early warning or detection of global re-emerging infectious disease (RED). First, there is an overview of LANL biosurveillance tools. Then, information is given about RED Alert. Next, a demonstration is given of a component prototype. RED Alert is an analysis tool that can provide early warning or detection of the re-emergence of an infectious disease at the global level, but through a local lens.

  4. EPA RE-Powering Mapper: Alternative Energy Potential at Cleanup Sites

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Office of Land and Emergency Management??s (OLEM) Office of Communications, Partnerships and Analysis (OCPA) initiated the RE-Powering America's Land Initiative to demonstrate the enormous potential that contaminated lands, landfills, and mine sites provide for developing renewable energy in the United States. EPA developed national level site screening criteria in partnership with the U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) for wind, solar, biomass, and geothermal facilities. While the screening criteria demonstrate the potential to reuse contaminated land for renewable energy facilities, the criteria and data are neither designed to identify the best sites for developing renewable energy nor all-inclusive. Therefore, more detailed, site-specific analysis is necessary to identify or prioritize the best sites for developing renewable energy facilities based on the technical and economic potential. Please note that these sites were only pre-screened for renewable energy potential. The sites were not evaluated for land use constraints or current on the ground conditions. Additional research and site-specific analysis are needed to verify viability for renewable energy potential at a given site.

  5. Evaluation of biochar powder on oxygen supply efficiency and global warming potential during mainstream large-scale aerobic composting.

    Science.gov (United States)

    He, Xueqin; Chen, Longjian; Han, Lujia; Liu, Ning; Cui, Ruxiu; Yin, Hongjie; Huang, Guangqun

    2017-12-01

    This study investigated the effects of biochar powder on oxygen supply efficiency and global warming potential (GWP) in the large-scale aerobic composting pattern which includes cyclical forced-turning with aeration at the bottom of composting tanks in China. A 55-day large-scale aerobic composting experiment was conducted in two different groups without and with 10% biochar powder addition (by weight). The results show that biochar powder improves the holding ability of oxygen, and the duration time (O 2 >5%) is around 80%. The composting process with above pattern significantly reduce CH 4 and N 2 O emissions compared to the static or turning-only styles. Considering the average GWP of the BC group was 19.82% lower than that of the CK group, it suggests that rational addition of biochar powder has the potential to reduce the energy consumption of turning, improve effectiveness of the oxygen supply, and reduce comprehensive greenhouse effects. Copyright © 2017. Published by Elsevier Ltd.

  6. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  7. Challenges and opportunities for large landscape-scale management in a shifting climate: The importance of nested adaptation responses across geospatial and temporal scales

    Science.gov (United States)

    Gary M. Tabor; Anne Carlson; Travis Belote

    2014-01-01

    The Yellowstone to Yukon Conservation Initiative (Y2Y) was established over 20 years ago as an experiment in large landscape conservation. Initially, Y2Y emerged as a response to large scale habitat fragmentation by advancing ecological connectivity. It also laid the foundation for large scale multi-stakeholder conservation collaboration with almost 200 non-...

  8. Quasi-potential and Two-Scale Large Deviation Theory for Gillespie Dynamics

    KAUST Repository

    Li, Tiejun; Li, Fangting; Li, Xianggang; Lu, Cheng

    2016-01-01

    theory for Gillespie-type jump dynamics. In the application to a typical genetic switching model, the two-scale large deviation theory is developed to take into account the fast switching of DNA states. The comparison with other proposals are also

  9. Kinematic morphology of large-scale structure: evolution from potential to rotational flow

    International Nuclear Information System (INIS)

    Wang, Xin; Szalay, Alex; Aragón-Calvo, Miguel A.; Neyrinck, Mark C.; Eyink, Gregory L.

    2014-01-01

    As an alternative way to describe the cosmological velocity field, we discuss the evolution of rotational invariants constructed from the velocity gradient tensor. Compared with the traditional divergence-vorticity decomposition, these invariants, defined as coefficients of the characteristic equation of the velocity gradient tensor, enable a complete classification of all possible flow patterns in the dark-matter comoving frame, including both potential and vortical flows. We show that this tool, first introduced in turbulence two decades ago, is very useful for understanding the evolution of the cosmic web structure, and in classifying its morphology. Before shell crossing, different categories of potential flow are highly associated with the cosmic web structure because of the coherent evolution of density and velocity. This correspondence is even preserved at some level when vorticity is generated after shell crossing. The evolution from the potential to vortical flow can be traced continuously by these invariants. With the help of this tool, we show that the vorticity is generated in a particular way that is highly correlated with the large-scale structure. This includes a distinct spatial distribution and different types of alignment between the cosmic web and vorticity direction for various vortical flows. Incorporating shell crossing into closed dynamical systems is highly non-trivial, but we propose a possible statistical explanation for some of the phenomena relating to the internal structure of the three-dimensional invariant space.

  10. Kinematic morphology of large-scale structure: evolution from potential to rotational flow

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xin; Szalay, Alex; Aragón-Calvo, Miguel A.; Neyrinck, Mark C.; Eyink, Gregory L. [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-09-20

    As an alternative way to describe the cosmological velocity field, we discuss the evolution of rotational invariants constructed from the velocity gradient tensor. Compared with the traditional divergence-vorticity decomposition, these invariants, defined as coefficients of the characteristic equation of the velocity gradient tensor, enable a complete classification of all possible flow patterns in the dark-matter comoving frame, including both potential and vortical flows. We show that this tool, first introduced in turbulence two decades ago, is very useful for understanding the evolution of the cosmic web structure, and in classifying its morphology. Before shell crossing, different categories of potential flow are highly associated with the cosmic web structure because of the coherent evolution of density and velocity. This correspondence is even preserved at some level when vorticity is generated after shell crossing. The evolution from the potential to vortical flow can be traced continuously by these invariants. With the help of this tool, we show that the vorticity is generated in a particular way that is highly correlated with the large-scale structure. This includes a distinct spatial distribution and different types of alignment between the cosmic web and vorticity direction for various vortical flows. Incorporating shell crossing into closed dynamical systems is highly non-trivial, but we propose a possible statistical explanation for some of the phenomena relating to the internal structure of the three-dimensional invariant space.

  11. Systematic review of surveillance systems and methods for early detection of exotic, new and re-emerging diseases in animal populations.

    Science.gov (United States)

    Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M

    2015-07-01

    In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.

  12. PathlinesExplorer — Image-based exploration of large-scale pathline fields

    KAUST Repository

    Nagoor, Omniah H.

    2015-10-25

    PathlinesExplorer is a novel image-based tool, which has been designed to visualize large scale pathline fields on a single computer [7]. PathlinesExplorer integrates explorable images (EI) technique [4] with order-independent transparency (OIT) method [2]. What makes this method different is that it allows users to handle large data on a single workstation. Although it is a view-dependent method, PathlinesExplorer combines both exploration and modification of visual aspects without re-accessing the original huge data. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathline segments. With this view-dependent method, it is possible to filter, color-code, and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  13. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  14. Estimating the electricity prices, generation costs and CO_2 emissions of large scale wind energy exports from Ireland to Great Britain

    International Nuclear Information System (INIS)

    Cleary, Brendan; Duffy, Aidan; Bach, Bjarne; Vitina, Aisma; O’Connor, Alan; Conlon, Michael

    2016-01-01

    The share of wind generation in the Irish and British electricity markets is set to increase by 2020 due to renewable energy (RE) targets. The United Kingdom (UK) and Ireland have set ambitious targets which require 30% and 40% of electricity demand to come from RE, mainly wind, by 2020, respectively. Ireland has sufficient indigenous onshore wind energy resources to exceed the RE target, while the UK faces uncertainty in achieving its target. A possible solution for the UK is to import RE directly from large scale onshore and offshore wind energy projects in Ireland; this possibility has recently been explored by both governments but is currently on hold. Thus, the aim of this paper is to estimate the effects of large scale wind energy in the Irish and British electricity markets in terms of wholesale system marginal prices, total generation costs and CO_2 emissions. The results indicate when the large scale Irish-based wind energy projects are connected directly to the UK there is a decrease of 0.6% and 2% in the Irish and British wholesale system marginal prices under the UK National Grid slow progression scenario, respectively. - Highlights: • Modelling the Irish and British electricity markets. • Investigating the impacts of large scale wind energy within the markets. • Results indicate a reduction in wholesale system marginal prices in both markets. • Decrease in total generation costs and CO_2 emissions in both markets.

  15. Spontaneous emergence of large-scale cell cycle synchronization in amoeba colonies

    International Nuclear Information System (INIS)

    Segota, Igor; Boulet, Laurent; Franck, David; Franck, Carl

    2014-01-01

    Unicellular eukaryotic amoebae Dictyostelium discoideum are generally believed to grow in their vegetative state as single cells until starvation, when their collective aspect emerges and they differentiate to form a multicellular slime mold. While major efforts continue to be aimed at their starvation-induced social aspect, our understanding of population dynamics and cell cycle in the vegetative growth phase has remained incomplete. Here we show that cell populations grown on a substrate spontaneously synchronize their cell cycles within several hours. These collective population-wide cell cycle oscillations span millimeter length scales and can be completely suppressed by washing away putative cell-secreted signals, implying signaling by means of a diffusible growth factor or mitogen. These observations give strong evidence for collective proliferation behavior in the vegetative state. (paper)

  16. Large-Scale Flows and Magnetic Fields Produced by Rotating Convection in a Quasi-Geostrophic Model of Planetary Cores

    Science.gov (United States)

    Guervilly, C.; Cardin, P.

    2017-12-01

    Convection is the main heat transport process in the liquid cores of planets. The convective flows are thought to be turbulent and constrained by rotation (corresponding to high Reynolds numbers Re and low Rossby numbers Ro). Under these conditions, and in the absence of magnetic fields, the convective flows can produce coherent Reynolds stresses that drive persistent large-scale zonal flows. The formation of large-scale flows has crucial implications for the thermal evolution of planets and the generation of large-scale magnetic fields. In this work, we explore this problem with numerical simulations using a quasi-geostrophic approximation to model convective and zonal flows at Re 104 and Ro 10-4 for Prandtl numbers relevant for liquid metals (Pr 0.1). The formation of intense multiple zonal jets strongly affects the convective heat transport, leading to the formation of a mean temperature staircase. We also study the generation of magnetic fields by the quasi-geostrophic flows at low magnetic Prandtl numbers.

  17. Integration, Provenance, and Temporal Queries for Large-Scale Knowledge Bases

    OpenAIRE

    Gao, Shi

    2016-01-01

    Knowledge bases that summarize web information in RDF triples deliver many benefits, including support for natural language question answering and powerful structured queries that extract encyclopedic knowledge via SPARQL. Large scale knowledge bases grow rapidly in terms of scale and significance, and undergo frequent changes in both schema and content. Two critical problems have thus emerged: (i) how to support temporal queries that explore the history of knowledge bases or flash-back to th...

  18. SCALING AN URBAN EMERGENCY EVACUATION FRAMEWORK: CHALLENGES AND PRACTICES

    Energy Technology Data Exchange (ETDEWEB)

    Karthik, Rajasekar [ORNL; Lu, Wei [ORNL

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist attacks, etc., has significant impacts on urban transportation systems. We built a computational framework to simulate urban transportation systems under critical infrastructure disruption in order to aid real-time emergency evacuation. This framework will use large scale datasets to provide a scalable tool for emergency planning and management. Our framework, World-Wide Emergency Evacuation (WWEE), integrates population distribution and urban infrastructure networks to model travel demand in emergency situations at global level. Also, a computational model of agent-based traffic simulation is used to provide an optimal evacuation plan for traffic operation purpose [1]. In addition, our framework provides a web-based high resolution visualization tool for emergency evacuation modelers and practitioners. We have successfully tested our framework with scenarios in both United States (Alexandria, VA) and Europe (Berlin, Germany) [2]. However, there are still some major drawbacks for scaling this framework to handle big data workloads in real time. On our back-end, lack of proper infrastructure limits us in ability to process large amounts of data, run the simulation efficiently and quickly, and provide fast retrieval and serving of data. On the front-end, the visualization performance of microscopic evacuation results is still not efficient enough due to high volume data communication between server and client. We are addressing these drawbacks by using cloud computing and next-generation web technologies, namely Node.js, NoSQL, WebGL, Open Layers 3 and HTML5 technologies. We will describe briefly about each one and how we are using and leveraging these technologies to provide an efficient tool for emergency management organizations. Our early experimentation demonstrates that using above technologies is a promising approach to build a scalable and high performance urban

  19. Goethite Bench-scale and Large-scale Preparation Tests

    Energy Technology Data Exchange (ETDEWEB)

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    ferrous ion, Fe{sup 2+}-Fe{sup 2+} is oxidized to Fe{sup 3+} - in the presence of goethite seed particles. Rhenium does not mimic that process; it is not a strong enough reducing agent to duplicate the TcO{sub 4}{sup -}/Fe{sup 2+} redox reactions. Laboratory tests conducted in parallel with these scaled tests identified modifications to the liquid chemistry necessary to reduce ReO{sub 4}{sup -} and capture rhenium in the solids at levels similar to those achieved by Um (2010) for inclusion of Tc into goethite. By implementing these changes, Re was incorporated into Fe-rich solids for testing at VSL. The changes also changed the phase of iron that was in the slurry product: rather than forming goethite ({alpha}-FeOOH), the process produced magnetite (Fe{sub 3}O{sub 4}). Magnetite was considered by Pacific Northwest National Laboratory (PNNL) and VSL to probably be a better product to improve Re retention in the melter because it decomposes at a higher temperature than goethite (1538 C vs. 136 C). The feasibility tests at VSL were conducted using Re-rich magnetite. The tests did not indicate an improved retention of Re in the glass during vitrification, but they did indicate an improved melting rate (+60%), which could have significant impact on HLW processing. It is still to be shown whether the Re is a solid solution in the magnetite as {sup 99}Tc was determined to be in goethite.

  20. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    Science.gov (United States)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  1. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    Science.gov (United States)

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  2. Study of elemental mercury re-emission through a lab-scale simulated scrubber

    Energy Technology Data Exchange (ETDEWEB)

    Cheng-Li Wu; Yan Cao; Cheng-Chun He; Zhong-Bing Dong; Wei-Ping Pan [Western Kentucky University, KY (United States). Institute for Combustion Science and Environmental Technology

    2010-08-15

    This paper describes a lab-scale simulated scrubber that was designed and built in the laboratory at Western Kentucky University's Institute for Combustion Science and Environmental Technology. A series of tests on slurries of CaO, CaSO{sub 3}, CaSO{sub 4}/CaSO{sub 3} and Na{sub 2}SO{sub 3} were carried out to simulate recirculating slurries in different oxidation modes. Elemental mercury (Hg{sup 0}) re-emission was replicated through the simulated scrubber. The relationship between the oxidation-reduction potential (ORP) of the slurries and the Hg0 re-emissions was evaluated. Elemental mercury re-emission occurred when Hg{sup 2+} that was absorbed in the simulated scrubber was converted to Hg{sup 0}; then, Hg{sup 0} was emitted from the slurry together with the carrier gas. The effects of both the reagents and the operational conditions (including the temperature, pH, and oxygen concentrations in the carrier gas) on the Hg{sup 0} re-emission rates in the simulated scrubber were investigated. The results indicated that as the operational temperature of the scrubber and the pH value of the slurry increased, the Hg{sup 0} concentrations that were emitted from the simulated scrubber increased. The Hg{sup 0} re-emission rates decreased as the O{sub 2} concentration in the carrier gas increased. In addition, the effects of additives to suppress Hg{sup 0} re-emission were evaluated in this paper. Sodium tetrasulfide, TMT 15, NaHS and HI were added to the slurry, while Hg{sup 2+}, which was absorbed in the slurry, was retained in the slurry as mercury precipitates. Therefore, there was a significant capacity for the additives to suppress Hg{sup 0} re-emission. 11 refs., 11 figs., 5 tabs.

  3. Identity Statuses throughout Adolescence and Emerging Adulthood: A Large-Scale Study into Gender, Age, and Contextual Differences

    Directory of Open Access Journals (Sweden)

    Margaux Verschueren

    2017-04-01

    Full Text Available Identity formation constitutes a core developmental task during adolescence and emerging adulthood. However, it remains unclear how identity formation may vary across age, gender, and context (education vs. employment in these developmental periods. The present study used a recently developed model to examine identity statuses or types in a sample of 7,906 Flemish individuals (14–30 years old; 64% female. As expected, achievement, foreclosure, moratorium, carefree diffusion, troubled diffusion, and an undifferentiated status emerged through cluster analysis. Women were overrepresented in the moratorium status (characterized by high exploration, whereas men were mainly situated in foreclosure and carefree diffusion statuses (both characterized by low exploration, but individuals in foreclosure having strong identity commitments as well. Individuals in the carefree and troubled diffusion statuses, which represent the least adaptive statuses, were youngest. High school students were overrepresented in the diffusion statuses and college students were mostly present in achievement (representing the most mature status and moratorium. Finally, employed individuals were overrepresented in foreclosure, whereas unemployed individuals were mainly situated in troubled diffusion. In sum, the present study systematically examined relationships between empirically-identified identity statuses and socio-demographic variables in a large-scale sample, generating important information on age, gender, and contextual differences in identity.

  4. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  5. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  6. Re-attenders to the emergency department of a major urban hospital serving a population of 290,000.

    LENUS (Irish Health Repository)

    Ramasubbu, B

    2015-01-01

    The national Emergency Medicine Programme (EMP) in Ireland, defines a re-attender as any patient re-presenting to the Emergency Department (ED) within 28 days with the same chief complaint. A retrospective, electronic patient record audit was carried out on all re-attenders to Connolly ED during November 2012. There were 2919 attendances made up from 2530 patients; 230 patients re-attended a total of 389 times. The re-attendance rate was 13% (389\\/2919). 63 (27%) were frequent presenters. There was a significantly higher admission rate at second attendance than first (89 (39%) vs 39 (17%), p < 0.001). 25% (57\\/230) of patients \\'left before completion of treatment\\' (LBCT) at first attendance (significantly higher than the number at second attendance (p < 0.01)). 14\\/57 (25%) of those who LBCT at first attendance required admission at second attendance. 28\\/89 (31%) of second attendance admissions were failed discharges from first attendance. Reasons for re-attendance are multi-factorial and include both patient and departmental factors.

  7. A large-scale peer teaching programme - acceptance and benefit.

    Science.gov (United States)

    Schuetz, Elisabeth; Obirei, Barbara; Salat, Daniela; Scholz, Julia; Hann, Dagmar; Dethleffsen, Kathrin

    2017-08-01

    The involvement of students in the embodiment of university teaching through peer-assisted learning formats is commonly applied. Publications on this topic exclusively focus on strictly defined situations within the curriculum and selected target groups. This study, in contrast, presents and evaluates a large-scale structured and quality-assured peer teaching programme, which offers diverse and targeted courses throughout the preclinical part of the medical curriculum. The large-scale peer teaching programme consists of subject specific and interdisciplinary tutorials that address all scientific, physiological and anatomic subjects of the preclinical curriculum as well as tutorials with contents exceeding the formal curriculum. In the study year 2013/14 a total of 1,420 lessons were offered as part of the programme. Paper-based evaluations were conducted over the full range of courses. Acceptance and benefit of this peer teaching programme were evaluated in a retrospective study covering the period 2012 to 2014. Usage of tutorials by students who commenced their studies in 2012/13 (n=959) was analysed from 2012 till 2014. Based on the results of 13 first assessments in the preclinical subjects anatomy, biochemistry and physiology, the students were assigned to one of five groups. These groups were compared according to participation in the tutorials. To investigate the benefit of tutorials of the peer teaching programme, the results of biochemistry re-assessments of participants and non-participants of tutorials in the years 2012 till 2014 (n=188, 172 and 204, respectively) were compared using Kolmogorov-Smirnov- and Chi-square tests as well as the effect size Cohen's d. Almost 70 % of the students attended the voluntary additional programme during their preclinical studies. The students participating in the tutorials had achieved different levels of proficiency in first assessments. The acceptance of different kinds of tutorials appears to correlate with their

  8. The Climate Potentials and Side-Effects of Large-Scale terrestrial CO2 Removal - Insights from Quantitative Model Assessments

    Science.gov (United States)

    Boysen, L.; Heck, V.; Lucht, W.; Gerten, D.

    2015-12-01

    Terrestrial carbon dioxide removal (tCDR) through dedicated biomass plantations is considered as one climate engineering (CE) option if implemented at large-scale. While the risks and costs are supposed to be small, the effectiveness depends strongly on spatial and temporal scales of implementation. Based on simulations with a dynamic global vegetation model (LPJmL) we comprehensively assess the effectiveness, biogeochemical side-effects and tradeoffs from an earth system-analytic perspective. We analyzed systematic land-use scenarios in which all, 25%, or 10% of natural and/or agricultural areas are converted to tCDR plantations including the assumption that biomass plantations are established once the 2°C target is crossed in a business-as-usual climate change trajectory. The resulting tCDR potentials in year 2100 include the net accumulated annual biomass harvests and changes in all land carbon pools. We find that only the most spatially excessive, and thus undesirable, scenario would be capable to restore the 2° target by 2100 under continuing high emissions (with a cooling of 3.02°C). Large-scale biomass plantations covering areas between 1.1 - 4.2 Gha would produce a climate reduction potential of 0.8 - 1.4°C. tCDR plantations at smaller scales do not build up enough biomass over this considered period and the potentials to achieve global warming reductions are substantially lowered to no more than 0.5-0.6°C. Finally, we demonstrate that the (non-economic) costs for the Earth system include negative impacts on the water cycle and on ecosystems, which are already under pressure due to both land use change and climate change. Overall, tCDR may lead to a further transgression of land- and water-related planetary boundaries while not being able to set back the crossing of the planetary boundary for climate change. tCDR could still be considered in the near-future mitigation portfolio if implemented on small scales on wisely chosen areas.

  9. The re-emergence of sodium ion batteries: testing, processing, and manufacturability

    Science.gov (United States)

    Roberts, Samuel; Kendrick, Emma

    2018-01-01

    With the re-emergence of sodium ion batteries (NIBs), we discuss the reasons for the recent interests in this technology and discuss the synergies between lithium ion battery (LIB) and NIB technologies and the potential for NIB as a “drop-in” technology for LIB manufacturing. The electrochemical testing of sodium materials in sodium metal anode arrangements is reviewed. The performance, stability, and polarization of the sodium in these test cells lead to alternative testing in three-electrode and alternative anode cell configurations. NIB manufacturability is also discussed, together with the impact that the material stability has upon the electrodes and coating. Finally, full-cell NIB technologies are reviewed, and literature proof-of-concept cells give an idea of some of the key differences in the testing protocols of these batteries. For more commercially relevant formats, safety, passive voltage control through cell balancing and cell formation aspects are discussed. PMID:29910609

  10. Analysis of genetic variation and potential applications in genome-scale metabolic modeling

    DEFF Research Database (Denmark)

    Cardoso, Joao; Andersen, Mikael Rørdam; Herrgard, Markus

    2015-01-01

    scale and resolution by re-sequencing thousands of strains systematically. In this article, we review challenges in the integration and analysis of large-scale re-sequencing data, present an extensive overview of bioinformatics methods for predicting the effects of genetic variants on protein function......Genetic variation is the motor of evolution and allows organisms to overcome the environmental challenges they encounter. It can be both beneficial and harmful in the process of engineering cell factories for the production of proteins and chemicals. Throughout the history of biotechnology......, there have been efforts to exploit genetic variation in our favor to create strains with favorable phenotypes. Genetic variation can either be present in natural populations or it can be artificially created by mutagenesis and selection or adaptive laboratory evolution. On the other hand, unintended genetic...

  11. Are we prepared for emerging and re-emerging diseases? Experience and lessons from epidemics that occurred in Tanzania during the last five decades.

    Science.gov (United States)

    Karimuribo, Esron D; Mboera, Leonard E G; Mbugi, Erasto; Simba, Azma; Kivaria, Fredrick M; Mmbuji, Peter; Rweyemamu, Mark M

    2011-12-01

    This paper reviews preparedness for containing and controlling emerging and re-emerging diseases drawing lessons from disease events that occurred in animal and human populations in the last five decades (1961-2011). A comprehensive analysis based on retrieval and analysis of grey and published literature as well as reported cases was carried out to document type and trend of occurrence of emerging and re-emerging infectious diseases in different parts of Tanzania. Overall, the majority of diseases reported in the country were viral in nature followed by bacterial diseases. The trend for the occurrence shows a number of new emerging diseases as well as re-occurrence of old diseases in both animal (domestic and wild) and human populations. In humans, the major disease epidemics reported in the last five decades include cholera, influenza A H1N1, plague and rubella. In animals, the major epidemic diseases reported were Contagious Bovine Pleuropneumonia, Contagious Caprine Pleuropneumonia, Peste des petits ruminants and Giraffe Ear and Skin Diseases. Some epidemics have been reported in both human and animal populations including Rift Valley fever and anthrax. The emergence of the 'fit-for purpose' approaches and technologies such as the discipline of One Health, use of participatory epidemiology and disease surveillance and mobile technologies offers opportunity for optimal use of limited resources to improve early detection, diagnosis and response to disease events and consequently reduced impact of such diseases in animal and human populations.

  12. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  13. Quasi-potential and Two-Scale Large Deviation Theory for Gillespie Dynamics

    KAUST Repository

    Li, Tiejun

    2016-01-07

    The construction of energy landscape for bio-dynamics is attracting more and more attention recent years. In this talk, I will introduce the strategy to construct the landscape from the connection to rare events, which relies on the large deviation theory for Gillespie-type jump dynamics. In the application to a typical genetic switching model, the two-scale large deviation theory is developed to take into account the fast switching of DNA states. The comparison with other proposals are also discussed. We demonstrate different diffusive limits arise when considering different regimes for genetic translation and switching processes.

  14. The re-emergence of tuberculosis: what have we learnt from molecular epidemiology?

    NARCIS (Netherlands)

    Borgdorff, M.W.; Soolingen, D. van

    2013-01-01

    Tuberculosis (TB) has re-emerged over the past two decades: in industrialized countries in association with immigration, and in Africa owing to the human immunodeficiency virus epidemic. Drug-resistant TB is a major threat worldwide. The variable and uncertain impact of TB control necessitates not

  15. Emergency planning and preparedness for re-entry of a nuclear powered satellite

    International Nuclear Information System (INIS)

    1996-01-01

    This safety practice report provides a general overview of the management of incidents or emergencies that may be created when nuclear power sources employed in space systems accidentally re-enter the earth's atmosphere and impact on its surface. 8 refs, 4 figs, 7 tabs

  16. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  17. PREFACE PASREG: The 7th International Workshop on the Processing and Applications of Superconducting (RE)BCO Large Grain Materials (Washington DC, 29-31 July 2010) PASREG: The 7th International Workshop on the Processing and Applications of Superconducting (RE)BCO Large Grain Materials (Washington DC, 29-31 July 2010)

    Science.gov (United States)

    Freyhardt, Herbert; Cardwell, David; Strasik, Mike

    2010-12-01

    Large grain, (RE)BCO bulk superconductors fabricated by top seeded melt growth (TSMG) are able to generate large magnetic fields compared to conventional, iron-based permanent magnets. Following 20 years of development, these materials are now beginning to realize their considerable potential for a variety of engineering applications such as magnetic separators, flywheel energy storage and magnetic bearings. MgB2 has also continued to emerge as a potentially important bulk superconducting material for engineering applications below 20 K due to its lack of granularity and the ease with which complex shapes of this material can be fabricated. This issue of Superconductor Science and Technology contains a selection of papers presented at the 7th International Workshop on the Processing and Applications of Superconducting (RE)BCO Large Grain Materials, including MgB2, held 29th-31sy July 2010 at the Omni Shoreham Hotel, Washington DC, USA, to report progress made in this field in the previous three year period. The workshop followed those held previously in Cambridge, UK (1997), Morioka, Japan (1999), Seattle, USA (2001), Jena, Germany (2003), Tokyo, Japan (2005) and again in Cambridge, UK (2007). The scope of the seventh PASREG workshop was extended to include processing and characterization aspects of the broader spectrum of bulk high temperature superconducting (HTS) materials, including melt-cast Bi-HTS and bulk MgB2, recent developments in the field and innovative applications of bulk HTS. A total of 38 papers were presented at this workshop, of which 30 were presented in oral form and 8 were presented as posters. The organizers wish to acknowledge the efforts of Sue Butler of the University of Houston for her local organization of the workshop. The eighth PASREG workshop will be held in Taiwan in the summer of 2012.

  18. The Emergence of Large-Scale Computer Assisted Summative Examination Facilities in Higher Education

    NARCIS (Netherlands)

    Draaijer, S.; Warburton, W. I.

    2014-01-01

    A case study is presented of VU University Amsterdam where a dedicated large-scale CAA examination facility was established. In the facility, 385 students can take an exam concurrently. The case study describes the change factors and processes leading up to the decision by the institution to

  19. Physiological responses of astronaut candidates to simulated +Gx orbital emergency re-entry.

    Science.gov (United States)

    Wu, Bin; Xue, Yueying; Wu, Ping; Gu, Zhiming; Wang, Yue; Jing, Xiaolu

    2012-08-01

    We investigated astronaut candidates' physiological and pathological responses to +Gx exposure during simulated emergency return from a running orbit to advance astronaut +Gx tolerance training and medical support in manned spaceflight. There were 13 male astronaut candidates who were exposed to a simulated high +Gx acceleration profile in a spacecraft during an emergency return lasting for 230 s. The peak value was 8.5 G. Subjective feelings and symptoms, cardiovascular and respiratory responses, and changes in urine component before, during, and after +Gx exposure were investigated. Under high +Gx exposure, 15.4% of subjects exhibited arrhythmia. Heart rate (HR) increased significantly and four different types of HR response curves were distinguished. The ratio of QT to RR interval on the electrocardiograms was significantly increased. Arterial oxygen saturation (SaO2) declined with increasing G value and then returned gradually. SaO2 reached a minimum (87.7%) at 3 G during the decline phase of the +Gx curve. Respiratory rate increased significantly with increasing G value, while the amplitude and area of the respiratory waves were significantly reduced. The overshoot appeared immediately after +Gx exposure. A few subjects suffered from slight injuries, including positive urine protein (1/13), positive urinary occult blood (1/13), and a large area of petechiae on the back (1/13). Astronaut candidates have relatively good tolerance to the +Gx profile during a simulation of spacecraft emergent ballistic re-entry. However, a few subjects exhibited adverse physiological responses and slight reversible pathological injuries.

  20. Time-scale and extent at which large-scale circulation modes determine the wind and solar potential in the Iberian Peninsula

    International Nuclear Information System (INIS)

    Jerez, Sonia; Trigo, Ricardo M

    2013-01-01

    The North Atlantic Oscillation (NAO), the East Atlantic (EA) and the Scandinavian (SCAND) modes are the three main large-scale circulation patterns driving the climate variability of the Iberian Peninsula. This study assesses their influence in terms of solar (photovoltaic) and wind power generation potential (SP and WP) and evaluates their skill as predictors. For that we use a hindcast regional climate simulation to retrieve the primary meteorological variables involved, surface solar radiation and wind speed. First we identify that the maximum influence of the various modes occurs on the interannual variations of the monthly mean SP and WP series, being generally more relevant in winter. Second we find that in this time-scale and season, SP (WP) varies up to 30% (40%) with respect to the mean climatology between years with opposite phases of the modes, although the strength and the spatial distribution of the signals differ from one month to another. Last, the skill of a multi-linear regression model (MLRM), built using the NAO, EA and SCAND indices, to reconstruct the original wintertime monthly series of SP and WP was investigated. The reconstructed series (when the MLRM is calibrated for each month individually) correlate with the original ones up to 0.8 at the interannual time-scale. Besides, when the modeled series for each individual month are merged to construct an October-to-March monthly series, and after removing the annual cycle in order to account for monthly anomalies, these correlate 0.65 (0.55) with the original SP (WP) series in average. These values remain fairly stable when the calibration and reconstruction periods differ, thus supporting up to a point the predictive potential of the method at the time-scale assessed here. (letter)

  1. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  2. Flat tree-level inflationary potentials in the light of cosmic microwave background and large scale structure data

    CERN Document Server

    Ballesteros, G; Espinosa, J R; de Austri, R Ruiz; Trotta, R

    2008-01-01

    We use cosmic microwave background and large scale structure data to test a broad and physically well-motivated class of inflationary models: those with flat tree-level potentials (typical in supersymmetry). The non-trivial features of the potential arise from radiative corrections which give a simple logarithmic dependence on the inflaton field, making the models very predictive. We also consider a modified scenario with new physics beyond a certain high-energy cut-off showing up as non-renormalizable operators (NRO) in the inflaton field. We find that both kinds of models fit remarkably well CMB and LSS data, with very few free parameters. Besides, a large part of these models naturally predict a reasonable number of e-folds. A robust feature of these scenarios is the smallness of tensor perturbations (r < 10^{-3}). The NRO case can give a sizeable running of the spectral index while achieving a sufficient number of e-folds. We use Bayesian model comparison tools to assess the relative performance of the...

  3. Emerging economies. Potentials, pledges and fair shares of greenhouse gas reduction

    Energy Technology Data Exchange (ETDEWEB)

    Fekete, Hanna; Hoehne, Niklas; Hagemann, Markus [Ecofys Germany GmbH, Koeln (Germany); Wehnert, Timon; Mersmann, Florian [Wuppertal Institute for Climate, Environment, Energy GmbH (Germany); Vieweg, Marion; Rocha, Marcia; Schaeffer, Michiel; Hare, William [Climate Analytics gGmbH, Berlin (Germany)

    2013-04-15

    Greenhouse gas emissions need to decrease substantially to limit global average temperature to a maximum of 2 C warming above the preindustrial level in 2100. Emerging economies are of increasing importance in this global effort. In this report we assess how ambitious emission reduction pledges of emerging economies are compared to business as usual emissions, the countries' mitigation potential and respective efforts based on different equity principles. We also compare the pledges and the identified mitigation potential of emerging economies to a global emissions pathway needed to limit global temperature increase to 2 C. Our assessment includes Brazil, China, India, Mexico, South Africa and South Korea. We find that emerging economies have a substantial impact on future global emission levels. This is due to high current levels and high projected growth rates. Also, in most of the countries a large emission reduction potential is available. Action needs to be taken soon to enable the full use of the potential until 2020 and most emerging economies will need significant support from developed countries to implement those.

  4. Imprint of thawing scalar fields on the large scale galaxy overdensity

    Science.gov (United States)

    Dinda, Bikash R.; Sen, Anjan A.

    2018-04-01

    We investigate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. We consider the full general relativistic perturbation equations for the matter as well as the dark energy fluid. We form a single autonomous system of equations containing both the background and the perturbed equations of motion which we subsequently solve for different scalar field potentials. First we study the percentage deviation from the Λ CDM model for different cosmological parameters as well as in the observed galaxy power spectra on different scales in scalar field models for various choices of scalar field potentials. Interestingly the difference in background expansion results from the enhancement of power from Λ CDM on small scales, whereas the inclusion of general relativistic (GR) corrections results in the suppression of power from Λ CDM on large scales. This can be useful to distinguish scalar field models from Λ CDM with future optical/radio surveys. We also compare the observed galaxy power spectra for tracking and thawing types of scalar field using some particular choices for the scalar field potentials. We show that thawing and tracking models can have large differences in observed galaxy power spectra on large scales and for smaller redshifts due to different GR effects. But on smaller scales and for larger redshifts, the difference is small and is mainly due to the difference in background expansion.

  5. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  6. Testing Einstein's Gravity on Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chandra

    2011-01-01

    A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.

  7. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  8. Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.

    Science.gov (United States)

    Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong

    2017-10-11

    The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.

  9. Long-term change of potential evapotranspiration over Southwest China and teleconnections with large-scale climate anomalies

    Science.gov (United States)

    Liu, B.; Chen, X.; Li, Y.; Chen, Z.

    2017-12-01

    bstract: Potential evapotranspiration (PET) is a sensitive factor for atmospheric and ecological systems over Southwest China which is characterized by intensive karst geomorphology and fragile environment. Based on daily meteorological data of 94 stations during 1961-2013, the spatiotemporal characteristics of PET are analyzed. The changing characteristics of local meteorological factors and large-scale climatic features are also investigated to explain the potential reasons for changing PET. Study results are as follows: (1) The high-value center of PET with a mean value of 1097 mm/a locates in the south mainly resulted from the regional climatic features of higher air temperature (TEM), sunshine duration (SSD) and lower relative humidity (RHU); and the low-value center of PET with a mean value of 831 mm/a is in the northeast primarily attributed to higher RHU and weaker SSD. (2) Annual PET decreases at -10.04 mm decade-1 before the year 2000 but increases at 50.65 mm decade-1 thereafter; and the dominant factors of PET change are SSD, RHU and wind speed (WIN), with the relative contributions of 33.29%, 25.42% and 22.16%, respectively. (3) The abrupt change of PET in 2000 is strongly dominated by large-scale climatic anomalies. The strengthened 850hPa geostrophic wind (0.51 ms-1 decade-1), weakened total cloud cover (-2.25 % decade-1) and 500hPa water vapor flux (-2.85 % decade-1) have provided advantageous dynamic, thermal and dry conditions for PET over Southwest China since the 21st century.

  10. Collective response to public health emergencies and large-scale disasters: putting hospitals at the core of community resilience.

    Science.gov (United States)

    Paturas, James L; Smith, Deborah; Smith, Stewart; Albanese, Joseph

    2010-07-01

    Healthcare organisations are a critical part of a community's resilience and play a prominent role as the backbone of medical response to natural and manmade disasters. The importance of healthcare organisations, in particular hospitals, to remain operational extends beyond the necessity to sustain uninterrupted medical services for the community, in the aftermath of a large-scale disaster. Hospitals are viewed as safe havens where affected individuals go for shelter, food, water and psychosocial assistance, as well as to obtain information about missing family members or learn of impending dangers related to the incident. The ability of hospitals to respond effectively to high-consequence incidents producing a massive arrival of patients that disrupt daily operations requires surge capacity and capability. The activation of hospital emergency support functions provides an approach by which hospitals manage a short-term shortfall of hospital personnel through the reallocation of hospital employees, thereby obviating the reliance on external qualified volunteers for surge capacity and capability. Recent revisions to the Joint Commission's hospital emergency preparedness standard have impelled healthcare facilities to participate actively in community-wide planning, rather than confining planning exclusively to a single healthcare facility, in order to harmonise disaster management strategies and effectively coordinate the allocation of community resources and expertise across all local response agencies.

  11. The Potential of Unmanned Aerial Vehicle for Large Scale Mapping of Coastal Area

    International Nuclear Information System (INIS)

    Darwin, N; Ahmad, A; Zainon, O

    2014-01-01

    Many countries in the tropical region are covered with cloud for most of the time, hence, it is difficult to get clear images especially from high resolution satellite imagery. Aerial photogrammetry can be used but most of the time the cloud problem still exists. Today, this problem could be solved using a system known as unmanned aerial vehicle (UAV) where the aerial images can be acquired at low altitude and the system can fly under the cloud. The UAV system could be used in various applications including mapping coastal area. The UAV system is equipped with an autopilot system and automatic method known as autonomous flying that can be utilized for data acquisition. To achieve high resolution imagery, a compact digital camera of high resolution was used to acquire the aerial images at an altitude. In this study, the UAV system was employed to acquire aerial images of a coastal simulation model at low altitude. From the aerial images, photogrammetric image processing was executed to produce photogrammetric outputs such a digital elevation model (DEM), contour line and orthophoto. In this study, ground control point (GCP) and check point (CP) were established using conventional ground surveying method (i.e total station). The GCP is used for exterior orientation in photogrammetric processes and CP for accuracy assessment based on Root Mean Square Error (RMSE). From this study, it was found that the UAV system can be used for large scale mapping of coastal simulation model with accuracy at millimeter level. It is anticipated that the same system could be used for large scale mapping of real coastal area and produces good accuracy. Finally, the UAV system has great potential to be used for various applications that require accurate results or products at limited time and less man power

  12. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  13. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  14. A Conceptual Framework for Allocation of Federally Stockpiled Ventilators During Large-Scale Public Health Emergencies.

    Science.gov (United States)

    Zaza, Stephanie; Koonin, Lisa M; Ajao, Adebola; Nystrom, Scott V; Branson, Richard; Patel, Anita; Bray, Bruce; Iademarco, Michael F

    2016-01-01

    Some types of public health emergencies could result in large numbers of patients with respiratory failure who need mechanical ventilation. Federal public health planning has included needs assessment and stockpiling of ventilators. However, additional federal guidance is needed to assist states in further allocating federally supplied ventilators to individual hospitals to ensure that ventilators are shipped to facilities where they can best be used during an emergency. A major consideration in planning is a hospital's ability to absorb additional ventilators, based on available space and staff expertise. A simple pro rata plan that does not take these factors into account might result in suboptimal use or unused scarce resources. This article proposes a conceptual framework that identifies the steps in planning and an important gap in federal guidance regarding the distribution of stockpiled mechanical ventilators during an emergency.

  15. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  16. Large scale laboratory diffusion experiments in clay rocks

    International Nuclear Information System (INIS)

    Garcia-Gutierrez, M.; Missana, T.; Mingarro, M.; Martin, P.L.; Cormenzana, J.L.

    2005-01-01

    Full text of publication follows: Clay formations are potential host rocks for high-level radioactive waste repositories. In clay materials the radionuclide diffusion is the main transport mechanism. Thus, the understanding of the diffusion processes and the determination of diffusion parameters in conditions as similar as possible to the real ones, are critical for the performance assessment of deep geological repository. Diffusion coefficients are mainly measured in the laboratory using small samples, after a preparation to fit into the diffusion cell. In addition, a few field tests are usually performed for confirming laboratory results, and analyse scale effects. In field or 'in situ' tests the experimental set-up usually includes the injection of a tracer diluted in reconstituted formation water into a packed off section of a borehole. Both experimental systems may produce artefacts in the determination of diffusion coefficients. In laboratory the preparation of the sample can generate structural change mainly if the consolidated clay have a layered fabric, and in field test the introduction of water could modify the properties of the saturated clay in the first few centimeters, just where radionuclide diffusion is expected to take place. In this work, a large scale laboratory diffusion experiment is proposed, using a large cylindrical sample of consolidated clay that can overcome the above mentioned problems. The tracers used were mixed with clay obtained by drilling a central hole, re-compacted into the hole at approximately the same density as the consolidated block and finally sealed. Neither additional treatment of the sample nor external monitoring are needed. After the experimental time needed for diffusion to take place (estimated by scoping calculations) the block was sampled to obtain a 3D distribution of the tracer concentration and the results were modelled. An additional advantage of the proposed configuration is that it could be used in 'in situ

  17. Generating descriptive visual words and visual phrases for large-scale image applications.

    Science.gov (United States)

    Zhang, Shiliang; Tian, Qi; Hua, Gang; Huang, Qingming; Gao, Wen

    2011-09-01

    Bag-of-visual Words (BoWs) representation has been applied for various problems in the fields of multimedia and computer vision. The basic idea is to represent images as visual documents composed of repeatable and distinctive visual elements, which are comparable to the text words. Notwithstanding its great success and wide adoption, visual vocabulary created from single-image local descriptors is often shown to be not as effective as desired. In this paper, descriptive visual words (DVWs) and descriptive visual phrases (DVPs) are proposed as the visual correspondences to text words and phrases, where visual phrases refer to the frequently co-occurring visual word pairs. Since images are the carriers of visual objects and scenes, a descriptive visual element set can be composed by the visual words and their combinations which are effective in representing certain visual objects or scenes. Based on this idea, a general framework is proposed for generating DVWs and DVPs for image applications. In a large-scale image database containing 1506 object and scene categories, the visual words and visual word pairs descriptive to certain objects or scenes are identified and collected as the DVWs and DVPs. Experiments show that the DVWs and DVPs are informative and descriptive and, thus, are more comparable with the text words than the classic visual words. We apply the identified DVWs and DVPs in several applications including large-scale near-duplicated image retrieval, image search re-ranking, and object recognition. The combination of DVW and DVP performs better than the state of the art in large-scale near-duplicated image retrieval in terms of accuracy, efficiency and memory consumption. The proposed image search re-ranking algorithm: DWPRank outperforms the state-of-the-art algorithm by 12.4% in mean average precision and about 11 times faster in efficiency.

  18. Development of a generic seed crystal for the fabrication of large grain (RE)-Ba-Cu-O bulk superconductors

    International Nuclear Information System (INIS)

    Shi, Y; Babu, N Hari; Cardwell, D A

    2005-01-01

    The critical current density, J c , irreversibility field, B irr , and magnetic field trapping ability of (LRE)-Ba-Cu-O bulk superconductors, where LRE is a light rare earth element such as Nd, Sm, Eu and Gd, are generally superior to those of the more common melt-processed Y-Ba-Cu-O (YBCO). The lack of availability of a suitable seed crystal to grow large, single grain (LRE)-Ba-Cu-O superconductors with controlled orientation, however, has hindered severely the development of these materials for engineering applications over the past ten years. In this communication we report for the first time the development of a generic seed crystal that can be used to fabricate any rare earth (RE) based (RE)-Ba-Cu-O ((RE)BCO) superconductor in the form of a large single grain with controlled orientation. The new seed crystal will potentially enable large grain (LRE)-Ba-Cu-O bulk superconductors to be fabricated routinely, as is the case for YBCO. This will enable the field trapping and current-carrying characteristics of these materials to be explored in more detail than has been possible to date. (rapid communication)

  19. Role of India's wildlife in the emergence and re-emergence of zoonotic pathogens, risk factors and public health implications.

    Science.gov (United States)

    Singh, B B; Gajadhar, A A

    2014-10-01

    Evolving land use practices have led to an increase in interactions at the human/wildlife interface. The presence and poor knowledge of zoonotic pathogens in India's wildlife and the occurrence of enormous human populations interfacing with, and critically linked to, forest ecosystems warrant attention. Factors such as diverse migratory bird populations, climate change, expanding human population and shrinking wildlife habitats play a significant role in the emergence and re-emergence of zoonotic pathogens from India's wildlife. The introduction of a novel Kyasanur forest disease virus (family flaviviridae) into human populations in 1957 and subsequent occurrence of seasonal outbreaks illustrate the key role that India's wild animals play in the emergence and reemergence of zoonotic pathogens. Other high priority zoonotic diseases of wildlife origin which could affect both livestock and humans include influenza, Nipah, Japanese encephalitis, rabies, plague, leptospirosis, anthrax and leishmaniasis. Continuous monitoring of India's extensively diverse and dispersed wildlife is challenging, but their use as indicators should facilitate efficient and rapid disease-outbreak response across the region and occasionally the globe. Defining and prioritizing research on zoonotic pathogens in wildlife are essential, particularly in a multidisciplinary one-world one-health approach which includes human and veterinary medical studies at the wildlife-livestock-human interfaces. This review indicates that wild animals play an important role in the emergence and re-emergence of zoonotic pathogens and provides brief summaries of the zoonotic diseases that have occurred in wild animals in India. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Leveraging Subsidence in Permafrost with Remotely Sensed Active Layer Thickness (ReSALT) Products

    Science.gov (United States)

    Schaefer, K. M.; Chen, A.; Chen, J.; Chen, R. H.; Liu, L.; Michaelides, R. J.; Moghaddam, M.; Parsekian, A.; Tabatabaeenejad, A.; Thompson, J. A.; Zebker, H. A.; Meyer, F. J.

    2017-12-01

    The Remotely Sensed Active Layer Thickness (ReSALT) product uses the Interferometric Synthetic Aperture Radar (InSAR) technique to measure ground subsidence in permafrost regions. Seasonal subsidence results from the expansion of soil water into ice as the surface soil or active layer freezes and thaws each year. Subsidence trends result from large-scale thaw of permafrost and from the melting and subsequent drainage of excess ground ice in permafrost-affected soils. The attached figure shows the 2006-2010 average seasonal subsidence from ReSALT around Barrow, Alaska. The average active layer thickness (the maximum surface thaw depth during summer) is 30-40 cm, resulting in an average seasonal subsidence of 1-3 cm. Analysis of the seasonal subsidence and subsidence trends provides valuable insights into important permafrost processes, such as the freeze/thaw of the active layer, large-scale thawing due to climate change, the impact of fire, and infrastructure vulnerability. ReSALT supports the Arctic-Boreal Vulnerability Experiment (ABoVE) field campaign in Alaska and northwest Canada and is a precursor for a potential NASA-ISRO Synthetic Aperture Radar (NISAR) product. ReSALT includes uncertainties for all parameters and is validated against in situ measurements from the Circumpolar Active Layer Monitoring (CALM) network, Ground Penetrating Radar and mechanical probe measurements. Here we present examples of ReSALT products in Alaska to highlight the untapped potential of the InSAR technique to understand permafrost dynamics, with a strong emphasis on the underlying processes that drive the subsidence.

  1. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  2. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  3. European Wintertime Windstorms and its Links to Large-Scale Variability Modes

    Science.gov (United States)

    Befort, D. J.; Wild, S.; Walz, M. A.; Knight, J. R.; Lockwood, J. F.; Thornton, H. E.; Hermanson, L.; Bett, P.; Weisheimer, A.; Leckebusch, G. C.

    2017-12-01

    Winter storms associated with extreme wind speeds and heavy precipitation are the most costly natural hazard in several European countries. Improved understanding and seasonal forecast skill of winter storms will thus help society, policy-makers and (re-) insurance industry to be better prepared for such events. We firstly assess the ability to represent extra-tropical windstorms over the Northern Hemisphere of three seasonal forecast ensemble suites: ECMWF System3, ECMWF System4 and GloSea5. Our results show significant skill for inter-annual variability of windstorm frequency over parts of Europe in two of these forecast suites (ECMWF-S4 and GloSea5) indicating the potential use of current seasonal forecast systems. In a regression model we further derive windstorm variability using the forecasted NAO from the seasonal model suites thus estimating the suitability of the NAO as the only predictor. We find that the NAO as the main large-scale mode over Europe can explain some of the achieved skill and is therefore an important source of variability in the seasonal models. However, our results show that the regression model fails to reproduce the skill level of the directly forecast windstorm frequency over large areas of central Europe. This suggests that the seasonal models also capture other sources of variability/predictability of windstorms than the NAO. In order to investigate which other large-scale variability modes steer the interannual variability of windstorms we develop a statistical model using a Poisson GLM. We find that the Scandinavian Pattern (SCA) in fact explains a larger amount of variability for Central Europe during the 20th century than the NAO. This statistical model is able to skilfully reproduce the interannual variability of windstorm frequency especially for the British Isles and Central Europe with correlations up to 0.8.

  4. Entropy Production of Emerging Turbulent Scales in a Temporal Supercritical N-Neptane/Nitrogen Three-Dimensional Mixing Layer

    Science.gov (United States)

    Bellan, J.; Okongo, N.

    2000-01-01

    A study of emerging turbulent scales entropy production is conducted for a supercritical shear layer as a precursor to the eventual modeling of Subgrid Scales (from a turbulent state) leading to Large Eddy Simulations.

  5. EEG potentials predict upcoming emergency brakings during simulated driving

    Science.gov (United States)

    Haufe, Stefan; Treder, Matthias S.; Gugler, Manfred F.; Sagebaum, Max; Curio, Gabriel; Blankertz, Benjamin

    2011-10-01

    Emergency braking assistance has the potential to prevent a large number of car crashes. State-of-the-art systems operate in two stages. Basic safety measures are adopted once external sensors indicate a potential upcoming crash. If further activity at the brake pedal is detected, the system automatically performs emergency braking. Here, we present the results of a driving simulator study indicating that the driver's intention to perform emergency braking can be detected based on muscle activation and cerebral activity prior to the behavioural response. Identical levels of predictive accuracy were attained using electroencephalography (EEG), which worked more quickly than electromyography (EMG), and using EMG, which worked more quickly than pedal dynamics. A simulated assistance system using EEG and EMG was found to detect emergency brakings 130 ms earlier than a system relying only on pedal responses. At 100 km h-1 driving speed, this amounts to reducing the braking distance by 3.66 m. This result motivates a neuroergonomic approach to driving assistance. Our EEG analysis yielded a characteristic event-related potential signature that comprised components related to the sensory registration of a critical traffic situation, mental evaluation of the sensory percept and motor preparation. While all these components should occur often during normal driving, we conjecture that it is their characteristic spatio-temporal superposition in emergency braking situations that leads to the considerable prediction performance we observed.

  6. Nuclear emergency planning and response in the Netherlands: Experiences obtained from large scale exercises

    International Nuclear Information System (INIS)

    Smetsers, R.C.G.M.; Pruppers, M.J.M.; Sonderen, J.F. van

    2000-01-01

    In 1986 the Chernobyl accident led the Dutch Government to a reconsideration of their possibilities for managing nuclear emergencies. It was decided to improve both the national emergency management organization and the infrastructure for collecting and presenting technical information. The first improvement resulted in the National Plan for Nuclear Emergency Planning and Response (EPR) and the second in a series of technical facilities for the assessment of radiation doses. Since 1990, following the implementation of the EPR and most of the technical facilities, several emergency exercises have taken place to test the effectiveness of organization and infrastructure. Special emphasis has been given to the early phase of the simulated accidents. This paper summarises the experiences obtained from these exercises. Major obstacles appear to be: (1) keeping all participants properly informed during the process, (2) the difference in working attitude of technical experts and decision-makers, (3) premature orders for countermeasures and (4) the (too) large number of people involved in the decision-making process. From these experiences requirements for instruments can be deduced. Such instruments include predictive models, to be used for dose assessment in the early phase of an accident which, apart from being fast, should yield uncomplicated results suitable for decision-makers. Refinements of models, such as taking into account the specific nature of the (urban) environment, are not needed until the recovery phase of a nuclear accident. (author)

  7. Large-scale bioenergy production from soybeans and switchgrass in Argentina: Part A: Potential and economic feasibility for national and international markets

    NARCIS (Netherlands)

    van Dam, J.; Faaij, A.P.C.; Hilbert, J.; Petruzzi, H.; Turkenburg, W.C.

    2009-01-01

    This study focuses on the economic feasibility for large-scale biomass production from soybeans or switchgrass from a region in Argentina. This is determined, firstly, by estimating whether the potential supply of biomass, when food and feed demand are met, is sufficient under different scenarios to

  8. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  9. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  10. A psychometric evaluation of the Pediatric Anesthesia Emergence Delirium scale.

    Science.gov (United States)

    Ringblom, Jenny; Wåhlin, Ingrid; Proczkowska, Marie

    2018-04-01

    Emergence delirium and emergence agitation have been a subject of interest since the early 1960s. This behavior has been associated with increased risk of injury in children and dissatisfaction with anesthesia care in their parents. The Pediatric Anesthesia Emergence Delirium Scale is a commonly used instrument for codifying and recording this behavior. The aim of this study was to psychometrically evaluate the Pediatric Anesthesia Emergence Delirium scale, focusing on the factor structure, in a sample of children recovering from anesthesia after surgery or diagnostic procedures. The reliability of the Pediatric Anesthesia Emergence Delirium scale was also tested. One hundred and twenty-two children younger than seven years were observed at postoperative care units during recovery from anesthesia. Two or 3 observers independently assessed the children using the Pediatric Anesthesia Emergence Delirium scale. The factor analysis clearly revealed a one-factor solution, which accounted for 82% of the variation in the data. Internal consistency, calculated with Cronbach's alpha, was good (0.96). The Intraclass Correlation Coefficient, which was used to assess interrater reliability for the Pediatric Anesthesia Emergence Delirium scale sum score, was 0.97 (P Pediatric Anesthesia Emergence Delirium scale for assessing emergence delirium in children recovering from anesthesia after surgery or diagnostic procedures. The kappa statistics for the Pediatric Anesthesia Emergence Delirium scale items essentially indicated good agreement between independent raters, supporting interrater reliability. © 2018 John Wiley & Sons Ltd.

  11. Diffusion Experiments in Opalinus Clay: Laboratory, Large-Scale Diffusion Experiments and Microscale Analysis by RBS.

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Gutierrez, M.; Alonso de los Rios, U.; Missana, T.; Cormenzana, J.L.; Mingarro, M.; Morejon, J.; Gil, P.

    2008-08-06

    The Opalinus Clay (OPA) formation in the Zurcher Weiland (Switzerland) is a potential host rock for a repository for high-level radioactive waste. Samples collected in the Mont Terri Underground Rock Laboratory (URL), where the OPA formation is located at a depth between -200 and -300 m below the surface, were used to study the radionuclide diffusion in clay materials. Classical laboratory essays and a novel experimental set-up for large-scale diffusion experiments were performed together to a novel application of the nuclear ion beam technique Rutherford Backscattering Spectrometry (RBS), to understand the transport properties of the OPA and to enhance the methodologies used for in situ diffusion experiments. Through-Diffusion and In-Diffusion conventional laboratory diffusion experiments were carried out with HTO, 36{sup C}l-, I-, 22{sup N}a, 75{sup S}e, 85{sup S}r, 233{sup U}, 137{sup C}s, 60{sup C}o and 152{sup E}u. Large-scale diffusion experiments were performed with HTO, 36{sup C}l, and 85{sup S}r, and new experiments with 60{sup C}o, 137{sup C}s and 152{sup E}u are ongoing. Diffusion experiments with RBS technique were done with Sr, Re, U and Eu. (Author) 38 refs.

  12. The Potential and Utilization of Unused Energy Sources for Large-Scale Horticulture Facility Applications under Korean Climatic Conditions

    Directory of Open Access Journals (Sweden)

    In Tak Hyun

    2014-07-01

    Full Text Available As the use of fossil fuel has increased, not only in construction, but also in agriculture due to the drastic industrial development in recent times, the problems of heating costs and global warming are getting worse. Therefore, introduction of more reliable and environmentally-friendly alternative energy sources has become urgent and the same trend is found in large-scale horticulture facilities. In this study, among many alternative energy sources, we investigated the reserves and the potential of various different unused energy sources which have infinite potential, but are nowadays wasted due to limitations in their utilization. In addition, we utilized available unused energy as a heat source for a heat pump in a large-scale horticulture facility and analyzed its feasibility through EnergyPlus simulation modeling. Accordingly, the discharge flow rate from the Fan Coil Unit (FCU in the horticulture facility, the discharge air temperature, and the return temperature were analyzed. The performance and heat consumption of each heat source were compared with those of conventional boilers. The result showed that the power load of the heat pump was decreased and thus the heat efficiency was increased as the temperature of the heat source was increased. Among the analyzed heat sources, power plant waste heat which had the highest heat source temperature consumed the least electric energy and showed the highest efficiency.

  13. An improved method to characterise the modulation of small-scale turbulent by large-scale structures

    Science.gov (United States)

    Agostini, Lionel; Leschziner, Michael; Gaitonde, Datta

    2015-11-01

    A key aspect of turbulent boundary layer dynamics is ``modulation,'' which refers to degree to which the intensity of coherent large-scale structures (LS) cause an amplification or attenuation of the intensity of the small-scale structures (SS) through large-scale-linkage. In order to identify the variation of the amplitude of the SS motion, the envelope of the fluctuations needs to be determined. Mathis et al. (2009) proposed to define this latter by low-pass filtering the modulus of the analytic signal built from the Hilbert transform of SS. The validity of this definition, as a basis for quantifying the modulated SS signal, is re-examined on the basis of DNS data for a channel flow. The analysis shows that the modulus of the analytic signal is very sensitive to the skewness of its PDF, which is dependent, in turn, on the sign of the LS fluctuation and thus of whether these fluctuations are associated with sweeps or ejections. The conclusion is that generating an envelope by use of a low-pass filtering step leads to an important loss of information associated with the effects of the local skewness of the PDF of the SS on the modulation process. An improved Hilbert-transform-based method is proposed to characterize the modulation of SS turbulence by LS structures

  14. Two-loop scale-invariant scalar potential and quantum effective operators

    CERN Document Server

    Ghilencea, D.M.

    2016-11-29

    Spontaneous breaking of quantum scale invariance may provide a solution to the hierarchy and cosmological constant problems. In a scale-invariant regularization, we compute the two-loop potential of a higgs-like scalar $\\phi$ in theories in which scale symmetry is broken only spontaneously by the dilaton ($\\sigma$). Its vev $\\langle\\sigma\\rangle$ generates the DR subtraction scale ($\\mu\\sim\\langle\\sigma\\rangle$), which avoids the explicit scale symmetry breaking by traditional regularizations (where $\\mu$=fixed scale). The two-loop potential contains effective operators of non-polynomial nature as well as new corrections, beyond those obtained with explicit breaking ($\\mu$=fixed scale). These operators have the form: $\\phi^6/\\sigma^2$, $\\phi^8/\\sigma^4$, etc, which generate an infinite series of higher dimensional polynomial operators upon expansion about $\\langle\\sigma\\rangle\\gg \\langle\\phi\\rangle$, where such hierarchy is arranged by {\\it one} initial, classical tuning. These operators emerge at the quantum...

  15. Large-area perovskite nanowire arrays fabricated by large-scale roll-to-roll micro-gravure printing and doctor blading

    Science.gov (United States)

    Hu, Qiao; Wu, Han; Sun, Jia; Yan, Donghang; Gao, Yongli; Yang, Junliang

    2016-02-01

    Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays with great potential applications in flexible electronic and optoelectronic devices.Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays

  16. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  17. Uncovering Nature’s 100 TeV Particle Accelerators in the Large-Scale Jets of Quasars

    Science.gov (United States)

    Georganopoulos, Markos; Meyer, Eileen; Sparks, William B.; Perlman, Eric S.; Van Der Marel, Roeland P.; Anderson, Jay; Sohn, S. Tony; Biretta, John A.; Norman, Colin Arthur; Chiaberge, Marco

    2016-04-01

    Since the first jet X-ray detections sixteen years ago the adopted paradigm for the X-ray emission has been the IC/CMB model that requires highly relativistic (Lorentz factors of 10-20), extremely powerful (sometimes super-Eddington) kpc scale jets. R I will discuss recently obtained strong evidence, from two different avenues, IR to optical polarimetry for PKS 1136-135 and gamma-ray observations for 3C 273 and PKS 0637-752, ruling out the EC/CMB model. Our work constrains the jet Lorentz factors to less than ~few, and leaves as the only reasonable alternative synchrotron emission from ~100 TeV jet electrons, accelerated hundreds of kpc away from the central engine. This refutes over a decade of work on the jet X-ray emission mechanism and overall energetics and, if confirmed in more sources, it will constitute a paradigm shift in our understanding of powerful large scale jets and their role in the universe. Two important findings emerging from our work will also discussed be: (i) the solid angle-integrated luminosity of the large scale jet is comparable to that of the jet core, contrary to the current belief that the core is the dominant jet radiative outlet and (ii) the large scale jets are the main source of TeV photon in the universe, something potentially important, as TeV photons have been suggested to heat up the intergalactic medium and reduce the number of dwarf galaxies formed.

  18. Recent developments in large-scale ozone generation with dielectric barrier discharges

    Science.gov (United States)

    Lopez, Jose L.

    2014-10-01

    Large-scale ozone generation for industrial applications has been entirely based on the creation of microplasmas or microdischarges created using dielectric barrier discharge (DBD) reactors. Although versions of DBD generated ozone have been in continuous use for over a hundred years especially in water treatment, recent changes in environmental awareness and sustainability have lead to a surge of ozone generating facilities throughout the world. As a result of this enhanced global usage of this environmental cleaning application various new discoveries have emerged in the science and technology of ozone generation. This presentation will describe some of the most recent breakthrough developments in large-scale ozone generation while further addressing some of the current scientific and engineering challenges of this technology.

  19. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  20. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    Science.gov (United States)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  1. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Martin, C D; Chandler, N A; Brown, Anton

    1994-09-01

    The potential for large-scale fracturing (> 10 m{sup 2}) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to {sigma}{sub 3} and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because {sigma}{sub 3} is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs.

  2. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    International Nuclear Information System (INIS)

    Martin, C.D.; Chandler, N.A.; Brown, Anton.

    1994-09-01

    The potential for large-scale fracturing (> 10 m 2 ) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to σ 3 and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because σ 3 is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs

  3. Automatic Generation of Connectivity for Large-Scale Neuronal Network Models through Structural Plasticity.

    Science.gov (United States)

    Diaz-Pier, Sandra; Naveau, Mikaël; Butz-Ostendorf, Markus; Morrison, Abigail

    2016-01-01

    With the emergence of new high performance computation technology in the last decade, the simulation of large scale neural networks which are able to reproduce the behavior and structure of the brain has finally become an achievable target of neuroscience. Due to the number of synaptic connections between neurons and the complexity of biological networks, most contemporary models have manually defined or static connectivity. However, it is expected that modeling the dynamic generation and deletion of the links among neurons, locally and between different regions of the brain, is crucial to unravel important mechanisms associated with learning, memory and healing. Moreover, for many neural circuits that could potentially be modeled, activity data is more readily and reliably available than connectivity data. Thus, a framework that enables networks to wire themselves on the basis of specified activity targets can be of great value in specifying network models where connectivity data is incomplete or has large error margins. To address these issues, in the present work we present an implementation of a model of structural plasticity in the neural network simulator NEST. In this model, synapses consist of two parts, a pre- and a post-synaptic element. Synapses are created and deleted during the execution of the simulation following local homeostatic rules until a mean level of electrical activity is reached in the network. We assess the scalability of the implementation in order to evaluate its potential usage in the self generation of connectivity of large scale networks. We show and discuss the results of simulations on simple two population networks and more complex models of the cortical microcircuit involving 8 populations and 4 layers using the new framework.

  4. Measles, One of the Re-emerging Diseases

    Directory of Open Access Journals (Sweden)

    Zeynep Türe

    2016-03-01

    Full Text Available Objective: The aim of the study is to stand out the measles which is a highly contagious re-emerging viral illness and may cause severe complications in susceptible population. Methods: This retrospective study was conducted on patients who were diagnosed with measles in the department of Infectious Diseases, Erciyes University Hospital, between January 2013 and February 2014. The diagnosis of measles was confirmed by measles specific immunoglobulin M (IgM antibody positivity in serum samples. Results: Nine patients were included the study. Three patients had a co-morbid condition including hematopoietic stem cell transplantation, pregnancy and diabetes mellitus. Four of the patients had hepatitis and one of them had pneumonia as a complication. Conclusion: Susceptible population, especially immunocompromised people are still at risk about measles. Adherence to universal vaccination programs is determinative in terms of breaking out of an outbreak. J Microbiol Infect Dis 2016;6(1: 19-22

  5. The Eruption of a Small-scale Emerging Flux Rope as the Driver of an M-class Flare and of a Coronal Mass Ejection

    Energy Technology Data Exchange (ETDEWEB)

    Yan, X. L.; Xue, Z. K.; Wang, J. C.; Yang, L. H.; Kong, D. F. [Yunnan Observatories, Chinese Academy of Sciences, 396 Yangfangwang, Guandu District, Kunming 650216, Yunnan (China); Jiang, C. W. [Institute of Space Science and Applied Technology, Harbin Institute of Technology, Shenzhen, 5180055 (China); Priest, E. R. [Mathematics Institute, University of St Andrews, St Andrews, KY16 9SS (United Kingdom); Cao, W. D. [Big Bear Solar Observatory, 40386 North Shore Lane, Big Bear City, CA 92314 (United States); Ji, H. S., E-mail: yanxl@ynao.ac.cn [Key Laboratory for Dark Matter and Space Science, Purple Mountain Observatory, Chinese Academy of Sciences, Nanjing 210008, Jiangsu (China)

    2017-08-10

    Solar flares and coronal mass ejections are the most powerful explosions in the Sun. They are major sources of potentially destructive space weather conditions. However, the possible causes of their initiation remain controversial. Using high-resolution data observed by the New Solar Telescope of Big Bear Solar Observatory, supplemented by Solar Dynamics Observatory observations, we present unusual observations of a small-scale emerging flux rope near a large sunspot, whose eruption produced an M-class flare and a coronal mass ejection. The presence of the small-scale flux rope was indicated by static nonlinear force-free field extrapolation as well as data-driven magnetohydrodynamics modeling of the dynamic evolution of the coronal three-dimensional magnetic field. During the emergence of the flux rope, rotation of satellite sunspots at the footpoints of the flux rope was observed. Meanwhile, the Lorentz force, magnetic energy, vertical current, and transverse fields were increasing during this phase. The free energy from the magnetic flux emergence and twisting magnetic fields is sufficient to power the M-class flare. These observations present, for the first time, the complete process, from the emergence of the small-scale flux rope, to the production of solar eruptions.

  6. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  7. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  8. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  9. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  10. Nanomedicine-emerging or re-emerging ethical issues? A discussion of four ethical themes.

    Science.gov (United States)

    Lenk, Christian; Biller-Andorno, Nikola

    2007-06-01

    Nanomedicine plays a prominent role among emerging technologies. The spectrum of potential applications is as broad as it is promising. It includes the use of nanoparticles and nanodevices for diagnostics, targeted drug delivery in the human body, the production of new therapeutic materials as well as nanorobots or nanoprotheses. Funding agencies are investing large sums in the development of this area, among them the European Commission, which has launched a large network for life-sciences related nanotechnology. At the same time government agencies as well as the private sector are putting forward reports of working groups that have looked into the promises and risks of these developments. This paper will begin with an introduction to the central ethical themes as identified by selected reports from Europe and beyond. In a next step, it will analyse the most frequently invoked ethical concerns-risk assessment and management, the issues of human identity and enhancement, possible implications for civil liberties (e.g. nanodevices that might be used for covert surveillance), and concerns about equity and fair access. Although it seems that the main ethical issues are not unique to nanotechnologies, the conclusion will argue against shrugging them off as non-specific items that have been considered before in the context of other biomedical technologies, such as gene therapy or xenotransplantation. Rather, the paper will call on ethicists to help foster a rational, fair and participatory discourse on the different potential applications of nanotechnologies in medicine, which can form the basis for informed and responsible societal and political decisions.

  11. Polymerase-endonuclease amplification reaction (PEAR for large-scale enzymatic production of antisense oligonucleotides.

    Directory of Open Access Journals (Sweden)

    Xiaolong Wang

    Full Text Available Antisense oligonucleotides targeting microRNAs or their mRNA targets prove to be powerful tools for molecular biology research and may eventually emerge as new therapeutic agents. Synthetic oligonucleotides are often contaminated with highly homologous failure sequences. Synthesis of a certain oligonucleotide is difficult to scale up because it requires expensive equipment, hazardous chemicals and a tedious purification process. Here we report a novel thermocyclic reaction, polymerase-endonuclease amplification reaction (PEAR, for the amplification of oligonucleotides. A target oligonucleotide and a tandem repeated antisense probe are subjected to repeated cycles of denaturing, annealing, elongation and cleaving, in which thermostable DNA polymerase elongation and strand slipping generate duplex tandem repeats, and thermostable endonuclease (PspGI cleavage releases monomeric duplex oligonucleotides. Each round of PEAR achieves over 100-fold amplification. The product can be used in one more round of PEAR directly, and the process can be further repeated. In addition to avoiding dangerous materials and improved product purity, this reaction is easy to scale up and amenable to full automation. PEAR has the potential to be a useful tool for large-scale production of antisense oligonucleotide drugs.

  12. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  13. Evaluation of re-criticality potential in Fukushima Dai-ichi reactors following core damage accidents

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    The re-criticality potential of the debris-bed, formed of the degraded core materials, cannot be ruled out during the cooling-down procedure of the Fukushima Dai-ichi NPPs. In this study the re-criticality potential has systematically investigated based on the core disruption phase analysis using a IMPACT-SAMPSON code prepared by The Institute of Applied Energy (IAE). The results obtained for the re-criticality potential, characterized by the eigen-values k-eff dependent on the debris composition formed at the core, RPV bottom, and PCV pedestal, are reflected to the arguments on the re-criticality prevention measures, such as timing and concentration of boron-compounds, during the cooling-down process of the Fukushima Dai-ichi NPPs. (author)

  14. Emergence of multi-scaling in fluid turbulence

    Science.gov (United States)

    Donzis, Diego; Yakhot, Victor

    2017-11-01

    We present new theoretical and numerical results on the transition to strong turbulence in an infinite fluid stirred by a Gaussian random force. The transition is defined as a first appearance of anomalous scaling of normalized moments of velocity derivatives (or dissipation rates) emerging from the low-Reynolds-number Gaussian background. It is shown that due to multi-scaling, strongly intermittent rare events can be quantitatively described in terms of an infinite number of different ``Reynolds numbers'' reflecting a multitude of anomalous scaling exponents. We found that anomalous scaling for high order moments emerges at very low Reynolds numbers implying that intense dissipative-range fluctuations are established at even lower Reynolds number than that required for an inertial range. Thus, our results suggest that information about inertial range dynamics can be obtained from dissipative scales even when the former does not exit. We discuss our further prediction that transition to fully anomalous turbulence disappears at Rλ < 3 . Support from NSF is acknowledged.

  15. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  16. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  17. Drivers for the emergence and re-emergence of vector-borne protozoal and bacterial diseases.

    Science.gov (United States)

    Harrus, S; Baneth, G

    2005-10-01

    In recent years, vector-borne parasitic and bacterial diseases have emerged or re-emerged in many geographical regions causing global health and economic problems that involve humans, livestock, companion animals and wild life. The ecology and epidemiology of vector-borne diseases are affected by the interrelations between three major factors comprising the pathogen, the host (human, animal or vector) and the environment. Important drivers for the emergence and spread of vector-borne parasites include habitat changes, alterations in water storage and irrigation habits, atmospheric and climate changes, immunosuppression by HIV, pollution, development of insecticide and drug resistance, globalization and the significant increase in international trade, tourism and travel. War and civil unrest, and governmental or global management failure are also major contributors to the spread of infectious diseases. The improvement of epidemic understanding and planning together with the development of new diagnostic molecular techniques in the last few decades have allowed researchers to better diagnose and trace pathogens, their origin and routes of infection, and to develop preventive public health and intervention programs. Health care workers, physicians, veterinarians and biosecurity officers should play a key role in future prevention of vector-borne diseases. A coordinated global approach for the prevention of vector-borne diseases should be implemented by international organizations and governmental agencies in collaboration with research institutions.

  18. Sustainable (Re)Construction : The Potential of the Renovation Market

    NARCIS (Netherlands)

    Usanov, A.; Chivot, E.

    2013-01-01

    The Sustainable Urban (Re)Construction Briefing argues that renovation is going to play an increasingly important role in the overall construction market – for several reasons. One of them is the urgency of climate change mitigation. Europe has a large stock of buildings, which together contribute

  19. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    Science.gov (United States)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  20. FREE GLOBAL DSM ASSESSMENT ON LARGE SCALE AREAS EXPLOITING THE POTENTIALITIES OF THE INNOVATIVE GOOGLE EARTH ENGINE PLATFORM

    Directory of Open Access Journals (Sweden)

    A. Nascetti

    2017-05-01

    Full Text Available The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah and one Italian Region (Trentino Alto- Adige, Northern Italy exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  1. Scaling-up Support for Emergency Response Organizations

    NARCIS (Netherlands)

    Oomes, A.H.J.; Neef, R.M.

    2005-01-01

    We present the design of an information system that supports the process of scaling-up of emergency response organizations. This process is vital for effective emergency response but tends to go awry in practice. Our proposed system consists of multiple distributed agents that are capable of

  2. HTS cables open the window for large-scale renewables

    International Nuclear Information System (INIS)

    Geschiere, A; Willen, D; Piga, E; Barendregt, P

    2008-01-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome

  3. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  4. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    Science.gov (United States)

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  5. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  6. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  7. Large arterial occlusive strokes as a medical emergency: need to accurately predict clot location.

    Science.gov (United States)

    Vanacker, Peter; Faouzi, Mohamed; Eskandari, Ashraf; Maeder, Philippe; Meuli, Reto; Michel, Patrik

    2017-10-01

    Endovascular treatment for acute ischemic stroke with a large intracranial occlusion was recently shown to be effective. Timely knowledge of the presence, site, and extent of arterial occlusions in the ischemic territory has the potential to influence patient selection for endovascular treatment. We aimed to find predictors of large vessel occlusive strokes, on the basis of available demographic, clinical, radiological, and laboratory data in the emergency setting. Patients enrolled in ASTRAL registry with acute ischemic stroke and computed tomography (CT)-angiography within 12 h of stroke onset were selected and categorized according to occlusion site. Easily accessible variables were used in a multivariate analysis. Of 1645 patients enrolled, a significant proportion (46.2%) had a large vessel occlusion in the ischemic territory. The main clinical predictors of any arterial occlusion were in-hospital stroke [odd ratios (OR) 2.1, 95% confidence interval 1.4-3.1], higher initial National Institute of Health Stroke Scale (OR 1.1, 1.1-1.2), presence of visual field defects (OR 1.9, 1.3-2.6), dysarthria (OR 1.4, 1.0-1.9), or hemineglect (OR 2.0, 1.4-2.8) at admission and atrial fibrillation (OR 1.7, 1.2-2.3). Further, the following radiological predictors were identified: time-to-imaging (OR 0.9, 0.9-1.0), early ischemic changes (OR 2.3, 1.7-3.2), and silent lesions on CT (OR 0.7, 0.5-1.0). The area under curve for this analysis was 0.85. Looking at different occlusion sites, National Institute of Health Stroke Scale and early ischemic changes on CT were independent predictors in all subgroups. Neurological deficits, stroke risk factors, and CT findings accurately identify acute ischemic stroke patients at risk of symptomatic vessel occlusion. Predicting the presence of these occlusions may impact emergency stroke care in regions with limited access to noninvasive vascular imaging.

  8. Guidance for Large-scale Implementation of Alternate Wetting and Drying: A Biophysical Suitability Assessment

    Science.gov (United States)

    Sander, B. O.; Wassmann, R.; Nelson, A.; Palao, L.; Wollenberg, E.; Ishitani, M.

    2014-12-01

    The alternate wetting and drying (AWD) technology for rice production does not only save 15-30% of irrigation water, it also reduces methane emissions by up to 70%. AWD is defined by periodic drying and re-flooding of a rice field. Due to its high mitigation potential and its simplicity to execute this practice AWD has gained a lot of attention in recent years. The Climate and Clean Air Coalition (CCAC) has put AWD high on its agenda and funds a project to guide implementation of this technology in Vietnam, Bangladesh and Colombia. One crucial activity is a biophysical suitability assessment for AWD in the three countries. For this, we analyzed rainfall and soil data as well as potential evapotranspiration to assess if the water balance allows practicing AWD or if precipitation is too high for rice fields to fall dry. In my talk I will outline key factors for a successful large-scale implementation of AWD with a focus on the biophysical suitability assessment. The seasonal suitability maps that we generated highlight priority areas for AWD implementation and guide policy makers to informed decisions about meaningful investments in infrastructure and extension work.

  9. Potential of electrical gas discharges for pollution control of large gas volumes

    International Nuclear Information System (INIS)

    Kogelschatz, U.

    1997-01-01

    Non-equilibrium gas discharges in many cases offer an innovative approach to the solution cf industrial air pollution problems. Negative corona discharges are used in electrostatic precipitators to collect dust and fly ash particles. Pulsed positive streamer coronas, dielectric-barrier discharges and possibly also flow-stabilised high pressure glow discharges are emerging technologies for the destruction of air pollutants like nitrogen oxides and sulfur dioxide in flue gases and volatile organic compounds (VOCs) in industrial effluents. The different discharge types are discussed with special emphasis on their potential for upscaling. Major applications are expected particularly in the removal of dilute concentrations of air pollutants, in odour control and in the simultaneous removal of different pollutants. Dielectric-barrier discharges exhibit disposal efficiencies similar to those of pulsed positive streamer coronas and require less sophisticated feeding circuits in large-scale industrial applications. (author)

  10. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets.

    Science.gov (United States)

    Wjst, Matthias

    2010-12-29

    Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public.

  11. Caught you: threats to confidentiality due to the public release of large-scale genetic data sets

    Directory of Open Access Journals (Sweden)

    Wjst Matthias

    2010-12-01

    Full Text Available Abstract Background Large-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers. Discussion The assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual. Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach. Summary Depending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public.

  12. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  13. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  14. Large-Area CVD-Grown Sub-2 V ReS2 Transistors and Logic Gates.

    Science.gov (United States)

    Dathbun, Ajjiporn; Kim, Youngchan; Kim, Seongchan; Yoo, Youngjae; Kang, Moon Sung; Lee, Changgu; Cho, Jeong Ho

    2017-05-10

    We demonstrated the fabrication of large-area ReS 2 transistors and logic gates composed of a chemical vapor deposition (CVD)-grown multilayer ReS 2 semiconductor channel and graphene electrodes. Single-layer graphene was used as the source/drain and coplanar gate electrodes. An ion gel with an ultrahigh capacitance effectively gated the ReS 2 channel at a low voltage, below 2 V, through a coplanar gate. The contact resistance of the ion gel-gated ReS 2 transistors with graphene electrodes decreased dramatically compared with the SiO 2 -devices prepared with Cr electrodes. The resulting transistors exhibited good device performances, including a maximum electron mobility of 0.9 cm 2 /(V s) and an on/off current ratio exceeding 10 4 . NMOS logic devices, such as NOT, NAND, and NOR gates, were assembled using the resulting transistors as a proof of concept demonstration of the applicability of the devices to complex logic circuits. The large-area synthesis of ReS 2 semiconductors and graphene electrodes and their applications in logic devices open up new opportunities for realizing future flexible electronics based on 2D nanomaterials.

  15. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  16. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  17. PERSEUS-HUB: Interactive and Collective Exploration of Large-Scale Graphs

    Directory of Open Access Journals (Sweden)

    Di Jin

    2017-07-01

    Full Text Available Graphs emerge naturally in many domains, such as social science, neuroscience, transportation engineering, and more. In many cases, such graphs have millions or billions of nodes and edges, and their sizes increase daily at a fast pace. How can researchers from various domains explore large graphs interactively and efficiently to find out what is ‘important’? How can multiple researchers explore a new graph dataset collectively and “help” each other with their findings? In this article, we present Perseus-Hub, a large-scale graph mining tool that computes a set of graph properties in a distributed manner, performs ensemble, multi-view anomaly detection to highlight regions that are worth investigating, and provides users with uncluttered visualization and easy interaction with complex graph statistics. Perseus-Hub uses a Spark cluster to calculate various statistics of large-scale graphs efficiently, and aggregates the results in a summary on the master node to support interactive user exploration. In Perseus-Hub, the visualized distributions of graph statistics provide preliminary analysis to understand a graph. To perform a deeper analysis, users with little prior knowledge can leverage patterns (e.g., spikes in the power-law degree distribution marked by other users or experts. Moreover, Perseus-Hub guides users to regions of interest by highlighting anomalous nodes and helps users establish a more comprehensive understanding about the graph at hand. We demonstrate our system through the case study on real, large-scale networks.

  18. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  19. Epidemiological characterization of Plasmodium falciparum in the Republic of Cabo Verde: implications for potential large-scale re-emergence of malaria

    Science.gov (United States)

    Alves, Joana; Roque, Ana Luísa; Cravo, Pedro; Valdez, Tomás; Jelinek, Tomas; do Rosário, Virgílio E; Arez, Ana Paula

    2006-01-01

    Background Malaria has come near eradication at archipelago of Cabo Verde in 1970. Infections are now only observed in Santiago, where outbreaks occur. In these islands, malaria is considered by the international community as being of limited risk and, therefore, no prophylaxis is recommended. Since the understanding of factors that determine malaria outbreaks are crucial for controlling the disease, the present study aimed to investigate if the malaria infections observed in Santiago Island are maintained in isolated foci and in asymptomatic individuals. Methods The occurrence of asymptomatic carriers in villages with history of malaria as well as the level of exposure of these populations were investigated using PCR and serological analyses. Results Results indicate that malaria is maintained as asymptomatic and sub-patent infections and that the majority of the circulating parasite populations harbour chloroquine-resistant mutations. Conclusion These observations highlight the alarming prospect of malaria to become a serious public health problem and underscore the need for a tighter surveillance. PMID:16630349

  20. Homelessness: patterns of emergency department use and risk factors for re-presentation.

    Science.gov (United States)

    Moore, G; Gerdtz, M F; Hepworth, G; Manias, E

    2011-05-01

    To describe patterns of service use and to predict risk factors for re-presentation to a metropolitan emergency department (ED) among people who are homeless. A retrospective cohort analysis was undertaken over a 24-month period from a principal referral hospital in Melbourne, Australia. All ED visits relating to people classified as homeless were included. A predictive model for risk of re-presentation was developed using logistic regression with random effects. Rates of re-presentation, defined as the total number of visits to the same ED within 28 days of discharge, were measured. The study period was 1 January 2003 to 31 December 2004. The re-presentation rate for homeless people was 47.8% (3199/6689) of ED visits and 45.5% (725/1595) of the patients. The final predictive model included risk factors, which incorporated both hospital and community service use. Those characteristics that resulted in significantly increased odds of re-presentation were leaving hospital at own risk (OR 1.31; 95% CI 1.10 to 1.56), treatment in another hospital (OR 1.45, 95% CI 1.23 to 1.72) and being in receipt of community-based case management (OR 1.31, 95% CI 1.11 to 1.54) or pension (OR 1.34, 95% CI 1.12 to 1.62). The predictive model identified nine risk factors of re-presentation to the ED for people who are homeless. Early identification of these factors among homeless people may alert clinicians to the complexity of issues influencing an individual ED visit. This information can be used at admission and discharge by ensuring that homeless people have access to services commensurate with their health needs. Improved linkage between community and hospital services must be underscored by the capacity to provide safe and secure housing.

  1. Experiences of Emerging Economy Firms

    DEFF Research Database (Denmark)

    Experiences of Emerging Economy Firms investigates the different elements of the experiences of emerging economy firms and sheds essential light on a large variety of aspects associated with their functioning in both home and host contexts. For example, firms must be able to overcome the liability...... of foreign and emerging issues when they expand their activities in various contexts, enter, exit, and re-enter overseas markets; they have to overcome institutional barriers, adapt the cultural challenges in foreign markets, undergo the impact of large multinational firms from developed economies...

  2. Diagnosis and management of new and re-emerging diseases of highbush blueberries in Michigan

    Science.gov (United States)

    Blueberries are an important commodity in Michigan and disease management is crucial for production of high-quality fruit. Over the past 6 years, a number of new and re-emerging diseases have been diagnosed in the state. In 2009, Blueberry scorch virus (BlScV) and Blueberry shock virus (BlShV) were ...

  3. A mixed-methods study of system-level sustainability of evidence-based practices in 12 large-scale implementation initiatives.

    Science.gov (United States)

    Scudder, Ashley T; Taber-Thomas, Sarah M; Schaffner, Kristen; Pemberton, Joy R; Hunter, Leah; Herschell, Amy D

    2017-12-07

    In recent decades, evidence-based practices (EBPs) have been broadly promoted in community behavioural health systems in the United States of America, yet reported EBP penetration rates remain low. Determining how to systematically sustain EBPs in complex, multi-level service systems has important implications for public health. This study examined factors impacting the sustainability of parent-child interaction therapy (PCIT) in large-scale initiatives in order to identify potential predictors of sustainment. A mixed-methods approach to data collection was used. Qualitative interviews and quantitative surveys examining sustainability processes and outcomes were completed by participants from 12 large-scale initiatives. Sustainment strategies fell into nine categories, including infrastructure, training, marketing, integration and building partnerships. Strategies involving integration of PCIT into existing practices and quality monitoring predicted sustainment, while financing also emerged as a key factor. The reported factors and strategies impacting sustainability varied across initiatives; however, integration into existing practices, monitoring quality and financing appear central to high levels of sustainability of PCIT in community-based systems. More detailed examination of the progression of specific activities related to these strategies may aide in identifying priorities to include in strategic planning of future large-scale initiatives. ClinicalTrials.gov ID NCT02543359 ; Protocol number PRO12060529.

  4. Catalogue of antibiotic resistome and host-tracking in drinking water deciphered by a large scale survey.

    Science.gov (United States)

    Ma, Liping; Li, Bing; Jiang, Xiao-Tao; Wang, Yu-Lin; Xia, Yu; Li, An-Dong; Zhang, Tong

    2017-11-28

    Excesses of antibiotic resistance genes (ARGs), which are regarded as emerging environmental pollutants, have been observed in various environments. The incidence of ARGs in drinking water causes potential risks to human health and receives more attention from the public. However, ARGs harbored in drinking water remain largely unexplored. In this study, we aimed at establishing an antibiotic resistome catalogue in drinking water samples from a wide range of regions and to explore the potential hosts of ARGs. A catalogue of antibiotic resistome in drinking water was established, and the host-tracking of ARGs was conducted through a large-scale survey using metagenomic approach. The drinking water samples were collected at the point of use in 25 cities in mainland China, Hong Kong, Macau, Taiwan, South Africa, Singapore and the USA. In total, 181 ARG subtypes belonging to 16 ARG types were detected with an abundance range of 2.8 × 10 -2 to 4.2 × 10 -1 copies of ARG per cell. The highest abundance was found in northern China (Henan Province). Bacitracin, multidrug, aminoglycoside, sulfonamide, and beta-lactam resistance genes were dominant in drinking water. Of the drinking water samples tested, 84% had a higher ARG abundance than typical environmental ecosystems of sediment and soil. Metagenomic assembly-based host-tracking analysis identified Acidovorax, Acinetobacter, Aeromonas, Methylobacterium, Methyloversatilis, Mycobacterium, Polaromonas, and Pseudomonas as the hosts of ARGs. Moreover, potential horizontal transfer of ARGs in drinking water systems was proposed by network and Procrustes analyses. The antibiotic resistome catalogue compiled using a large-scale survey provides a useful reference for future studies on the global surveillance and risk management of ARGs in drinking water. .

  5. Snow Tweets: Emergency Information Dissemination in a US County During 2014 Winter Storms.

    Science.gov (United States)

    Bonnan-White, Jess; Shulman, Jason; Bielecke, Abigail

    2014-12-22

    This paper describes how American federal, state, and local organizations created, sourced, and disseminated emergency information via social media in preparation for several winter storms in one county in the state of New Jersey (USA). Postings submitted to Twitter for three winter storm periods were collected from selected organizations, along with a purposeful sample of select private local users. Storm-related posts were analyzed for stylistic features (hashtags, retweet mentions, embedded URLs). Sharing and re-tweeting patterns were also mapped using NodeXL. RESULTS indicate emergency management entities were active in providing preparedness and response information during the selected winter weather events. A large number of posts, however, did not include unique Twitter features that maximize dissemination and discovery by users. Visual representations of interactions illustrate opportunities for developing stronger relationships among agencies. Whereas previous research predominantly focuses on large-scale national or international disaster contexts, the current study instead provides needed analysis in a small-scale context. With practice during localized events like extreme weather, effective information dissemination in large events can be enhanced.

  6. Cell therapy-processing economics: small-scale microfactories as a stepping stone toward large-scale macrofactories.

    Science.gov (United States)

    Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A

    2018-03-01

    Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive.  The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.

  7. Assessing emotional status following acquired brain injury: the clinical potential of the depression, anxiety and stress scales.

    Science.gov (United States)

    Ownsworth, Tamara; Little, Trudi; Turner, Ben; Hawkes, Anna; Shum, David

    2008-10-01

    To investigate the clinical potential of the Depression, Anxiety and Stress Scales (DASS 42) and its shorter version (DASS 21) for assessing emotional status following acquired brain injury. Participants included 23 individuals with traumatic brain injury (TBI), 25 individuals with brain tumour and 29 non-clinical controls. Investigations of internal consistency, test-re-test reliability, theory-consistent differences, sensitivity to change and concurrent validity were conducted. Internal consistency of the DASS was generally acceptable (r > 0.70), with the exception of the anxiety scale for the TBI sample. Test-re-test reliability (1-3 weeks) was sound for the depression scale (r > 0.75) and significant but comparatively lower for other scales (r = 0.60-0.73, p scale (p DASS in the context of hospital discharge was demonstrated for depression and stress (p 0.05). Concurrent validity with the Hospital Anxiety and Depression Scale was significant for all scales of the DASS (p DASS following ABI, further research examining the factor structure of existing and modified versions of the DASS is recommended.

  8. Keeping the ‘Great’ in the Great Barrier Reef: large-scale governance of the Great Barrier Reef Marine Park

    Directory of Open Access Journals (Sweden)

    Louisa S. Evans

    2014-08-01

    Full Text Available As part of an international collaboration to compare large-scale commons, we used the Social-Ecological Systems Meta-Analysis Database (SESMAD to systematically map out attributes of and changes in the Great Barrier Reef Marine Park (GBRMP in Australia. We focus on eight design principles from common-pool resource (CPR theory and other key social-ecological systems governance variables, and explore to what extent they help explain the social and ecological outcomes of park management through time. Our analysis showed that commercial fisheries management and the re-zoning of the GBRMP in 2004 led to improvements in ecological condition of the reef, particularly fisheries. These boundary and rights changes were supported by effective monitoring, sanctioning and conflict resolution. Moderate biophysical connectivity was also important for improved outcomes. However, our analysis also highlighted that continued challenges to improved ecological health in terms of coral cover and biodiversity can be explained by fuzzy boundaries between land and sea, and the significance of external drivers to even large-scale social-ecological systems (SES. While ecological and institutional fit in the marine SES was high, this was not the case when considering the coastal SES. Nested governance arrangements become even more important at this larger scale. To our knowledge, our paper provides the first analysis linking the re-zoning of the GBRMP to CPR and SES theory. We discuss important challenges to coding large-scale systems for meta-analysis.

  9. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  10. LandCaRe DSS--an interactive decision support system for climate change impact assessment and the analysis of potential agricultural land use adaptation strategies.

    Science.gov (United States)

    Wenkel, Karl-Otto; Berg, Michael; Mirschel, Wilfried; Wieland, Ralf; Nendel, Claas; Köstner, Barbara

    2013-09-01

    Decision support to develop viable climate change adaptation strategies for agriculture and regional land use management encompasses a wide range of options and issues. Up to now, only a few suitable tools and methods have existed for farmers and regional stakeholders that support the process of decision-making in this field. The interactive model-based spatial information and decision support system LandCaRe DSS attempts to close the existing methodical gap. This system supports interactive spatial scenario simulations, multi-ensemble and multi-model simulations at the regional scale, as well as the complex impact assessment of potential land use adaptation strategies at the local scale. The system is connected to a local geo-database and via the internet to a climate data server. LandCaRe DSS uses a multitude of scale-specific ecological impact models, which are linked in various ways. At the local scale (farm scale), biophysical models are directly coupled with a farm economy calculator. New or alternative simulation models can easily be added, thanks to the innovative architecture and design of the DSS. Scenario simulations can be conducted with a reasonable amount of effort. The interactive LandCaRe DSS prototype also offers a variety of data analysis and visualisation tools, a help system for users and a farmer information system for climate adaptation in agriculture. This paper presents the theoretical background, the conceptual framework, and the structure and methodology behind LandCaRe DSS. Scenario studies at the regional and local scale for the two Eastern German regions of Uckermark (dry lowlands, 2600 km(2)) and Weißeritz (humid mountain area, 400 km(2)) were conducted in close cooperation with stakeholders to test the functionality of the DSS prototype. The system is gradually being transformed into a web version (http://www.landcare-dss.de) to ensure the broadest possible distribution of LandCaRe DSS to the public. The system will be continuously

  11. On the contribution of external cost calculations to energy system governance: The case of a potential large-scale nuclear accident

    International Nuclear Information System (INIS)

    Laes, Erik; Meskens, Gaston; Sluijs, Jeroen P. van der

    2011-01-01

    The contribution of nuclear power to a sustainable energy future is a contested issue. This paper presents a critical review of an attempt to objectify this debate through the calculation of the external costs of a potential large-scale nuclear accident in the ExternE project. A careful dissection of the ExternE approach resulted in a list of 30 calculation steps and assumptions, from which the 6 most contentious ones were selected through a stakeholder internet survey. The policy robustness and relevance of these key assumptions were then assessed in a workshop using the concept of a 'pedigree of knowledge'. Overall, the workshop outcomes revealed the stakeholder and expert panel's scepticism about the assumptions made: generally these were considered not very plausible, subjected to disagreement, and to a large extent inspired by contextual factors. Such criticism indicates a limited validity and useability of the calculated nuclear accident externality as a trustworthy sustainability indicator. Furthermore, it is our contention that the ExternE project could benefit greatly - in terms of gaining public trust - from employing highly visible procedures of extended peer review such as the pedigree assessment applied to our specific case of the external costs of a potential large-scale nuclear accident. - Highlights: → Six most contentious assumptions were selected through a stakeholder internet survey. → Policy robustness of these assumptions was assessed in a pedigree assessment workshop. → Assumptions were considered implausible, controversial, and inspired by contextual factors. → This indicates a limited validity and useability as a trustworthy sustainability indicator.

  12. Academic Training Lecture Regular Programme: How Large-Scale Civil Engineering Projects Realise the Potential of a City

    CERN Multimedia

    2012-01-01

    How Large-Scale Civil Engineering Projects Realise the Potential of a City (1/3), by Bill Hanway (Excecutive Director of Operations, AECOM Europe).   Wednesday, June 6, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 80-1-001 - Globe 1st Floor ) In this series of three special lectures, leading experts from AECOM would explore the impact of a trio of major projects on a single city. In common with every metropolis, London has run-down districts and infrastructure in need of upgrading. The lectures propose to cover three of the biggest challenges: regenerating run-down areas; reducing congestion and transporting people more efficiently; and improving water and wastewater systems. Each project contributes to a collective public aim - to realise the potential of a growing city, and ensure its healthy, sustainable and competitive future. Lecture 1: Into the lecture series and The London 2012 Olympic Games Most cities share a group of common complex challenges – growing populations, agei...

  13. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  14. Re-analysis of correlations among four impulsivity scales.

    Science.gov (United States)

    Gallardo-Pujol, David; Andrés-Pueyo, Antonio

    2006-08-01

    Impulsivity plays a key role in normal and pathological behavior. Although there is some consensus about its conceptualization, there have been many attempts to build a multidimensional tool due to the lack of agreement in how to measure it. A recent study claimed support for a three-dimensional structure of impulsivity, however with weak empirical support. By re-analysing those data, a four-factor structure was found to describe the correlation matrix much better. The debate remains open and further research is needed to clarify the factor structure. The desirability of constructing new measures, perhaps analogously to the Wechsler Intelligence Scale, is emphasized.

  15. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  16. Large Eddy simulation of turbulence: A subgrid scale model including shear, vorticity, rotation, and buoyancy

    Science.gov (United States)

    Canuto, V. M.

    1994-01-01

    The Reynolds numbers that characterize geophysical and astrophysical turbulence (Re approximately equals 10(exp 8) for the planetary boundary layer and Re approximately equals 10(exp 14) for the Sun's interior) are too large to allow a direct numerical simulation (DNS) of the fundamental Navier-Stokes and temperature equations. In fact, the spatial number of grid points N approximately Re(exp 9/4) exceeds the computational capability of today's supercomputers. Alternative treatments are the ensemble-time average approach, and/or the volume average approach. Since the first method (Reynolds stress approach) is largely analytical, the resulting turbulence equations entail manageable computational requirements and can thus be linked to a stellar evolutionary code or, in the geophysical case, to general circulation models. In the volume average approach, one carries out a large eddy simulation (LES) which resolves numerically the largest scales, while the unresolved scales must be treated theoretically with a subgrid scale model (SGS). Contrary to the ensemble average approach, the LES+SGS approach has considerable computational requirements. Even if this prevents (for the time being) a LES+SGS model to be linked to stellar or geophysical codes, it is still of the greatest relevance as an 'experimental tool' to be used, inter alia, to improve the parameterizations needed in the ensemble average approach. Such a methodology has been successfully adopted in studies of the convective planetary boundary layer. Experienc e with the LES+SGS approach from different fields has shown that its reliability depends on the healthiness of the SGS model for numerical stability as well as for physical completeness. At present, the most widely used SGS model, the Smagorinsky model, accounts for the effect of the shear induced by the large resolved scales on the unresolved scales but does not account for the effects of buoyancy, anisotropy, rotation, and stable stratification. The

  17. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  18. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  19. Predators on private land: broad-scale socioeconomic interactions influence large predator management

    Directory of Open Access Journals (Sweden)

    Hayley S. Clements

    2016-06-01

    Full Text Available The proliferation of private land conservation areas (PLCAs is placing increasing pressure on conservation authorities to effectively regulate their ecological management. Many PLCAs depend on tourism for income, and charismatic large mammal species are considered important for attracting international visitors. Broad-scale socioeconomic factors therefore have the potential to drive fine-scale ecological management, creating a systemic scale mismatch that can reduce long-term sustainability in cases where economic and conservation objectives are not perfectly aligned. We assessed the socioeconomic drivers and outcomes of large predator management on 71 PLCAs in South Africa. Owners of PLCAs that are stocking free-roaming large predators identified revenue generation as influencing most or all of their management decisions, and rated profit generation as a more important objective than did the owners of PLCAs that did not stock large predators. Ecotourism revenue increased with increasing lion (Panthera leo density, which created a potential economic incentive for stocking lion at high densities. Despite this potential mismatch between economic and ecological objectives, lion densities were sustainable relative to available prey. Regional-scale policy guidelines for free-roaming lion management were ecologically sound. By contrast, policy guidelines underestimated the area required to sustain cheetah (Acinonyx jubatus, which occurred at unsustainable densities relative to available prey. Evidence of predator overstocking included predator diet supplementation and frequent reintroduction of game. We conclude that effective facilitation of conservation on private land requires consideration of the strong and not necessarily beneficial multiscale socioeconomic factors that influence private land management.

  20. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  1. Hantaviruses in the Americas and Their Role as Emerging Pathogens

    Directory of Open Access Journals (Sweden)

    Fernando Torres-Pérez

    2010-11-01

    Full Text Available The continued emergence and re-emergence of pathogens represent an ongoing, sometimes major, threat to populations. Hantaviruses (family Bunyaviridae and their associated human diseases were considered to be confined to Eurasia, but the occurrence of an outbreak in 1993–94 in the southwestern United States led to a great increase in their study among virologists worldwide. Well over 40 hantaviral genotypes have been described, the large majority since 1993, and nearly half of them pathogenic for humans. Hantaviruses cause persistent infections in their reservoir hosts, and in the Americas, human disease is manifest as a cardiopulmonary compromise, hantavirus cardiopulmonary syndrome (HCPS, with case-fatality ratios, for the most common viral serotypes, between 30% and 40%. Habitat disturbance and larger-scale ecological disturbances, perhaps including climate change, are among the factors that may have increased the human caseload of HCPS between 1993 and the present. We consider here the features that influence the structure of host population dynamics that may lead to viral outbreaks, as well as the macromolecular determinants of hantaviruses that have been regarded as having potential contribution to pathogenicity.

  2. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  3. Outcome of emergency endovascular treatment of large internal iliac artery aneurysms with guidewires

    International Nuclear Information System (INIS)

    Cambj-Sapunar, Liana; Maskovic, Josip; Brkljacic, Boris; Radonic, Vedran; Dragicevic, Dragan; Ajduk, Marko

    2010-01-01

    Purpose: Guidewires have been reported as a useful occlusion material for large aneurysms of different locations with good short-term results. In this study we retrospectively evaluate long-term results of emergency embolization technique with guidewires in symptomatic internal iliac artery aneurysm (IIAA) impending rupture. Patients and methods: In four patients presented with acute abdominal pain, multidetector computed tomography revealed unstable, 7-14 cm large, IIAAs. Two patients were treated with coil embolization of distal branches followed by occlusion of aneurysmal sac with guidewires. In two patients embolization of aneurysmal sac alone was performed. Results: In three patients complete or near complete occlusion of the aneurysmal sac was achieved and abdominal pain ceased within hours. Two patients treated with embolization of distal iliac artery branches and aneurysmal sac developed claudication that lasted up to 1 year. Their aneurysms remained thrombosed and they were without symptoms until they died 31 and 56 months later of causes unrelated to IIAA. Two patients treated with embolization of the aneurysm alone were free of ischemic symptoms. Because of incomplete embolization of the sac in one patient open surgery treatment in a non-emergency setting was performed. Complete filling of aneurysmal sac was achieved in other patient but 2 years later his aneurysm re-opened and required open surgery treatment. Conclusion: Embolization of aneurysmal sac of large IIAA with guidewires may be effective for immediate treatment of impending rupture. Long-term results were better when embolization of the aneurysmal sac was combined with embolization of distal IIA branches.

  4. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  5. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  6. Effective potentials for supersymmetric three-scale hierarchies

    International Nuclear Information System (INIS)

    Polchinski, J.

    1983-01-01

    We consider the effective potential in models in which supersymmetry breaks at a scale μ but the Goldstone fermion couples only to fields of mass M>>μ. We show that all large perturbative logarithms are removed by taking the renormalization point to be O(M). This makes it possible to calculate the effective potential at large X in those inverted-hierarchy models where the Goldstone fermion couples only to superheavy fields. A general formula for the one-loop logarithm in these models is given. We illustrate the results with an SU(n) example in which the direction as well as the magnitude of the gauge symmetry breaking is undetermined at the tree level. For this example a large perturbative hierarchy does not form and the unbroken subgroup is always SU(n-1) x U(1). In an appendix we show that O'Raifeartaigh models with just one undetermined scalar field always have a decoupled Goldstone fermion when the undetermined field is large, but that this need not be true in more general inverted-hierarchy models

  7. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  8. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  9. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  10. Calculations in support of a potential definition of large release

    International Nuclear Information System (INIS)

    Hanson, A.L.; Davis, R.E.; Mubayi, V.

    1994-05-01

    The Nuclear Regulatory Commission has stated a hierarchy of safety goals with the qualitative safety goals as Level I of the hierarchy, backed up by the quantitative health objectives as Level II and the large release guideline as Level III. The large release guideline has been stated in qualitative terms as a magnitude of release of the core inventory whose frequency should not exceed 10 -6 per reactor year. However, the Commission did not provide a quantitative specification of a large release. This report describes various specifications of a large release and focuses, in particular, on an examination of releases which have a potential to lead to one prompt fatality in the mean. The basic information required to set up the calculations was derived from the simplified source terms which were obtained from approximations of the NUREG-1150 source terms. Since the calculation of consequences is affected by a large number of assumptions, a generic site with a (conservatively determined) population density and meteorology was specified. At this site, various emergency responses (including no response) were assumed based on information derived from earlier studies. For each of the emergency response assumptions, a set of calculations were performed with the simplified source terms; these included adjustments to the source terms, such as the timing of the release, the core inventory, and the release fractions of different radionuclides, to arrive at a result of one mean prompt fatality in each case. Each of the source terms, so defined, has the potential to be a candidate for a large release. The calculations show that there are many possible candidate source terms for a large release depending on the characteristics which are felt to be important

  11. Using GeoRePORT to report socio-economic potential for geothermal development

    Energy Technology Data Exchange (ETDEWEB)

    Young, Katherine R.; Levine, Aaron

    2018-07-01

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT, http://en.openei.org/wiki/GeoRePORT) was developed for reporting resource grades and project readiness levels, providing the U.S. Department of Energy a consistent and comprehensible means of evaluating projects. The tool helps funding organizations (1) quantitatively identify barriers, (2) develop measureable goals, (3) objectively evaluate proposals, including contribution to goals, (4) monitor progress, and (5) report portfolio performance. GeoRePORT assesses three categories: geological, technical, and socio-economic. Here, we describe GeoRePORT, then focus on the socio-economic assessment and its applications for assessing deployment potential in the U.S. Socio-economic attributes include land access, permitting, transmission, and market.

  12. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  13. Generation and saturation of large-scale flows in flute turbulence

    International Nuclear Information System (INIS)

    Sandberg, I.; Isliker, H.; Pavlenko, V. P.; Hizanidis, K.; Vlahos, L.

    2005-01-01

    The excitation and suppression of large-scale anisotropic modes during the temporal evolution of a magnetic-curvature-driven electrostatic flute instability are numerically investigated. The formation of streamerlike structures is attributed to the linear development of the instability while the subsequent excitation of the zonal modes is the result of the nonlinear coupling between linearly grown flute modes. When the amplitudes of the zonal modes become of the same order as that of the streamer modes, the flute instabilities get suppressed and poloidal (zonal) flows dominate. In the saturated state that follows, the dominant large-scale modes of the potential and the density are self-organized in different ways, depending on the value of the ion temperature

  14. Mycoplasmas and their host: emerging and re-emerging minimal pathogens.

    Science.gov (United States)

    Citti, Christine; Blanchard, Alain

    2013-04-01

    Commonly known as mycoplasmas, bacteria of the class Mollicutes include the smallest and simplest life forms capable of self replication outside of a host. Yet, this minimalism hides major human and animal pathogens whose prevalence and occurrence have long been underestimated. Owing to advances in sequencing methods, large data sets have become available for a number of mycoplasma species and strains, providing new diagnostic approaches, typing strategies, and means for comprehensive studies. A broader picture is thus emerging in which mycoplasmas are successful pathogens having evolved a number of mechanisms and strategies for surviving hostile environments and adapting to new niches or hosts. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  16. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  17. Wuchereria bancrofti infection in Haitian immigrants and the risk of re-emergence of lymphatic filariasis in the Brazilian Amazon

    Directory of Open Access Journals (Sweden)

    Edson Fidelis da Silva Junior

    Full Text Available Abstract INTRODUCTION: Lymphatic filariasis (LF is a public health problem in Haiti. Thus, the emigration of Haitians to Brazil is worrisome because of the risk for LF re-emergence. METHODS: Blood samples of Haitian immigrants, aged ≥18 years, who emigrated to Manaus (Brazilian Amazon, were examined using thick blood smears, membrane blood filtration, and immunochromatography. RESULTS: Of the 244 immigrants evaluated, 1 (0.4% tested positive for W. bancrofti; 11.5% reported as having received LF treatment in Haiti. CONCLUSIONS: The re-emergence of LF in Manaus is unlikely, due to its low prevalence and low density of microfilaremia among the assessed Haitian immigrants.

  18. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  19. High spatial resolution measurements of large-scale three-dimensional structures in a turbulent boundary layer

    Science.gov (United States)

    Atkinson, Callum; Buchmann, Nicolas; Kuehn, Matthias; Soria, Julio

    2011-11-01

    Large-scale three-dimensional (3D) structures in a turbulent boundary layer at Reθ = 2000 are examined via the streamwise extrapolation of time-resolved stereo particle image velocimetry (SPIV) measurements in a wall-normal spanwise plane using Taylor's hypothesis. Two overlapping SPIV systems are used to provide a field of view similar to that of direct numerical simulations (DNS) on the order of 50 δ × 1 . 5 δ × 3 . 0 δ in the streamwise, wall-normal and spanwise directions, respectively, with an interrogation window size of 40+ ×20+ ×60+ wall units. Velocity power spectra are compared with DNS to examine the effective resolution of these measurements and two-point correlations are performed to investigate the integral length scales associated with coherent velocity and vorticity fluctuations. Individual coherent structures are detected to provide statistics on the 3D size, spacing, and angular orientation of large-scale structures, as well as their contribution to the total turbulent kinetic energy and Reynolds shear stress. The support of the ARC through Discovery (and LIEF) grants is gratefully acknowledged.

  20. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Senior Member of Technical Staff, Analytical Structural Dynamics Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Owens, B C; 2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Principal Member of Technical Staff, Wind Energy Technologies Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Griffith, D T

    2014-01-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs

  1. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    Science.gov (United States)

    Owens, B. C.; Griffith, D. T.

    2014-06-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs.

  2. Preparation of 188 Re-lanreotide as a potential tumor therapeutic agent

    International Nuclear Information System (INIS)

    Bai Hongsheng; Jin Xiaohai; Fan Hongqiang; Jia Bing; Wang Yuqing; Lu Weiwei

    2001-01-01

    Radiolabeled peptides hold unlimited potential in diagnostic applications and therapy of malignant tumor. Somatostatin analogue peptide (Lanreotide) is labeled directly with 188 Re via the mixture of citrate and tartrate. The influences of reaction conditions such as pH, temperature, amount of stannous chloride, Lanreotide quantity, reaction time on labeling yield are investigated in detail. At the same time, the stability in vitro, quality control and animal test are evaluated. The experimental results show that Lanreotide reacts with 188 Re for 40 min at pH 2 - 3 and 60 degree C, the labeling yield is at range of 88% - 94%. After purification of 188 Re-Lanreotide with Sep-Pak C 18 reverse phase extraction cartridge, the radiochemical purity (RP) is more than 95%. 188 Re-Lanreotide is eliminated rapidly from the blood and is excreted through liver, the uptake of lung and intestine is high

  3. Particulate matter from re-suspended mineral dust and emergency cause-specific respiratory hospitalizations in Hong Kong

    Science.gov (United States)

    Pun, Vivian C.; Tian, Linwei; Ho, Kin-fai

    2017-09-01

    While contribution from non-exhaust particulate matter (PM) emissions towards traffic-related emissions is increasing, few epidemiologic evidence of their health impact is available. We examined the association of short-term exposure to PM10 apportioned to re-suspended mineral dust with emergency hospitalizations for three major respiratory causes in Hong Kong between 2001 and 2008. Time-series regression model was constructed to examine association of PM10 from re-suspended mineral dust with emergency hospitalizations for upper respiratory infection (URI), chronic obstructive pulmonary disease (COPD) and asthma at exposure lag 0-5 days, adjusting for time trends, seasonality, temperature and relative humidity. An interquartile range (6.8 μg/m3) increment in re-suspended mineral dust on previous day was associated with 0.66% (95% CI: 0.12, 0.98) increase in total respiratory hospitalizations, and 1.01% (95% CI: 0.14, 1.88) increase in URI hospitalizations. A significant 0.66%-0.80% increases in risk of COPD hospitalizations were found after exposure to re-suspended mineral dust at lag 3 or later. Exposure to mineral dust at lag 4 was linked to 1.71% increase (95% CI: 0.14, 2.22) in asthma hospitalizations. Associations from single-pollutant models remained significant in multi-pollutant models, which additionally adjusted for PM10 contributing from vehicle exhaust, regional combustion, residual oil, fresh sea salt, aged sea salt, secondary nitrate and secondary sulfate, or gaseous pollutants (i.e., nitrogen dioxide, sulfur dioxide, or ozone), respectively. Our findings provide insight into the biological mechanism by which non-exhaust pollution may be associated with risk of adverse respiratory outcomes, and also stress the needs for strategies to reduce emission and re-suspension of mineral dust. More research is warranted to assess the health effects of different non-exhaust PM emissions under various roadway conditions and vehicle fleets.

  4. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  5. Large-scale ab initio configuration interaction calculations for light nuclei

    International Nuclear Information System (INIS)

    Maris, Pieter; Potter, Hugh; Vary, James P; Aktulga, H Metin; Ng, Esmond G; Yang Chao; Caprio, Mark A; Çatalyürek, Ümit V; Saule, Erik; Oryspayev, Dossay; Sosonkina, Masha; Zhou Zheng

    2012-01-01

    In ab-initio Configuration Interaction calculations, the nuclear wavefunction is expanded in Slater determinants of single-nucleon wavefunctions and the many-body Schrodinger equation becomes a large sparse matrix problem. The challenge is to reach numerical convergence to within quantified numerical uncertainties for physical observables using finite truncations of the infinite-dimensional basis space. We discuss strategies for constructing and solving the resulting large sparse matrix eigenvalue problems on current multicore computer architectures. Several of these strategies have been implemented in the code MFDn, a hybrid MPI/OpenMP Fortran code for ab-initio nuclear structure calculations that can scale to 100,000 cores and more. Finally, we will conclude with some recent results for 12 C including emerging collective phenomena such as rotational band structures using SRG evolved chiral N3LO interactions.

  6. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  7. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  8. [Discussion on development of four diagnostic information scale for clinical re-evaluation of postmarketing herbs].

    Science.gov (United States)

    He, Wei; Xie, Yanming; Wang, Yongyan

    2011-12-01

    Post-marketing re-evaluation of Chinese herbs can well reflect Chinese medicine characteristics, which is the most easily overlooked the clinical re-evaluation content. Since little attention has been paid to this, study on the clinical trial design method was lost. It is difficult to improving the effectiveness and safety of traditional Chinese medicine. Therefore, more attention should be paid on re-evaluation of the clinical trial design method point about tcm syndrome such as the type of research program design, the study of Chinese medical information collection scale and statistical analysis methods, so as to improve the clinical trial design method study about tcm syndrome of Chinese herbs postmarketing re-evalutation status.

  9. Dairy farm demographics and management factors that played a role in the re-emergence of brucellosis on dairy cattle farms in Fiji.

    Science.gov (United States)

    Tukana, Andrew; Gummow, B

    2017-08-01

    Little is published on risk factors associated with bovine brucellosis in Pacific island communities. The 2009 re-emergence of bovine brucellosis in Fiji enabled us to do an interview-based questionnaire survey of 81 farms in the Wainivesi locality of the Tailevu province on the main island of Fiji to investigate what risk factors could have played a role in the re-emergence of the disease. The survey was conducted on 68 farms that had no positive cases of bovine brucellosis and on 13 farms in the same area where cattle had returned a positive result to the Brucella Rose Bengal test. Descriptive statistical methods were used to describe the demographic data while univariate analysis and multivariate logistic regression were used to evaluate the association between the selected risk factors and the presence of brucellosis on the farms at the time of the outbreak. The demographics of Fijian dairy farms are presented in the article and the biosecurity implications of those farming systems are discussed. Two risk factors were strongly associated with farms having brucellosis, and these were history of reactor cattle to brucellosis and or bovine tuberculosis on the farm (OR = 29, P ≤ 0.01) and farms that practised sharing of water sources for cattle within and with outside farms (OR = 39, P ≤ 0.01). Possible reasons why these were risk factors are also discussed. The potential risks for human health was also high as the use of personal protective equipment was low (15%). A high proportion of farmers (62%) could not recognise brucellosis thus contributing to the low frequency of disease reports (44%) made. The article also highlights other important risk factors which could be attributed to farming practices in the region and which could contribute to public health risks and the re-emergence of diseases.

  10. Re-Emerging Vaccine-Preventable Diseases in War-Affected Peoples of the Eastern Mediterranean Region—An Update

    Directory of Open Access Journals (Sweden)

    Rasha Raslan

    2017-10-01

    Full Text Available For the past few decades, the Eastern Mediterranean Region has been one area of the world profoundly shaped by war and political instability. On-going conflict and destruction have left the region struggling with innumerable health concerns that have claimed the lives of many. Wars, and the chaos they leave behind, often provide the optimal conditions for the growth and re-emergence of communicable diseases. In this article, we highlight a few of the major re-emerging vaccine preventable diseases in four countries of the Eastern Mediterranean Region that are currently affected by war leading to a migration crisis: Iraq, South Sudan, Syria, and Yemen. We will also describe the impact these infections have had on patients, societies, and national health care services. This article also describes the efforts, both local and international, which have been made to address these crises, as well as future endeavors that can be done to contain and control further devastation left by these diseases.

  11. Re-evaluation of emergency planning zone for 3 NPPS in Taiwan

    International Nuclear Information System (INIS)

    Chiou, S.-T.; Yin, H.-L.; Chen, C.-S.; Shih, C.-L.

    2004-01-01

    The emergency planning zone for the 3 nuclear power plants in Taiwan are re-evaluated. The analysis is performed by the CRAC2 code and the basic approach follows the NUREG-0396 evaluation procedure. Meteorological data are provided by Taiwan Power Company and reviewed by Taiwan University and Central Weather Bureau. Accident source terms are also provided by Institute of Nuclear Energy Research (INER) by probabilistic risk assessment method with consideration of actual plant system improvement and/or modification. The dose rate distribution, acute and latent cancer fatality are evaluated and compared with proposed EPZ decision criteria including protective action guide dose levels, individual and societal risk safety goal. (author)

  12. Re-passivation Potential of Alloy 22 in Chloride plus Nitrate Solutions using the Potentiodynamic-Galvano-static-Potentiostatic Method

    International Nuclear Information System (INIS)

    Evans, Kenneth J.; Rebak, Raul B.

    2007-01-01

    In general, the susceptibility of Alloy 22 to suffer crevice corrosion is measured using the Cyclic Potentiodynamic Polarization (CPP) technique. This is a fast technique that gives rather accurate and reproducible values of re-passivation potential (ER1) in most cases. In the fringes of susceptibility, when the environment is not highly aggressive, the values of re-passivation potential using the CPP technique may not be highly reproducible, especially because the technique is fast. To circumvent this, the re-passivation potential of Alloy 22 was measured using a slower method that combines Potentiodynamic-Galvano-static-Potentiostatic steps (called here the Tsujikawa-Hisamatsu Electrochemical or THE method). The THE method applies the charge to the specimen in a more controlled way, which may give more reproducible re-passivation potential values, especially when the environment is not aggressive. The values of re-passivation potential of Alloy 22 in sodium chloride plus potassium nitrate solutions were measured using the THE and CPP methods. Results show that both methods yield similar values of re-passivation potential, especially under aggressive conditions. (authors)

  13. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  14. Small-scale fuel cell cogen: application potentials and market strategies

    International Nuclear Information System (INIS)

    Vogel, Bernd

    2000-01-01

    Small (less than 5 kW) fuel-cell cogeneration systems are now being developed for use in residential buildings. The devices are expected to be on the market in five years. The article discusses the potential for their large-scale introduction, the impact of this new technology on the natural gas business, potential applications and marketing strategies

  15. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  16. Prospective validation of a predictive model that identifies homeless people at risk of re-presentation to the emergency department.

    Science.gov (United States)

    Moore, Gaye; Hepworth, Graham; Weiland, Tracey; Manias, Elizabeth; Gerdtz, Marie Frances; Kelaher, Margaret; Dunt, David

    2012-02-01

    To prospectively evaluate the accuracy of a predictive model to identify homeless people at risk of representation to an emergency department. A prospective cohort analysis utilised one month of data from a Principal Referral Hospital in Melbourne, Australia. All visits involving people classified as homeless were included, excluding those who died. Homelessness was defined as living on the streets, in crisis accommodation, in boarding houses or residing in unstable housing. Rates of re-presentation, defined as the total number of visits to the same emergency department within 28 days of discharge from hospital, were measured. Performance of the risk screening tool was assessed by calculating sensitivity, specificity, positive and negative predictive values and likelihood ratios. Over the study period (April 1, 2009 to April 30, 2009), 3298 presentations from 2888 individuals were recorded. The homeless population accounted for 10% (n=327) of all visits and 7% (n=211) of all patients. A total of 90 (43%) homeless people re-presented to the emergency department. The predictive model included nine variables and achieved 98% (CI, 0.92-0.99) sensitivity and 66% (CI, 0.57-0.74) specificity. The positive predictive value was 68% and the negative predictive value was 98%. The positive likelihood ratio 2.9 (CI, 2.2-3.7) and the negative likelihood ratio was 0.03 (CI, 0.01-0.13). The high emergency department re-presentation rate for people who were homeless identifies unresolved psychosocial health needs. The emergency department remains a vital access point for homeless people, particularly after hours. The risk screening tool is key to identify medical and social aspects of a homeless patient's presentation to assist early identification and referral. Copyright © 2012 College of Emergency Nursing Australasia Ltd. Published by Elsevier Ltd. All rights reserved.

  17. 188Re-microspheres of albumin - the potential preparation for radiotherapy

    International Nuclear Information System (INIS)

    Dyomin, D.N.; Petriev, V.M.

    2000-01-01

    In this paper author describe preparation the albumin microspheres labelled with rhenium-188. We undertake an attempt to develop kits to the generator of rhenium-188 on the basis of albumin microspheres for radiotherapy of both oncological and non-oncological diseases. Microspheres, rhenium-188 with sizes 1 0-20 micron for treatment of rheumatoid arthritis (damage of large and intermediate joints), intraperitoneal administration and intrapleural administration at metastases covering a cavity. Microspheres, Re-188 with sizes 40-60 micron for treatment of disseminated kidney cancer (intraarterial, selectively), intratumoral administration to damaged nodules less than 2-3 cm. Microspheres, Re-188 with sizes 80-100 micron for large neoplasms and metastases of liver (intraarterial, selectively), intratumoral administration to damaged nodules with sizes over 3 cm. Preparation of albumin microspheres is carried out by thermal denaturation of protein in vegetable oil. Microspheres are obtained with the necessary range of sizes by ultrasonic fractionation. At our laboratory the method of preparation of albumin microspheres with any sizes of particles (from 5 -10 up to 800 -1000 microns) has been developed. (authors)

  18. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    SDG agenda. Based on this, we argue that the development of policies for regulating externalities of large-scale bioenergy production should rely on broad sustainability assessments to discover potential trade-offs with the SDG agenda before implementation.

  19. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP

    Science.gov (United States)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-01

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version of the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. Other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.

  20. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  1. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  2. Evaluation of the small-scale hydro-energetic potential in micro-basins in Colombia

    International Nuclear Information System (INIS)

    Torres Q, E.; Castillo C, J.J.

    1995-01-01

    A definition of the small-scale hydroelectric power plants (PCHs, abbreviations in Spanish), its classification according to potency and fall and its classification as utilization form is present. The general parameters to PCHs design in aspects as topography studies, geology and geotechnical studies and hydrologic studies are described. The primary elements of a PCH, as dam (little dam), conduction, tank of charge, sand trap (water re claimer), floodgate, grating (network), pressure's pipe, fall, principal valve and turbine are shown. In the study of potential of micro-basins, general points as topography, draining, population, supply and demand of electric energy, morphology, hydrology, geology and hydraulic potential are consider

  3. Onboard autonomous mission re-planning for multi-satellite system

    Science.gov (United States)

    Zheng, Zixuan; Guo, Jian; Gill, Eberhard

    2018-04-01

    This paper presents an onboard autonomous mission re-planning system for Multi-Satellites System (MSS) to perform onboard re-planing in disruptive situations. The proposed re-planning system can deal with different potential emergency situations. This paper uses Multi-Objective Hybrid Dynamic Mutation Genetic Algorithm (MO-HDM GA) combined with re-planning techniques as the core algorithm. The Cyclically Re-planning Method (CRM) and the Near Real-time Re-planning Method (NRRM) are developed to meet different mission requirements. Simulations results show that both methods can provide feasible re-planning sequences under unforeseen situations. The comparisons illustrate that using the CRM is average 20% faster than the NRRM on computation time. However, by using the NRRM more raw data can be observed and transmitted than using the CRM within the same period. The usability of this onboard re-planning system is not limited to multi-satellite system. Other mission planning and re-planning problems related to autonomous multiple vehicles with similar demands are also applicable.

  4. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  5. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  6. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  7. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  8. Emergence of Scale-Free Syntax Networks

    Science.gov (United States)

    Corominas-Murtra, Bernat; Valverde, Sergi; Solé, Ricard V.

    The evolution of human language allowed the efficient propagation of nongenetic information, thus creating a new form of evolutionary change. Language development in children offers the opportunity of exploring the emergence of such complex communication system and provides a window to understanding the transition from protolanguage to language. Here we present the first analysis of the emergence of syntax in terms of complex networks. A previously unreported, sharp transition is shown to occur around two years of age from a (pre-syntactic) tree-like structure to a scale-free, small world syntax network. The observed combinatorial patterns provide valuable data to understand the nature of the cognitive processes involved in the acquisition of syntax, introducing a new ingredient to understand the possible biological endowment of human beings which results in the emergence of complex language. We explore this problem by using a minimal, data-driven model that is able to capture several statistical traits, but some key features related to the emergence of syntactic complexity display important divergences.

  9. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  10. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  11. Biochemical analysis of force-sensitive responses using a large-scale cell stretch device.

    Science.gov (United States)

    Renner, Derrick J; Ewald, Makena L; Kim, Timothy; Yamada, Soichiro

    2017-09-03

    Physical force has emerged as a key regulator of tissue homeostasis, and plays an important role in embryogenesis, tissue regeneration, and disease progression. Currently, the details of protein interactions under elevated physical stress are largely missing, therefore, preventing the fundamental, molecular understanding of mechano-transduction. This is in part due to the difficulty isolating large quantities of cell lysates exposed to force-bearing conditions for biochemical analysis. We designed a simple, easy-to-fabricate, large-scale cell stretch device for the analysis of force-sensitive cell responses. Using proximal biotinylation (BioID) analysis or phospho-specific antibodies, we detected force-sensitive biochemical changes in cells exposed to prolonged cyclic substrate stretch. For example, using promiscuous biotin ligase BirA* tagged α-catenin, the biotinylation of myosin IIA increased with stretch, suggesting the close proximity of myosin IIA to α-catenin under a force bearing condition. Furthermore, using phospho-specific antibodies, Akt phosphorylation was reduced upon stretch while Src phosphorylation was unchanged. Interestingly, phosphorylation of GSK3β, a downstream effector of Akt pathway, was also reduced with stretch, while the phosphorylation of other Akt effectors was unchanged. These data suggest that the Akt-GSK3β pathway is force-sensitive. This simple cell stretch device enables biochemical analysis of force-sensitive responses and has potential to uncover molecules underlying mechano-transduction.

  12. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  13. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  14. Large-Scale Battery System Development and User-Specific Driving Behavior Analysis for Emerging Electric-Drive Vehicles

    Directory of Open Access Journals (Sweden)

    Yihe Sun

    2011-04-01

    Full Text Available Emerging green-energy transportation, such as hybrid electric vehicles (HEVs and plug-in HEVs (PHEVs, has a great potential for reduction of fuel consumption and greenhouse emissions. The lithium-ion battery system used in these vehicles, however, is bulky, expensive and unreliable, and has been the primary roadblock for transportation electrification. Meanwhile, few studies have considered user-specific driving behavior and its significant impact on (PHEV fuel efficiency, battery system lifetime, and the environment. This paper presents a detailed investigation of battery system modeling and real-world user-specific driving behavior analysis for emerging electric-drive vehicles. The proposed model is fast to compute and accurate for analyzing battery system run-time and long-term cycle life with a focus on temperature dependent battery system capacity fading and variation. The proposed solution is validated against physical measurement using real-world user driving studies, and has been adopted to facilitate battery system design and optimization. Using the collected real-world hybrid vehicle and run-time driving data, we have also conducted detailed analytical studies of users’ specific driving patterns and their impacts on hybrid vehicle electric energy and fuel efficiency. This work provides a solid foundation for future energy control with emerging electric-drive applications.

  15. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  16. Sensitivity of local air quality to the interplay between small- and large-scale circulations: a large-eddy simulation study

    Science.gov (United States)

    Wolf-Grosse, Tobias; Esau, Igor; Reuder, Joachim

    2017-06-01

    Street-level urban air pollution is a challenging concern for modern urban societies. Pollution dispersion models assume that the concentrations decrease monotonically with raising wind speed. This convenient assumption breaks down when applied to flows with local recirculations such as those found in topographically complex coastal areas. This study looks at a practically important and sufficiently common case of air pollution in a coastal valley city. Here, the observed concentrations are determined by the interaction between large-scale topographically forced and local-scale breeze-like recirculations. Analysis of a long observational dataset in Bergen, Norway, revealed that the most extreme cases of recurring wintertime air pollution episodes were accompanied by increased large-scale wind speeds above the valley. Contrary to the theoretical assumption and intuitive expectations, the maximum NO2 concentrations were not found for the lowest 10 m ERA-Interim wind speeds but in situations with wind speeds of 3 m s-1. To explain this phenomenon, we investigated empirical relationships between the large-scale forcing and the local wind and air quality parameters. We conducted 16 large-eddy simulation (LES) experiments with the Parallelised Large-Eddy Simulation Model (PALM) for atmospheric and oceanic flows. The LES accounted for the realistic relief and coastal configuration as well as for the large-scale forcing and local surface condition heterogeneity in Bergen. They revealed that emerging local breeze-like circulations strongly enhance the urban ventilation and dispersion of the air pollutants in situations with weak large-scale winds. Slightly stronger large-scale winds, however, can counteract these local recirculations, leading to enhanced surface air stagnation. Furthermore, this study looks at the concrete impact of the relative configuration of warmer water bodies in the city and the major transport corridor. We found that a relatively small local water

  17. Sensitivity of local air quality to the interplay between small- and large-scale circulations: a large-eddy simulation study

    Directory of Open Access Journals (Sweden)

    T. Wolf-Grosse

    2017-06-01

    Full Text Available Street-level urban air pollution is a challenging concern for modern urban societies. Pollution dispersion models assume that the concentrations decrease monotonically with raising wind speed. This convenient assumption breaks down when applied to flows with local recirculations such as those found in topographically complex coastal areas. This study looks at a practically important and sufficiently common case of air pollution in a coastal valley city. Here, the observed concentrations are determined by the interaction between large-scale topographically forced and local-scale breeze-like recirculations. Analysis of a long observational dataset in Bergen, Norway, revealed that the most extreme cases of recurring wintertime air pollution episodes were accompanied by increased large-scale wind speeds above the valley. Contrary to the theoretical assumption and intuitive expectations, the maximum NO2 concentrations were not found for the lowest 10 m ERA-Interim wind speeds but in situations with wind speeds of 3 m s−1. To explain this phenomenon, we investigated empirical relationships between the large-scale forcing and the local wind and air quality parameters. We conducted 16 large-eddy simulation (LES experiments with the Parallelised Large-Eddy Simulation Model (PALM for atmospheric and oceanic flows. The LES accounted for the realistic relief and coastal configuration as well as for the large-scale forcing and local surface condition heterogeneity in Bergen. They revealed that emerging local breeze-like circulations strongly enhance the urban ventilation and dispersion of the air pollutants in situations with weak large-scale winds. Slightly stronger large-scale winds, however, can counteract these local recirculations, leading to enhanced surface air stagnation. Furthermore, this study looks at the concrete impact of the relative configuration of warmer water bodies in the city and the major transport corridor. We found that a

  18. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  19. Importance of regional species pools and functional traits in colonization processes: predicting re-colonization after large-scale destruction of ecosystems

    NARCIS (Netherlands)

    Kirmer, A.; Tischew, S.; Ozinga, W.A.; Lampe, von M.; Baasch, A.; Groenendael, van J.M.

    2008-01-01

    Large-scale destruction of ecosystems caused by surface mining provides an opportunity for the study of colonization processes starting with primary succession. Surprisingly, over several decades and without any restoration measures, most of these sites spontaneously developed into valuable biotope

  20. The effective potential in the presence of several mass scales

    International Nuclear Information System (INIS)

    Casas, J.A.; Di Clemente, V.; Quiros, M.

    1999-01-01

    We consider the problem of improving the effective potential in mass independent schemes, as e.g. the MS-bar or DR-bar renormalization scheme, in the presence of an arbitrary number of fields with PHI-dependent masses M i(PHI c ) . We use the decoupling theorem at the scales μ i M i (PHI c ) such that the matching between the effective (low energy) and complete (high energy) one-loop theories contains no thresholds. We find that for any value of PHI c , there is a convenient scale μ * ≡ min i M i (PHI c ), at which the loop expansion has the best behaviour and the effective potential has the least μ-dependence. Furthermore, at this scale the effective potential coincides with the (improved) tree-level one in the effective field theory. The decoupling method is explicitly illustrated with a simple Higgs-Yukawa model, along with its relationship with other decoupling prescriptions and with proposed multi-scale renormalization approaches. The procedure leads to a nice suppression of potentially large logarithms and can be easily adapted to include higher-loop effects, which is explicitly shown at the two-loop level

  1. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  2. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  3. Predicting the effect of fire on large-scale vegetation patterns in North America.

    Science.gov (United States)

    Donald McKenzie; David L. Peterson; Ernesto. Alvarado

    1996-01-01

    Changes in fire regimes are expected across North America in response to anticipated global climatic changes. Potential changes in large-scale vegetation patterns are predicted as a result of altered fire frequencies. A new vegetation classification was developed by condensing Kuchler potential natural vegetation types into aggregated types that are relatively...

  4. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  5. A large scale double beta and dark matter experiment: On the physics potential of GENIUS

    International Nuclear Information System (INIS)

    Klapdor-Kleingrothaus, H.V.; Hirsch, M.

    1997-01-01

    The physics potential of GENIUS, a recently proposed double beta decay anddark matter experiment is discussed. The experiment will allow to probe neutrino masses down to 10 -(2-3) eV. GENIUS will test the structure of the neutrino mass matrix, and therefore implicitly neutrino oscillation parameters comparable or superior in sensitivity to the best proposed dedicated terrestrial neutrino oscillation experiments. If the 10 -3 eV level is reached, GENIUS will even allow to test the large angle MSW solution of the solar neutrino problem. Even in its first stage GENIUS will confirm or rule out degenerate or inverted neutrino mass scenarios, which have been widely discussed in the literature as a possible solution to current hints on finite neutrino masses and also test the ν e ν μ hypothesis of the atmospheric neutrino problem.GENIUS would contribute to the search for R-parity violating SUSY and right-handed W-bosons on a scale similar or superior to LHC. In addition, GENIUS would largely improve the current 0νββ decay searches for R-parity conserving SUSY and leptoquarks. Concerning cold dark matter (CDM) search, the low background anticipated for GENIUS would, for thefirst time ever, allow to cover the complete MSSM neutralino parameter space, making GENIUS competitive to LHC in SUSY discovery. If GENIUS could find SUSY CDM as a by-product it would confirm that R-parity must be conserved exactly. GENIUS will thus be a major tool for future non-accelerator particle physics. (orig.)

  6. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  7. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  8. Large-Scale Urban Decontamination; Developments, Historical Examples and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Rick Demmer

    2007-02-01

    Recent terrorist threats and actual events have lead to a renewed interest in the technical field of large scale, urban environment decontamination. One of the driving forces for this interest is the real potential for the cleanup and removal of radioactive dispersal device (RDD or “dirty bomb”) residues. In response the U. S. Government has spent many millions of dollars investigating RDD contamination and novel decontamination methodologies. Interest in chemical and biological (CB) cleanup has also peaked with the threat of terrorist action like the anthrax attack at the Hart Senate Office Building and with catastrophic natural events such as Hurricane Katrina. The efficiency of cleanup response will be improved with these new developments and a better understanding of the “old reliable” methodologies. Perhaps the most interesting area of investigation for large area decontamination is that of the RDD. While primarily an economic and psychological weapon, the need to cleanup and return valuable or culturally significant resources to the public is nonetheless valid. Several private companies, universities and National Laboratories are currently developing novel RDD cleanup technologies. Because of its longstanding association with radioactive facilities, the U. S. Department of Energy National Laboratories are at the forefront in developing and testing new RDD decontamination methods. However, such cleanup technologies are likely to be fairly task specific; while many different contamination mechanisms, substrate and environmental conditions will make actual application more complicated. Some major efforts have also been made to model potential contamination, to evaluate both old and new decontamination techniques and to assess their readiness for use. Non-radioactive, CB threats each have unique decontamination challenges and recent events have provided some examples. The U. S. Environmental Protection Agency (EPA), as lead agency for these emergency

  9. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  10. Virtual neutron scattering experiments - Training and preparing students for large-scale facility experiments

    Directory of Open Access Journals (Sweden)

    Julie Hougaard Overgaard

    2016-11-01

    Full Text Available Dansk Vi beskriver, hvordan virtuelle eksperimenter kan udnyttes i et læringsdesign ved at forberede de studerende til hands-on-eksperimenter ved storskalafaciliteter. Vi illustrerer designet ved at vise, hvordan virtuelle eksperimenter bruges på Niels Bohr Institutets kandidatkursus om neutronspredning. I den sidste uge af kurset, rejser studerende til et storskala neutronspredningsfacilitet for at udføre neutronspredningseksperimenter. Vi bruger studerendes udsagn om deres oplevelser til at argumentere for, at arbejdet med virtuelle experimenter forbereder de studerende til at engagere sig mere frugtbart med eksperimenter ved at lade dem fokusere på fysikken og relevante data i stedet for instrumenternes funktion. Vi hævder, at det er, fordi de kan overføre deres erfaringer med virtuelle eksperimenter til rigtige eksperimenter. Vi finder dog, at læring stadig er situeret i den forstand, at kun kendskab til bestemte eksperimenter overføres. Vi afslutter med at diskutere de muligheder, som virtuelle eksperimenter giver. English We describe how virtual experiments can be utilized in a learning design that prepares students for hands-on experiments at large-scale facilities. We illustrate the design by showing how virtual experiments are used at the Niels Bohr Institute in a master level course on neutron scattering. In the last week of the course, students travel to a large-scale neutron scattering facility to perform real neutron scattering experiments. Through student interviews and survey answers, we argue, that the virtual training prepares the students to engage more fruitfully with experiments by letting them focus on physics and data rather than the overwhelming instrumentation. We argue that this is because they can transfer their virtual experimental experience to the real-life situation. However, we also find that learning is still situated in the sense that only knowledge of particular experiments is transferred. We proceed to

  11. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  12. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  13. Planck 2013 results. XVII. Gravitational lensing by large-scale structure

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Basak, S.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Dechelette, T.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Ho, S.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lavabre, A.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Lewis, A.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Pullen, A.R.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    On the arcminute angular scales probed by Planck, the CMB anisotropies are gently perturbed by gravitational lensing. Here we present a detailed study of this effect, detecting lensing independently in the 100, 143, and 217GHz frequency bands with an overall significance of greater than 25sigma. We use the temperature-gradient correlations induced by lensing to reconstruct a (noisy) map of the CMB lensing potential, which provides an integrated measure of the mass distribution back to the CMB last-scattering surface. Our lensing potential map is significantly correlated with other tracers of mass, a fact which we demonstrate using several representative tracers of large-scale structure. We estimate the power spectrum of the lensing potential, finding generally good agreement with expectations from the best-fitting LCDM model for the Planck temperature power spectrum, showing that this measurement at z=1100 correctly predicts the properties of the lower-redshift, later-time structures which source the lensing ...

  14. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  15. Human anthrax as a re-emerging disease.

    Science.gov (United States)

    Doganay, Mehmet; Demiraslan, Hayati

    2015-01-01

    Anthrax is primarily a disease of herbivores and the etiological agent is B. anthracis which is a gram-positive, aerobic, spore-forming, and rod shaped bacterium. Bacillus anthracis spores are highly resistant to heat, pressure, ultraviolet and ionizing radiation, chemical agents and disinfectants. For these reasons, B. anthracis spores are an attractive choice as biological agents for the use of bioweapon and/or bioterrorism. Soil is the main reservoir for the infectious agent. The disease most commonly affects wild and domestic mammals. Human are secondarily infected by contact with infected animals and contaminated animal products or directly expose to B. anthracis spores. Anthrax occurs worldwide. This infection is still endemic or hyperendemic in both animals and humans in some part of areas of the world; particularly in Middle East, West Africa, Central Asia, some part of India, South America. However, some countries are claiming free of anthrax, and anthrax has become a re-emerging disease in western countries with the intentional outbreak. Currently, anthrax is classified according to its setting as (1) naturally occurring anthrax, (2) bioterrorism-related anthrax. Vast majority of human anthrax are occurring as naturally occurring anthrax in the world. It is also a threaten disease for western countries. The aim of this paper is to review the relevant patents, short historical perspective, microbiological and epidemiological features, clinical presentations and treatment.

  16. Tularemia in Germany—A Re-emerging Zoonosis

    Directory of Open Access Journals (Sweden)

    Mirko Faber

    2018-02-01

    Full Text Available Tularemia, also known as “rabbit fever,” is a zoonosis caused by the facultative intracellular, gram-negative bacterium Francisella tularensis. Infection occurs through contact with infected animals (often hares, arthropod vectors (such as ticks or deer flies, inhalation of contaminated dust or through contaminated food and water. In this review, we would like to provide an overview of the current epidemiological situation in Germany using published studies and case reports, an analysis of recent surveillance data and our own experience from the laboratory diagnostics, and investigation of cases. While in Germany tularemia is a rarely reported disease, there is evidence of recent re-emergence. We also describe some peculiarities that were observed in Germany, such as a broad genetic diversity, and a recently discovered new genus of Francisella and protracted or severe clinical courses of infections with the subspecies holarctica. Because tularemia is a zoonosis, we also touch upon the situation in the animal reservoir and one-health aspects of this disease. Apparently, many pieces of the puzzle need to be found and put into place before the complex interaction between wildlife, the environment and humans are fully understood. Funding for investigations into rare diseases is scarce. Therefore, combining efforts in several countries in the framework of international projects may be necessary to advance further our understanding of this serious but also scientifically interesting disease.

  17. Tularemia in Germany—A Re-emerging Zoonosis

    Science.gov (United States)

    Faber, Mirko; Heuner, Klaus; Jacob, Daniela; Grunow, Roland

    2018-01-01

    Tularemia, also known as “rabbit fever,” is a zoonosis caused by the facultative intracellular, gram-negative bacterium Francisella tularensis. Infection occurs through contact with infected animals (often hares), arthropod vectors (such as ticks or deer flies), inhalation of contaminated dust or through contaminated food and water. In this review, we would like to provide an overview of the current epidemiological situation in Germany using published studies and case reports, an analysis of recent surveillance data and our own experience from the laboratory diagnostics, and investigation of cases. While in Germany tularemia is a rarely reported disease, there is evidence of recent re-emergence. We also describe some peculiarities that were observed in Germany, such as a broad genetic diversity, and a recently discovered new genus of Francisella and protracted or severe clinical courses of infections with the subspecies holarctica. Because tularemia is a zoonosis, we also touch upon the situation in the animal reservoir and one-health aspects of this disease. Apparently, many pieces of the puzzle need to be found and put into place before the complex interaction between wildlife, the environment and humans are fully understood. Funding for investigations into rare diseases is scarce. Therefore, combining efforts in several countries in the framework of international projects may be necessary to advance further our understanding of this serious but also scientifically interesting disease. PMID:29503812

  18. Wind speed reductions by large-scale wind turbine deployments lower turbine efficiencies and set low wind power potentials

    Science.gov (United States)

    Miller, Lee; Kleidon, Axel

    2017-04-01

    Wind turbines generate electricity by removing kinetic energy from the atmosphere. Large numbers of wind turbines are likely to reduce wind speeds, which lowers estimates of electricity generation from what would be presumed from unaffected conditions. Here, we test how well wind power potentials that account for this effect can be estimated without explicitly simulating atmospheric dynamics. We first use simulations with an atmospheric general circulation model (GCM) that explicitly simulates the effects of wind turbines to derive wind power limits (GCM estimate), and compare them to a simple approach derived from the climatological conditions without turbines [vertical kinetic energy (VKE) estimate]. On land, we find strong agreement between the VKE and GCM estimates with respect to electricity generation rates (0.32 and 0.37 We m-2) and wind speed reductions by 42 and 44%. Over ocean, the GCM estimate is about twice the VKE estimate (0.59 and 0.29 We m-2) and yet with comparable wind speed reductions (50 and 42%). We then show that this bias can be corrected by modifying the downward momentum flux to the surface. Thus, large-scale limits to wind power can be derived from climatological conditions without explicitly simulating atmospheric dynamics. Consistent with the GCM simulations, the approach estimates that only comparatively few land areas are suitable to generate more than 1 We m-2 of electricity and that larger deployment scales are likely to reduce the expected electricity generation rate of each turbine. We conclude that these atmospheric effects are relevant for planning the future expansion of wind power.

  19. On Feature Extraction from Large Scale Linear LiDAR Data

    Science.gov (United States)

    Acharjee, Partha Pratim

    Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are

  20. Detecting differential protein expression in large-scale population proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  1. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Design and assembly of ternary Pt/Re/SnO2 NPs by controlling the zeta potential of individual Pt, Re, and SnO2 NPs

    Science.gov (United States)

    Drzymała, Elżbieta; Gruzeł, Grzegorz; Pajor-Świerzy, Anna; Depciuch, Joanna; Socha, Robert; Kowal, Andrzej; Warszyński, Piotr; Parlinska-Wojtan, Magdalena

    2018-05-01

    In this study Pt, Re, and SnO2 nanoparticles (NPs) were combined in a controlled manner into binary and ternary combinations for a possible application for ethanol oxidation. For this purpose, zeta potentials as a function of the pH of the individual NPs solutions were measured. In order to successfully combine the NPs into Pt/SnO2 and Re/SnO2 NPs, the solutions were mixed together at a pH guaranteeing opposite zeta potentials of the metal and oxide NPs. The individually synthesized NPs and their binary/ternary combinations were characterized by Fourier transform infrared spectroscopy (FTIR) and scanning transmission electron microscopy (STEM) combined with energy dispersive X-ray spectroscopy (EDS) analysis. FTIR and XPS spectroscopy showed that the individually synthesized Pt and Re NPs are metallic and the Sn component was oxidized to SnO2. STEM showed that all NPs are well crystallized and the sizes of the Pt, Re, and SnO2 NPs were 2.2, 1.0, and 3.4 nm, respectively. Moreover, EDS analysis confirmed the successful formation of binary Pt/SnO2 and Re/SnO2 NP, as well as ternary Pt/Re/SnO2 NP combinations. This study shows that by controlling the zeta potential of individual metal and oxide NPs, it is possible to assemble them into binary and ternary combinations. [Figure not available: see fulltext.

  3. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  4. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    Science.gov (United States)

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  5. Seismic Parameters of Mining-Induced Aftershock Sequences for Re-entry Protocol Development

    Science.gov (United States)

    Vallejos, Javier A.; Estay, Rodrigo A.

    2018-03-01

    A common characteristic of deep mines in hard rock is induced seismicity. This results from stress changes and rock failure around mining excavations. Following large seismic events, there is an increase in the levels of seismicity, which gradually decay with time. Restricting access to areas of a mine for enough time to allow this decay of seismic events is the main approach in re-entry strategies. The statistical properties of aftershock sequences can be studied with three scaling relations: (1) Gutenberg-Richter frequency magnitude, (2) the modified Omori's law (MOL) for the temporal decay, and (3) Båth's law for the magnitude of the largest aftershock. In this paper, these three scaling relations, in addition to the stochastic Reasenberg-Jones model are applied to study the characteristic parameters of 11 large magnitude mining-induced aftershock sequences in four mines in Ontario, Canada. To provide guidelines for re-entry protocol development, the dependence of the scaling relation parameters on the magnitude of the main event are studied. Some relations between the parameters and the magnitude of the main event are found. Using these relationships and the scaling relations, a space-time-magnitude re-entry protocol is developed. These findings provide a first approximation to concise and well-justified guidelines for re-entry protocol development applicable to the range of mining conditions found in Ontario, Canada.

  6. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  7. MetReS, an Efficient Database for Genomic Applications.

    Science.gov (United States)

    Vilaplana, Jordi; Alves, Rui; Solsona, Francesc; Mateo, Jordi; Teixidó, Ivan; Pifarré, Marc

    2018-02-01

    MetReS (Metabolic Reconstruction Server) is a genomic database that is shared between two software applications that address important biological problems. Biblio-MetReS is a data-mining tool that enables the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the processes of interest and their function. The main goal of this work was to identify the areas where the performance of the MetReS database performance could be improved and to test whether this improvement would scale to larger datasets and more complex types of analysis. The study was started with a relational database, MySQL, which is the current database server used by the applications. We also tested the performance of an alternative data-handling framework, Apache Hadoop. Hadoop is currently used for large-scale data processing. We found that this data handling framework is likely to greatly improve the efficiency of the MetReS applications as the dataset and the processing needs increase by several orders of magnitude, as expected to happen in the near future.

  8. Concentration of 188Re-Perrhenate for Therapeutic Radiopharmaceuticals

    International Nuclear Information System (INIS)

    Bokhari, T.H.; Hina, S.; Ahmad, M.; Iqbal, M.

    2013-01-01

    Summary: Rhenium-188 (T1/2=16.9h) has great potential for a variety of therapeutic applications, including radionuclide synovectomy, oncology and bone pain palliation. The radioactive concentration of 188Re is dependent upon the specific activity of 188W, which dictates the bed size of the alumina/gel column. Due to the high content of inactive tungsten in neutron irradiated WO3, large columns containing aluminum oxide or gel are needed to prepare to double neutron capture based 188W/188Re generators that results in large elution volumes containing relatively high188W contents and low concentrations of /sup 188/ ReO/sub 4/ This decrease in specific volume of 188ReO/sub 4/ places a limitation because a high radioactive concentration of 188ReO4 - is always needed for filling angioplasty balloons or other therapeutic radiopharmaceuticals like188Re -EHDP 188Re -EDTMP, 188Re - MAG3 and 188Re -DTPA. We report post elution concentration of 188ReO4 - using in- house prepared lead cation exchange and alumina columns. Using these columns high bolus volume (10 mL saline) of 188ReO4 - can conveniently be concentrated in 1 mL of physiological saline for therapeutic use. (author)

  9. Potential of renewable energy in large fossil-fuelled boilers; Potential erneuerbarer Energien in groesseren fossilen Feuerungen

    Energy Technology Data Exchange (ETDEWEB)

    Dettli, R.; Baur, M.; Philippen, D. [Econcept AG, Zuerich (Switzerland); Kernen, M. [Planair SA, La Sagne (Switzerland)

    2007-01-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the findings of a project that examined large heat generation systems used in Switzerland for the supply of heating services to several buildings via small and large district heating systems. Focus is placed on those using fossil fuels and the potential of using combined heat and power plants and renewable forms of energy such as heat-pumps and boilers fired with wood-chippings. The study was also extended to other large-scale, fossil-fuelled heating installations. The report discusses the setting up of a data base, the assessment of the potentials for fuel substitution, the economic viability of wood-fired systems and heat-pumps and the analysis of various factors that can obstruct the use of systems employing renewable forms of energy. Around 20 owners of large installations were interviewed on the subject. Strategic planning, studies, putting to tender, realisation and operation aspects are reviewed.

  10. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  11. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    Science.gov (United States)

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  12. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  13. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  14. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  15. Hartle-Hawking wave function and large-scale power suppression of CMB*

    Directory of Open Access Journals (Sweden)

    Yeom Dong-han

    2018-01-01

    Full Text Available In this presentation, we first describe the Hartle-Hawking wave function in the Euclidean path integral approach. After we introduce perturbations to the background instanton solution, following the formalism developed by Halliwell-Hawking and Laflamme, one can obtain the scale-invariant power spectrum for small-scales. We further emphasize that the Hartle-Hawking wave function can explain the large-scale power suppression by choosing suitable potential parameters, where this will be a possible window to confirm or falsify models of quantum cosmology. Finally, we further comment on possible future applications, e.g., Euclidean wormholes, which can result in distinct signatures to the power spectrum.

  16. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors

    Science.gov (United States)

    Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Vrendenburg, Vance T.; Rosenblum, Erica Bree; Briggs, Cheryl J.

    2016-01-01

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.

  17. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  18. A emergência do autor Pierre Rivière/The emergence of the Pierre Rivière author

    Directory of Open Access Journals (Sweden)

    Adriana Duarte Bonini Mariguela

    2007-01-01

    Full Text Available Nesse artigo apresento uma análise do caso de parricídio / fraticídio cometido por um jovem camponês francês, nascido na comuna de Courvaudon, Pierre Rivière que em 3 de junho de 1835, aos vinte anos, assassinou a golpes de foice sua mãe grávida de sete meses, sua irmã de 18 anos, e seu irmão de sete anos. O livro intitulado Eu, Pierre Rivière, que degolei minha mãe, minha irmã e meu irmão produzido pelo trabalho de uma equipe de pesquisadores, no Collège de France, sob a coordenação de Michel Foucault em 1973, apresenta em seu conjunto notícias de jornais, testemunhos, interrogatórios, laudos médicos e uma gama de diferentes discursos. Para empreender a análise do nó entre a escrita e o assassinato, utilizo a relação entre a escrita e o autor demarcando o personagem Rivière no desdobramento, na ordem do corpo e da linguagem. This paper has as aim to present an analysis of the case of patricide / fratricide cometed by a French Countryman, Pierre Rivière, who was born in the commune of Courvaudon, who, on July 3rd, 1835, at age of 20, killed his mother, his brother and sister with a sickle. His mother was in the seventh month of pregnancy, his sister was 18 and his brother was 7 years old. The book, whose name was “I, Pierre Riviere, Having Slaughtered My Mother, My Sister, and My Brother ...: A Case of Parricide in the Nineteenth Century” was produced by a group of researchers at the Collège de France, under the coordination of Michel Foucault in 1973, has a set of newspaper news, witnesses, inquiring, forensic medical reports and a variety of different texts. To understand and analyze the writings and the homicide, the relationship between writing and the author is used, marking the character Rivière unfolding body and language.

  19. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  20. Synthesis of Large-Scale Single-Crystalline Monolayer WS2 Using a Semi-Sealed Method

    Directory of Open Access Journals (Sweden)

    Feifei Lan

    2018-02-01

    Full Text Available As a two-dimensional semiconductor, WS2 has attracted great attention due to its rich physical properties and potential applications. However, it is still difficult to synthesize monolayer single-crystalline WS2 at larger scale. Here, we report the growth of large-scale triangular single-crystalline WS2 with a semi-sealed installation by chemical vapor deposition (CVD. Through this method, triangular single-crystalline WS2 with an average length of more than 300 µm was obtained. The largest one was about 405 μm in length. WS2 triangles with different sizes and thicknesses were analyzed by optical microscope and atomic force microscope (AFM. Their optical properties were evaluated by Raman and photoluminescence (PL spectra. This report paves the way to fabricating large-scale single-crystalline monolayer WS2, which is useful for the growth of high-quality WS2 and its potential applications in the future.

  1. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  2. Traffic assignment models in large-scale applications

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Kjær

    the potential of the method proposed and the possibility to use individual-based GPS units for travel surveys in real-life large-scale multi-modal networks. Congestion is known to highly influence the way we act in the transportation network (and organise our lives), because of longer travel times...... of observations of actual behaviour to obtain estimates of the (monetary) value of different travel time components, thereby increasing the behavioural realism of largescale models. vii The generation of choice sets is a vital component in route choice models. This is, however, not a straight-forward task in real......, but the reliability of the travel time also has a large impact on our travel choices. Consequently, in order to improve the realism of transport models, correct understanding and representation of two values that are related to the value of time (VoT) are essential: (i) the value of congestion (VoC), as the Vo...

  3. Capsid coding region diversity of re-emerging lineage C foot-and-mouth disease virus serotype Asia1 from India.

    Science.gov (United States)

    Subramaniam, Saravanan; Mohapatra, Jajati K; Das, Biswajit; Sharma, Gaurav K; Biswal, Jitendra K; Mahajan, Sonalika; Misri, Jyoti; Dash, Bana B; Pattnaik, Bramhadev

    2015-07-01

    Foot-and-mouth disease virus (FMDV) serotype Asia1 was first reported in India in 1951, where three major genetic lineages (B, C and D) of this serotype have been described until now. In this study, the capsid protein coding region of serotype Asia1 viruses (n = 99) from India were analyzed, giving importance to the viruses circulating since 2007. All of the isolates (n = 50) recovered during 2007-2013 were found to group within the re-emerging cluster of lineage C (designated as sublineage C(R)). The evolutionary rate of sublineage C(R) was estimated to be slightly higher than that of the serotype as a whole, and the time of the most recent common ancestor for this cluster was estimated to be approximately 2001. In comparison to the older isolates of lineage C (1993-2001), the re-emerging viruses showed variation at eight amino acid positions, including substitutions at the antigenically critical residues VP279 and VP2131. However, no direct correlation was found between sequence variations and antigenic relationships. The number of codons under positive selection and the nature of the selection pressure varied widely among the structural proteins, implying a heterogeneous pattern of evolution in serotype Asia1. While episodic diversifying selection appears to play a major role in shaping the evolution of VP1 and VP3, selection pressure acting on codons of VP2 is largely pervasive. Further, episodic positive selection appears to be responsible for the early diversification of lineage C. Recombination events identified in the structural protein coding region indicates its probable role in adaptive evolution of serotype Asia1 viruses.

  4. The Schroedinger-Poisson equations as the large-N limit of the Newtonian N-body system. Applications to the large scale dark matter dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Briscese, Fabio [Northumbria University, Department of Mathematics, Physics and Electrical Engineering, Newcastle upon Tyne (United Kingdom); Citta Universitaria, Istituto Nazionale di Alta Matematica Francesco Severi, Gruppo Nazionale di Fisica Matematica, Rome (Italy)

    2017-09-15

    In this paper it is argued how the dynamics of the classical Newtonian N-body system can be described in terms of the Schroedinger-Poisson equations in the large N limit. This result is based on the stochastic quantization introduced by Nelson, and on the Calogero conjecture. According to the Calogero conjecture, the emerging effective Planck constant is computed in terms of the parameters of the N-body system as ℎ ∝ M{sup 5/3}G{sup 1/2}(N/ left angle ρ right angle){sup 1/6}, where is G the gravitational constant, N and M are the number and the mass of the bodies, and left angle ρ right angle is their average density. The relevance of this result in the context of large scale structure formation is discussed. In particular, this finding gives a further argument in support of the validity of the Schroedinger method as numerical double of the N-body simulations of dark matter dynamics at large cosmological scales. (orig.)

  5. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  6. The Field Assessment Stroke Triage for Emergency Destination (FAST-ED): a Simple and Accurate Pre-Hospital Scale to Detect Large Vessel Occlusion Strokes

    Science.gov (United States)

    Lima, Fabricio O.; Silva, Gisele S.; Furie, Karen L.; Frankel, Michael R.; Lev, Michael H.; Camargo, Érica CS; Haussen, Diogo C.; Singhal, Aneesh B.; Koroshetz, Walter J.; Smith, Wade S.; Nogueira, Raul G.

    2016-01-01

    Background and Purpose Patients with large vessel occlusion strokes (LVOS) may be better served by direct transfer to endovascular capable centers avoiding hazardous delays between primary and comprehensive stroke centers. However, accurate stroke field triage remains challenging. We aimed to develop a simple field scale to identify LVOS. Methods The FAST-ED scale was based on items of the NIHSS with higher predictive value for LVOS and tested in the STOPStroke cohort, in which patients underwent CT angiography within the first 24 hours of stroke onset. LVOS were defined by total occlusions involving the intracranial-ICA, MCA-M1, MCA-2, or basilar arteries. Patients with partial, bi-hemispheric, and/or anterior + posterior circulation occlusions were excluded. Receiver operating characteristic (ROC) curve, sensitivity, specificity, positive (PPV) and negative predictive values (NPV) of FAST-ED were compared with the NIHSS, Rapid Arterial oCclusion Evaluation (RACE) scale and Cincinnati Prehospital Stroke Severity Scale (CPSSS). Results LVO was detected in 240 of the 727 qualifying patients (33%). FAST-ED had comparable accuracy to predict LVO to the NIHSS and higher accuracy than RACE and CPSS (area under the ROC curve: FAST-ED=0.81 as reference; NIHSS=0.80, p=0.28; RACE=0.77, p=0.02; and CPSS=0.75, p=0.002). A FAST-ED ≥4 had sensitivity of 0.60, specificity 0.89, PPV 0.72, and NPV 0.82 versus RACE ≥5 of 0.55, 0.87, 0.68, 0.79 and CPSS ≥2 of 0.56, 0.85, 0.65, 0.78, respectively. Conclusions FAST-ED is a simple scale that if successfully validated in the field may be used by medical emergency professionals to identify LVOS in the pre-hospital setting enabling rapid triage of patients. PMID:27364531

  7. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  8. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  9. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  10. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  11. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  12. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  13. Beam test results of the first full-scale prototype of CMS RE 1/2 resistive plate chamber

    International Nuclear Information System (INIS)

    Ying Jun; Ban Yong; Ye Yanlin; Cai Jianxin; Qian Sijin; Wang Quanjin; Liu Hongtao

    2005-01-01

    The authors reported the muon beam test results of the first full-scale prototype of CMS RE 1/2 Resistive Plate Chamber (RPC). The bakelite surface is treated using a special technology without oil to make it smooth enough. The full scale RE 1/2 RPC with honeycomb supporting frame is strong and thin enough to be fitted to the limited space of CMS design for the inner Forward RPC. The muon beam test was performed at CERN Gamma Irradiation Facility (GIF). The detection efficiency of this full scale RPC prototype is >95% even at very high irradiation background. The time resolution (less than 1.2 ns) and spatial resolution are satisfactory for the muon trigger device in future CMS experiments. The noise rate is also calculated and discussed

  14. Bio-inspired wooden actuators for large scale applications.

    Science.gov (United States)

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  15. Large Scale Relationship between Aquatic Insect Traits and Climate.

    Science.gov (United States)

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  16. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  17. Fracture appraisal of large scale glass block under various realistic thermal conditions

    International Nuclear Information System (INIS)

    Laude, F.; Vernaz, E.; Saint-Gaudens, M.

    1982-06-01

    Fracturing of nuclear waste glass caused primarily by thermal and residual stresses during cooling increases the potential leaching surface area and the number of small particles. A theoretical study shows that it is possible to calculate the stresses created but it is difficult to evaluate the state of fracture. Theoretical results are completed by an experimental study with inactive industrial scale glass blocks. The critical stages of its thermal history are simulated and the total surface area of the pieces is measured by comparison of leaching rate of the fractured glass with known samples in the same conditions. Quenching due to water impact, air cooling in a storage fit and experimental reassembly of fractured glass by re-heating are examined

  18. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  19. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  20. Large magnetoresistance in (AA')2FeReO6 double perovskites

    International Nuclear Information System (INIS)

    Teresa, J.M. de; Serrate, D.; Blasco, J.; Ibarra, M.R.; Morellon, L.

    2005-01-01

    We review the main structural, magnetic and magnetotransport properties of the intriguing (AA') 2 FeReO 6 magnetic double perovskites. As the average cation size decreases, the crystallographic structure at room temperature evolves from cubic [(AA') 2 =Ba 2 , Ba 1.5 Sr 0.5 , BaSr, Ba 0.5 Sr 1.5 ] to tetragonal [(AA') 2 =Sr 2 ] and monoclinic [(AA') 2 =Ca 0.5 Sr 1.5 , CaSr, Ca 1.5 Sr 0.5 , Ca 2 ]. The Curie temperature increases anomalously from ∼303K for Ba 2 to ∼522K for Ca 2 in sharp contrast with the observed behaviour in the isostructural compounds (AA') 2 FeMoO 6 . Other anomalous features in the (AA') 2 FeReO 6 series are: the large magnetic anisotropy, the large magnetoelastic coupling and the semiconducting behaviour of the monoclinic compounds. The monoclinic compounds undergo a structural/magnetic transition at T S below 125K. Three different magnetoresistance mechanisms have been identified: the intergrain negative magnetoresistance effect, which is present across the whole series of compounds, and in the case of the monoclinic compounds below T S a negative magnetoresistance effect associated to the melting of the low-temperature phase and a positive magnetoresistance effect only present in (AA') 2 =Ca 2 below T∼50K

  1. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  2. Results of research and development in large-scale research centers as an innovation source for firms

    International Nuclear Information System (INIS)

    Theenhaus, R.

    1978-01-01

    The twelve large-scale research centres of the Federal Republic of Germany with their 16,000 employees represent a considerable scientific and technical potential. Cooperation with industry with regard to large-scale projects has already become very close and the know-how flow as well as the contributions to innovation connected therewith are largely established. The first successful steps to utilizing the results of basic research, of spin off and those within the frame of research and development as well as the fulfilling of services are encouraging. However, there is a number of detail problems which can only be solved between all parties concerned, in particular between industry and all large-scale research centres. (orig./RW) [de

  3. Lymphogranuloma venereum in Quebec: Re-emergence among men who have sex with men.

    Science.gov (United States)

    Boutin, C A; Venne, S; Fiset, M; Fortin, C; Murphy, D; Severini, A; Martineau, C; Longtin, J; Labbé, A C

    2018-02-01

    Lymphogranuloma venereum (LGV) is a sexually transmitted infection (STI) caused by Chlamydia trachomatis genotypes L 1 , L 2 and L 3 . This LGV is associated with significant morbidity and increased risk of HIV transmission. While fewer than two cases per year were reported in Quebec before 2005, LGV emerged in 2005-2006 with 69 cases, followed by a period of low incidence (2007-2012), and subsequent re-emergence since 2013. To describe the incidence of LGV in Quebec and the characteristics of the affected population, including demographics and risk factors, clinical manifestations, laboratory tests, treatments and reinfection rates. Descriptive data were collected from the notifiable diseases records through the Institut national de santé publique du Québec (INSPQ) infocentre portal. Questionnaires were obtained through the enhanced surveillance system and transmitted anonymously to the Quebec Ministry of Health. In-depth analysis was performed on cases from 2013 to 2016. There were 338 cases of LGV over the four-year period in Quebec. All cases were male, excluding one transsexual. Mean age was 41 years. Most lived in Montréal (81%) and were men who have sex with men (MSM; 99%). The majority (83%) reported four sexual partners or more in the last year, met mostly through the Internet (77%) and in saunas (73%). Frequency of sexual intercourse with out-of-province residents decreased in 2013-2016 (27%) compared with 2005-2012 (38%). History of STIs was frequent: 83% were HIV-infected, 81% reported previous syphilis and 78% previous gonorrhea. Recreational drug use was frequent (57%), reaching 71% in 2016. Most cases were symptomatic, a proportion which decreased in 2016 (68%) compared with 2013-2015 (82%; p =0.006). Clinical presentations included proctitis (86%), lymphadenopathy (13%) and ulcer/papule (12%). Reinfections, mostly within two years of first infection, occurred in 35 individuals (10%). Conclusion: The re-emergence of LGV in Quebec involves an urban

  4. Microarray data re-annotation reveals specific lncRNAs and their potential functions in non-small cell lung cancer subtypes

    OpenAIRE

    Zhou, Dongbo; Xie, Mingxuan; He, Baimei; Gao, Ying; Yu, Qiao; He, Bixiu; Chen, Qiong

    2017-01-01

    Non-small-cell lung cancer (NSCLC) is a leading cause of cancer mortality worldwide. The most common subtypes of NSCLC are adenocarcinoma (AC) and squamous cell carcinoma (SCC). However, the pathophysiological mechanisms contributing to AC and SCC are still largely unknown, especially the roles of long non-coding RNAs (lncRNAs). The present study identified differentially expressed lncRNAs between lung AC and SCC by re-annotation of NSCLC microarray data analysis profiling. The potential func...

  5. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  6. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  7. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  8. Rapid Large Scale Reprocessing of the ODI Archive using the QuickReduce Pipeline

    Science.gov (United States)

    Gopu, A.; Kotulla, R.; Young, M. D.; Hayashi, S.; Harbeck, D.; Liu, W.; Henschel, R.

    2015-09-01

    The traditional model of astronomers collecting their observations as raw instrument data is being increasingly replaced by astronomical observatories serving standard calibrated data products to observers and to the public at large once proprietary restrictions are lifted. For this model to be effective, observatories need the ability to periodically re-calibrate archival data products as improved master calibration products or pipeline improvements become available, and also to allow users to rapidly calibrate their data on-the-fly. Traditional astronomy pipelines are heavily I/O dependent and do not scale with increasing data volumes. In this paper, we present the One Degree Imager - Portal, Pipeline and Archive (ODI-PPA) calibration pipeline framework which integrates the efficient and parallelized QuickReduce pipeline to enable a large number of simultaneous, parallel data reduction jobs - initiated by operators AND/OR users - while also ensuring rapid processing times and full data provenance. Our integrated pipeline system allows re-processing of the entire ODI archive (˜15,000 raw science frames, ˜3.0 TB compressed) within ˜18 hours using twelve 32-core compute nodes on the Big Red II supercomputer. Our flexible, fast, easy to operate, and highly scalable framework improves access to ODI data, in particular when data rates double with an upgraded focal plane (scheduled for 2015), and also serve as a template for future data processing infrastructure across the astronomical community and beyond.

  9. Estimating GHG emission mitigation supply curves of large-scale biomass use on a country level

    International Nuclear Information System (INIS)

    Dornburg, Veronika; Dam, Jinke van; Faaij, Andre

    2007-01-01

    This study evaluates the possible influences of a large-scale introduction of biomass material and energy systems and their market volumes on land, material and energy market prices and their feedback to greenhouse gas (GHG) emission mitigation costs. GHG emission mitigation supply curves for large-scale biomass use were compiled using a methodology that combines a bottom-up analysis of biomass applications, biomass cost supply curves and market prices of land, biomaterials and bioenergy carriers. These market prices depend on the scale of biomass use and the market volume of materials and energy carriers and were estimated using own-price elasticities of demand. The methodology was demonstrated for a case study of Poland in the year 2015 applying different scenarios on economic development and trade in Europe. For the key technologies considered, i.e. medium density fibreboard, poly lactic acid, electricity and methanol production, GHG emission mitigation costs increase strongly with the scale of biomass production. Large-scale introduction of biomass use decreases the GHG emission reduction potential at costs below 50 Euro /Mg CO 2eq with about 13-70% depending on the scenario. Biomaterial production accounts for only a small part of this GHG emission reduction potential due to relatively small material markets and the subsequent strong decrease of biomaterial market prices at large scale of production. GHG emission mitigation costs depend strongly on biomass supply curves, own-price elasticity of land and market volumes of bioenergy carriers. The analysis shows that these influences should be taken into account for developing biomass implementations strategies

  10. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  11. Two micro-models of tourism capitalism and the (re)scaling of state-business relations

    NARCIS (Netherlands)

    Erkuş-Öztürk, H.; Terhorst, P.

    2011-01-01

    This paper aims to show that (i) there are two micro-models of tourism capitalism in Antalya (Turkey) and (ii) different trajectories of (re)scaling of state-business relations form an integral part of each model of tourism capitalism. The paper bridges two debates in the literature that generally

  12. Development of the re-engineered European decision support system for off-site nuclear and radiological emergencies - JRODOS. Application to air pollution transport modelling

    International Nuclear Information System (INIS)

    Ievdin, I.; Treebushny, D.; Raskob, W.; Zheleznyak, M.

    2008-01-01

    Full text: The European decision support system for nuclear and radiological emergencies RODOS includes a set of numerical models simulating the transport of radionuclides in the environment, estimating potential doses to the public and simulating and evaluating the efficiency of countermeasures. The re-engineering of the RODOS system using the Java technology has started recently which will allow to apply the new system called JRODOS on nearly any computational platform running Java virtual machine. Modern software development approaches were used for the JRODOS system architecture and implementation: distributed system design (client, management server, computational server), geo-database utilization, plug-in model structure and OpenMI-like compatibility to support seamless model inter-connection. Stable open source components such as an ORM solution (Hibernate), an OpenGIS component (Geotools) and a charting/reporting component (JFree, Pentaho) were utilized to optimize the development effort and allow a fast completion of the project. The architecture of the system is presented and illustrated for the atmospheric dispersion module ALSMC (Atmospheric Local Scale Model Chain) performing calculations of atmospheric pollution transport and the corresponding acute doses and dose rates. The example application is based on a synthetic scenario of a release from a nuclear power plant located in Europe. (author)

  13. The transport sectors potential contribution to the flexibility in the power sector required by large-scale wind power integration

    DEFF Research Database (Denmark)

    Nørgård, Per Bromand; Lund, H.; Mathiesen, B.V.

    2007-01-01

    -scale integration of renewable energy in the power system – in specific wind power. In the plan, 20 % of the road transport is based on electricity and 20 % on bio- fuels. This, together with other initiatives allows for up to 55-60 % wind power penetration in the power system. A fleet of 0.5 mio electrical...... vehicles in Denmark in 2030 connected to the grid 50 % of the time represents an aggregated flexible power capacity of 1- 1.5 GW and an energy capacity of 10-150 GWh.......In 2006, the Danish Society of Engineers developed a visionary plan for the Danish energy system in 2030. The paper presents and qualifies selected part of the analyses, illustrating the transport sectors potential to contribute to the flexibility in the power sector, necessary for large...

  14. Measles re-emergence in Northern Italy: Pathways of measles virus genotype D8, 2013-2014.

    Science.gov (United States)

    Amendola, Antonella; Bianchi, Silvia; Lai, Alessia; Canuti, Marta; Piralla, Antonio; Baggieri, Melissa; Ranghiero, Alberto; Piatti, Alessandra; Tanzi, Elisabetta; Zehender, Gianguglielmo; Magurano, Fabio; Baldanti, Fausto

    2017-03-01

    Molecular surveillance and advanced phylogenetic methods are important tools to track the pathways of Measles virus (MV) genotypes, provide evidence for the interruption of endemic transmission and verify the elimination of the disease. The aims of this study were to describe the genetic profile of MV genotype D8 (D8-MV) strains circulating in Northern Italy (Lombardy Region) during the 2013-2014 period and to analyze the transmission chains and estimate the introduction time points using a phylogenetic approach. Forty-four strains of D8-MV identified from 12 outbreaks and 28 cases reported as sporadic were analyzed. Molecular analysis was performed by sequencing the highly variable 450nt region of the N gene of MV genome (N-450), as recommended by the WHO. Phylogenetic analyses and tree time-scaled reconstruction were performed with BEAST software. We could trace back the transmission pathways that resulted in three chains of transmission, two introductions with limited spread (two familiar outbreaks), and two single introductions (true sporadic cases). The D8-Taunton transmission chain, which was involved in 7 outbreaks and 13 sporadic cases, was endemic during the studied period. Furthermore, two novel local variants emerged independently in March 2014 and caused two transmission chains linked to at least 3 outbreaks. Overall, viral diversity was high and strains belonging to 5 different variants were identified. The results of this study clearly demonstrate that multiple lineages of D8-MV co-circulated in Northern Italy. Measles can be considered a re-emerging disease in Italy and additional efforts are necessary to achieve measles elimination goal. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  16. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  17. Estimation of Radiative Efficiency of Chemicals with Potentially Significant Global Warming Potential.

    Science.gov (United States)

    Betowski, Don; Bevington, Charles; Allison, Thomas C

    2016-01-19

    Halogenated chemical substances are used in a broad array of applications, and new chemical substances are continually being developed and introduced into commerce. While recent research has considerably increased our understanding of the global warming potentials (GWPs) of multiple individual chemical substances, this research inevitably lags behind the development of new chemical substances. There are currently over 200 substances known to have high GWP. Evaluation of schemes to estimate radiative efficiency (RE) based on computational chemistry are useful where no measured IR spectrum is available. This study assesses the reliability of values of RE calculated using computational chemistry techniques for 235 chemical substances against the best available values. Computed vibrational frequency data is used to estimate RE values using several Pinnock-type models, and reasonable agreement with reported values is found. Significant improvement is obtained through scaling of both vibrational frequencies and intensities. The effect of varying the computational method and basis set used to calculate the frequency data is discussed. It is found that the vibrational intensities have a strong dependence on basis set and are largely responsible for differences in computed RE values.

  18. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  19. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    Science.gov (United States)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  20. Sub-surface laser nanostructuring in stratified metal/dielectric media: a versatile platform towards flexible, durable and large-scale plasmonic writing

    International Nuclear Information System (INIS)

    Siozios, A; Bellas, D V; Lidorikis, E; Patsalas, P; Kalfagiannis, N; Cranton, W M; Koutsogeorgis, D C; Bazioti, C; Dimitrakopulos, G P; Vourlias, G

    2015-01-01

    Laser nanostructuring of pure ultrathin metal layers or ceramic/metal composite thin films has emerged as a promising route for the fabrication of plasmonic patterns with applications in information storage, cryptography, and security tagging. However, the environmental sensitivity of pure Ag layers and the complexity of ceramic/metal composite film growth hinder the implementation of this technology to large-scale production, as well as its combination with flexible substrates. In the present work we investigate an alternative pathway, namely, starting from non-plasmonic multilayer metal/dielectric layers, whose growth is compatible with large scale production such as in-line sputtering and roll-to-roll deposition, which are then transformed into plasmonic templates by single-shot UV-laser annealing (LA). This entirely cold, large-scale process leads to a subsurface nanoconstruction involving plasmonic Ag nanoparticles (NPs) embedded in a hard and inert dielectric matrix on top of both rigid and flexible substrates. The subsurface encapsulation of Ag NPs provides durability and long-term stability, while the cold character of LA suits the use of sensitive flexible substrates. The morphology of the final composite film depends primarily on the nanocrystalline character of the dielectric host and its thermal conductivity. We demonstrate the emergence of a localized surface plasmon resonance, and its tunability depending on the applied fluence and environmental pressure. The results are well explained by theoretical photothermal modeling. Overall, our findings qualify the proposed process as an excellent candidate for versatile, large-scale optical encoding applications. (paper)

  1. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Science.gov (United States)

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  2. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Directory of Open Access Journals (Sweden)

    Xianlei Dong

    Full Text Available Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  3. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  4. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  5. Assessment of economically optimal water management and geospatial potential for large-scale water storage

    Science.gov (United States)

    Weerasinghe, Harshi; Schneider, Uwe A.

    2010-05-01

    Assessment of economically optimal water management and geospatial potential for large-scale water storage Weerasinghe, Harshi; Schneider, Uwe A Water is an essential but limited and vulnerable resource for all socio-economic development and for maintaining healthy ecosystems. Water scarcity accelerated due to population expansion, improved living standards, and rapid growth in economic activities, has profound environmental and social implications. These include severe environmental degradation, declining groundwater levels, and increasing problems of water conflicts. Water scarcity is predicted to be one of the key factors limiting development in the 21st century. Climate scientists have projected spatial and temporal changes in precipitation and changes in the probability of intense floods and droughts in the future. As scarcity of accessible and usable water increases, demand for efficient water management and adaptation strategies increases as well. Addressing water scarcity requires an intersectoral and multidisciplinary approach in managing water resources. This would in return safeguard the social welfare and the economical benefit to be at their optimal balance without compromising the sustainability of ecosystems. This paper presents a geographically explicit method to assess the potential for water storage with reservoirs and a dynamic model that identifies the dimensions and material requirements under an economically optimal water management plan. The methodology is applied to the Elbe and Nile river basins. Input data for geospatial analysis at watershed level are taken from global data repositories and include data on elevation, rainfall, soil texture, soil depth, drainage, land use and land cover; which are then downscaled to 1km spatial resolution. Runoff potential for different combinations of land use and hydraulic soil groups and for mean annual precipitation levels are derived by the SCS-CN method. Using the overlay and decision tree algorithms

  6. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  7. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  8. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  9. ENCoRE: an efficient software for CRISPR screens identifies new players in extrinsic apoptosis.

    Science.gov (United States)

    Trümbach, Dietrich; Pfeiffer, Susanne; Poppe, Manuel; Scherb, Hagen; Doll, Sebastian; Wurst, Wolfgang; Schick, Joel A

    2017-11-25

    As CRISPR/Cas9 mediated screens with pooled guide libraries in somatic cells become increasingly established, an unmet need for rapid and accurate companion informatics tools has emerged. We have developed a lightweight and efficient software to easily manipulate large raw next generation sequencing datasets derived from such screens into informative relational context with graphical support. The advantages of the software entitled ENCoRE (Easy NGS-to-Gene CRISPR REsults) include a simple graphical workflow, platform independence, local and fast multithreaded processing, data pre-processing and gene mapping with custom library import. We demonstrate the capabilities of ENCoRE to interrogate results from a pooled CRISPR cellular viability screen following Tumor Necrosis Factor-alpha challenge. The results not only identified stereotypical players in extrinsic apoptotic signaling but two as yet uncharacterized members of the extrinsic apoptotic cascade, Smg7 and Ces2a. We further validated and characterized cell lines containing mutations in these genes against a panel of cell death stimuli and involvement in p53 signaling. In summary, this software enables bench scientists with sensitive data or without access to informatic cores to rapidly interpret results from large scale experiments resulting from pooled CRISPR/Cas9 library screens.

  10. Oil supply security: the emergency response potential of IEA countries

    International Nuclear Information System (INIS)

    1995-01-01

    This work deals with the oil supply security and more particularly with the emergency response potential of International Energy Agency (IEA) countries. The first part describes the changing pattern of IEA emergency response requirements. It begins with the experience from the past, then gives the energy outlook to 2010 and ends with the emergency response policy issues for the future. The second part is an overview on the IEA emergency response potential which includes the organisation, the emergency reserves, the demand restraint and the other response mechanisms. The third part gives the response potential of individual IEA countries. The last part deals with IEA emergency response in practice and more particularly with the gulf crisis of 1990-1991. It includes the initial problems raised by the gulf crisis, the adjustment and preparation and the onset of military action with the IEA response.(O.L.). 7 figs., 85 tabs

  11. Formation of outflow channels on Mars: Testing the origin of Reull Vallis in Hesperia Planum by large-scale lava-ice interactions and top-down melting

    Science.gov (United States)

    Cassanelli, James P.; Head, James W.

    2018-05-01

    The Reull Vallis outflow channel is a segmented system of fluvial valleys which originates from the volcanic plains of the Hesperia Planum region of Mars. Explanation of the formation of the Reull Vallis outflow channel by canonical catastrophic groundwater release models faces difficulties with generating sufficient hydraulic head, requiring unreasonably high aquifer permeability, and from limited recharge sources. Recent work has proposed that large-scale lava-ice interactions could serve as an alternative mechanism for outflow channel formation on the basis of predictions of regional ice sheet formation in areas that also underwent extensive contemporaneous volcanic resurfacing. Here we assess in detail the potential formation of outflow channels by large-scale lava-ice interactions through an applied case study of the Reull Vallis outflow channel system, selected for its close association with the effusive volcanic plains of the Hesperia Planum region. We first review the geomorphology of the Reull Vallis system to outline criteria that must be met by the proposed formation mechanism. We then assess local and regional lava heating and loading conditions and generate model predictions for the formation of Reull Vallis to test against the outlined geomorphic criteria. We find that successive events of large-scale lava-ice interactions that melt ice deposits, which then undergo re-deposition due to climatic mechanisms, best explains the observed geomorphic criteria, offering improvements over previously proposed formation models, particularly in the ability to supply adequate volumes of water.

  12. Can Large Scale Land Acquisition for Agro-Development in Indonesia be Managed Sustainably?

    NARCIS (Netherlands)

    Obidzinski, K.; Takahashi, I.; Dermawan, A.; Komarudin, H.; Andrianto, A.

    2013-01-01

    This paper explores the impacts of large scale land acquisition for agro-development by analyzing the Merauke Integrated Food and Energy Estate (MIFEE) in Indonesia. It also examines the potential for MIFEE to meet sustainability requirements under RSPO, ISPO, and FSC. The plantation development

  13. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.

    Science.gov (United States)

    Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt

    2009-05-30

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.

  14. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    Science.gov (United States)

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  15. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  16. Preparation of a large-scale and multi-layer molybdenum crystal and its characteristics

    International Nuclear Information System (INIS)

    Fujii, Tadayuki

    1989-01-01

    In the present work, the secondary recrystallization method was applied to obtain a large-scale and multi-layer crystal from a hot-rolled multi-laminated molybdenum sheet doped and stacked alternately with different amounts of dopant. It was found that the time and/or temperature at which secondary recrystallization commence from the multi- layer sheet is strongly dependent on the amounts of dopants. Therefore the potential nucleus of the secondary grain from layers with different amounts of dopant occurred first at the layer with a small amount of dopant and then grew into the layer with a large amount of dopant after an anneal at 1800 0 C-2000 0 C. Consequently a large -scale and multi-layer molybdenum crystal can easily be obtained. 12 refs., 9 figs., 2 tabs. (Author)

  17. Bilevel Traffic Evacuation Model and Algorithm Design for Large-Scale Activities

    Directory of Open Access Journals (Sweden)

    Danwen Bao

    2017-01-01

    Full Text Available This paper establishes a bilevel planning model with one master and multiple slaves to solve traffic evacuation problems. The minimum evacuation network saturation and shortest evacuation time are used as the objective functions for the upper- and lower-level models, respectively. The optimizing conditions of this model are also analyzed. An improved particle swarm optimization (PSO method is proposed by introducing an electromagnetism-like mechanism to solve the bilevel model and enhance its convergence efficiency. A case study is carried out using the Nanjing Olympic Sports Center. The results indicate that, for large-scale activities, the average evacuation time of the classic model is shorter but the road saturation distribution is more uneven. Thus, the overall evacuation efficiency of the network is not high. For induced emergencies, the evacuation time of the bilevel planning model is shortened. When the audience arrival rate is increased from 50% to 100%, the evacuation time is shortened from 22% to 35%, indicating that the optimization effect of the bilevel planning model is more effective compared to the classic model. Therefore, the model and algorithm presented in this paper can provide a theoretical basis for the traffic-induced evacuation decision making of large-scale activities.

  18. Bio-inspired wooden actuators for large scale applications.

    Directory of Open Access Journals (Sweden)

    Markus Rüggeberg

    Full Text Available Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  19. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  20. Pre-Flight Ground Testing of the Full-Scale HIFiRE-1 at Fully Duplicated Flight Conditions

    National Research Council Canada - National Science Library

    Wadhams, Tim P; MacLean, Matthew G; Holden, Michael S; Mundy, Erik

    2008-01-01

    As part of an experimental study to obtain detailed heating and pressure data over the full-scale HIFiRE-1 flight geometry, CUBRC has completed a 30-run matrix of ground tests, sponsored by the AFOSR...

  1. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  2. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  3. Anomalous scaling of structure functions and dynamic constraints on turbulence simulations

    International Nuclear Information System (INIS)

    Yakhot, Victor; Sreenivasan, Katepalli R.

    2006-12-01

    The connection between anomalous scaling of structure functions (intermittency) and numerical methods for turbulence simulations is discussed. It is argued that the computational work for direct numerical simulations (DNS) of fully developed turbulence increases as Re 4 , and not as Re 3 expected from Kolmogorov's theory, where Re is a large-scale Reynolds number. Various relations for the moments of acceleration and velocity derivatives are derived. An infinite set of exact constraints on dynamically consistent subgrid models for Large Eddy Simulations (LES) is derived from the Navier-Stokes equations, and some problems of principle associated with existing LES models are highlighted. (author)

  4. Study of Potential Cost Reductions Resulting from Super-Large-Scale Manufacturing of PV Modules: Final Subcontract Report, 7 August 2003--30 September 2004

    Energy Technology Data Exchange (ETDEWEB)

    Keshner, M. S.; Arya, R.

    2004-10-01

    Hewlett Packard has created a design for a ''Solar City'' factory that will process 30 million sq. meters of glass panels per year and produce 2.1-3.6 GW of solar panels per year-100x the volume of a typical, thin-film, solar panel manufacturer in 2004. We have shown that with a reasonable selection of materials, and conservative assumptions, this ''Solar City'' can produce solar panels and hit the price target of $1.00 per peak watt (6.5x-8.5x lower than prices in 2004) as the total price for a complete and installed rooftop (or ground mounted) solar energy system. This breakthrough in the price of solar energy comes without the need for any significant new invention. It comes entirely from the manufacturing scale of a large plant and the cost savings inherent in operating at such a large manufacturing scale. We expect that further optimizations from these simple designs will lead to further improvements in cost. The manufacturing process and cost depend on the choice for the active layer that converts sunlight into electricity. The efficiency by which sunlight is converted into electricity can range from 7% to 15%. This parameter has a large effect on the overall price per watt. There are other impacts, as well, and we have attempted to capture them without creating undue distractions. Our primary purpose is to demonstrate the impact of large-scale manufacturing. This impact is largely independent of the choice of active layer. It is not our purpose to compare the pro's and con's for various types of active layers. Significant improvements in cost per watt can also come from scientific advances in active layers that lead to higher efficiency. But, again, our focus is on manufacturing gains and not on the potential advances in the basic technology.

  5. Searching for animal models and potential target species for emerging pathogens: Experience gained from Middle East respiratory syndrome (MERS coronavirus

    Directory of Open Access Journals (Sweden)

    Júlia Vergara-Alert

    2017-06-01

    Full Text Available Emerging and re-emerging pathogens represent a substantial threat to public health, as demonstrated with numerous outbreaks over the past years, including the 2013–2016 outbreak of Ebola virus in western Africa. Coronaviruses are also a threat for humans, as evidenced in 2002/2003 with infection by the severe acute respiratory syndrome coronavirus (SARS-CoV, which caused more than 8000 human infections with 10% fatality rate in 37 countries. Ten years later, a novel human coronavirus (Middle East respiratory syndrome coronavirus, MERS-CoV, associated with severe pneumonia, arose in the Kingdom of Saudi Arabia. Until December 2016, MERS has accounted for more than 1800 cases and 35% fatality rate. Finding an animal model of disease is key to develop vaccines or antivirals against such emerging pathogens and to understand its pathogenesis. Knowledge of the potential role of domestic livestock and other animal species in the transmission of pathogens is of importance to understand the epidemiology of the disease. Little is known about MERS-CoV animal host range. In this paper, experimental data on potential hosts for MERS-CoV is reviewed. Advantages and limitations of different animal models are evaluated in relation to viral pathogenesis and transmission studies. Finally, the relevance of potential new target species is discussed.

  6. Using radar altimetry to update a large-scale hydrological model of the Brahmaputra river basin

    DEFF Research Database (Denmark)

    Finsen, F.; Milzow, Christian; Smith, R.

    2014-01-01

    Measurements of river and lake water levels from space-borne radar altimeters (past missions include ERS, Envisat, Jason, Topex) are useful for calibration and validation of large-scale hydrological models in poorly gauged river basins. Altimetry data availability over the downstream reaches...... of the Brahmaputra is excellent (17 high-quality virtual stations from ERS-2, 6 from Topex and 10 from Envisat are available for the Brahmaputra). In this study, altimetry data are used to update a large-scale Budyko-type hydrological model of the Brahmaputra river basin in real time. Altimetry measurements...... improved model performance considerably. The Nash-Sutcliffe model efficiency increased from 0.77 to 0.83. Real-time river basin modelling using radar altimetry has the potential to improve the predictive capability of large-scale hydrological models elsewhere on the planet....

  7. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities.

    Science.gov (United States)

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks

  8. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities

    Directory of Open Access Journals (Sweden)

    Valerio Santangelo

    2018-02-01

    Full Text Available Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010 to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory in one spatial location. The analysis of the independent components (ICs revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC. The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among

  9. Thermal anchoring of wires in large scale superconducting coil test experiment

    International Nuclear Information System (INIS)

    Patel, Dipak; Sharma, A.N.; Prasad, Upendra; Khristi, Yohan; Varmora, Pankaj; Doshi, Kalpesh; Pradhan, S.

    2013-01-01

    Highlights: • We addressed how thermal anchoring in large scale coil test is different compare to small cryogenic apparatus? • We did precise estimation of thermal anchoring length at 77 K and 4.2 K heat sink in large scale superconducting coil test experiment. • We addressed, the quality of anchoring without covering entire wires using Kapton/Teflon tape. • We obtained excellent results in temperature measurement without using GE Varnish by doubling estimated anchoring length. -- Abstract: Effective and precise thermal anchoring of wires in cryogenic experiment is mandatory to measure temperature in milikelvin accuracy and to avoid unnecessary cooling power due to additional heat conduction from room temperature (RT) to operating temperature (OT) through potential, field, displacement and stress measurement instrumentation wires. Instrumentation wires used in large scale superconducting coil test experiments are different compare to cryogenic apparatus in terms of unique construction and overall diameter/area due to errorless measurement in large time-varying magnetic field compare to small cryogenic apparatus, often shielded wires are used. Hence, along with other variables, anchoring techniques and required thermal anchoring length are entirely different in this experiment compare to cryogenic apparatus. In present paper, estimation of thermal anchoring length of five different types of instrumentation wires used in coils test campaign at Institute for Plasma Research (IPR), India has been discussed and some temperature measurement results of coils test campaign have been presented

  10. Evaluating the CO 2 emissions reduction potential and cost of power sector re-dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Steinberg, Daniel C.; Bielen, David A.; Townsend, Aaron

    2018-01-01

    Prior studies of the U.S. electricity sector have recognized the potential to reduce carbon dioxide (CO2) emissions by substituting generation from coal-fired units with generation from under-utilized and lower-emitting natural gas-fired units; in fact, this type of 're-dispatch' was invoked as one of the three building blocks used to set the emissions targets under the Environmental Protection Agency's Clean Power Plan. Despite the existence of surplus natural gas capacity in the U.S., power system operational constraints not often considered in power sector policy analyses, such as transmission congestion, generator ramping constraints, minimum generation constraints, planned and unplanned generator outages, and ancillary service requirements, could limit the potential and increase the cost of coal-to-gas re-dispatch. Using a highly detailed power system unit commitment and dispatch model, we estimate the maximum potential for re-dispatch in the Eastern Interconnection, which accounts for the majority of coal capacity and generation in the U.S. Under our reference assumptions, we find that maximizing coal-to-gas re-dispatch yields emissions reductions of 230 million metric tons (Mt), or 13% of power sector emissions in the Eastern Interconnection, with a corresponding average abatement cost of $15-$44 per metric ton of CO2, depending on the assumed supply elasticity of natural gas.

  11. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  12. Time-Efficient Cloning Attacks Identification in Large-Scale RFID Systems

    Directory of Open Access Journals (Sweden)

    Ju-min Zhao

    2017-01-01

    Full Text Available Radio Frequency Identification (RFID is an emerging technology for electronic labeling of objects for the purpose of automatically identifying, categorizing, locating, and tracking the objects. But in their current form RFID systems are susceptible to cloning attacks that seriously threaten RFID applications but are hard to prevent. Existing protocols aimed at detecting whether there are cloning attacks in single-reader RFID systems. In this paper, we investigate the cloning attacks identification in the multireader scenario and first propose a time-efficient protocol, called the time-efficient Cloning Attacks Identification Protocol (CAIP to identify all cloned tags in multireaders RFID systems. We evaluate the performance of CAIP through extensive simulations. The results show that CAIP can identify all the cloned tags in large-scale RFID systems fairly fast with required accuracy.

  13. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  14. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  15. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  16. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  17. The Potential of Threshold Concepts: An Emerging Framework for Educational Research and Practice

    Science.gov (United States)

    Lucas, Ursula; Mladenovic, Rosina

    2007-01-01

    This paper explores the notion of a "threshold concept" and discusses its possible implications for higher education research and practice. Using the case of introductory accounting as an illustration, it is argued that the idea of a threshold concept provides an emerging theoretical framework for a "re-view" of educational…

  18. Emerging and potentially emerging viruses in water environments

    Directory of Open Access Journals (Sweden)

    Giuseppina La Rosa

    2012-12-01

    Full Text Available Among microorganisms, viruses are best fit to become emerging pathogens since they are able to adapt not only by mutation but also through recombination and reassortment and can thus become able to infect new hosts and to adjust to new environments. Enteric viruses are among the commonest and most hazardous waterborne pathogens, causing both sporadic and outbreak-related illness. The main health effect associated with enteric viruses is gastrointestinal illness, but they can also cause respiratory symptoms, conjunctivitis, hepatitis, central nervous system infections, and chronic diseases. Non-enteric viruses, such as respiratory and epitheliotrophic viruses are not considered waterborne, as they are not readily transmitted to water sources from infected individuals. The present review will focus on viral pathogens shown to be transmitted through water. It will also provide an overview of viruses that had not been a concern for waterborne transmission in the past, but that may represent potentially emerging waterborne pathogens due to their occurrence and persistence in water environments.

  19. Biofuel Development and Large-Scale Land Deals in Sub-Saharan Africa

    OpenAIRE

    Giorgia Giovannetti; Elisa Ticci

    2013-01-01

    Africa's biofuel potential over the last ten years has increasingly attracted foreign investors’ attention. We estimate the determinants of foreign investors land demand for biofuel production in SSA, using Poisson specifications of the gravity model. Our estimates suggest that land availability, abundance of water resources and weak land governance are significant determinants of large-scale land acquisitions for biofuel production. This in turn suggests that this type of investment is mainl...

  20. Relativistic jets without large-scale magnetic fields

    Science.gov (United States)

    Parfrey, K.; Giannios, D.; Beloborodov, A.

    2014-07-01

    The canonical model of relativistic jets from black holes requires a large-scale ordered magnetic field to provide a significant magnetic flux through the ergosphere--in the Blandford-Znajek process, the jet power scales with the square of the magnetic flux. In many jet systems the presence of the required flux in the environment of the central engine is questionable. I will describe an alternative scenario, in which jets are produced by the continuous sequential accretion of small magnetic loops. The magnetic energy stored in these coronal flux systems is amplified by the differential rotation of the accretion disc and by the rotating spacetime of the black hole, leading to runaway field line inflation, magnetic reconnection in thin current layers, and the ejection of discrete bubbles of Poynting-flux-dominated plasma. For illustration I will show the results of general-relativistic force-free electrodynamic simulations of rotating black hole coronae, performed using a new resistivity model. The dissipation of magnetic energy by coronal reconnection events, as demonstrated in these simulations, is a potential source of the observed high-energy emission from accreting compact objects.

  1. A global classification of coastal flood hazard climates associated with large-scale oceanographic forcing.

    Science.gov (United States)

    Rueda, Ana; Vitousek, Sean; Camus, Paula; Tomás, Antonio; Espejo, Antonio; Losada, Inigo J; Barnard, Patrick L; Erikson, Li H; Ruggiero, Peter; Reguero, Borja G; Mendez, Fernando J

    2017-07-11

    Coastal communities throughout the world are exposed to numerous and increasing threats, such as coastal flooding and erosion, saltwater intrusion and wetland degradation. Here, we present the first global-scale analysis of the main drivers of coastal flooding due to large-scale oceanographic factors. Given the large dimensionality of the problem (e.g. spatiotemporal variability in flood magnitude and the relative influence of waves, tides and surge levels), we have performed a computer-based classification to identify geographical areas with homogeneous climates. Results show that 75% of coastal regions around the globe have the potential for very large flooding events with low probabilities (unbounded tails), 82% are tide-dominated, and almost 49% are highly susceptible to increases in flooding frequency due to sea-level rise.

  2. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  3. Dust Destruction in the ISM: A Re-Evaluation of Dust Lifetimes

    Science.gov (United States)

    Jones, A. P.; Nuth, J. A., III

    2011-01-01

    There is a long-standing conundrum in interstellar dust studies relating to the discrepancy between the time-scales for dust formation from evolved stars and the apparently more rapid destruction in supernova-generated shock waves. Aims. We re-examine some of the key issues relating to dust evolution and processing in the interstellar medium. Methods. We use recent and new constraints from observations, experiments, modelling and theory to re-evaluate dust formation in the interstellar medium (ISM). Results. We find that the discrepancy between the dust formation and destruction time-scales may not be as significant as has previously been assumed because of the very large uncertainties involved. Conclusions. The derived silicate dust lifetime could be compatible with its injection time-scale, given the inherent uncertainties in the dust lifetime calculation. The apparent need to re-form significant quantities of silicate dust in the tenuous interstellar medium may therefore not be a strong requirement. Carbonaceous matter, on the other hand, appears to be rapidly recycled in the ISM and, in contrast to silicates, there are viable mechanisms for its re-formation in the ISM.

  4. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  5. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    Science.gov (United States)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  6. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  7. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  8. Large-Scale Analysis of Network Bistability for Human Cancers

    Science.gov (United States)

    Shiraishi, Tetsuya; Matsuyama, Shinako; Kitano, Hiroaki

    2010-01-01

    Protein–protein interaction and gene regulatory networks are likely to be locked in a state corresponding to a disease by the behavior of one or more bistable circuits exhibiting switch-like behavior. Sets of genes could be over-expressed or repressed when anomalies due to disease appear, and the circuits responsible for this over- or under-expression might persist for as long as the disease state continues. This paper shows how a large-scale analysis of network bistability for various human cancers can identify genes that can potentially serve as drug targets or diagnosis biomarkers. PMID:20628618

  9. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  10. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  11. Emergency contraception - potential for women's health.

    Science.gov (United States)

    Mittal, Suneeta

    2014-11-01

    Emergency contraception (EC) is a safe and effective method which is used to prevent unwanted pregnancy after unprotected sexual intercourse. Many of the unwanted pregnancies end in unsafe abortions. The search for an ideal contraceptive, which does not interfere with spontaneity or pleasure of the sexual act, yet effectively controls the fertility, is still continuing. Numerous contraceptive techniques are available, yet contraceptive coverage continues to be poor in India. Thus, even when not planning for a pregnancy, exposure to unprotected sex takes place often, necessitating the use of emergency contraception. This need may also arise due to failure of contraceptive method being used (condom rupture, diaphragm slippage, forgotten oral pills) or following sexual assault. Emergency contraception is an intervention that can prevent a large number of unwanted pregnancies resulting from failure of regular contraception or unplanned sexual activity, which in turn helps in reducing the maternal mortality and morbidity due to unsafe abortions. However, a concern has been expressed regarding repeated and indiscriminate usage of e-pill, currently the rational use of emergency contraception is being promoted as it is expected to make a significant dent in reducing the number of unwanted pregnancies and unsafe abortions. In fact, since the introduction of emergency contraception, the contribution of unsafe abortion towards maternal mortality has declined from 13 to 8 per cent.

  12. Very-large-scale production of antibodies in plants: The biologization of manufacturing.

    Science.gov (United States)

    Buyel, J F; Twyman, R M; Fischer, R

    2017-07-01

    Gene technology has facilitated the biologization of manufacturing, i.e. the use and production of complex biological molecules and systems at an industrial scale. Monoclonal antibodies (mAbs) are currently the major class of biopharmaceutical products, but they are typically used to treat specific diseases which individually have comparably low incidences. The therapeutic potential of mAbs could also be used for more prevalent diseases, but this would require a massive increase in production capacity that could not be met by traditional fermenter systems. Here we outline the potential of plants to be used for the very-large-scale (VLS) production of biopharmaceutical proteins such as mAbs. We discuss the potential market sizes and their corresponding production capacities. We then consider available process technologies and scale-down models and how these can be used to develop VLS processes. Finally, we discuss which adaptations will likely be required for VLS production, lessons learned from existing cell culture-based processes and the food industry, and practical requirements for the implementation of a VLS process. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  14. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  15. Addressing Criticisms of Large-Scale Marine Protected Areas

    Science.gov (United States)

    Ban, Natalie C; Fernandez, Miriam; Friedlander, Alan M; García-Borboroglu, Pablo; Golbuu, Yimnang; Guidetti, Paolo; Harris, Jean M; Hawkins, Julie P; Langlois, Tim; McCauley, Douglas J; Pikitch, Ellen K; Richmond, Robert H; Roberts, Callum M

    2018-01-01

    Abstract Designated large-scale marine protected areas (LSMPAs, 100,000 or more square kilometers) constitute over two-thirds of the approximately 6.6% of the ocean and approximately 14.5% of the exclusive economic zones within marine protected areas. Although LSMPAs have received support among scientists and conservation bodies for wilderness protection, regional ecological connectivity, and improving resilience to climate change, there are also concerns. We identified 10 common criticisms of LSMPAs along three themes: (1) placement, governance, and management; (2) political expediency; and (3) social–ecological value and cost. Through critical evaluation of scientific evidence, we discuss the value, achievements, challenges, and potential of LSMPAs in these arenas. We conclude that although some criticisms are valid and need addressing, none pertain exclusively to LSMPAs, and many involve challenges ubiquitous in management. We argue that LSMPAs are an important component of a diversified management portfolio that tempers potential losses, hedges against uncertainty, and enhances the probability of achieving sustainably managed oceans. PMID:29731514

  16. THE DECAY OF A WEAK LARGE-SCALE MAGNETIC FIELD IN TWO-DIMENSIONAL TURBULENCE

    Energy Technology Data Exchange (ETDEWEB)

    Kondić, Todor; Hughes, David W.; Tobias, Steven M., E-mail: t.kondic@leeds.ac.uk [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2016-06-01

    We investigate the decay of a large-scale magnetic field in the context of incompressible, two-dimensional magnetohydrodynamic turbulence. It is well established that a very weak mean field, of strength significantly below equipartition value, induces a small-scale field strong enough to inhibit the process of turbulent magnetic diffusion. In light of ever-increasing computer power, we revisit this problem to investigate fluids and magnetic Reynolds numbers that were previously inaccessible. Furthermore, by exploiting the relation between the turbulent diffusion of the magnetic potential and that of the magnetic field, we are able to calculate the turbulent magnetic diffusivity extremely accurately through the imposition of a uniform mean magnetic field. We confirm the strong dependence of the turbulent diffusivity on the product of the magnetic Reynolds number and the energy of the large-scale magnetic field. We compare our findings with various theoretical descriptions of this process.

  17. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    Science.gov (United States)

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. The impact of movements and animal density on continental scale cattle disease outbreaks in the United States.

    Directory of Open Access Journals (Sweden)

    Michael G Buhnerkempe

    Full Text Available Globalization has increased the potential for the introduction and spread of novel pathogens over large spatial scales necessitating continental-scale disease models to guide emergency preparedness. Livestock disease spread models, such as those for the 2001 foot-and-mouth disease (FMD epidemic in the United Kingdom, represent some of the best case studies of large-scale disease spread. However, generalization of these models to explore disease outcomes in other systems, such as the United States's cattle industry, has been hampered by differences in system size and complexity and the absence of suitable livestock movement data. Here, a unique database of US cattle shipments allows estimation of synthetic movement networks that inform a near-continental scale disease model of a potential FMD-like (i.e., rapidly spreading epidemic in US cattle. The largest epidemics may affect over one-third of the US and 120,000 cattle premises, but cattle movement restrictions from infected counties, as opposed to national movement moratoriums, are found to effectively contain outbreaks. Slow detection or weak compliance may necessitate more severe state-level bans for similar control. Such results highlight the role of large-scale disease models in emergency preparedness, particularly for systems lacking comprehensive movement and outbreak data, and the need to rapidly implement multi-scale contingency plans during a potential US outbreak.

  19. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  20. P08.52 Proton therapy re-Irradiation in large-volume recurrent glioblastoma.

    Science.gov (United States)

    Amelio, D.; Widesott, L.; Vennarini, S.; Fellin, F.; Maines, F.; Righetto, R.; Lorentini, S.; Farace, P.; Schwarz, M.; Amichetti, M.

    2016-01-01

    Abstract Purpose: To report preliminary results of re-irradiation with proton therapy (PT) in large-volume recurrent glioblastoma (rGBM). Matherial/Methods: Between January and December 2015 ten patients (pts) with rGBM were re-irradiated with PT. All pts were previously treated with photon radiotherapy (60 Gy) with concomitant and adjuvant TMZ for 1–20 cycles (median, 7). Seven pts were re-irradiated at first relapse/progression. Four patients were re-irradiated after partial tumor resection. Median age and Karnofsky performance status at re-irradiation were 57 years (range, 41–68) and 80%, (range, 70–100), respectively. Median time between prior radiotherapy and PT was 9 months (range, 5–24). Target definition was based on CT, MR, and 18F-DOPA PET imaging. GTV included any area of contrast enhancement after contrast medium administration plus any pathological PET uptake regions. CTV was generated by adding to GTV a 3-mm uniform margin manually corrected in proximity of anatomical barriers. CTV was expanded by 4 mm to create PTV. Median PTV volume was 90 cc (range, 46–231). All pts received 36 GyRBE in 18 fractions. Four pts also received concomitant temozolomide (75 mg/m2/die, 7 days/week). All pts were treated with active beam scanning PT using 2–3 fields with single field optimization technique. Results: All pts completed the treatment without breaks. Registered acute side effects (according to Common Terminology Criteria for Adverse Events version 4.0 - CTCAE) include grade 1–2 skin erythema, alopecia, fatigue, conjunctivitis, concentration impairment, dysphasia, and headache. There were no grade 3 or higher toxicities. One patient developed grade 1 neutropenia. Five pts started PT under steroids (2–7 mg/daily); two of them reduced the dose during PT, while three kept the same steroids dose. None of remaining pts needed steroids therapy. Registered late side effects (according to CTCAE version 4.0) include grade 1–2 alopecia, fatigue

  1. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    Science.gov (United States)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  2. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    Science.gov (United States)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological

  3. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  4. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  5. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region.

    Science.gov (United States)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-07-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0-20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. Copyright © 2017 Elsevier Ltd. All

  6. Re-entering fast ion effects on NBI heating power in high-beta plasmas of the Large Helical Device

    International Nuclear Information System (INIS)

    Seki, Ryosuke; Watanabe, Kiyomasa; Funaba, Hisamichi; Suzuki, Yasuhiro; Sakakibara, Satoru; Ohdachi, Satoshi; Matsumoto, Yutaka; Hamamatsu, Kiyotaka

    2011-10-01

    We calculate the heating power of the neutral beam injection (NBI) in the = 4.8% high-beta discharge achieved in the Large Helical Device (LHD). We investigate the difference of the heating efficiency and the heating power profile between with and without the re-entering fast ion effects. When the re-entering fast ion effects are taken into account, the heating efficiency in the co injection of the NBI (co-NBI case) is improved and it is about 1.8 times larger than that without the re-entering effects. In contrast, the heating efficiency with the re-entering effects in the counter injection of the NBI (ctr-NBI case) rarely differs from that without the re-entering ones. We also study the re-entering fast ion effects on the transport properties in the LHD high beta discharges. It is found that the tendency of the thermal conductivities on the beta value is not so much sensitive with and without the re-entering effects. In addition, we investigate the difference in the re-entering fast ion effects caused by the field strength and the magnetic configuration. In the co-NBI case, the re-entering fast ion effects on the heating efficiency increases with the decrease of the field strength. In the contrast, the re-entering fast ion effects in the ctr-NBI case rarely differs by changing the field strength. (author)

  7. In situ vitrification: Preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-02-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: (1) determine large-scale processing performance and (2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. Because of the test's successful completion, within a year technical data on the vitrified soil will be available to determine how well the process incorporates transuranics into the waste form and how well the form resists leaching of transuranics. Preliminary results available include retention of transuranics and other elements within the waste form during processing and the efficiency of the off-gas treatment system in removing contaminants from the gaseous effluents. 13 refs., 10 figs., 5 tabs

  8. Systems Perturbation Analysis of a Large-Scale Signal Transduction Model Reveals Potentially Influential Candidates for Cancer Therapeutics

    Science.gov (United States)

    Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš

    2016-01-01

    -related components and tumor-suppressor genes, suggesting that this combinatorial perturbation may lead to a better target for decreasing cell proliferation and inducing apoptosis. Finally, our approach shows a potential to identify and prioritize therapeutic targets through systemic perturbation analysis of large-scale computational models of signal transduction. Although some components of the presented computational results have been validated against independent gene expression data sets, more laboratory experiments are warranted to more comprehensively validate the presented results. PMID:26904540

  9. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region

    International Nuclear Information System (INIS)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-01-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0–20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. - Causation between the

  10. Disappearing scales in carps: re-visiting Kirpichnikov's model on the genetics of scale pattern formation.

    Directory of Open Access Journals (Sweden)

    Laura Casas

    Full Text Available The body of most fishes is fully covered by scales that typically form tight, partially overlapping rows. While some of the genes controlling the formation and growth of fish scales have been studied, very little is known about the genetic mechanisms regulating scale pattern formation. Although the existence of two genes with two pairs of alleles (S&s and N&n regulating scale coverage in cyprinids has been predicted by Kirpichnikov and colleagues nearly eighty years ago, their identity was unknown until recently. In 2009, the 'S' gene was found to be a paralog of fibroblast growth factor receptor 1, fgfr1a1, while the second gene called 'N' has not yet been identified. We re-visited the original model of Kirpichnikov that proposed four major scale pattern types and observed a high degree of variation within the so-called scattered phenotype due to which this group was divided into two sub-types: classical mirror and irregular. We also analyzed the survival rates of offspring groups and found a distinct difference between Asian and European crosses. Whereas nude × nude crosses involving at least one parent of Asian origin or hybrid with Asian parent(s showed the 25% early lethality predicted by Kirpichnikov (due to the lethality of the NN genotype, those with two Hungarian nude parents did not. We further extended Kirpichnikov's work by correlating changes in phenotype (scale-pattern to the deformations of fins and losses of pharyngeal teeth. We observed phenotypic changes which were not restricted to nudes, as described by Kirpichnikov, but were also present in mirrors (and presumably in linears as well; not analyzed in detail here. We propose that the gradation of phenotypes observed within the scattered group is caused by a gradually decreasing level of signaling (a dose-dependent effect probably due to a concerted action of multiple pathways involved in scale formation.

  11. Disappearing scales in carps: Re-visiting Kirpichnikov's model on the genetics of scale pattern formation

    KAUST Repository

    Casas, Laura; Szűcs, Ré ka; Vij, Shubha; Goh, Chin Heng; Kathiresan, Purushothaman; Né meth, Sá ndor; Jeney, Zsigmond; Bercsé nyi, Mikló s; Orbá n, Lá szló

    2013-01-01

    The body of most fishes is fully covered by scales that typically form tight, partially overlapping rows. While some of the genes controlling the formation and growth of fish scales have been studied, very little is known about the genetic mechanisms regulating scale pattern formation. Although the existence of two genes with two pairs of alleles (S&s and N&n) regulating scale coverage in cyprinids has been predicted by Kirpichnikov and colleagues nearly eighty years ago, their identity was unknown until recently. In 2009, the 'S' gene was found to be a paralog of fibroblast growth factor receptor 1, fgfr1a1, while the second gene called 'N' has not yet been identified. We re-visited the original model of Kirpichnikov that proposed four major scale pattern types and observed a high degree of variation within the so-called scattered phenotype due to which this group was divided into two sub-types: classical mirror and irregular. We also analyzed the survival rates of offspring groups and found a distinct difference between Asian and European crosses. Whereas nude x nude crosses involving at least one parent of Asian origin or hybrid with Asian parent(s) showed the 25% early lethality predicted by Kirpichnikov (due to the lethality of the NN genotype), those with two Hungarian nude parents did not. We further extended Kirpichnikov's work by correlating changes in phenotype (scale-pattern) to the deformations of fins and losses of pharyngeal teeth. We observed phenotypic changes which were not restricted to nudes, as described by Kirpichnikov, but were also present in mirrors (and presumably in linears as well; not analyzed in detail here). We propose that the gradation of phenotypes observed within the scattered group is caused by a gradually decreasing level of signaling (a dosedependent effect) probably due to a concerted action of multiple pathways involved in scale formation. 2013 Casas et al.

  12. Disappearing scales in carps: Re-visiting Kirpichnikov's model on the genetics of scale pattern formation

    KAUST Repository

    Casas, Laura

    2013-12-30

    The body of most fishes is fully covered by scales that typically form tight, partially overlapping rows. While some of the genes controlling the formation and growth of fish scales have been studied, very little is known about the genetic mechanisms regulating scale pattern formation. Although the existence of two genes with two pairs of alleles (S&s and N&n) regulating scale coverage in cyprinids has been predicted by Kirpichnikov and colleagues nearly eighty years ago, their identity was unknown until recently. In 2009, the \\'S\\' gene was found to be a paralog of fibroblast growth factor receptor 1, fgfr1a1, while the second gene called \\'N\\' has not yet been identified. We re-visited the original model of Kirpichnikov that proposed four major scale pattern types and observed a high degree of variation within the so-called scattered phenotype due to which this group was divided into two sub-types: classical mirror and irregular. We also analyzed the survival rates of offspring groups and found a distinct difference between Asian and European crosses. Whereas nude x nude crosses involving at least one parent of Asian origin or hybrid with Asian parent(s) showed the 25% early lethality predicted by Kirpichnikov (due to the lethality of the NN genotype), those with two Hungarian nude parents did not. We further extended Kirpichnikov\\'s work by correlating changes in phenotype (scale-pattern) to the deformations of fins and losses of pharyngeal teeth. We observed phenotypic changes which were not restricted to nudes, as described by Kirpichnikov, but were also present in mirrors (and presumably in linears as well; not analyzed in detail here). We propose that the gradation of phenotypes observed within the scattered group is caused by a gradually decreasing level of signaling (a dosedependent effect) probably due to a concerted action of multiple pathways involved in scale formation. 2013 Casas et al.

  13. Re-thinking stages of cognitive development: an appraisal of connectionist models on the balance scale task

    NARCIS (Netherlands)

    Quinlan, P.T.; van der Maas, H.L.J.; Jansen, B.R.J.; Booij, O.; Rendell, M.

    2007-01-01

    The present paper re-appraises connectionist attempts to explain how human cognitive development appears to progress through a series of sequential stages. Models of performance on the Piagetian balance scale task are the focus of attention. Limitations of these models are discussed and replications

  14. The linearly scaling 3D fragment method for large scale electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Zhao Zhengji [National Energy Research Scientific Computing Center (NERSC) (United States); Meza, Juan; Shan Hongzhang; Strohmaier, Erich; Bailey, David; Wang Linwang [Computational Research Division, Lawrence Berkeley National Laboratory (United States); Lee, Byounghak, E-mail: ZZhao@lbl.go [Physics Department, Texas State University (United States)

    2009-07-01

    The linearly scaling three-dimensional fragment (LS3DF) method is an O(N) ab initio electronic structure method for large-scale nano material simulations. It is a divide-and-conquer approach with a novel patching scheme that effectively cancels out the artificial boundary effects, which exist in all divide-and-conquer schemes. This method has made ab initio simulations of thousand-atom nanosystems feasible in a couple of hours, while retaining essentially the same accuracy as the direct calculation methods. The LS3DF method won the 2008 ACM Gordon Bell Prize for algorithm innovation. Our code has reached 442 Tflop/s running on 147,456 processors on the Cray XT5 (Jaguar) at OLCF, and has been run on 163,840 processors on the Blue Gene/P (Intrepid) at ALCF, and has been applied to a system containing 36,000 atoms. In this paper, we will present the recent parallel performance results of this code, and will apply the method to asymmetric CdSe/CdS core/shell nanorods, which have potential applications in electronic devices and solar cells.

  15. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  16. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  17. Use of human immunoglobulins as an anti-infective treatment: the experience so far and their possible re-emerging role.

    Science.gov (United States)

    Bozzo, Jordi; Jorquera, Juan I

    2017-06-01

    Pooled human immunoglobulins (IGs) are prepared from plasma obtained from healthy donors as a concentrated antibody-containing solution. In addition, high-titer IGs (hyperimmune) against a specific pathogen can be obtained from vaccinated or convalescing donors. Currently, IGs can be used for the treatment of a variety of infections for which no specific therapy exists or that remain difficult to treat. Moreover, the recent pathogen outbreaks for which there is no approved treatment have renewed attention to the role of convalescent plasma and IGs. Areas covered: In this review, a historical perspective of the use of sera and IGs in humans as anti-infective agents (any viral, bacterial, parasitic infection), excluding immunodeficient patients, is presented from early development to the latest clinical studies. A Medline search was conducted to examine the peer-reviewed literature, with no date limits. Expert commentary: Human pooled plasma-derived IG products benefit from the polyclonal response of every individual donor and from the interindividual variability in such response. The trend to increased availability of vaccines for infectious diseases also opens new potential applications of hyperimmune IGs for emerging or re-emerging infectious diseases (e.g.: Ebola, Zika, Dengue), for the prevention and treatment in the general population, healthcare personnel and caregivers.

  18. The Vietnam Initiative on Zoonotic Infections (VIZIONS): A Strategic Approach to Studying Emerging Zoonotic Infectious Diseases.

    Science.gov (United States)

    Rabaa, Maia A; Tue, Ngo Tri; Phuc, Tran My; Carrique-Mas, Juan; Saylors, Karen; Cotten, Matthew; Bryant, Juliet E; Nghia, Ho Dang Trung; Cuong, Nguyen Van; Pham, Hong Anh; Berto, Alessandra; Phat, Voong Vinh; Dung, Tran Thi Ngoc; Bao, Long Hoang; Hoa, Ngo Thi; Wertheim, Heiman; Nadjm, Behzad; Monagin, Corina; van Doorn, H Rogier; Rahman, Motiur; Tra, My Phan Vu; Campbell, James I; Boni, Maciej F; Tam, Pham Thi Thanh; van der Hoek, Lia; Simmonds, Peter; Rambaut, Andrew; Toan, Tran Khanh; Van Vinh Chau, Nguyen; Hien, Tran Tinh; Wolfe, Nathan; Farrar, Jeremy J; Thwaites, Guy; Kellam, Paul; Woolhouse, Mark E J; Baker, Stephen

    2015-12-01

    The effect of newly emerging or re-emerging infectious diseases of zoonotic origin in human populations can be potentially catastrophic, and large-scale investigations of such diseases are highly challenging. The monitoring of emergence events is subject to ascertainment bias, whether at the level of species discovery, emerging disease events, or disease outbreaks in human populations. Disease surveillance is generally performed post hoc, driven by a response to recent events and by the availability of detection and identification technologies. Additionally, the inventory of pathogens that exist in mammalian and other reservoirs is incomplete, and identifying those with the potential to cause disease in humans is rarely possible in advance. A major step in understanding the burden and diversity of zoonotic infections, the local behavioral and demographic risks of infection, and the risk of emergence of these pathogens in human populations is to establish surveillance networks in populations that maintain regular contact with diverse animal populations, and to simultaneously characterize pathogen diversity in human and animal populations. Vietnam has been an epicenter of disease emergence over the last decade, and practices at the human/animal interface may facilitate the likelihood of spillover of zoonotic pathogens into humans. To tackle the scientific issues surrounding the origins and emergence of zoonotic infections in Vietnam, we have established The Vietnam Initiative on Zoonotic Infections (VIZIONS). This countrywide project, in which several international institutions collaborate with Vietnamese organizations, is combining clinical data, epidemiology, high-throughput sequencing, and social sciences to address relevant one-health questions. Here, we describe the primary aims of the project, the infrastructure established to address our scientific questions, and the current status of the project. Our principal objective is to develop an integrated approach to

  19. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  20. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  1. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  2. Electricity network limitations on large-scale deployment of wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Fairbairn, R.J.

    1999-07-01

    This report sought to identify limitation on large scale deployment of wind energy in the UK. A description of the existing electricity supply system in England, Scotland and Wales is given, and operational aspects of the integrated electricity networks, licence conditions, types of wind turbine generators, and the scope for deployment of wind energy in the UK are addressed. A review of technical limitations and technical criteria stipulated by the Distribution and Grid Codes, the effects of system losses, and commercial issues are examined. Potential solutions to technical limitations are proposed, and recommendations are outlined.

  3. EFT of large scale structures in redshift space

    Science.gov (United States)

    Lewandowski, Matthew; Senatore, Leonardo; Prada, Francisco; Zhao, Cheng; Chuang, Chia-Hsun

    2018-03-01

    We further develop the description of redshift-space distortions within the effective field theory of large scale structures. First, we generalize the counterterms to include the effect of baryonic physics and primordial non-Gaussianity. Second, we evaluate the IR resummation of the dark matter power spectrum in redshift space. This requires us to identify a controlled approximation that makes the numerical evaluation straightforward and efficient. Third, we compare the predictions of the theory at one loop with the power spectrum from numerical simulations up to ℓ=6 . We find that the IR resummation allows us to correctly reproduce the baryon acoustic oscillation peak. The k reach—or, equivalently, the precision for a given k —depends on additional counterterms that need to be matched to simulations. Since the nonlinear scale for the velocity is expected to be longer than the one for the overdensity, we consider a minimal and a nonminimal set of counterterms. The quality of our numerical data makes it hard to firmly establish the performance of the theory at high wave numbers. Within this limitation, we find that the theory at redshift z =0.56 and up to ℓ=2 matches the data at the percent level approximately up to k ˜0.13 h Mpc-1 or k ˜0.18 h Mpc-1 , depending on the number of counterterms used, with a potentially large improvement over former analytical techniques.

  4. Geometagenomics illuminates the impact of agriculture on the distribution and prevalence of plant viruses at the ecosystem scale

    OpenAIRE

    Bernardo, Pauline; Charles-Dominique, Tristan; Barakat, Mohamed; Ortet, Philippe; Fernandez, Emmanuel; Filloux, Denis; Hartnady, Penelope; Rebelo, Tony A; Cousins, Stephen R; Mesleard, François; Cohez, Damien; Yavercovski, Nicole; Varsani, Arvind; Harkins, Gordon W; Peterschmitt, Michel

    2017-01-01

    Disease emergence events regularly result from human activities such as agriculture, which frequently brings large populations of genetically uniform hosts into contact with potential pathogens. Although viruses cause nearly 50% of emerging plant diseases, there is little systematic information about virus distribution across agro-ecological interfaces and large gaps in understanding of virus diversity in nature. Here we applied a novel landscape-scale geometagenomics approach to examine rela...

  5. Geometagenomics illuminates the impact of agriculture on the distribution and prevalence of plant viruses at the ecosystem scale

    OpenAIRE

    Bernardo, Pauline; Charles-Dominique, Tristan; Barakat, Mohamed; Ortet, Philippe; Fernandez, Emmanuel; Filloux, Denis; Hartnady, Penelope; Rebelo, Tony A.; Cousins, Stephen; Mesleard, François; Cohez, Damien; Yaverkovski, Nicole; Varsani, Arvind; Harkins, Gordon William; Peterschmitt, Michel

    2018-01-01

    Disease emergence events regularly result from human activities such as agriculture, which frequently brings large populations of genetically uniform hosts into contact with potential pathogens. Although viruses cause nearly 50% of emerging plant diseases, there is little systematic information about virus distribution across agro-ecological interfaces and large gaps in understanding of virus diversity in nature. Here we applied a novel landscape-scale geometagenomics approach to examine rela...

  6. Spider Transcriptomes Identify Ancient Large-Scale Gene Duplication Event Potentially Important in Silk Gland Evolution.

    Science.gov (United States)

    Clarke, Thomas H; Garb, Jessica E; Hayashi, Cheryl Y; Arensburger, Peter; Ayoub, Nadia A

    2015-06-08

    The evolution of specialized tissues with novel functions, such as the silk synthesizing glands in spiders, is likely an influential driver of adaptive success. Large-scale gene duplication events and subsequent paralog divergence are thought to be required for generating evolutionary novelty. Such an event has been proposed for spiders, but not tested. We de novo assembled transcriptomes from three cobweb weaving spider species. Based on phylogenetic analyses of gene families with representatives from each of the three species, we found numerous duplication events indicative of a whole genome or segmental duplication. We estimated the age of the gene duplications relative to several speciation events within spiders and arachnids and found that the duplications likely occurred after the divergence of scorpions (order Scorpionida) and spiders (order Araneae), but before the divergence of the spider suborders Mygalomorphae and Araneomorphae, near the evolutionary origin of spider silk glands. Transcripts that are expressed exclusively or primarily within black widow silk glands are more likely to have a paralog descended from the ancient duplication event and have elevated amino acid replacement rates compared with other transcripts. Thus, an ancient large-scale gene duplication event within the spider lineage was likely an important source of molecular novelty during the evolution of silk gland-specific expression. This duplication event may have provided genetic material for subsequent silk gland diversification in the true spiders (Araneomorphae). © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  7. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  8. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  9. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  10. Market Potential Estimation for Tourism in Emerging Markets

    Directory of Open Access Journals (Sweden)

    Baimai, Chaiwat

    2009-10-01

    Full Text Available The objective of this paper was to develop a useful framework for estimating demand for tourism in emerging markets. Tourism has become one of the most crucial sectors in a large number of emerging countries. Moreover, the tourism industry in such markets is forecasted to keep increasing in the next decade. Hence, understanding and accurately forecast demand in the industry is essential in order to manage this sector effectively. Using stepwise regression analysis, we found a number of important variables in estimating demand for tourism in emerging markets. Our regression model can benefit travel agencies and policy makers dealing with the tourism industry.

  11. Measurements of the microwave spectrum, Re-H bond length, and Re quadrupole coupling for HRe(CO)5

    Science.gov (United States)

    Kukolich, Stephen G.; Sickafoose, Shane M.

    1993-11-01

    Rotational transition frequencies for rhenium pentacarbonyl hydride were measured in the 4-10 GHz range using a Flygare-Balle type microwave spectrometer. The rotational constants and Re nuclear quadrupole coupling constants for the four isotopomers, (1) H187Re(CO)5, (2) H185Re(CO)5, (3) D187Re(CO)5, and (4) D185Re(CO)5, were obtained from the spectra. For the most common isotopomer, B(1)=818.5464(2) MHz and eq Q(187Re)=-900.13(3) MHz. The Re-H bond length (r0) determined by fitting the rotational constants is 1.80(1) Å. Although the Re atom is located at a site of near-octahedral symmetry, the quadrupole coupling is large due to the large Re nuclear moments. A 2.7% increase in Re quadrupole coupling was observed for D-substituted isotopomers, giving a rather large isotope effect on the quadrupole coupling. The Cax-Re-Ceq angle is 96(1)°, when all Re-C-O angles are constrained to 180°.

  12. Ten Questions about Emergence

    OpenAIRE

    Fromm, Jochen

    2005-01-01

    Self-Organization is of growing importance for large distributed computing systems. In these systems, a central control and manual management is exceedingly difficult or even impossible. Emergence is widely recognized as the core principle behind self-organization. Therefore the idea to use both principles to control and organize large-scale distributed systems is very attractive and not so far off. Yet there are many open questions about emergence and self-organization, ranging from a clear ...

  13. Re-Emergent Inhibition of Cochlear Inner Hair Cells in a Mouse Model of Hearing Loss.

    Science.gov (United States)

    Zachary, Stephen Paul; Fuchs, Paul Albert

    2015-07-01

    Hearing loss among the elderly correlates with diminished social, mental, and physical health. Age-related cochlear cell death does occur, but growing anatomical evidence suggests that synaptic rearrangements on sensory hair cells also contribute to auditory functional decline. Here we present voltage-clamp recordings from inner hair cells of the C57BL/6J mouse model of age-related hearing loss, which reveal that cholinergic synaptic inputs re-emerge during aging. These efferents are functionally inhibitory, using the same ionic mechanisms as do efferent contacts present transiently before the developmental onset of hearing. The strength of efferent inhibition of inner hair cells increases with hearing threshold elevation. These data indicate that the aged cochlea regains features of the developing cochlea and that efferent inhibition of the primary receptors of the auditory system re-emerges with hearing impairment. Synaptic changes in the auditory periphery are increasingly recognized as important factors in hearing loss. To date, anatomical work has described the loss of afferent contacts from cochlear hair cells. However, relatively little is known about the efferent innervation of the cochlea during hearing loss. We performed intracellular recordings from mouse inner hair cells across the lifespan and show that efferent innervation of inner hair cells arises in parallel with the loss of afferent contacts and elevated hearing threshold during aging. These efferent neurons inhibit inner hair cells, raising the possibility that they play a role in the progression of age-related hearing loss. Copyright © 2015 the authors 0270-6474/15/359701-06$15.00/0.

  14. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  15. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  16. DNA/MVA Vaccination of HIV-1 Infected Participants with Viral Suppression on Antiretroviral Therapy, followed by Treatment Interruption: Elicitation of Immune Responses without Control of Re-Emergent Virus.

    Directory of Open Access Journals (Sweden)

    Melanie Thompson

    Full Text Available GV-TH-01, a Phase 1 open-label trial of a DNA prime—Modified Vaccinia Ankara (MVA boost vaccine (GOVX-B11, was undertaken in HIV infected participants on antiretroviral treatment (ART to evaluate safety and vaccine-elicited T cell responses, and explore the ability of elicited CD8+ T cells to control viral rebound during analytical treatment interruption (TI. Nine men who began antiretroviral therapy (ART within 18 months of seroconversion and had sustained plasma HIV-1 RNA <50 copies/mL for at least 6 months were enrolled. Median age was 38 years, median pre-ART HIV-1 RNA was 140,000 copies/ml and mean baseline CD4 count was 755/μl. Two DNA, followed by 2 MVA, inoculations were given 8 weeks apart. Eight subjects completed all vaccinations and TI. Clinical and laboratory adverse events were generally mild, with no serious or grade 4 events. Only reactogenicity events were considered related to study drug. No treatment emergent viral resistance was seen. The vaccinations did not reduce viral reservoirs and virus re-emerged in all participants during TI, with a median time to re-emergence of 4 weeks. Eight of 9 participants had CD8+ T cells that could be stimulated by vaccine-matched Gag peptides prior to vaccination. Vaccinations boosted these responses as well as eliciting previously undetected CD8+ responses. Elicited T cells did not display signs of exhaustion. During TI, temporal patterns of viral re-emergence and Gag-specific CD8+ T cell expansion suggested that vaccine-specific CD8+ T cells had been stimulated by re-emergent virus in only 2 of 8 participants. In these 2, transient decreases in viremia were associated with Gag selection in known CD8+ T cell epitopes. We hypothesize that escape mutations, already archived in the viral reservoir, plus a poor ability of CD8+ T cells to traffic to and control virus at sites of re-emergence, limited the therapeutic efficacy of the DNA/MVA vaccine.clinicaltrials.gov NCT01378156.

  17. DNA/MVA Vaccination of HIV-1 Infected Participants with Viral Suppression on Antiretroviral Therapy, followed by Treatment Interruption: Elicitation of Immune Responses without Control of Re-Emergent Virus.

    Science.gov (United States)

    Thompson, Melanie; Heath, Sonya L; Sweeton, Bentley; Williams, Kathy; Cunningham, Pamela; Keele, Brandon F; Sen, Sharon; Palmer, Brent E; Chomont, Nicolas; Xu, Yongxian; Basu, Rahul; Hellerstein, Michael S; Kwa, Suefen; Robinson, Harriet L

    2016-01-01

    GV-TH-01, a Phase 1 open-label trial of a DNA prime—Modified Vaccinia Ankara (MVA) boost vaccine (GOVX-B11), was undertaken in HIV infected participants on antiretroviral treatment (ART) to evaluate safety and vaccine-elicited T cell responses, and explore the ability of elicited CD8+ T cells to control viral rebound during analytical treatment interruption (TI). Nine men who began antiretroviral therapy (ART) within 18 months of seroconversion and had sustained plasma HIV-1 RNA HIV-1 RNA was 140,000 copies/ml and mean baseline CD4 count was 755/μl. Two DNA, followed by 2 MVA, inoculations were given 8 weeks apart. Eight subjects completed all vaccinations and TI. Clinical and laboratory adverse events were generally mild, with no serious or grade 4 events. Only reactogenicity events were considered related to study drug. No treatment emergent viral resistance was seen. The vaccinations did not reduce viral reservoirs and virus re-emerged in all participants during TI, with a median time to re-emergence of 4 weeks. Eight of 9 participants had CD8+ T cells that could be stimulated by vaccine-matched Gag peptides prior to vaccination. Vaccinations boosted these responses as well as eliciting previously undetected CD8+ responses. Elicited T cells did not display signs of exhaustion. During TI, temporal patterns of viral re-emergence and Gag-specific CD8+ T cell expansion suggested that vaccine-specific CD8+ T cells had been stimulated by re-emergent virus in only 2 of 8 participants. In these 2, transient decreases in viremia were associated with Gag selection in known CD8+ T cell epitopes. We hypothesize that escape mutations, already archived in the viral reservoir, plus a poor ability of CD8+ T cells to traffic to and control virus at sites of re-emergence, limited the therapeutic efficacy of the DNA/MVA vaccine. clinicaltrials.gov NCT01378156.

  18. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  19. Long-lived large-scale deformation under Central and Western Europe

    Science.gov (United States)

    Qorbani, Ehsan; Bokelmann, Götz

    2016-04-01

    We investigate the past and present-day deformation pattern under Central and Western Europe through seismic anisotropy. We use all SK(K)S splitting results that have been so far presented for this region and compile an image of upper mantle deformation. A large-scale deformation pattern emerges where NE-SW fast orientations under the Aegean are smoothly changing to NW-SE beneath the Hellenides-Dinarides conjunction. NW-SE is the dominant pattern under the whole Carpathian-Pannonian region. Towards Bohemia, the pattern rotates to E-W. The rotation continues until the Rhine valley, and it continues further within the Alps, all the way to Southern France. Outside the Alpine-deformation-influenced region, we observe a jump in fast orientation, between the Ardennes and the Massif Central in France, where the fast axis orientation is back to NW-SE. That anisotropy pattern may correlate with the arcuate shape of Variscan orogeny. It agrees with the Rheic suture line, and the boarders of two main tectonic units of European Variscides, Saxothuringian and Muldanubian. Previous studies on upper mantle anisotropy have interpreted and related such pattern mainly to frozen-in deformation from the past tectonic episodes. This has so far remained ambiguous though. Here we assess the relation between deformation at depth and shallower structure, as evidenced by stress field and topography. We discuss the presence of a long-lived large-scale upper mantle deformation, which has been acting ever since the Cambrian in different orogenic phases (Caledonian, Variscan, Alpine).

  20. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    Science.gov (United States)

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  1. Exploring fish microbial communities to mitigate emerging diseases in aquaculture.

    Science.gov (United States)

    de Bruijn, Irene; Liu, Yiying; Wiegertjes, Geert F; Raaijmakers, Jos M

    2018-01-01

    Aquaculture is the fastest growing animal food sector worldwide and expected to further increase to feed the growing human population. However, existing and (re-)emerging diseases are hampering fish and shellfish cultivation and yield. For many diseases, vaccination protocols are not in place and the excessive use of antibiotics and other chemicals is of substantial concern. A more sustainable disease control strategy to protect fish and shellfish from (re-)emerging diseases could be achieved by introduction or augmentation of beneficial microbes. To establish and maintain a 'healthy' fish microbiome, a fundamental understanding of the diversity and temporal-spatial dynamics of fish-associated microbial communities and their impact on growth and health of their aquatic hosts is required. This review describes insights in the diversity and functions of the fish bacterial communities elucidated with next-generation sequencing and discusses the potential of the microbes to mitigate (re-)emerging diseases in aquaculture. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. EvArnoldi: A New Algorithm for Large-Scale Eigenvalue Problems.

    Science.gov (United States)

    Tal-Ezer, Hillel

    2016-05-19

    Eigenvalues and eigenvectors are an essential theme in numerical linear algebra. Their study is mainly motivated by their high importance in a wide range of applications. Knowledge of eigenvalues is essential in quantum molecular science. Solutions of the Schrödinger equation for the electrons composing the molecule are the basis of electronic structure theory. Electronic eigenvalues compose the potential energy surfaces for nuclear motion. The eigenvectors allow calculation of diople transition matrix elements, the core of spectroscopy. The vibrational dynamics molecule also requires knowledge of the eigenvalues of the vibrational Hamiltonian. Typically in these problems, the dimension of Hilbert space is huge. Practically, only a small subset of eigenvalues is required. In this paper, we present a highly efficient algorithm, named EvArnoldi, for solving the large-scale eigenvalues problem. The algorithm, in its basic formulation, is mathematically equivalent to ARPACK ( Sorensen , D. C. Implicitly Restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations ; Springer , 1997 ; Lehoucq , R. B. ; Sorensen , D. C. SIAM Journal on Matrix Analysis and Applications 1996 , 17 , 789 ; Calvetti , D. ; Reichel , L. ; Sorensen , D. C. Electronic Transactions on Numerical Analysis 1994 , 2 , 21 ) (or Eigs of Matlab) but significantly simpler.

  3. Large-scale dynamic compaction demonstration using WIPP salt: Fielding and preliminary results

    International Nuclear Information System (INIS)

    Ahrens, E.H.; Hansen, F.D.

    1995-10-01

    Reconsolidation of crushed rock salt is a phenomenon of great interest to programs studying isolation of hazardous materials in natural salt geologic settings. Of particular interest is the potential for disaggregated salt to be restored to nearly an impermeable state. For example, reconsolidated crushed salt is proposed as a major shaft seal component for the Waste Isolation Pilot Plant (WIPP) Project. The concept for a permanent shaft seal component of the WIPP repository is to densely compact crushed salt in the four shafts; an effective seal will then be developed as the surrounding salt creeps into the shafts, further consolidating the crushed salt. Fundamental information on placement density and permeability is required to ensure attainment of the design function. The work reported here is the first large-scale compaction demonstration to provide information on initial salt properties applicable to design, construction, and performance expectations. The shaft seals must function for 10,000 years. Over this period a crushed salt mass will become less permeable as it is compressed by creep closure of salt surrounding the shaft. These facts preclude the possibility of conducting a full-scale, real-time field test. Because permanent seals taking advantage of salt reconsolidation have never been constructed, performance measurements have not been made on an appropriately large scale. An understanding of potential construction methods, achievable initial density and permeability, and performance of reconsolidated salt over time is required for seal design and performance assessment. This report discusses fielding and operations of a nearly full-scale dynamic compaction of mine-run WIPP salt, and presents preliminary density and in situ (in place) gas permeability results

  4. A Complementary Resistive Switch-based Crossbar Array Adder

    OpenAIRE

    Siemon, A.; Menzel, S.; Waser, R.; Linn, E.

    2014-01-01

    Redox-based resistive switching devices (ReRAM) are an emerging class of non-volatile storage elements suited for nanoscale memory applications. In terms of logic operations, ReRAM devices were suggested to be used as programmable interconnects, large-scale look-up tables or for sequential logic operations. However, without additional selector devices these approaches are not suited for use in large scale nanocrossbar memory arrays, which is the preferred architecture for ReRAM devices due to...

  5. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  6. Local Helioseismology of Emerging Active Regions: A Case Study

    Science.gov (United States)

    Kosovichev, Alexander G.; Zhao, Junwei; Ilonidis, Stathis

    2018-04-01

    Local helioseismology provides a unique opportunity to investigate the subsurface structure and dynamics of active regions and their effect on the large-scale flows and global circulation of the Sun. We use measurements of plasma flows in the upper convection zone, provided by the Time-Distance Helioseismology Pipeline developed for analysis of solar oscillation data obtained by Helioseismic and Magnetic Imager (HMI) on Solar Dynamics Observatory (SDO), to investigate the subsurface dynamics of emerging active region NOAA 11726. The active region emergence was detected in deep layers of the convection zone about 12 hours before the first bipolar magnetic structure appeared on the surface, and 2 days before the emergence of most of the magnetic flux. The speed of emergence determined by tracking the flow divergence with depth is about 1.4 km/s, very close to the emergence speed in the deep layers. As the emerging magnetic flux becomes concentrated in sunspots local converging flows are observed beneath the forming sunspots. These flows are most prominent in the depth range 1-3 Mm, and remain converging after the formation process is completed. On the larger scale converging flows around active region appear as a diversion of the zonal shearing flows towards the active region, accompanied by formation of a large-scale vortex structure. This process occurs when a substantial amount of the magnetic flux emerged on the surface, and the converging flow pattern remains stable during the following evolution of the active region. The Carrington synoptic flow maps show that the large-scale subsurface inflows are typical for active regions. In the deeper layers (10-13 Mm) the flows become diverging, and surprisingly strong beneath some active regions. In addition, the synoptic maps reveal a complex evolving pattern of large-scale flows on the scale much larger than supergranulation

  7. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  8. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  9. The Role of Remote Sensing for Understanding Large-Scale Rubber Concession Expansion in Southern Laos

    Directory of Open Access Journals (Sweden)

    Mutlu Özdoğan

    2018-04-01

    Full Text Available Increasing global demand for natural rubber began in the mid-2000s and led to large-scale expansion of plantations in Laos until rubber latex prices declined greatly beginning in 2011. The expansion of rubber did not, however, occur uniformly across the country. While the north and central Laos experienced mostly local and smallholder plantations, rubber expansion in the south was dominated by transnational companies from Vietnam, China and Thailand through large-scale land concessions, often causing conflicts with local communities. In this study we use satellite remote sensing to identify and map the expansion of large-scale rubber plantations in Champasak Province—the first area in southern Laos to host large-scale rubber development—and document the biophysical impacts on the local landscape, which of course is linked to social impacts on local people. Our study demonstrates that the expansion of rubber in the province was rapid and did not always conform to approved concession area locations. The mono-culture nature of rubber plantations also had the effect of homogenizing the landscape, eclipsing the changes caused by local populations. We argue that by providing a relatively inexpensive way to track the expansion of rubber plantations over space and time, remote sensing has the potential to provide advocates and other civil society groups with data that might otherwise remain limited to the restricted domains of state regulation and private sector reporting. However, we also caution that while remote sensing has the potential to provide strong public evidence about plantation expansion, access to and control of this information ultimately determines its value.

  10. Policies and Livestock Systems Driving Brucellosis Re-emergence in Kazakhstan.

    Science.gov (United States)

    Beauvais, Wendy; Coker, Richard; Nurtazina, Gulzhan; Guitian, Javier

    2017-06-01

    Brucellosis is a considerable public health and economic burden in many areas of the world including sub-Saharan Africa, the Middle East and former USSR countries. The collapse of the USSR has been cited as a driver for re-emergence of diseases including brucellosis, and human incidence rates in the former Soviet republics have been estimated as high as 88 per 100,000 per year. The aim of this paper is to examine the historical trends in brucellosis in Kazakhstan and to explore how livestock systems, veterinary services and control policies may have influenced them. In conclusion, a brucellosis epidemic most likely began before the collapse of the USSR and high livestock densities may have played an important role. Changes to the livestock systems in Kazakhstan, as well as other factors, are likely to have an impact on the success of brucellosis policies in the future. Incentives and practicalities of different policies in smallholder settings should be considered. However, the lack of reliable estimates of brucellosis prevalence and difficulties in understanding exactly how policy is being applied in Kazakhstan, which is a vast country with low population density, prevent firm conclusions from being drawn.

  11. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  12. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  13. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  14. Solving large scale unit dilemma in electricity system by applying commutative law

    Science.gov (United States)

    Legino, Supriadi; Arianto, Rakhmat

    2018-03-01

    The conventional system, pooling resources with large centralized power plant interconnected as a network. provides a lot of advantages compare to the isolated one include optimizing efficiency and reliability. However, such a large plant need a huge capital. In addition, more problems emerged to hinder the construction of big power plant as well as its associated transmission lines. By applying commutative law of math, ab = ba, for all a,b €-R, the problem associated with conventional system as depicted above, can be reduced. The idea of having small unit but many power plants, namely “Listrik Kerakyatan,” abbreviated as LK provides both social and environmental benefit that could be capitalized by using proper assumption. This study compares the cost and benefit of LK to those of conventional system, using simulation method to prove that LK offers alternative solution to answer many problems associated with the large system. Commutative Law of Algebra can be used as a simple mathematical model to analyze whether the LK system as an eco-friendly distributed generation can be applied to solve various problems associated with a large scale conventional system. The result of simulation shows that LK provides more value if its plants operate in less than 11 hours as peaker power plant or load follower power plant to improve load curve balance of the power system. The result of simulation indicates that the investment cost of LK plant should be optimized in order to minimize the plant investment cost. This study indicates that the benefit of economies of scale principle does not always apply to every condition, particularly if the portion of intangible cost and benefit is relatively high.

  15. Zika, chikungunya and dengue: the causes and threats of new and re-emerging arboviral diseases.

    Science.gov (United States)

    Paixão, Enny S; Teixeira, Maria Gloria; Rodrigues, Laura C

    2018-01-01

    The recent emergence and re-emergence of viral infections transmitted by vectors-Zika, chikungunya, dengue, Japanese encephalitis, West Nile, yellow fever and others-is a cause for international concern. Using as examples Zika, chikungunya and dengue, we summarise current knowledge on characteristics of the viruses and their transmission, clinical features, laboratory diagnosis, burden, history, possible causes of the spread and the expectation for future epidemics. Arboviruses are transmitted by mosquitoes, are of difficult diagnosis, can have surprising clinical complications and cause severe burden. The current situation is complex, because there is no vaccine for Zika and chikungunya and no specific treatment for the three arboviruses. Vector control is the only comprehensive solution available now and this remains a challenge because up to now this has not been very effective. Until we develop new technologies of control mosquito populations, the globalised and urbanised world we live in will remain vulnerable to the threat of successive arbovirus epidemics.

  16. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  17. Emerging and re-emerging bacterial diseases in India

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    et al (2003) have discussed the epidemiology of V. cholerae and Aeromonas in a five year prospective study in Mumbai. 3.3 Listeria monocytogenes. Listerosis is an emerging zoonotic disease. It is estimated that L. monocytogenes is responsible for 28% deaths due to foodborne illnesses in the United States. The organism.

  18. Large-scale dynamical influence of a gravity wave generated over the Antarctic Peninsula – regional modelling and budget analysis

    Directory of Open Access Journals (Sweden)

    JOEL Arnault

    2013-03-01

    Full Text Available The case study of a mountain wave triggered by the Antarctic Peninsula on 6 October 2005, which has already been documented in the literature, is chosen here to quantify the associated gravity wave forcing on the large-scale flow, with a budget analysis of the horizontal wind components and horizontal kinetic energy. In particular, a numerical simulation using the Weather Research and Forecasting (WRF model is compared to a control simulation with flat orography to separate the contribution of the mountain wave from that of other synoptic processes of non-orographic origin. The so-called differential budgets of horizontal wind components and horizontal kinetic energy (after subtracting the results from the simulation without orography are then averaged horizontally and vertically in the inner domain of the simulation to quantify the mountain wave dynamical influence at this scale. This allows for a quantitative analysis of the simulated mountain wave's dynamical influence, including the orographically induced pressure drag, the counterbalancing wave-induced vertical transport of momentum from the flow aloft, the momentum and energy exchanges with the outer flow at the lateral and upper boundaries, the effect of turbulent mixing, the dynamics associated with geostrophic re-adjustment of the inner flow, the deceleration of the inner flow, the secondary generation of an inertia–gravity wave and the so-called baroclinic conversion of energy between potential energy and kinetic energy.

  19. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  20. A Novel CPU/GPU Simulation Environment for Large-Scale Biologically-Realistic Neural Modeling

    Directory of Open Access Journals (Sweden)

    Roger V Hoang

    2013-10-01

    Full Text Available Computational Neuroscience is an emerging field that provides unique opportunities to studycomplex brain structures through realistic neural simulations. However, as biological details are added tomodels, the execution time for the simulation becomes longer. Graphics Processing Units (GPUs are now being utilized to accelerate simulations due to their ability to perform computations in parallel. As such, they haveshown significant improvement in execution time compared to Central Processing Units (CPUs. Most neural simulators utilize either multiple CPUs or a single GPU for better performance, but still show limitations in execution time when biological details are not sacrificed. Therefore, we present a novel CPU/GPU simulation environment for large-scale biological networks,the NeoCortical Simulator version 6 (NCS6. NCS6 is a free, open-source, parallelizable, and scalable simula-tor, designed to run on clusters of multiple machines, potentially with high performance computing devicesin each of them. It has built-in leaky-integrate-and-fire (LIF and Izhikevich (IZH neuron models, but usersalso have the capability to design their own plug-in interface for different neuron types as desired. NCS6is currently able to simulate one million cells and 100 million synapses in quasi real time by distributing dataacross these heterogeneous clusters of CPUs and GPUs.