WorldWideScience

Sample records for large-scale production processes

  1. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  2. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  3. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Large-scale production of UO2 kernels by sol–gel process at INET

    International Nuclear Information System (INIS)

    Hao, Shaochang; Ma, Jingtao; Zhao, Xingyu; Wang, Yang; Zhou, Xiangwen; Deng, Changsheng

    2014-01-01

    In order to supply elements (300,000 elements per year) for the Chinese pebble bed modular high temperature gas cooled reactor (HTR-PM), it is necessary to scale up the production of UO 2 kernels to 3–6 kgU per batch. The sol–gel process for preparation of UO 2 kernels have been improved and optimized at Institute of Nuclear and New Energy Technology (INET), Tsinghua University, PR China, and a whole set of facility was designed and constructed based on the process. This report briefly describes the main steps of the process, the key equipment and the production capacities of every step. Six batches of kernels for scale-up verification and four batches of kernels for fuel elements for in-pile irradiation tests have been successfully produced, respectively. The quality of the produced kernels meets the design requirements. The production capacity of the process reaches 3–6 kgU per batch

  5. Computational Modelling of Large Scale Phage Production Using a Two-Stage Batch Process

    Directory of Open Access Journals (Sweden)

    Konrad Krysiak-Baltyn

    2018-04-01

    Full Text Available Cost effective and scalable methods for phage production are required to meet an increasing demand for phage, as an alternative to antibiotics. Computational models can assist the optimization of such production processes. A model is developed here that can simulate the dynamics of phage population growth and production in a two-stage, self-cycling process. The model incorporates variable infection parameters as a function of bacterial growth rate and employs ordinary differential equations, allowing application to a setup with multiple reactors. The model provides simple cost estimates as a function of key operational parameters including substrate concentration, feed volume and cycling times. For the phage and bacteria pairing examined, costs and productivity varied by three orders of magnitude, with the lowest cost found to be most sensitive to the influent substrate concentration and low level setting in the first vessel. An example case study of phage production is also presented, showing how parameter values affect the production costs and estimating production times. The approach presented is flexible and can be used to optimize phage production at laboratory or factory scale by minimizing costs or maximizing productivity.

  6. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  7. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  8. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  9. Large-scale production of poly(3-hydroxyoctanoic acid) by Pseudomonas putida GPo1 and a simplified downstream process.

    Science.gov (United States)

    Elbahloul, Yasser; Steinbüchel, Alexander

    2009-02-01

    The suitability of Pseudomonas putida GPo1 for large-scale cultivation and production of poly(3-hydroxyoctanoate) (PHO) was investigated in this study. Three fed-batch cultivations of P. putida GPo1 at the 350- or 400-liter scale in a bioreactor with a capacity of 650 liters were done in mineral salts medium containing initially 20 mM sodium octanoate as the carbon source. The feeding solution included ammonium octanoate, which was fed at a relatively low concentration to promote PHO accumulation under nitrogen-limited conditions. During cultivation, the pH was regulated by addition of NaOH, NH(4)OH, or octanoic acid, which was used as an additional carbon source. Partial O(2) pressure (pO(2)) was adjusted to 20 to 40% by controlling the airflow and stirrer speed. Under the optimized conditions, P. putida GPo1 was able to grow to cell densities as high as 18, 37, and 53 g cells (dry mass) (CDM) per liter containing 49, 55, and 60% (wt/wt) of PHO, respectively. The resulting 40 kg CDM from these three cultivations was used directly for extraction of PHO. Three different methods of extraction of PHO were applied. From these, only acetone extraction showed better performance and resulted in 94% recovery of the PHO contents of cells. A novel mixture of precipitation solvents composed of 70% (vol/vol) methanol and 70% (vol/vol) ethanol was identified in this study. The ratio of PHO concentrate to the mixture was 0.2:1 (vol/vol) and allowed complete precipitation of PHO as white flakes. However, at a ratio of 1:1 (vol/vol) of the solvent mixture to PHO concentrate, a highly purified PHO was obtained. Precipitation yielded a dough-like polymeric material which was cast into thin layers and then shredded into small strips to allow evaporation of the remaining solvents. Gas chromatographic analysis revealed a purity of about 99% +/- 0.2% (wt/wt) of the polymer, which consisted mainly of 3-hydroxyoctanoic acid (96 mol%).

  10. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  11. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  12. Towards large-scale production of solution-processed organic tandem modules based on ternary composites: Design of the intermediate layer, device optimization and laser based module processing

    DEFF Research Database (Denmark)

    Li, Ning; Kubis, Peter; Forberich, Karen

    2014-01-01

    on commercially available materials, which enhances the absorption of poly(3-hexylthiophene) (P3HT) and as a result increase the PCE of the P3HT-based large-scale OPV devices; 3. laser-based module processing, which provides an excellent processing resolution and as a result can bring the power conversion...... efficiency (PCE) of mass-produced organic photovoltaic (OPV) devices close to the highest PCE values achieved for lab-scale solar cells through a significant increase in the geometrical fill factor. We believe that the combination of the above mentioned concepts provides a clear roadmap to push OPV towards...

  13. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  14. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  15. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  16. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  17. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  18. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  19. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  20. LARGE SCALE METHOD FOR THE PRODUCTION AND PURIFICATION OF CURIUM

    Science.gov (United States)

    Higgins, G.H.; Crane, W.W.T.

    1959-05-19

    A large-scale process for production and purification of Cm/sup 242/ is described. Aluminum slugs containing Am are irradiated and declad in a NaOH-- NaHO/sub 3/ solution at 85 to 100 deg C. The resulting slurry filtered and washed with NaOH, NH/sub 4/OH, and H/sub 2/O. Recovery of Cm from filtrate and washings is effected by an Fe(OH)/sub 3/ precipitation. The precipitates are then combined and dissolved ln HCl and refractory oxides centrifuged out. These oxides are then fused with Na/sub 2/CO/sub 3/ and dissolved in HCl. The solution is evaporated and LiCl solution added. The Cm, rare earths, and anionic impurities are adsorbed on a strong-base anfon exchange resin. Impurities are eluted with LiCl--HCl solution, rare earths and Cm are eluted by HCl. Other ion exchange steps further purify the Cm. The Cm is then precipitated as fluoride and used in this form or further purified and processed. (T.R.H.)

  1. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  2. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  3. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  4. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  5. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Large-scale enzymatic production of natural flavour esters in organic solvent with continuous water removal.

    Science.gov (United States)

    Gubicza, L; Kabiri-Badr, A; Keoves, E; Belafi-Bako, K

    2001-11-30

    A new, large-scale process was developed for the enzymatic production of low molecular weight flavour esters in organic solvent. Solutions for the elimination of substrate and product inhibitions are presented. The excess water produced during the process was continuously removed by hetero-azeotropic distillation and esters were produced at yields of over 90%.

  7. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  8. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  9. Toyota production system beyond large-scale production

    CERN Document Server

    Ohno, Taiichi

    1998-01-01

    In this classic text, Taiichi Ohno--inventor of the Toyota Production System and Lean manufacturing--shares the genius that sets him apart as one of the most disciplined and creative thinkers of our time. Combining his candid insights with a rigorous analysis of Toyota's attempts at Lean production, Ohno's book explains how Lean principles can improve any production endeavor. A historical and philosophical description of just-in-time and Lean manufacturing, this work is a must read for all students of human progress. On a more practical level, it continues to provide inspiration and instruction for those seeking to improve efficiency through the elimination of waste.

  10. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  11. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  12. Quality Assurance in Large Scale Online Course Production

    Science.gov (United States)

    Holsombach-Ebner, Cinda

    2013-01-01

    The course design and development process (often referred to here as the "production process") at Embry-Riddle Aeronautical University (ERAU-Worldwide) aims to produce turnkey style courses to be taught by a highly-qualified pool of over 800 instructors. Given the high number of online courses and tremendous number of live sections…

  13. Evaluation of enzymatic reactors for large-scale panose production.

    Science.gov (United States)

    Fernandes, Fabiano A N; Rodrigues, Sueli

    2007-07-01

    Panose is a trisaccharide constituted by a maltose molecule bonded to a glucose molecule by an alpha-1,6-glycosidic bond. This trisaccharide has potential to be used in the food industry as a noncariogenic sweetener, as the oral flora does not ferment it. Panose can also be considered prebiotic for stimulating the growth of benefic microorganisms, such as lactobacillus and bifidobacteria, and for inhibiting the growth of undesired microorganisms such as E. coli and Salmonella. In this paper, the production of panose by enzymatic synthesis in a batch and a fed-batch reactor was optimized using a mathematical model developed to simulate the process. Results show that optimum production is obtained in a fed-batch process with an optimum production of 11.23 g/l h of panose, which is 51.5% higher than production with batch reactor.

  14. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  15. The potential for large scale uses for fission product xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-01-01

    Of all fission products in spent, low enrichment, uranium, power reactor fuels xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the U.S. radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state of the art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much more voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays and luminescence as well as for medicinal diagnostics and therapeutics fission product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly higher atomic weight, because of the much higher concentrations of the 134 X and 136 Xe isotopes. Therefore, fission product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  16. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  17. Logistics of large scale commercial IVF embryo production.

    Science.gov (United States)

    Blondin, P

    2016-01-01

    The use of IVF in agriculture is growing worldwide. This can be explained by the development of better IVF media and techniques, development of sexed semen and the recent introduction of bovine genomics on farms. Being able to perform IVF on a large scale, with multiple on-farm experts to perform ovum pick-up and IVF laboratories capable of handling large volumes in a consistent and sustainable way, remains a huge challenge. To be successful, there has to be a partnership between veterinarians on farms, embryologists in the laboratory and animal owners. Farmers must understand the limits of what IVF can or cannot do under different conditions; veterinarians must manage expectations of farmers once strategies have been developed regarding potential donors; and embryologists must maintain fluent communication with both groups to make sure that objectives are met within predetermined budgets. The logistics of such operations can be very overwhelming, but the return can be considerable if done right. The present mini review describes how such operations can become a reality, with an emphasis on the different aspects that must be considered by all parties.

  18. Large Scale Production of Stem Cells and Their Derivatives

    Science.gov (United States)

    Zweigerdt, Robert

    Stem cells have been envisioned to become an unlimited cell source for regenerative medicine. Notably, the interest in stem cells lies beyond direct therapeutic applications. They might also provide a previously unavailable source of valuable human cell types for screening platforms, which might facilitate the development of more efficient and safer drugs. The heterogeneity of stem cell types as well as the numerous areas of application suggests that differential processes are mandatory for their in vitro culture. Many of the envisioned applications would require the production of a high number of stem cells and their derivatives in scalable, well-defined and potentially clinical compliant manner under current good manufacturing practice (cGMP). In this review we provide an overview on recent strategies to develop bioprocesses for the expansion, differentiation and enrichment of stem cells and their progenies, presenting examples for adult and embryonic stem cells alike.

  19. Metoder for Modellering, Simulering og Regulering af Større Termiske Processer anvendt i Sukkerproduktion. Methods for Modelling, Simulation and Control of Large Scale Thermal Systems Applied in Sugar Production

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    The subject of this Ph.D. thesis is to investigate and develop methods for modelling, simulation and control applicable in large scale termal industrial plants. An ambition has been to evaluate the results in a physical process. Sugar production is well suited for the purpose. In collaboration...... simulator has been developed. The simulator handles the normal working conditions relevant to control engineers. A non-linear dynamic model based on mass and energy balances has been developed. The model parameters have been adjusted to data measured on a Danish sugar plant. The simulator consists...... of a computer, a data terminal and an electric interface corresponding to the interface at the sugar plant. The simulator is operating in realtime and thus a realistic test of controllers is possible. The idiomatic control methodology has been investigated developing a control concept for the evaporation...

  20. Market competitive Fischer-Tropsch diesel production. Techno-economic and environmental analysis of a thermo-chemical Biorefinery process for large scale biosyngas-derived FT-diesel production

    International Nuclear Information System (INIS)

    Van Ree, R.; Van der Drift, A.; Zwart, R.W.R.; Boerrigter, H.

    2005-08-01

    The contents of the presentation are summarized as follows: Introduction of the Dutch policy framework, Biomass availability and contractibility, and Biomass transportation fuels: current use and perspectives; Next subject concerns Large-scale BioSyngas production: optimum gasification technology; slagging EF-gasifier; identification and modelling biomass-conversion chains; overall energetic chain efficiencies, economics, environmental char; and a comparison with fossil-derived diesel. Further subjects are Current technological SOTA and R, D and D-trajectory; Pre-design 600 MWth demonstration plant; and the Conclusions

  1. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  2. Ethanol Production from Biomass: Large Scale Facility Design Project

    Energy Technology Data Exchange (ETDEWEB)

    Berson, R. Eric [Univ. of Louisville, KY (United States)

    2009-10-29

    High solids processing of biomass slurries provides the following benefits: maximized product concentration in the fermentable sugar stream, reduced water usage, and reduced reactor size. However, high solids processing poses mixing and heat transfer problems above about 15% for pretreated corn stover solids due to their high viscosities. Also, highly viscous slurries require high power consumption in conventional stirred tanks since they must be run at high rotational speeds to maintain proper mixing. An 8 liter scraped surface bio-reactor (SSBR) is employed here that is designed to efficiently handle high solids loadings for enzymatic saccharification of pretreated corn stover (PCS) while maintaining power requirements on the order of low viscous liquids in conventional stirred tanks. Saccharification of biomass exhibit slow reaction rates and incomplete conversion, which may be attributed to enzyme deactivation and loss of activity due to a variety of mechanisms. Enzyme deactivation is classified into two categories here: one, deactivation due to enzyme-substrate interactions and two, deactivation due to all other factors that are grouped together and termed “non-specific” deactivation. A study was conducted to investigate the relative extents of “non-specific” deactivation and deactivation due to “enzyme-substrate interactions” and a model was developed that describes the kinetics of cellulose hydrolysis by considering the observed deactivation effects. Enzyme substrate interactions had a much more significant effect on overall deactivation with a deactivation rate constant about 20X higher than the non-specific deactivation rate constant (0.35 h-1 vs 0.018 h-1). The model is well validated by the experimental data and predicts complete conversion of cellulose within 30 hours in the absence of enzyme substrate interactions.

  3. Performance of mushroom fruiting for large scale commercial production

    International Nuclear Information System (INIS)

    Mat Rosol Awang; Rosnani Abdul Rashid; Hassan Hamdani Mutaat; Mohd Meswan Maskom

    2012-01-01

    The paper described the determination of mushroom fruiting yield, which is vital to economics of mushroom production. Consistency in mushroom yields enabling an estimation to be made for revenues and hence profitability could be predicted. It has been reported by many growers, there are a large variation in mushroom yields over different times of production. To assess such claims we have run four batches of mushroom fruiting and the performance fruiting body productions are presented. (author)

  4. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  5. Scenario analysis of large scale algae production in tubular photobioreactors

    NARCIS (Netherlands)

    Slegers, P.M.; Beveren, van P.J.M.; Wijffels, R.H.; Straten, van G.; Boxtel, van A.J.B.

    2013-01-01

    Microalgae productivity in tubular photobioreactors depends on algae species, location, tube diameter, biomass concentration, distance between tubes and for vertically stacked systems, the number of horizontal tubes per stack. A simulation model for horizontal and vertically stacked horizontal

  6. Large-scale distribution of tritium in a commercial product

    International Nuclear Information System (INIS)

    Combs, F.; Doda, R.J.

    1979-01-01

    Tritium enters the environment from various sources including nuclear reactor operations, weapons testing, natural production, and from the manufacture, use and ultimate disposal of commercial products containing tritium. A recent commercial application of tritium in the United States of America involves the backlighting of liquid crystal displays (LCD) in digital electronic watches. These watches are distributed through normal commercial channels to the general public. One million curies (MCi) of tritium were distributed in 1977 in this product. This is a significant quantity of tritium compared with power reactor-produced tritium (3MCi yearly) or with naturally produced tritium (6MCi yearly). This is the single largest commercial application involving tritium to date. The final disposition of tritium from large quantities of this product, after its useful life, must be estimated by considering the means of disposal and the possibility of dispersal of tritium concurrent with disposal. The most likely method of final disposition of this product will be disposal in solid refuse; this includes burial in land fills and incineration. Burial in land fills will probably contain the tritium for its effective lifetime, whereas incineration will release all the tritium gas (as the oxide) to the atmosphere. The use and disposal of this product will be studied as part of an environmental study that is at present being prepared for the U.S. Nuclear Regulatory Commission. (author)

  7. Large scale production of antitumor cucurbitacins from Ecballium ...

    African Journals Online (AJOL)

    ajl6

    2012-08-16

    Aug 16, 2012 ... 1Department of Plant Biotechnology, National Research Center, Cairo, 12622 Egypt. ... Bioreactor plays a vital role in the commercial production of secondary metabolites .... comparing the peak area with that at the same retention time with ... air dried by rotatory evaporator and then extracted using ethanol:.

  8. Guide to Large Scale Production of Moringa oleifera (Lam.) for ...

    African Journals Online (AJOL)

    Thus, the use of environmentally safe natural pesticides as an alternative is being embraced in aquaculture because they have a short time of toxicity disappearance and biodegradable. This will enhance the principles of sustainable aquaculture production and its management in Nigeria. In Southwestern Nigeria, there is ...

  9. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  10. Manufacturing Process Simulation of Large-Scale Cryotanks

    Science.gov (United States)

    Babai, Majid; Phillips, Steven; Griffin, Brian

    2003-01-01

    NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing

  11. LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY

    International Nuclear Information System (INIS)

    SCHULTZ, K.R.; BROWN, L.C.; BESENBRUCH, G.E.; HAMILTON, C.J.

    2003-01-01

    OAK B202 LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY. The ''Hydrogen Economy'' will reduce petroleum imports and greenhouse gas emissions. However, current commercial hydrogen production processes use fossil fuels and releases carbon dioxide. Hydrogen produced from nuclear energy could avoid these concerns. The authors have recently completed a three-year project for the US Department of Energy whose objective was to ''define an economically feasible concept for production of hydrogen, by nuclear means, using an advanced high-temperature nuclear reactor as the energy source''. Thermochemical water-splitting, a chemical process that accomplishes the decomposition of water into hydrogen and oxygen, met this objective. The goal of the first phase of this study was to evaluate thermochemical processes which offer the potential for efficient, cost-effective, large-scale production of hydrogen and to select one for further detailed consideration. The authors selected the Sulfur-Iodine cycle, In the second phase, they reviewed all the basic reactor types for suitability to provide the high temperature heat needed by the selected thermochemical water splitting cycle and chose the helium gas-cooled reactor. In the third phase they designed the chemical flowsheet for the thermochemical process and estimated the efficiency and cost of the process and the projected cost of producing hydrogen. These results are summarized in this paper

  12. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  13. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  14. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    Science.gov (United States)

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  16. Very-large-scale production of antibodies in plants: The biologization of manufacturing.

    Science.gov (United States)

    Buyel, J F; Twyman, R M; Fischer, R

    2017-07-01

    Gene technology has facilitated the biologization of manufacturing, i.e. the use and production of complex biological molecules and systems at an industrial scale. Monoclonal antibodies (mAbs) are currently the major class of biopharmaceutical products, but they are typically used to treat specific diseases which individually have comparably low incidences. The therapeutic potential of mAbs could also be used for more prevalent diseases, but this would require a massive increase in production capacity that could not be met by traditional fermenter systems. Here we outline the potential of plants to be used for the very-large-scale (VLS) production of biopharmaceutical proteins such as mAbs. We discuss the potential market sizes and their corresponding production capacities. We then consider available process technologies and scale-down models and how these can be used to develop VLS processes. Finally, we discuss which adaptations will likely be required for VLS production, lessons learned from existing cell culture-based processes and the food industry, and practical requirements for the implementation of a VLS process. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  18. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  20. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  1. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  2. Large-scale production and properties of human plasma-derived activated Factor VII concentrate.

    Science.gov (United States)

    Tomokiyo, K; Yano, H; Imamura, M; Nakano, Y; Nakagaki, T; Ogata, Y; Terano, T; Miyamoto, S; Funatsu, A

    2003-01-01

    An activated Factor VII (FVIIa) concentrate, prepared from human plasma on a large scale, has to date not been available for clinical use for haemophiliacs with antibodies against FVIII and FIX. In the present study, we attempted to establish a large-scale manufacturing process to obtain plasma-derived FVIIa concentrate with high recovery and safety, and to characterize its biochemical and biological properties. FVII was purified from human cryoprecipitate-poor plasma, by a combination of anion exchange and immunoaffinity chromatography, using Ca2+-dependent anti-FVII monoclonal antibody. To activate FVII, a FVII preparation that was nanofiltered using a Bemberg Microporous Membrane-15 nm was partially converted to FVIIa by autoactivation on an anion-exchange resin. The residual FVII in the FVII and FVIIa mixture was completely activated by further incubating the mixture in the presence of Ca2+ for 18 h at 10 degrees C, without any additional activators. For preparation of the FVIIa concentrate, after dialysis of FVIIa against 20 mm citrate, pH 6.9, containing 13 mm glycine and 240 mm NaCl, the FVIIa preparation was supplemented with 2.5% human albumin (which was first pasteurized at 60 degrees C for 10 h) and lyophilized in vials. To inactivate viruses contaminating the FVIIa concentrate, the lyophilized product was further heated at 65 degrees C for 96 h in a water bath. Total recovery of FVII from 15 000 l of plasma was approximately 40%, and the FVII preparation was fully converted to FVIIa with trace amounts of degraded products (FVIIabeta and FVIIagamma). The specific activity of the FVIIa was approximately 40 U/ micro g. Furthermore, virus-spiking tests demonstrated that immunoaffinity chromatography, nanofiltration and dry-heating effectively removed and inactivated the spiked viruses in the FVIIa. These results indicated that the FVIIa concentrate had both high specific activity and safety. We established a large-scale manufacturing process of human plasma

  3. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  4. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  5. The use of soil moisture - remote sensing products for large-scale groundwater modeling and assessment

    NARCIS (Netherlands)

    Sutanudjaja, E.H.

    2012-01-01

    In this thesis, the possibilities of using spaceborne remote sensing for large-scale groundwater modeling are explored. We focus on a soil moisture product called European Remote Sensing Soil Water Index (ERS SWI, Wagner et al., 1999) - representing the upper profile soil moisture. As a test-bed, we

  6. Polymerase-endonuclease amplification reaction (PEAR for large-scale enzymatic production of antisense oligonucleotides.

    Directory of Open Access Journals (Sweden)

    Xiaolong Wang

    Full Text Available Antisense oligonucleotides targeting microRNAs or their mRNA targets prove to be powerful tools for molecular biology research and may eventually emerge as new therapeutic agents. Synthetic oligonucleotides are often contaminated with highly homologous failure sequences. Synthesis of a certain oligonucleotide is difficult to scale up because it requires expensive equipment, hazardous chemicals and a tedious purification process. Here we report a novel thermocyclic reaction, polymerase-endonuclease amplification reaction (PEAR, for the amplification of oligonucleotides. A target oligonucleotide and a tandem repeated antisense probe are subjected to repeated cycles of denaturing, annealing, elongation and cleaving, in which thermostable DNA polymerase elongation and strand slipping generate duplex tandem repeats, and thermostable endonuclease (PspGI cleavage releases monomeric duplex oligonucleotides. Each round of PEAR achieves over 100-fold amplification. The product can be used in one more round of PEAR directly, and the process can be further repeated. In addition to avoiding dangerous materials and improved product purity, this reaction is easy to scale up and amenable to full automation. PEAR has the potential to be a useful tool for large-scale production of antisense oligonucleotide drugs.

  7. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  8. Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity

    Science.gov (United States)

    Morris, C. K.; Knighton, J.

    2017-12-01

    Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.

  9. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  10. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  11. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  12. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  13. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  14. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  15. Large-scale calculations of the beta-decay rates and r-process nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Borzov, I N; Goriely, S [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); Pearson, J M [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); [Lab. de Physique Nucleaire, Univ. de Montreal, Montreal (Canada)

    1998-06-01

    An approximation to a self-consistent model of the ground state and {beta}-decay properties of neutron-rich nuclei is outlined. The structure of the {beta}-strength functions in stable and short-lived nuclei is discussed. The results of large-scale calculations of the {beta}-decay rates for spherical and slightly deformed nuclides of relevance to the r-process are analysed and compared with the results of existing global calculations and recent experimental data. (orig.)

  16. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  17. The effects of large scale processing on caesium leaching from cemented simulant sodium nitrate waste

    International Nuclear Information System (INIS)

    Lee, D.J.; Brown, D.J.

    1982-01-01

    The effects of large scale processing on the properties of cemented simulant sodium nitrate waste have been investigated. Leach tests have been performed on full-size drums, cores and laboratory samples of cement formulations containing Ordinary Portland Cement (OPC), Sulphate Resisting Portland Cement (SRPC) and a blended cement (90% ground granulated blast furnace slag/10% OPC). In addition, development of the cement hydration exotherms with time and the temperature distribution in 220 dm 3 samples have been followed. (author)

  18. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    Science.gov (United States)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  19. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  20. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  1. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  2. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  3. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  4. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    Science.gov (United States)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  5. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  6. Large-scale functional networks connect differently for processing words and symbol strings.

    Science.gov (United States)

    Liljeström, Mia; Vartiainen, Johanna; Kujala, Jan; Salmelin, Riitta

    2018-01-01

    Reconfigurations of synchronized large-scale networks are thought to be central neural mechanisms that support cognition and behavior in the human brain. Magnetoencephalography (MEG) recordings together with recent advances in network analysis now allow for sub-second snapshots of such networks. In the present study, we compared frequency-resolved functional connectivity patterns underlying reading of single words and visual recognition of symbol strings. Word reading emphasized coherence in a left-lateralized network with nodes in classical perisylvian language regions, whereas symbol processing recruited a bilateral network, including connections between frontal and parietal regions previously associated with spatial attention and visual working memory. Our results illustrate the flexible nature of functional networks, whereby processing of different form categories, written words vs. symbol strings, leads to the formation of large-scale functional networks that operate at distinct oscillatory frequencies and incorporate task-relevant regions. These results suggest that category-specific processing should be viewed not so much as a local process but as a distributed neural process implemented in signature networks. For words, increased coherence was detected particularly in the alpha (8-13 Hz) and high gamma (60-90 Hz) frequency bands, whereas increased coherence for symbol strings was observed in the high beta (21-29 Hz) and low gamma (30-45 Hz) frequency range. These findings attest to the role of coherence in specific frequency bands as a general mechanism for integrating stimulus-dependent information across brain regions.

  7. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  8. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  9. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  10. Luminescence property and large-scale production of ZnO nanowires by current heating deposition

    International Nuclear Information System (INIS)

    Singjai, P.; Jintakosol, T.; Singkarat, S.; Choopun, S.

    2007-01-01

    Large-scale production for ZnO nanowires has been demonstrated by current heating deposition. Based on the use of a solid-vapor phase carbothermal sublimation technique, a ZnO-graphite mixed rod was placed between two copper bars and gradually heated by passing current through it under constant flowing of argon gas at atmospheric pressure. The product seen as white films deposited on the rod surface was separated for further characterizations. The results have shown mainly comb-like structures of ZnO nanowires in diameter ranging from 50 to 200 nm and length up to several tens micrometers. From optical testing, ionoluminescence spectra of as-grown and annealed samples have shown high green emission intensities centered at 510 nm. In contrast, the small UV peak centered at 390 nm was observed clearly in the as-grown sample which almost disappeared after the annealing treatment

  11. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    Energy Technology Data Exchange (ETDEWEB)

    Crater, Jason [Gemomatica, Inc., San Diego, CA (United States); Galleher, Connor [Gemomatica, Inc., San Diego, CA (United States); Lievense, Jeff [Gemomatica, Inc., San Diego, CA (United States)

    2017-05-12

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integrated black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.

  12. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  13. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  14. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  15. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  16. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  17. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  18. Large-Scale Selection and Breeding To Generate Industrial Yeasts with Superior Aroma Production

    Science.gov (United States)

    Steensels, Jan; Meersman, Esther; Snoek, Tim; Saels, Veerle

    2014-01-01

    The concentrations and relative ratios of various aroma compounds produced by fermenting yeast cells are essential for the sensory quality of many fermented foods, including beer, bread, wine, and sake. Since the production of these aroma-active compounds varies highly among different yeast strains, careful selection of variants with optimal aromatic profiles is of crucial importance for a high-quality end product. This study evaluates the production of different aroma-active compounds in 301 different Saccharomyces cerevisiae, Saccharomyces paradoxus, and Saccharomyces pastorianus yeast strains. Our results show that the production of key aroma compounds like isoamyl acetate and ethyl acetate varies by an order of magnitude between natural yeasts, with the concentrations of some compounds showing significant positive correlation, whereas others vary independently. Targeted hybridization of some of the best aroma-producing strains yielded 46 intraspecific hybrids, of which some show a distinct heterosis (hybrid vigor) effect and produce up to 45% more isoamyl acetate than the best parental strains while retaining their overall fermentation performance. Together, our results demonstrate the potential of large-scale outbreeding to obtain superior industrial yeasts that are directly applicable for commercial use. PMID:25192996

  19. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Nunes, Suzana Pereira; Amy, Gary L.

    2014-01-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  20. The large-scale process of microbial carbonate precipitation for nickel remediation from an industrial soil.

    Science.gov (United States)

    Zhu, Xuejiao; Li, Weila; Zhan, Lu; Huang, Minsheng; Zhang, Qiuzhuo; Achal, Varenyam

    2016-12-01

    Microbial carbonate precipitation is known as an efficient process for the remediation of heavy metals from contaminated soils. In the present study, a urease positive bacterial isolate, identified as Bacillus cereus NS4 through 16S rDNA sequencing, was utilized on a large scale to remove nickel from industrial soil contaminated by the battery industry. The soil was highly contaminated with an initial total nickel concentration of approximately 900 mg kg -1 . The soluble-exchangeable fraction was reduced to 38 mg kg -1 after treatment. The primary objective of metal stabilization was achieved by reducing the bioavailability through immobilizing the nickel in the urease-driven carbonate precipitation. The nickel removal in the soils contributed to the transformation of nickel from mobile species into stable biominerals identified as calcite, vaterite, aragonite and nickelous carbonate when analyzed under XRD. It was proven that during precipitation of calcite, Ni 2+ with an ion radius close to Ca 2+ was incorporated into the CaCO 3 crystal. The biominerals were also characterized by using SEM-EDS to observe the crystal shape and Raman-FTIR spectroscopy to predict responsible bonding during bioremediation with respect to Ni immobilization. The electronic structure and chemical-state information of the detected elements during MICP bioremediation process was studied by XPS. This is the first study in which microbial carbonate precipitation was used for the large-scale remediation of metal-contaminated industrial soil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  2. Optimization of Large-Scale Culture Conditions for the Production of Cordycepin with Cordyceps militaris by Liquid Static Culture

    Directory of Open Access Journals (Sweden)

    Chao Kang

    2014-01-01

    Full Text Available Cordycepin is one of the most important bioactive compounds produced by species of Cordyceps sensu lato, but it is hard to produce large amounts of this substance in industrial production. In this work, single factor design, Plackett-Burman design, and central composite design were employed to establish the key factors and identify optimal culture conditions which improved cordycepin production. Using these culture conditions, a maximum production of cordycepin was 2008.48 mg/L for 700 mL working volume in the 1000 mL glass jars and total content of cordycepin reached 1405.94 mg/bottle. This method provides an effective way for increasing the cordycepin production at a large scale. The strategies used in this study could have a wide application in other fermentation processes.

  3. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  4. New systems for the large-scale production of male tsetse flies (Diptera: Glossinidae)

    International Nuclear Information System (INIS)

    Opiyo, E.; Luger, D.; Robinson, A.S.

    2000-01-01

    morsitans morsitans Westwood produced a total of 500,000 sterile males. In Burkina Faso, between 1976 and 1984, a colony of 330,000 G. palpalis gambiensis Vanderplank and G. tachinoides Westwood provided 950,000 sterile males for release into an area of 3,000 km 2 (Clair et al. 1990) while during the Bicot project in Nigeria in an area of 1,500 km 2 , 1.5 million sterile male G. p. palpalis Robineau-Desvoidy were released (Olandunmade et al. 1990). Recently, 8.5 million sterile males were released on Unguja Island, Zanzibar, the United Republic of Tanzania in an area of 1,600 km 2 produced by a colony of about 600,000 G. austeni Newstead (Saleh et al. 1997, Kitwika et al. 1997). This led to the eradication of the tsetse population and a massive reduction in disease incidence in cattle (Saleh et al. 1997). Tsetse fly SIT has been applied on a limited scale because of the inability to provide large numbers of sterile males for release. The present rearing system is labour intensive and too many quality sensitive steps in the mass production system are not sufficiently standardised to transfer the system directly to large-scale production. Tsetse rearing evolved from feeding on live hosts to an in vitro rearing system where blood is fed to flies through a silicone membrane (Feldmann 1994a). At present, cages are small, hold a small number of flies and have to be manually transferred for feeding and then returned for pupal collection. This limits the number of flies that can be handled at any one time. In order to improve these processes, a Tsetse Production Unit (TPU) was developed and evaluated. During conventional tsetse rearing, flies need to be sexed with the correct number and sex of flies, whether for stocking production cages or for the release of males only. This has to be done by hand on an individual fly basis following the immobilisation of adults at C. A procedure is reported in this paper for the self-stocking of production cages (SSPC) which enables flies to

  5. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Huiqiang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China); Shi, Zhenyu; Li, Shuang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); Liu, Na, E-mail: Naliujlu@163.com [College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China)

    2016-08-30

    Highlights: • Microwave method for synthesizing g-C{sub 3}N{sub 4} with N{sub 2} photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4}. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C{sub 3}N{sub 4}) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N{sub 2} adsorption, UV–vis spectroscopy, SEM, N{sub 2}-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C{sub 3}N{sub 4}, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C{sub 3}N{sub 4}. These nitrogen vacancies not only serve as active sites to adsorb and activate N{sub 2} molecules but also promote interfacial charge transfer from catalysts to N{sub 2} molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4} which is significantly important for the practical application.

  6. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    International Nuclear Information System (INIS)

    Ma, Huiqiang; Shi, Zhenyu; Li, Shuang; Liu, Na

    2016-01-01

    Highlights: • Microwave method for synthesizing g-C_3N_4 with N_2 photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C_3N_4. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C_3N_4) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N_2 adsorption, UV–vis spectroscopy, SEM, N_2-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C_3N_4, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C_3N_4. These nitrogen vacancies not only serve as active sites to adsorb and activate N_2 molecules but also promote interfacial charge transfer from catalysts to N_2 molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C_3N_4 which is significantly important for the practical application.

  7. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  8. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  9. Production of microbial biosurfactants: Status quo of rhamnolipid and surfactin towards large-scale production.

    Science.gov (United States)

    Henkel, Marius; Geissler, Mareen; Weggenmann, Fabiola; Hausmann, Rudolf

    2017-07-01

    Surfactants are an important class of industrial chemicals. Nowadays oleochemical surfactants such as alkyl polyglycosides (APGs) become increasingly important. This trend towards the utilization of renewable resources continues and consumers increasingly demand for environmentally friendly products. Consequently, research in microbial surfactants has drastically increased in the last years. While for mannosylerythritol lipids and sophorolipids established industrial processes exist, an implementation of other microbially derived surfactants has not yet been achieved. Amongst these biosurfactants, rhamnolipids synthesized by Pseudomonas aeruginosa and surfactin produced by Bacillus subtilis are so far the most analyzed biosurfactants due to their exceptional properties and the concomitant possible applications. In this review, a general overview is given regarding the current status of biosurfactants and benefits attributed to these molecules. Furthermore, the most recent research approaches for both rhamnolipids and surfactin are presented with respect to possible methods for industrial processes and the occurring drawbacks and limitations researchers have to address and overcome. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  11. Rain forest nutrient cycling and productivity in response to large-scale litter manipulation.

    Science.gov (United States)

    Wood, Tana E; Lawrence, Deborah; Clark, Deborah A; Chazdon, Robin L

    2009-01-01

    Litter-induced pulses of nutrient availability could play an important role in the productivity and nutrient cycling of forested ecosystems, especially tropical forests. Tropical forests experience such pulses as a result of wet-dry seasonality and during major climatic events, such as strong El Niños. We hypothesized that (1) an increase in the quantity and quality of litter inputs would stimulate leaf litter production, woody growth, and leaf litter nutrient cycling, and (2) the timing and magnitude of this response would be influenced by soil fertility and forest age. To test these hypotheses in a Costa Rican wet tropical forest, we established a large-scale litter manipulation experiment in two secondary forest sites and four old-growth forest sites of differing soil fertility. In replicated plots at each site, leaves and twigs (forest floor. We analyzed leaf litter mass, [N] and [P], and N and P inputs for addition, removal, and control plots over a two-year period. We also evaluated basal area increment of trees in removal and addition plots. There was no response of forest productivity or nutrient cycling to litter removal; however, litter addition significantly increased leaf litter production and N and P inputs 4-5 months following litter application. Litter production increased as much as 92%, and P and N inputs as much as 85% and 156%, respectively. In contrast, litter manipulation had no significant effect on woody growth. The increase in leaf litter production and N and P inputs were significantly positively related to the total P that was applied in litter form. Neither litter treatment nor forest type influenced the temporal pattern of any of the variables measured. Thus, environmental factors such as rainfall drive temporal variability in litter and nutrient inputs, while nutrient release from decomposing litter influences the magnitude. Seasonal or annual variation in leaf litter mass, such as occurs in strong El Niño events, could positively

  12. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    Directory of Open Access Journals (Sweden)

    Anirban Nandi

    2014-01-01

    Full Text Available Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D. In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA. It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  13. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  14. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    Science.gov (United States)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  15. Large-scale grain growth in the solid-state process: From "Abnormal" to "Normal"

    Science.gov (United States)

    Jiang, Minhong; Han, Shengnan; Zhang, Jingwei; Song, Jiageng; Hao, Chongyan; Deng, Manjiao; Ge, Lingjing; Gu, Zhengfei; Liu, Xinyu

    2018-02-01

    Abnormal grain growth (AGG) has been a common phenomenon during the ceramic or metallurgy processing since prehistoric times. However, usually it had been very difficult to grow big single crystal (centimeter scale over) by using the AGG method due to its so-called occasionality. Based on the AGG, a solid-state crystal growth (SSCG) method was developed. The greatest advantages of the SSCG technology are the simplicity and cost-effectiveness of the technique. But the traditional SSCG technology is still uncontrollable. This article first summarizes the history and current status of AGG, and then reports recent technical developments from AGG to SSCG, and further introduces a new seed-free, solid-state crystal growth (SFSSCG) technology. This SFSSCG method allows us to repeatedly and controllably fabricate large-scale single crystals with appreciable high quality and relatively stable chemical composition at a relatively low temperature, at least in (K0.5Na0.5)NbO3(KNN) and Cu-Al-Mn systems. In this sense, the exaggerated grain growth is no longer 'Abnormal' but 'Normal' since it is able to be artificially controllable and repeated now. This article also provides a crystal growth model to qualitatively explain the mechanism of SFSSCG for KNN system. Compared with the traditional melt and high temperature solution growth methods, the SFSSCG method has the advantages of low energy consumption, low investment, simple technique, composition homogeneity overcoming the issues with incongruent melting and high volatility. This SFSSCG could be helpful for improving the mechanical and physical properties of single crystals, which should be promising for industrial applications.

  16. Forest landscape models, a tool for understanding the effect of the large-scale and long-term landscape processes

    Science.gov (United States)

    Hong S. He; Robert E. Keane; Louis R. Iverson

    2008-01-01

    Forest landscape models have become important tools for understanding large-scale and long-term landscape (spatial) processes such as climate change, fire, windthrow, seed dispersal, insect outbreak, disease propagation, forest harvest, and fuel treatment, because controlled field experiments designed to study the effects of these processes are often not possible (...

  17. Large-scale production of Fischer-Tropsch diesel from biomass. Optimal gasification and gas cleaning systems

    International Nuclear Information System (INIS)

    Boerrigter, H.; Van der Drift, A.

    2004-12-01

    The paper is presented in the form of copies of overhead sheets. The contents concern definitions, an overview of Integrated biomass gasification and Fischer Tropsch (FT) systems (state-of-the-art, gas cleaning and biosyngas production, experimental demonstration and conclusions), some aspects of large-scale systems (motivation, biomass import) and an outlook

  18. Large-Scale Production of Fuel and Feed from Marine Microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Huntley, Mark [Cornell Univ., Ithaca, NY (United States)

    2015-09-30

    In summary, this Consortium has demonstrated a fully integrated process for the production of biofuels and high-value nutritional bioproducts at pre-commercial scale. We have achieved unprecedented yields of algal oil, and converted the oil to viable fuels. We have demonstrated the potential value of the residual product as a viable feed ingredient for many important animals in the global food supply.

  19. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  20. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian. Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  1. Technical data summary: Uranium(IV) production using a large scale electrochemical cell

    International Nuclear Information System (INIS)

    Hsu, T.C.

    1984-05-01

    This Technical Data Summary outlines an electrochemical process to produce U(IV), in the form of uranous nitrate, from U(VI), as uranyl nitrate. U(IV) with hydrazine could then be used as an alternative plutonium reductant to substantially reduce the waste volume from the Purex solvent extraction process. This TDS is divided into three parts. The first part (Chapters I to IV) generally describes the electrochemical production of U(IV). The second part (Chapters V to VII) describes a pilot scale U(IV) production facility that was constructed and operated at an engineering semiworks area of SRP, referred to as TNX. The lst part (Chapter VIII) describes a preliminary design for a full-scale facility that would meet the projected need for U(IV) as a reductant in SRP's separations processes. The preliminary design was described in a Basic Data Summary for the U(IV) production facility, and a Venture Guidance Appraisal (VGA) was prepared from the Basic Data Summary. The VGA for the U(IV) process showed that because of the large capital investment required, this approach to waste reduction was not economically competitive with another alternative that required only modifying the ongoing Purex process at no additional capital cost. However, implementing he U(IV) process as part of an overall canyon renovation, presently scheduled for the 1990's, may be economically attractive. The purpose of this TDS is therefore to bring together the information and experience obtained thus far in the U(IV) program so that a useful body of information will be available to support any future development of this process

  2. Authentication of Fish Products by Large-Scale Comparison of Tandem Mass Spectra

    DEFF Research Database (Denmark)

    Wulff, Tune; Nielsen, Michael Engelbrecht; Deelder, André M.

    2013-01-01

    Authentication of food is a major concern worldwide to ensure that food products are correctly labeled in terms of which animals are actually processed for consumption. Normally authentication is based on species recognition by comparison of selected sequences of DNA or protein. We here present...... a new robust, proteome-wide tandem mass spectrometry method for species recognition and food product authentication. The method does not use or require any genome sequences or selection of tandem mass spectra but uses all acquired data. The experimental steps were performed in a simple, standardized...

  3. Investigating Coastal Processes Responsible for Large-Scale Shoreline Responses to Human Shoreline Stabilization

    Science.gov (United States)

    Slott, J. M.; Murray, A. B.; Ashton, A. D.

    2006-12-01

    Human shoreline stabilization practices, such as beach nourishment (i.e. placing sand on an eroding beach), have become more prevalent as erosion threatens coastal communities. On sandy shorelines, recent experiments with a numerical model of shoreline change (Slott, et al., in press) indicate that moderate shifts in storminess patterns, one possible outcome of global warming, may accelerate the rate at which shorelines erode or accrete, by altering the angular distribution of approaching waves (the `wave climate'). Accelerated erosion would undoubtedly place greater demands on stabilization. Scientists and coastal engineers have typically only considered the site-specific consequences of shoreline stabilization; here we explore the coastal processes responsible for large-scale (10's kms) and long-term (decades) effects using a numerical model developed by Ashton, et al. (2001). In this numerical model, waves breaking at oblique angles drive a flux of sediment along the shoreline, where gradients in this flux can shape the coastline into surprisingly complex forms (e.g. cuspate-capes found on the Carolina coast). Wave "shadowing" plays a major role in shoreline evolution, whereby coastline features may block incoming waves from reaching distant parts. In this work, we include beach nourishment in the Ashton, et al. (2001) model. Using a cuspate-cape shoreline as our initial model condition, we conducted pairs of experiments and varied the wave-climate forcing across each pair, each representing different storminess scenarios. Here we report on one scenario featuring increased extra-tropical storm influence. For each experiment-pair we ran a control experiment with no shoreline stabilization and a second where a beach nourishment project stabilized a cape tip. By comparing the results of these two parallel runs, we isolate the tendency of the shoreline to migrate landward or seaward along the domain due solely to beach nourishment. Significant effects from beach

  4. Large-scale gas dynamical processes affecting the origin and evolution of gaseous galactic halos

    Science.gov (United States)

    Shapiro, Paul R.

    1991-01-01

    Observations of galactic halo gas are consistent with an interpretation in terms of the galactic fountain model in which supernova heated gas in the galactic disk escapes into the halo, radiatively cools and forms clouds which fall back to the disk. The results of a new study of several large-scale gas dynamical effects which are expected to occur in such a model for the origin and evolution of galactic halo gas will be summarized, including the following: (1) nonequilibrium absorption line and emission spectrum diagnostics for radiatively cooling halo gas in our own galaxy, as well the implications of such absorption line diagnostics for the origin of quasar absorption lines in galactic halo clouds of high redshift galaxies; (2) numerical MHD simulations and analytical analysis of large-scale explosions ad superbubbles in the galactic disk and halo; (3) numerical MHD simulations of halo cloud formation by thermal instability, with and without magnetic field; and (4) the effect of the galactic fountain on the galactic dynamo.

  5. Potential for large-scale uses for fission-product Xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-03-01

    Of all fission products in spent, low-enrichment-uranium power-reactor fuels, xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the US, radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state-of-the-art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission-product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much-more-voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays, and luminescence - as well as for medicinal diagnostics and therapeutics - fission-product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly hgiher atomic weight, because of the much higher concentrations of the 134 Xe and 136 Xe isotopes. Therefore, fission-product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  6. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  7. H1 Grid production tool for large scale Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lobodzinski, B; Wissing, Ch [DESY, Hamburg (Germany); Bystritskaya, E; Vorobiew, M [ITEP, Moscow (Russian Federation); Karbach, T M [University of Dortmund (Germany); Mitsyn, S [JINR, Moscow (Russian Federation); Mudrinic, M, E-mail: bogdan.lobodzinski@desy.d [VINS, Belgrad (Serbia)

    2010-04-01

    The H1 Collaboration at HERA has entered the period of high precision analyses based on the final data sample. These analyses require a massive production of simulated Monte Carlo (MC) events. The H1 MC framework (H1MC) is a software for mass MC production on the LCG Grid infrastructure and on a local batch system created by H1 Collaboration. The aim of the tool is a full automatisation of the MC production workflow including management of the MC jobs on the Grid down to copying of the resulting files from the Grid to the H1 mass storage tape device. The H1 MC framework has modular structure, delegating a specific task to each module, including task specific to the H1 experiment: Automatic building of steer and input files, simulation of the H1 detector, reconstruction of particle tracks and post processing calculation. Each module provides data or functionality needed by other modules via a local database. The Grid jobs created for detector simulation and reconstruction from generated MC input files are fully independent and fault-tolerant for 32 and 64-bit LCG Grid architecture and in Grid running state they can be continuously monitored using Relational Grid Monitoring Architecture (R-GMA) service. To monitor the full production chain and detect potential problems, regular checks of the job state are performed using the local database and the Service Availability Monitoring (SAM) framework. The improved stability of the system has resulted in a dramatic increase in the production rate, which exceeded two billion MC events in 2008.

  8. Large Scale Product Recommendation of Supermarket Ware Based on Customer Behaviour Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Kanavos

    2018-05-01

    Full Text Available In this manuscript, we present a prediction model based on the behaviour of each customer using data mining techniques. The proposed model utilizes a supermarket database and an additional database from Amazon, both containing information about customers’ purchases. Subsequently, our model analyzes these data in order to classify customers as well as products, being trained and validated with real data. This model is targeted towards classifying customers according to their consuming behaviour and consequently proposes new products more likely to be purchased by them. The corresponding prediction model is intended to be utilized as a tool for marketers so as to provide an analytically targeted and specified consumer behavior. Our algorithmic framework and the subsequent implementation employ the cloud infrastructure and use the MapReduce Programming Environment, a model for processing large data-sets in a parallel manner with a distributed algorithm on computer clusters, as well as Apache Spark, which is a newer framework built on the same principles as Hadoop. Through a MapReduce model application on each step of the proposed method, text processing speed and scalability are enhanced in reference to other traditional methods. Our results show that the proposed method predicts with high accuracy the purchases of a supermarket.

  9. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    OpenAIRE

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been...

  10. Large-scale membrane transfer process: its application to single-crystal-silicon continuous membrane deformable mirror

    International Nuclear Information System (INIS)

    Wu, Tong; Sasaki, Takashi; Hane, Kazuhiro; Akiyama, Masayuki

    2013-01-01

    This paper describes a large-scale membrane transfer process developed for the construction of large-scale membrane devices via the transfer of continuous single-crystal-silicon membranes from one substrate to another. This technique is applied for fabricating a large stroke deformable mirror. A bimorph spring array is used to generate a large air gap between the mirror membrane and the electrode. A 1.9 mm × 1.9 mm × 2 µm single-crystal-silicon membrane is successfully transferred to the electrode substrate by Au–Si eutectic bonding and the subsequent all-dry release process. This process provides an effective approach for transferring a free-standing large continuous single-crystal-silicon to a flexible suspension spring array with a large air gap. (paper)

  11. A 3D Sphere Culture System Containing Functional Polymers for Large-Scale Human Pluripotent Stem Cell Production

    Directory of Open Access Journals (Sweden)

    Tomomi G. Otsuji

    2014-05-01

    Full Text Available Utilizing human pluripotent stem cells (hPSCs in cell-based therapy and drug discovery requires large-scale cell production. However, scaling up conventional adherent cultures presents challenges of maintaining a uniform high quality at low cost. In this regard, suspension cultures are a viable alternative, because they are scalable and do not require adhesion surfaces. 3D culture systems such as bioreactors can be exploited for large-scale production. However, the limitations of current suspension culture methods include spontaneous fusion between cell aggregates and suboptimal passaging methods by dissociation and reaggregation. 3D culture systems that dynamically stir carrier beads or cell aggregates should be refined to reduce shearing forces that damage hPSCs. Here, we report a simple 3D sphere culture system that incorporates mechanical passaging and functional polymers. This setup resolves major problems associated with suspension culture methods and dynamic stirring systems and may be optimal for applications involving large-scale hPSC production.

  12. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  13. Constructing Model of Relationship among Behaviors and Injuries to Products Based on Large Scale Text Data on Injuries

    Science.gov (United States)

    Nomori, Koji; Kitamura, Koji; Motomura, Yoichi; Nishida, Yoshifumi; Yamanaka, Tatsuhiro; Komatsubara, Akinori

    In Japan, childhood injury prevention is urgent issue. Safety measures through creating knowledge of injury data are essential for preventing childhood injuries. Especially the injury prevention approach by product modification is very important. The risk assessment is one of the most fundamental methods to design safety products. The conventional risk assessment has been carried out subjectively because product makers have poor data on injuries. This paper deals with evidence-based risk assessment, in which artificial intelligence technologies are strongly needed. This paper describes a new method of foreseeing usage of products, which is the first step of the evidence-based risk assessment, and presents a retrieval system of injury data. The system enables a product designer to foresee how children use a product and which types of injuries occur due to the product in daily environment. The developed system consists of large scale injury data, text mining technology and probabilistic modeling technology. Large scale text data on childhood injuries was collected from medical institutions by an injury surveillance system. Types of behaviors to a product were derived from the injury text data using text mining technology. The relationship among products, types of behaviors, types of injuries and characteristics of children was modeled by Bayesian Network. The fundamental functions of the developed system and examples of new findings obtained by the system are reported in this paper.

  14. Investigation of factors influencing biogas production in a large-scale thermophilic municipal biogas plant

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, Agnes; Jerome, Valerie; Freitag, Ruth [Bayreuth Univ. (Germany). Chair for Process Biotechnology; Burghardt, Diana; Likke, Likke; Peiffer, Stefan [Bayreuth Univ. (Germany). Dept. of Hydrology; Hofstetter, Eugen M. [RVT Process Equipment GmbH, Steinwiesen (Germany); Gabler, Ralf [BKW Biokraftwerke Fuerstenwalde GmbH, Fuerstenwalde (Germany)

    2009-10-15

    A continuously operated, thermophilic, municipal biogas plant was observed over 26 months (sampling twice per month) in regard to a number of physicochemical parameters and the biogas production. Biogas yields were put in correlation to parameters such as the volatile fatty acid concentration, the pH and the ammonium concentration. When the residing microbiota was classified via analysis of the 16S rRNA genes, most bacterial sequences matched with unidentified or uncultured bacteria from similar habitats. Of the archaeal sequences, 78.4% were identified as belonging to the genus Methanoculleus, which has not previously been reported for biogas plants, but is known to efficiently use H{sub 2} and CO{sub 2} produced by the degradation of fatty acids by syntrophic microorganisms. In order to further investigate the influence of varied amounts of ammonia (2-8 g/L) and volatile fatty acids on biogas production and composition (methane/CO{sub 2}), laboratory scale satellite experiments were performed in parallel to the technical plant. Finally, ammonia stripping of the process water of the technical plant was accomplished, a measure through which the ammonia entering the biogas reactor via the mash could be nearly halved, which increased the energy output of the biogas plant by almost 20%. (orig.)

  15. Comparing centralised and decentralised anaerobic digestion of stillage from a large-scale bioethanol plant to animal feed production.

    Science.gov (United States)

    Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R

    2008-01-01

    A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.

  16. Optimizing in vitro large scale production of giant reed (Arundo donax L.) by liquid medium culture

    International Nuclear Information System (INIS)

    Cavallaro, Valeria; Patanè, Cristina; Cosentino, Salvatore L.; Di Silvestro, Isabella; Copani, Venera

    2014-01-01

    Tissue culture methods offer the potential for large-scale propagation of giant reed (Arundo donax L.), a promising crop for energy biomass. In previous trials, giant reed resulted particularly suitable to in vitro culture. In this paper, with the final goal of enhancing the efficiency of in vitro production process and reducing costs, the influence of four different culture media (agar or gellan-gum solidified medium, liquid medium into a temporary immersion system-RITA ® or in a stationary state) on in vitro shoot proliferation of giant reed was evaluated. Giant reed exhibited a particular sensitivity to gelling agents during the phase of secondary shoot formation. Gellan gum, as compared to agar, improved the efficiency of in vitro culture giving more shoots with higher mean fresh and dry weight. Moreover, the cultivation of this species into a liquid medium under temporary immersion conditions or in a stationary state, was comparatively as effective as and cheaper than that into a gellan gum medium. Increasing 6-benzylaminopurine (BA) up to 4 mg l −1 also resulted in a further enhancement of secondary shoot proliferation. The good adaptability of this species to liquid medium and the high multiplication rates observed indicate the possibility to obtain from a single node at least 1200 plantlets every six multiplication cycles (about 6 months), a number 100 fold higher than that obtained yearly per plant by the conventional methods of vegetative multiplication. In open field, micropropagated plantlets guaranteed a higher number of survived plants, secondary stems and above ground biomass as compared to rhizome ones. - Highlights: • In vitro propagation offers the potential for large-scale propagation of giant reed. • The success of an in vitro protocol depends on the rate and mode of shoot proliferation. • Substituting liquid media to solid ones may decrease propagation costs in Arundo donax. • Giant reed showed good proliferation rates in

  17. Large-Scale Reactive Atomistic Simulation of Shock-induced Initiation Processes in Energetic Materials

    Science.gov (United States)

    Thompson, Aidan

    2013-06-01

    Initiation in energetic materials is fundamentally dependent on the interaction between a host of complex chemical and mechanical processes, occurring on scales ranging from intramolecular vibrations through molecular crystal plasticity up to hydrodynamic phenomena at the mesoscale. A variety of methods (e.g. quantum electronic structure methods (QM), non-reactive classical molecular dynamics (MD), mesoscopic continuum mechanics) exist to study processes occurring on each of these scales in isolation, but cannot describe how these processes interact with each other. In contrast, the ReaxFF reactive force field, implemented in the LAMMPS parallel MD code, allows us to routinely perform multimillion-atom reactive MD simulations of shock-induced initiation in a variety of energetic materials. This is done either by explicitly driving a shock-wave through the structure (NEMD) or by imposing thermodynamic constraints on the collective dynamics of the simulation cell e.g. using the Multiscale Shock Technique (MSST). These MD simulations allow us to directly observe how energy is transferred from the shockwave into other processes, including intramolecular vibrational modes, plastic deformation of the crystal, and hydrodynamic jetting at interfaces. These processes in turn cause thermal excitation of chemical bonds leading to initial chemical reactions, and ultimately to exothermic formation of product species. Results will be presented on the application of this approach to several important energetic materials, including pentaerythritol tetranitrate (PETN) and ammonium nitrate/fuel oil (ANFO). In both cases, we validate the ReaxFF parameterizations against QM and experimental data. For PETN, we observe initiation occurring via different chemical pathways, depending on the shock direction. For PETN containing spherical voids, we observe enhanced sensitivity due to jetting, void collapse, and hotspot formation, with sensitivity increasing with void size. For ANFO, we

  18. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  19. Open-Source Pipeline for Large-Scale Data Processing, Analysis and Collaboration, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's observational and modeled data products encompass petabytes of earth science data available for analysis, analytics, and exploitation. Unfortunately, these...

  20. Ocean Acidification Experiments in Large-Scale Mesocosms Reveal Similar Dynamics of Dissolved Organic Matter Production and Biotransformation

    Directory of Open Access Journals (Sweden)

    Maren Zark

    2017-09-01

    Full Text Available Dissolved organic matter (DOM represents a major reservoir of carbon in the oceans. Environmental stressors such as ocean acidification (OA potentially affect DOM production and degradation processes, e.g., phytoplankton exudation or microbial uptake and biotransformation of molecules. Resulting changes in carbon storage capacity of the ocean, thus, may cause feedbacks on the global carbon cycle. Previous experiments studying OA effects on the DOM pool under natural conditions, however, were mostly conducted in temperate and coastal eutrophic areas. Here, we report on OA effects on the existing and newly produced DOM pool during an experiment in the subtropical North Atlantic Ocean at the Canary Islands during an (1 oligotrophic phase and (2 after simulated deep water upwelling. The last is a frequently occurring event in this region controlling nutrient and phytoplankton dynamics. We manipulated nine large-scale mesocosms with a gradient of pCO2 ranging from ~350 up to ~1,030 μatm and monitored the DOM molecular composition using ultrahigh-resolution mass spectrometry via Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS. An increase of 37 μmol L−1 DOC was observed in all mesocosms during a phytoplankton bloom induced by simulated upwelling. Indications for enhanced DOC accumulation under elevated CO2 became apparent during a phase of nutrient recycling toward the end of the experiment. The production of DOM was reflected in changes of the molecular DOM composition. Out of the 7,212 molecular formulae, which were detected throughout the experiment, ~50% correlated significantly in mass spectrometric signal intensity with cumulative bacterial protein production (BPP and are likely a product of microbial transformation. However, no differences in the produced compounds were found with respect to CO2 levels. Comparing the results of this experiment with a comparable OA experiment in the Swedish Gullmar Fjord, reveals

  1. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  2. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  3. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    Science.gov (United States)

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  4. Large-Scale Urban Projects, Production of Space and Neo-liberal Hegemony: A Comparative Study of Izmir

    Directory of Open Access Journals (Sweden)

    Mehmet PENPECİOĞLU

    2013-04-01

    Full Text Available With the rise of neo-liberalism, large-scale urban projects (LDPs have become a powerful mechanism of urban policy. Creating spaces of neo-liberal urbanization such as central business districts, tourism centers, gated residences and shopping malls, LDPs play a role not only in the reproduction of capital accumulation relations but also in the shift of urban political priorities towards the construction of neo-liberal hegemony. The construction of neo-liberal hegemony and the role played by LDPs in this process could not only be investigated by the analysis of capital accumulation. For such an investigation; the role of state and civil society actors in LDPs, their collaborative and conflictual relationships should be researched and their functions in hegemony should be revealed. In the case of Izmir’s two LDPs, namely the New City Center (NCC and Inciraltı Tourism Center (ITC projects, this study analyzes the relationship between the production of space and neo-liberal hegemony. In the NCC project, local governments, investors, local capital organizations and professional chambers collaborated and disseminated hegemonic discourse, which provided social support for the project. Through these relationships and discourses, the NCC project has become a hegemonic project for producing space and constructed neo-liberal hegemony over urban political priorities. In contrast to the NCC project, the ITC project saw no collaboration between state and organized civil society actors. The social opposition against the ITC project, initiated by professional chambers, has brought legal action against the ITC development plans in order to prevent their implementation. As a result, the ITC project did not acquire the consent of organized social groups and failed to become a hegemonic project for producing space.

  5. Modeling of a Large-Scale High Temperature Regenerative Sulfur Removal Process

    DEFF Research Database (Denmark)

    Konttinen, Jukka T.; Johnsson, Jan Erik

    1999-01-01

    model that does not account for bed hydrodynamics. The pilot-scale test run results, obtained in the test runs of the sulfur removal process with real coal gasifier gas, have been used for parameter estimation. The validity of the reactor model for commercial-scale design applications is discussed.......Regenerable mixed metal oxide sorbents are prime candidates for the removal of hydrogen sulfide from hot gasifier gas in the simplified integrated gasification combined cycle (IGCC) process. As part of the regenerative sulfur removal process development, reactor models are needed for scale......-up. Steady-state kinetic reactor models are needed for reactor sizing, and dynamic models can be used for process control design and operator training. The regenerative sulfur removal process to be studied in this paper consists of two side-by-side fluidized bed reactors operating at temperatures of 400...

  6. Third generation design solar cell module LSA task 5, large scale production

    Science.gov (United States)

    1980-01-01

    A total of twelve (12) preproduction modules were constructed, tested, and delivered. A concept to the frame assembly was designed and proven to be quite reliable. This frame design, as well as the rest of the assembly, was designed with future high volume production and the use of automated equipment in mind.

  7. Large-scale module production for the CMS silicon strip tracker

    CERN Document Server

    Cattai, A

    2005-01-01

    The Silicon Strip Tracker (SST) for the CMS experiment at LHC consists of 210 m**2 of silicon strip detectors grouped into four distinct sub-systems. We present a brief description of the CMS Tracker, the industrialised detector module production methods and the current status of the SST with reference to some problems encountered at the factories and in the construction centres.

  8. Large-scale production of PWO scintillation elements for CMS ECAL

    International Nuclear Information System (INIS)

    Annenkov, A.; Auffray, E.; Drobychev, G.; Korzhik, M.; Kostylev, V.; Kovalev, O.; Lecoq, P.; Ligoun, V.; Missevitch, O.; Zouevski, R.

    2005-01-01

    JSC Bogoroditsk Technical Chemical Plant, BTCP, has produced up to date more than 20,000 lead tungstate scintillation elements for the electromagnetic calorimeter of CMS Collaboration. Here we report on the status of the crystal production and results of the quality insurance program, which is performed by the Collaboration in cooperation with BTCP to keep crystal properties within specifications

  9. A roadmap for natural product discovery based on large-scale genomics and metabolomics

    Science.gov (United States)

    Actinobacteria encode a wealth of natural product biosynthetic gene clusters, whose systematic study is complicated by numerous repetitive motifs. By combining several metrics we developed a method for global classification of these gene clusters into families (GCFs) and analyzed the biosynthetic ca...

  10. Innovative Techniques for Large-Scale Collection, Processing, and Storage of Eelgrass (Zostera marina) Seeds

    National Research Council Canada - National Science Library

    Orth, Robert J; Marion, Scott R

    2007-01-01

    .... Although methods for hand-collecting, processing and storing eelgrass seeds have advanced to match the scale of collections, the number of seeds collected has limited the scale of restoration efforts...

  11. Large scale production of densified hydrogen to the triple point and below

    Science.gov (United States)

    Swanger, A. M.; Notardonato, W. U.; E Fesmire, J.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage technology at NASA Kennedy Space Center led to the production of large quantities of densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. Overall densification performance of the system is explored, and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing, and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. The phenomenon, observed at two fill levels, is detailed herein. The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  12. Some Examples of Residence-Time Distribution Studies in Large-Scale Chemical Processes by Using Radiotracer Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, R. M.; Johnson, P.; Whiston, J. [Imperial Chemical Industries Ltd., Billingham, Co., Durham (United Kingdom)

    1967-06-15

    The application of radiotracers to determine flow patterns in chemical processes is discussed with particular reference to the derivation of design data from model reactors for translation to large-scale units, the study of operating efficiency and design attainment in established plant and the rapid identification of various types of process malfunction. The requirements governing the selection of tracers for various types of media are considered and an example is given of the testing of the behaviour of a typical tracer before use in a particular large-scale process operating at 250 atm and 200 Degree-Sign C. Information which may be derived from flow patterns is discussed including the determination of mixing parameters, gas hold-up in gas/liquid reactions and the detection of channelling and stagnant regions. Practical results and their interpretation are given in relation to an define hydroformylation reaction system, a process for the conversion of propylene to isopropanol, a moving bed catalyst system for the isomerization of xylenes and a three-stage gas-liquid reaction system. The use of mean residence-time data for the detection of leakage between reaction vessels and a heat interchanger system is given as an example of the identification of process malfunction. (author)

  13. Study of the environmental impacts of large scale bioethanol production in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1991-01-01

    The report provides an analysis of the energy balance, the carbon dioxide balance, and other environmental effects. Four crops which might be used as bioethanol feedstock were considered. These were: wheat, sugar beet, sweet sorghum and Jerusalem artichoke. Given the current agricultural capabilities in Europe, wheat and sugar beet could be cultivated immediately for bioethanol production whilst sweet sorghum and Jerusalem artichoke represent crops which are under investigation as potential bioethanol feedstock in the longer term. (author).

  14. Study of the environmental impacts of large scale bioethanol production in Europe

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    The report provides an analysis of the energy balance, the carbon dioxide balance, and other environmental effects. Four crops which might be used as bioethanol feedstock were considered. These were: wheat, sugar beet, sweet sorghum and Jerusalem artichoke. Given the current agricultural capabilities in Europe, wheat and sugar beet could be cultivated immediately for bioethanol production whilst sweet sorghum and Jerusalem artichoke represent crops which are under investigation as potential bioethanol feedstock in the longer term. (author)

  15. Integrating large-scale functional genomics data to dissect metabolic networks for hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Harwood, Caroline S

    2012-12-17

    The goal of this project is to identify gene networks that are critical for efficient biohydrogen production by leveraging variation in gene content and gene expression in independently isolated Rhodopseudomonas palustris strains. Coexpression methods were applied to large data sets that we have collected to define probabilistic causal gene networks. To our knowledge this a first systems level approach that takes advantage of strain-to strain variability to computationally define networks critical for a particular bacterial phenotypic trait.

  16. Large-scale Roll-to-Roll Fabrication of Organic Solar Cells for Energy Production

    DEFF Research Database (Denmark)

    Hösel, Markus

    intelligent connection of single cells that should involve as less as possible manual processes such as wiring or soldering. The problem was solved by serially connecting thousands of single cells entirely during the R2R processing by printing thin-film silver conductors. High voltage networks require only...... be produced cheaply and very fast from solution with printing processes. The current research all around the world is still focused on lab-scale sized devices « cm2, ITO-glass substrates, and spin coating as the main fabrication method. These OPV devices are far from any practical application although record...... substrates and ITO-free transparent conductive electrodes made from special designed flexo printed silver grids, rotary screen printed PEDOT:PSS, and slot-die coated ZnO (= Flextrode). The organic solar cell was fabricated by slot-die coating a light absorbing photoactive layer (e. g. P3HT:PCBM) on top...

  17. New method for large scale production of medically applicable Actinium-225 and Radium-223

    International Nuclear Information System (INIS)

    Aliev, R.A.; Vasilyev, A.N.; Ostapenko, V.; Kalmykov, S.N.; Zhuikov, B.L.; Ermolaev, S.V.; Lapshina, E.V.

    2014-01-01

    Alpha-emitters ( 211 At, 212 Bi, 213 Bi, 223 Ra, 225 Ac) are promising for targeted radiotherapy of cancer. Only two alpha decays near a cell membrane result in 50% death of cancer cell and only a single decay inside the cell is required for this. 225 Ac may be used either directly or as a mother radionuclide in 213 Bi isotope generator. Production of 225 Ac is provided by three main suppliers - Institute for Transuranium Elements in Germany, Oak Ridge National Laboratory in USA and Institute of Physics and Power Engineering in Obninsk, Russia. The current worldwide production of 225 Ac is approximately 1.7 Ci per year that corresponds to only 100-200 patients that could be treated annually. The common approach for 225 Ac production is separation from mother 229 Th or irradiation of 226 Ra with protons in a cyclotron. Both the methods have some practical limitations to be applied routinely. 225 Ac can be also produced by irradiation of natural thorium with medium energy protons . Cumulative cross sections of 225 Ac, 227 Ac, 227 Th, 228 Th formations have been obtained recently. Thorium targets (1-9 g) were irradiated by 114-91 MeV proton beam (1-50 μA) at INR linear accelerator. After dissolution in 8 M HNO 3 + 0.004 M HF thorium was removed by double LLX by HDEHP in toluene (1:1). Ac and REE were pre-concentrated and separated from Ra and most fission products by DGA-Resin (Triskem). After washing out by 0.01 M HNO 3 Ac was separated from REE by TRU Resin (Triskem) in 3 M HNO 3 media. About 6 mCi 225 Ac were separated in hot cell with chemical yield 85%. The method may be upscaled for production of Ci amounts of the radionuclide. The main impurity is 227 Ac (0.1% at the EOB) but it does not hinder 225 Ac from being used for medical 225 Ac/ 213 Bi generators. (author)

  18. A Recommender System for an IPTV Service Provider: a Real Large-Scale Production Environment

    Science.gov (United States)

    Bambini, Riccardo; Cremonesi, Paolo; Turrin, Roberto

    In this chapter we describe the integration of a recommender system into the production environment of Fastweb, one of the largest European IP Television (IPTV) providers. The recommender system implements both collaborative and content-based techniques, suitable tailored to the specific requirements of an IPTV architecture, such as the limited screen definition, the reduced navigation capabilities, and the strict time constraints. The algorithms are extensively analyzed by means of off-line and on-line tests, showing the effectiveness of the recommender systems: up to 30% of the recommendations are followed by a purchase, with an estimated lift factor (increase in sales) of 15%.

  19. Breakthrough In Current In Plane Metrology For Monitoring Large Scale MRAM Production

    DEFF Research Database (Denmark)

    Cagliani, Alberto; Østerberg, Frederik Westergaard; Hansen, Ole

    2017-01-01

    The current-in-plane tunneling technique (CIPT) has been a crucial tool in the development of magnetic tunnel junction stacks suitable for Magnetic Random Access Memories (MRAM) for more than a decade. The MRAM development has now reached the maturity to make the transition from R&D to large...... of the Resistance Area product (RA) and the Tunnel Magnetoresistance (TMR) measurements, compared to state of the art CIPT metrology tools dedicated to R&D. On two test wafers, the repeatability of RA and MR was improved up to 350% and the measurement reproducibility up to 1700%. We believe that CIPT metrology now...

  20. Large-scale bioreactor production of the herbicide-degrading Aminobacter sp. strain MSH1

    DEFF Research Database (Denmark)

    Schultz-Jensen, Nadja; Knudsen, Berith Elkær; Frkova, Zuzana

    2014-01-01

    The Aminobacter sp. strain MSH1 has potential for pesticide bioremediation because it degrades the herbicide metabolite 2,6-dichlorobenzamide (BAM). Production of the BAM-degrading bacterium using aerobic bioreactor fermentation was investigated. A mineral salt medium limited for carbon and with ......The Aminobacter sp. strain MSH1 has potential for pesticide bioremediation because it degrades the herbicide metabolite 2,6-dichlorobenzamide (BAM). Production of the BAM-degrading bacterium using aerobic bioreactor fermentation was investigated. A mineral salt medium limited for carbon...... and with an element composition similar to the strain was generated. The optimal pH and temperature for strain growth were determined using shaker flasks and verified in bioreactors. Glucose, fructose, and glycerol were suitable carbon sources for MSH1 (μ =0.1 h−1); slower growth was observed on succinate and acetic...... acid (μ =0.01 h−1). Standard conditions for growth of theMSH1 strain were defined at pH 7 and 25 °C, with glucose as the carbon source. In bioreactors (1 and 5 L), the specific growth rate of MSH1 increased from μ =0.1 h−1 on traditional mineral salt medium to μ =0.18 h−1 on the optimized mineral salt...

  1. Experience with LHC Magnets from Prototyping to Large Scale Industrial Production and Integration

    CERN Multimedia

    Rossi, L

    2004-01-01

    The construction of the LHC superconducting magnets is approaching its half way to completion. At the end of 2003, main dipoles cold masses for more than one octant were delivered; meanwhile the winding for the second octant was almost completed. The other large magnets, like the main quadrupoles and the insertion quadrupoles, have entered into series production as well. Providing more than 20 km of superconducting magnets, with the quality required for an accelerator like LHC, is an unprecedented challenge in term of complexity that has required many steps from the construction of 1 meterlong magnets in the laboratory to today’s production of more than one 15 meter-long magnet per day in Industry. The work and its organization is made even more complex by the fact that CERN supplies most of the critical components and part of the main tooling to the magnet manufacturers, both for cost reduction and for quality issues. In this paper the critical aspects of the construction will be reviewed and the actual ...

  2. Large Scale Production of Densified Hydrogen Using Integrated Refrigeration and Storage

    Science.gov (United States)

    Notardonato, William U.; Swanger, Adam Michael; Jumper, Kevin M.; Fesmire, James E.; Tomsik, Thomas M.; Johnson, Wesley L.

    2017-01-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage (IRAS) technology at NASA Kennedy Space Center led to the production of large quantities of solid densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. System energy balances and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing (up to 1 K), and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. Twenty silicon diode temperature sensors were recorded over approximately one month for testing at two different fill levels (33 67). The phenomenon, observed at both two fill levels, is described and presented detailed and explained herein., and The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  3. Bioprocessing strategies for the large-scale production of human mesenchymal stem cells: a review.

    Science.gov (United States)

    Panchalingam, Krishna M; Jung, Sunghoon; Rosenberg, Lawrence; Behie, Leo A

    2015-11-23

    Human mesenchymal stem cells (hMSCs), also called mesenchymal stromal cells, have been of great interest in regenerative medicine applications because of not only their differentiation potential but also their ability to secrete bioactive factors that can modulate the immune system and promote tissue repair. This potential has initiated many early-phase clinical studies for the treatment of various diseases, disorders, and injuries by using either hMSCs themselves or their secreted products. Currently, hMSCs for clinical use are generated through conventional static adherent cultures in the presence of fetal bovine serum or human-sourced supplements. However, these methods suffer from variable culture conditions (i.e., ill-defined medium components and heterogeneous culture environment) and thus are not ideal procedures to meet the expected future demand of quality-assured hMSCs for human therapeutic use. Optimizing a bioprocess to generate hMSCs or their secreted products (or both) promises to improve the efficacy as well as safety of this stem cell therapy. In this review, current media and methods for hMSC culture are outlined and bioprocess development strategies discussed.

  4. Large-scale production of tannase using the yeast Arxula adeninivorans.

    Science.gov (United States)

    Böer, Erik; Breuer, Friederike Sophie; Weniger, Michael; Denter, Sylvia; Piontek, Michael; Kunze, Gotthard

    2011-10-01

    Tannase (tannin acyl hydrolase, EC 3.1.1.20) hydrolyses the ester and depside bonds of gallotannins and gallic acid esters and is an important industrial enzyme. In the present study, transgenic Arxula adeninivorans strains were optimised for tannase production. Various plasmids carrying one or two expression modules for constitutive expression of tannase were constructed. Transformant strains that overexpress the ATAN1 gene from the strong A. adeninivorans TEF1 promoter produce levels of up to 1,642 U L(-1) when grown in glucose medium in shake flasks. The effect of fed-batch fermentation on tannase productivity was then investigated in detail. Under these conditions, a transgenic strain containing one ATAN1 expression module produced 51,900 U of tannase activity per litre after 142 h of fermentation at a dry cell weight of 162 g L(-1). The highest yield obtained from a transgenic strain with two ATAN1 expression modules was 31,300 U after 232 h at a dry cell weight of 104 g L(-1). Interestingly, the maximum achieved yield coefficients [Y(P/X)] for the two strains were essentially identical.

  5. Large-scale production of paper-based Li-ion cells

    CERN Document Server

    Zolin, Lorenzo

    2017-01-01

    This book describes in detail the use of natural cellulose fibers for the production of innovative, low-cost, and easily recyclable lithium-ion (Li-ion) cells by means of fast and reliable papermaking procedures that employ water as a solvent. In addition, it proposes specific methods to optimize the safety features of these paper-based cells and to improve the electronic conductivity of the electrodes by means of a carbonization process– an interesting novel technology that enables higher current rate capabilities to be achieved. The in-depth descriptions of materials, methods, and techniques are complemented by the inclusion of a general overview of electrochemical devices and, in particular, of different Li-ion battery configurations. Presenting the outcomes of this important research, the work is of wide interest to electrochemical engineers in both research institutions and industry.

  6. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  7. Efficient large-scale protein production of larvae and pupae of silkworm by Bombyx mori nuclear polyhedrosis virus bacmid system

    International Nuclear Information System (INIS)

    Motohashi, Tomoko; Shimojima, Tsukasa; Fukagawa, Tatsuo; Maenaka, Katsumi; Park, Enoch Y.

    2005-01-01

    Silkworm is one of the most attractive hosts for large-scale production of eukaryotic proteins as well as recombinant baculoviruses for gene transfer to mammalian cells. The bacmid system of Autographa californica nuclear polyhedrosis virus (AcNPV) has already been established and widely used. However, the AcNPV does not have a potential to infect silkworm. We developed the first practical Bombyx mori nuclear polyhedrosis virus bacmid system directly applicable for the protein expression of silkworm. By using this system, the green fluorescence protein was successfully expressed in silkworm larvae and pupae not only by infection of its recombinant virus but also by direct injection of its bacmid DNA. This method provides the rapid protein production in silkworm as long as 10 days, is free from biohazard, thus will be a powerful tool for the future production factory of recombinant eukaryotic proteins and baculoviruses

  8. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  9. Combining Vertex-centric Graph Processing with SPARQL for Large-scale RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-06-27

    Modern applications, such as drug repositioning, require sophisticated analytics on RDF graphs that combine structural queries with generic graph computations. Existing systems support either declarative SPARQL queries, or generic graph processing, but not both. We bridge the gap by introducing Spartex, a versatile framework for complex RDF analytics. Spartex extends SPARQL to support programs that combine seamlessly generic graph algorithms (e.g., PageRank, Shortest Paths, etc.) with SPARQL queries. Spartex builds on existing vertex-centric graph processing frameworks, such as Graphlab or Pregel. It implements a generic SPARQL operator as a vertex-centric program that interprets SPARQL queries and executes them efficiently using a built-in optimizer. In addition, any graph algorithm implemented in the underlying vertex-centric framework, can be executed in Spartex. We present various scenarios where our framework simplifies significantly the implementation of complex RDF data analytics programs. We demonstrate that Spartex scales to datasets with billions of edges, and show that our core SPARQL engine is at least as fast as the state-of-the-art specialized RDF engines. For complex analytical tasks that combine generic graph processing with SPARQL, Spartex is at least an order of magnitude faster than existing alternatives.

  10. Large-scale environmental effects and ecological processes in the Baltic Sea

    International Nuclear Information System (INIS)

    Wulff, F.

    1990-11-01

    A Swedish research programme concerning the Baltic Sea is initiated by the SNV to produce budgets and models of eutrophying substances (nitrogen, phosphorus, silicate, some organic substances) and toxic substances (PCB, lindane and PAH). A description of the distribution and turnover of these substances including their transformation will be necessary in the evaluation of critical processes controlling concentrations in relation to external load. A geographical information system will be made available as a database and analytical tool for all participants (BED, Baltic Ecosystem Data). This project is designed around cooperation between the Baltic Sea countries. (au)

  11. Large-Scale, Continuous-Flow Production of Stressed Biomass (Desulfovibrio vulgaris Hildenborough)

    Energy Technology Data Exchange (ETDEWEB)

    Geller, Jil T.; Borglin, Sharon E.; Fortney, Julian L.; Lam, Bonita R.; Hazen, Terry C.; Biggin, Mark D.

    2010-05-01

    The Protein Complex Analysis Project (PCAP, http://pcap.lbl.gov/), focuses on high-throughput analysis of microbial protein complexes in the anaerobic, sulfate-reducing organism, DesulfovibriovulgarisHildenborough(DvH).Interest in DvHas a model organism for bioremediation of contaminated groundwater sites arises from its ability to reduce heavy metals. D. vulgarishas been isolated from contaminated groundwater of sites in the DOE complex. To understand the effect of environmental changes on the organism, midlog-phase cultures are exposed to nitrate and salt stresses (at the minimum inhibitory concentration, which reduces growth rates by 50percent), and compared to controls of cultures at midlogand stationary phases. Large volumes of culture of consistent quality (up to 100 liters) are needed because of the relatively low cell density of DvHcultures (one order of magnitude lower than E. coli, for example) and PCAP's challenge to characterize low-abundance membrane proteins. Cultures are grown in continuous flow stirred tank reactors (CFSTRs) to produce consistent cell densities. Stressor is added to the outflow from the CFSTR, and the mixture is pumped through a plug flow reactor (PFR), to provide a stress exposure time of 2 hours. Effluent is chilled and held in large carboys until it is centrifuged. A variety of analyses -- including metabolites, total proteins, cell density and phospholipidfatty-acids -- track culture consistency within a production run, and differences due to stress exposure and growth phase for the different conditions used. With our system we are able to produce the requisite 100 L of culture for a given condition within a week.

  12. Cost-effective large-scale fabrication of diffractive optical elements by using conventional semiconducting processes.

    Science.gov (United States)

    Yoo, Seunghwan; Song, Ho Young; Lee, Junghoon; Jang, Cheol-Yong; Jeong, Hakgeun

    2012-11-20

    In this article, we introduce a simple fabrication method for SiO(2)-based thin diffractive optical elements (DOEs) that uses the conventional processes widely used in the semiconductor industry. Photolithography and an inductively coupled plasma etching technique are easy and cost-effective methods for fabricating subnanometer-scale and thin DOEs with a refractive index of 1.45, based on SiO(2). After fabricating DOEs, we confirmed the shape of the output light emitted from the laser diode light source and applied to a light-emitting diode (LED) module. The results represent a new approach to mass-produce DOEs and realize a high-brightness LED module.

  13. On the self-organizing process of large scale shear flows

    Energy Technology Data Exchange (ETDEWEB)

    Newton, Andrew P. L. [Department of Applied Maths, University of Sheffield, Sheffield, Yorkshire S3 7RH (United Kingdom); Kim, Eun-jin [School of Mathematics and Statistics, University of Sheffield, Sheffield, Yorkshire S3 7RH (United Kingdom); Liu, Han-Li [High Altitude Observatory, National Centre for Atmospheric Research, P. O. BOX 3000, Boulder, Colorado 80303-3000 (United States)

    2013-09-15

    Self organization is invoked as a paradigm to explore the processes governing the evolution of shear flows. By examining the probability density function (PDF) of the local flow gradient (shear), we show that shear flows reach a quasi-equilibrium state as its growth of shear is balanced by shear relaxation. Specifically, the PDFs of the local shear are calculated numerically and analytically in reduced 1D and 0D models, where the PDFs are shown to converge to a bimodal distribution in the case of finite correlated temporal forcing. This bimodal PDF is then shown to be reproduced in nonlinear simulation of 2D hydrodynamic turbulence. Furthermore, the bimodal PDF is demonstrated to result from a self-organizing shear flow with linear profile. Similar bimodal structure and linear profile of the shear flow are observed in gulf stream, suggesting self-organization.

  14. Formation and fate of marine snow: small-scale processes with large- scale implications

    Directory of Open Access Journals (Sweden)

    Thomas Kiørboe

    2001-12-01

    Full Text Available Marine snow aggregates are believed to be the main vehicles for vertical material transport in the ocean. However, aggregates are also sites of elevated heterotrophic activity, which may rather cause enhanced retention of aggregated material in the upper ocean. Small-scale biological-physical interactions govern the formation and fate of marine snow. Aggregates may form by physical coagulation: fluid motion causes collisions between small primary particles (e.g. phytoplankton that may then stick together to form aggregates with enhanced sinking velocities. Bacteria may subsequently solubilise and remineralise aggregated particles. Because the solubilization rate exceeds the remineralization rate, organic solutes leak out of sinking aggregates. The leaking solutes spread by diffusion and advection and form a chemical trail in the wake of the sinking aggregate that may guide small zooplankters to the aggregate. Also, suspended bacteria may enjoy the elevated concentration of organic solutes in the plume. I explore these small-scale formation and degradation processes by means of models, experiments and field observations. The larger scale implications for the structure and functioning of pelagic food chains of export vs. retention of material will be discussed.

  15. Deep-inelastic processes: a workbench for large scale motion in nuclear matter

    International Nuclear Information System (INIS)

    Moretto, L.G.; Schmitt, R.P.

    1978-07-01

    The most prominent collective modes excited in deep-inelastic reactions are reviewed, and the natural hierarchy provided by their characteristic relaxation times is described. A model is presented which treats the relaxation of the mass asymmetry mode in terms of a diffusion process. Charge distributions and angular distributions as a function of Z calculated with this model are in good agreement with experimental data. An extension of this diffusion model which treats the transfer of energy and angular momentum in terms of particle transfer is described, and is successfully compared with experimental γ-ray multiplicities as a function of both Q-value and mass asymmetry. The problem of angular momentum transfer is again considered in connection with the sequential fission of heavy, deep-inelastic fragments and the excitation of collective modes in the exit channel is suggested. Lastly, the role of the giant E1 mode in the equilibration of the neutron-to-proton ratio is discussed. 14 figures, 39 references

  16. Deep-inelastic processes: a workbench for large scale motion in nuclear matter

    Energy Technology Data Exchange (ETDEWEB)

    Moretto, L.G.; Schmitt, R.P.

    1978-07-01

    The most prominent collective modes excited in deep-inelastic reactions are reviewed, and the natural hierarchy provided by their characteristic relaxation times is described. A model is presented which treats the relaxation of the mass asymmetry mode in terms of a diffusion process. Charge distributions and angular distributions as a function of Z calculated with this model are in good agreement with experimental data. An extension of this diffusion model which treats the transfer of energy and angular momentum in terms of particle transfer is described, and is successfully compared with experimental ..gamma..-ray multiplicities as a function of both Q-value and mass asymmetry. The problem of angular momentum transfer is again considered in connection with the sequential fission of heavy, deep-inelastic fragments and the excitation of collective modes in the exit channel is suggested. Lastly, the role of the giant E1 mode in the equilibration of the neutron-to-proton ratio is discussed. 14 figures, 39 references.

  17. A High Density Low Cost Digital Signal Processing Module for Large Scale Radiation Detectors

    International Nuclear Information System (INIS)

    Tan, Hui; Hennig, Wolfgang; Walby, Mark D.; Breus, Dimitry; Harris, Jackson T.; Grudberg, Peter M.; Warburton, William K.

    2013-06-01

    A 32-channel digital spectrometer PIXIE-32 is being developed for nuclear physics or other radiation detection applications requiring digital signal processing with large number of channels at relatively low cost. A single PIXIE-32 provides spectrometry and waveform acquisition for 32 input signals per module whereas multiple modules can be combined into larger systems. It is based on the PCI Express standard which allows data transfer rates to the host computer of up to 800 MB/s. Each of the 32 channels in a PIXIE-32 module accepts signals directly from a detector preamplifier or photomultiplier. Digitally controlled offsets can be individually adjusted for each channel. Signals are digitized in 12-bit, 50 MHz multi-channel ADCs. Triggering, pile-up inspection and filtering of the data stream are performed in real time, and pulse heights and other event data are calculated on an event-by event basis. The hardware architecture, internal and external triggering features, and the spectrometry and waveform acquisition capability of the PIXIE- 32 as well as its capability to distribute clock and triggers among multiple modules, are presented. (authors)

  18. NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.

    Science.gov (United States)

    Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S

    2016-01-14

    Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.

  19. Large-scale production of bioenergy by the side of fuel-peat; Bioenergian suurtuotanto polttoturpeen rinnalla

    Energy Technology Data Exchange (ETDEWEB)

    Heikkilae, K. [Vapo Oy, Jyvaeskylae (Finland)

    1996-12-31

    The objective of the project was to clarify the large-scale production possibilities and the construction of the costs for bioenergy, and to develop the operational manners so that smaller volumes of biomasses are integrated to prevailing peat production and delivered so that peat ensures the quality of the fuel supply, as well as the prices and the reliability of deliveries. Hence it is possible to utilize the same organisation, machinery and volumes. The operation will be designed to be all-year-round so that the profitability can be improved. Another aim is to get the non-utilizeable wood-wastes into use, which would serve also the silvicultural purposes. The utilizeable municipal and other wastes and sludges could be used within biomass, and to make, using proper mixing ratios, biofuels precisely suitable for the purposes of the customer. At the grain growing areas it is possible to utilize the straw and at the seaside the reed grass

  20. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    Science.gov (United States)

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be ≈50% α-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices. PMID:19581598

  1. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  2. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  3. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  4. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  5. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    Science.gov (United States)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables

  6. Formation of Large-scale Coronal Loops Interconnecting Two Active Regions through Gradual Magnetic Reconnection and an Associated Heating Process

    Science.gov (United States)

    Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin

    2018-06-01

    Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.

  7. Inverse problem to constrain the controlling parameters of large-scale heat transport processes: The Tiberias Basin example

    Science.gov (United States)

    Goretzki, Nora; Inbar, Nimrod; Siebert, Christian; Möller, Peter; Rosenthal, Eliyahu; Schneider, Michael; Magri, Fabien

    2015-04-01

    Salty and thermal springs exist along the lakeshore of the Sea of Galilee, which covers most of the Tiberias Basin (TB) in the northern Jordan- Dead Sea Transform, Israel/Jordan. As it is the only freshwater reservoir of the entire area, it is important to study the salinisation processes that pollute the lake. Simulations of thermohaline flow along a 35 km NW-SE profile show that meteoric and relic brines are flushed by the regional flow from the surrounding heights and thermally induced groundwater flow within the faults (Magri et al., 2015). Several model runs with trial and error were necessary to calibrate the hydraulic conductivity of both faults and major aquifers in order to fit temperature logs and spring salinity. It turned out that the hydraulic conductivity of the faults ranges between 30 and 140 m/yr whereas the hydraulic conductivity of the Upper Cenomanian aquifer is as high as 200 m/yr. However, large-scale transport processes are also dependent on other physical parameters such as thermal conductivity, porosity and fluid thermal expansion coefficient, which are hardly known. Here, inverse problems (IP) are solved along the NW-SE profile to better constrain the physical parameters (a) hydraulic conductivity, (b) thermal conductivity and (c) thermal expansion coefficient. The PEST code (Doherty, 2010) is applied via the graphical interface FePEST in FEFLOW (Diersch, 2014). The results show that both thermal and hydraulic conductivity are consistent with the values determined with the trial and error calibrations. Besides being an automatic approach that speeds up the calibration process, the IP allows to cover a wide range of parameter values, providing additional solutions not found with the trial and error method. Our study shows that geothermal systems like TB are more comprehensively understood when inverse models are applied to constrain coupled fluid flow processes over large spatial scales. References Diersch, H.-J.G., 2014. FEFLOW Finite

  8. Survey of high-voltage pulse technology suitable for large-scale plasma source ion implantation processes

    International Nuclear Information System (INIS)

    Reass, W.A.

    1994-01-01

    Many new plasma processes ideas are finding their way from the research lab to the manufacturing plant floor. These require high voltage (HV) pulse power equipment, which must be optimized for application, system efficiency, and reliability. Although no single HV pulse technology is suitable for all plasma processes, various classes of high voltage pulsers may offer a greater versatility and economy to the manufacturer. Technology developed for existing radar and particle accelerator modulator power systems can be utilized to develop a modern large scale plasma source ion implantation (PSII) system. The HV pulse networks can be broadly defined by two classes of systems, those that generate the voltage directly, and those that use some type of pulse forming network and step-up transformer. This article will examine these HV pulse technologies and discuss their applicability to the specific PSII process. Typical systems that will be reviewed will include high power solid state, hard tube systems such as crossed-field ''hollow beam'' switch tubes and planar tetrodes, and ''soft'' tube systems with crossatrons and thyratrons. Results will be tabulated and suggestions provided for a particular PSII process

  9. Large scale hydrogen production from wind energy in the Magallanes area for consumption in the central zone of Chile

    International Nuclear Information System (INIS)

    Zolezzi, J.M.; Garay, A.; Reveco, M.

    2010-01-01

    The energy proposal of this research suggests the use of places with abundant wind resources for the production of H 2 on a large scale to be transported and used in the central zone of Chile with the purpose of diversifying the country's energy matrix in order to decrease its dependence on fossil fuels, increase its autonomy, and cover the future increases in energy demand. This research showed that the load factor of the proposed wind park reaches a value of 54.5%, putting in evidence the excellent wind conditions of the zone. This implies that the cost of the electricity produced by the wind park located in the Chilean Patagonia would have a cost of 0.0213 US$ kWh -1 in the year 2030. The low prices of the electricity obtained from the park, thanks to the economy of scale and the huge wind potential, represent a very attractive scenario for the production of H 2 in the future. The study concludes that by the year 2030 the cost of the H 2 generated in Magallanes and transported to the port of Quinteros would be 18.36 US$ MBTU -1 , while by that time the cost of oil would be about 17.241 US$ MBTU -1 , a situation that places H 2 in a very competitive position as a fuel. (author)

  10. Evaluation of hollow fiber culture for large-scale production of mouse embryonic stem cell-derived hematopoietic stem cells.

    Science.gov (United States)

    Nakano, Yu; Iwanaga, Shinya; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2018-03-03

    Hematopoietic stem cells (HSCs) have the ability to differentiate into all types of blood cells and can be transplanted to treat blood disorders. However, it is difficult to obtain HSCs in large quantities because of the shortage of donors. Recent efforts have focused on acquiring HSCs by differentiation of pluripotent stem cells. As a conventional differentiation method of pluripotent stem cells, the formation of embryoid bodies (EBs) is often employed. However, the size of EBs is limited by depletion of oxygen and nutrients, which prevents them from being efficient for the production of HSCs. In this study, we developed a large-scale hematopoietic differentiation approach for mouse embryonic stem (ES) cells by applying a hollow fiber (HF)/organoid culture method. Cylindrical organoids, which had the potential for further spontaneous differentiation, were established inside of hollow fibers. Using this method, we improved the proliferation rate of mouse ES cells to produce an increased HSC population and achieved around a 40-fold higher production volume of HSCs in HF culture than in conventional EB culture. Therefore, the HF/organoid culture method may be a new mass culture method to acquire pluripotent stem cell-derived HSCs.

  11. Spatiotemporal complexity of 2-D rupture nucleation process observed by direct monitoring during large-scale biaxial rock friction experiments

    Science.gov (United States)

    Fukuyama, Eiichi; Tsuchida, Kotoyo; Kawakata, Hironori; Yamashita, Futoshi; Mizoguchi, Kazuo; Xu, Shiqing

    2018-05-01

    We were able to successfully capture rupture nucleation processes on a 2-D fault surface during large-scale biaxial friction experiments using metagabbro rock specimens. Several rupture nucleation patterns have been detected by a strain gauge array embedded inside the rock specimens as well as by that installed along the edge walls of the fault. In most cases, the unstable rupture started just after the rupture front touched both ends of the rock specimen (i.e., when rupture front extended to the entire width of the fault). In some cases, rupture initiated at multiple locations and the rupture fronts coalesced to generate unstable ruptures, which could only be detected from the observation inside the rock specimen. Therefore, we need to carefully examine the 2-D nucleation process of the rupture especially when analyzing the data measured only outside the rock specimen. At least the measurements should be done at both sides of the fault to identify the asymmetric rupture propagation on the fault surface, although this is not perfect yet. In the present experiment, we observed three typical types of the 2-D rupture propagation patterns, two of which were initiated at a single location either close to the fault edge or inside the fault. This initiation could be accelerated by the free surface effect at the fault edge. The third one was initiated at multiple locations and had a rupture coalescence at the middle of the fault. These geometrically complicated rupture initiation patterns are important for understanding the earthquake nucleation process in nature.

  12. Manufacturing and mechanical property test of the large-scale oxide dispersion strengthened martensitic mother tube by hot isostatic pressing and hot extrusion process

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2003-09-01

    Mass production capability of Oxide Dispersion Strengthened (ODS) ferritic steel cladding (9Cr) is evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube is a dominant factor in the total cost for manufacturing ODS ferritic cladding. In this study, the large-scale 9Cr-ODS martensitic mother tube was produced by overseas supplier with mass production equipments for commercialized ODS steels. The process of manufacturing the ODS mother tube consists of raw material powder production, mechanical alloying by high energy ball mill, hot isostatic pressing(HIP), and hot extrusion. Following results were obtained in this study. (1) Micro structure of the ODS steels is equivalent to that of domestic products, and fine oxides are uniformly distributed. The mechanical alloying by large capacity (1 ton) ball mill can be satisfactorily carried out. (2) A large scale mother tube (65 mm OD x 48 mm ID x 10,000 mm L), which can produce about 60 pieces of 3 m length ODS ferritic claddings by four times cold rolling, have been successfully manufactured through HIP and Hot Extrusion process. (3) Rough surface of the mother tubes produced in this study can be improved by selecting the reasonable hot extrusion condition. (4) Hardness and tensile strength of the manufactured ODS steels are lower than domestic products with same chemical composition. This is owing to the high aluminum content in the product, and those properties could be improved by decreasing the aluminum content in the raw material powder. (author)

  13. Large-Scale Consumption and Zero-Waste Recycling Method of Red Mud in Steel Making Process

    Directory of Open Access Journals (Sweden)

    Guoshan Ning

    2018-03-01

    Full Text Available To release the environmental pressure from the massive discharge of bauxite residue (red mud, a novel recycling method of red mud in steel making process was investigated through high-temperature experiments and thermodynamic analysis. The results showed that after the reduction roasting of the carbon-bearing red mud pellets at 1100–1200 °C for 12–20 min, the metallic pellets were obtained with the metallization ratio of ≥88%. Then, the separation of slag and iron achieved from the metallic pellets at 1550 °C, after composition adjustment targeting the primary crystal region of the 12CaO·7Al2O3 phase. After iron removal and composition adjustment, the smelting-separation slag had good smelting performance and desulfurization capability, which meets the demand of sulfurization flux in steel making process. The pig iron quality meets the requirements of the high-quality raw material for steel making. In virtue of the huge scale and output of steel industry, the large-scale consumption and zero-waste recycling method of red mud was proposed, which comprised of the carbon-bearing red mud pellets roasting in the rotary hearth furnace and smelting separation in the electric arc furnace after composition adjustment.

  14. An integrated assessment of a large-scale biodiesel production in Italy: Killing several birds with one stone?

    International Nuclear Information System (INIS)

    Russi, Daniela

    2008-01-01

    Biofuels are often presented as a contribution towards the solution of the problems related to our strong dependency on fossil fuels, i.e. greenhouse effect, energy dependency, urban pollution, besides being a way to support rural development. In this paper, an integrated assessment approach is employed to discuss the social desirability of a large-scale biodiesel production in Italy, taking into account social, environmental and economic factors. The conclusion is that the advantages in terms of reduction of greenhouse gas emissions, energy dependency and urban pollution would be very modest. The small benefits would not be enough to offset the huge costs in terms of land requirement: if the target of the European Directive 2003/30/EC were reached (5.75% of the energy used for transport by 2010) the equivalent of about one-third of the Italian agricultural land would be needed. The consequences would be a considerable increase in food imports and large environmental impacts in the agricultural phase. Also, since biodiesel must be de-taxed in order to make it competitive with oil-derived diesel, the Italian energy revenues would be reduced. In the end, rural development remains the only sound reason to promote biodiesel, but even for this objective other strategies look more advisable, like supporting organic agriculture. (author)

  15. Large-scale hydrological model river storage and discharge correction using a satellite altimetry-based discharge product

    Science.gov (United States)

    Emery, Charlotte Marie; Paris, Adrien; Biancamaria, Sylvain; Boone, Aaron; Calmant, Stéphane; Garambois, Pierre-André; Santos da Silva, Joecila

    2018-04-01

    Land surface models (LSMs) are widely used to study the continental part of the water cycle. However, even though their accuracy is increasing, inherent model uncertainties can not be avoided. In the meantime, remotely sensed observations of the continental water cycle variables such as soil moisture, lakes and river elevations are more frequent and accurate. Therefore, those two different types of information can be combined, using data assimilation techniques to reduce a model's uncertainties in its state variables or/and in its input parameters. The objective of this study is to present a data assimilation platform that assimilates into the large-scale ISBA-CTRIP LSM a punctual river discharge product, derived from ENVISAT nadir altimeter water elevation measurements and rating curves, over the whole Amazon basin. To deal with the scale difference between the model and the observation, the study also presents an initial development for a localization treatment that allows one to limit the impact of observations to areas close to the observation and in the same hydrological network. This assimilation platform is based on the ensemble Kalman filter and can correct either the CTRIP river water storage or the discharge. Root mean square error (RMSE) compared to gauge discharges is globally reduced until 21 % and at Óbidos, near the outlet, RMSE is reduced by up to 52 % compared to ENVISAT-based discharge. Finally, it is shown that localization improves results along the main tributaries.

  16. Large-Scale Processes Associated with Inter-Decadal and Inter-Annual Early Spring Rainfall Variability in Taiwan

    Directory of Open Access Journals (Sweden)

    Jau-Ming Chen

    2016-02-01

    Full Text Available Early spring (March - April rainfall in Taiwan exhibits evident and distinct inter-annual and inter-decadal variability. The inter-annual varibility has a positive correlation with the El Niño/Southern Oscillation while the inter-decadal variability features a phase change beginning in the late 1970s, coherent with the major phase change in the Pacific decadal oscillation. Rainfall variability in both timescales is regulated by large-scale processes showing consistent dynamic features. Rainfall increases are associated with positive sea surface temperature (SST anomalies in the tropical eastern Pacific and negative SST anomalies in the tropical central Pacific. An anomalous lower-level divergent center appears in the tropical central Pacific. Via a Rossby-wave-like response, an anomalous lower-level anticyclone appears to the southeast of Taiwan over the Philippine Sea-tropical western Pacific region, which is accompanied by an anomalous cyclone to the north-northeast of Taiwan. Both circulation anomalies induce anomalous southwesterly flows to enhance moisture flux from the South China Sea onto Taiwan, resulting in significant moisture convergence nearby Taiwan. With enhanced moisture supplied by anomalous southwesterly flows, significant rainfall increases occur in both inter-annual and inter-decadal timescales in early spring rainfall on Taiwan.

  17. Theory and algorithms for solving large-scale numerical problems. Application to the management of electricity production

    International Nuclear Information System (INIS)

    Chiche, A.

    2012-01-01

    This manuscript deals with large-scale optimization problems, and more specifically with solving the electricity unit commitment problem arising at EDF. First, we focused on the augmented Lagrangian algorithm. The behavior of that algorithm on an infeasible convex quadratic optimization problem is analyzed. It is shown that the algorithm finds a point that satisfies the shifted constraints with the smallest possible shift in the sense of the Euclidean norm and that it minimizes the objective on the corresponding shifted constrained set. The convergence to such a point is realized at a global linear rate, which depends explicitly on the augmentation parameter. This suggests us a rule for determining the augmentation parameter to control the speed of convergence of the shifted constraint norm to zero. This rule has the advantage of generating bounded augmentation parameters even when the problem is infeasible. As a by-product, the algorithm computes the smallest translation in the Euclidean norm that makes the constraints feasible. Furthermore, this work provides solution methods for stochastic optimization industrial problems decomposed on a scenario tree, based on the progressive hedging algorithm introduced by [Rockafellar et Wets, 1991]. We also focus on the convergence of that algorithm. On the one hand, we offer a counter-example showing that the algorithm could diverge if its augmentation parameter is iteratively updated. On the other hand, we show how to recover the multipliers associated with the non-dualized constraints defined on the scenario tree from those associated with the corresponding constraints of the scenario subproblems. Their convergence is also analyzed for convex problems. The practical interest of theses solutions techniques is corroborated by numerical experiments performed on the electric production management problem. We apply the progressive hedging algorithm to a realistic industrial problem. More precisely, we solve the French medium

  18. Applications of Neutron Scattering in the Chemical Industry: Proton Dynamics of Highly Dispersed Materials, Characterization of Fuel Cell Catalysts, and Catalysts from Large-Scale Chemical Processes

    Science.gov (United States)

    Albers, Peter W.; Parker, Stewart F.

    The attractiveness of neutron scattering techniques for the detailed characterization of materials of high degrees of dispersity and structural complexity as encountered in the chemical industry is discussed. Neutron scattering picks up where other analytical methods leave off because of the physico-chemical properties of finely divided products and materials whose absorption behavior toward electromagnetic radiation and electrical conductivity causes serious problems. This is demonstrated by presenting typical applications from large-scale production technology and industrial catalysis. These include the determination of the proton-related surface chemistry of advanced materials that are used as reinforcing fillers in the manufacture of tires, where interrelations between surface chemistry, rheological properties, improved safety, and significant reduction of fuel consumption are the focus of recent developments. Neutron scattering allows surface science studies of the dissociative adsorption of hydrogen on nanodispersed, supported precious metal particles of fuel cell catalysts under in situ loading at realistic gas pressures of about 1 bar. Insight into the occupation of catalytically relevant surface sites provides valuable information about the catalyst in the working state and supplies essential scientific input for tailoring better catalysts by technologists. The impact of deactivation phenomena on industrial catalysts by coke deposition, chemical transformation of carbonaceous deposits, and other processes in catalytic hydrogenation processes that result in significant shortening of the time of useful operation in large-scale plants can often be traced back in detail to surface or bulk properties of catalysts or materials of catalytic relevance. A better understanding of avoidable or unavoidable aspects of catalyst deactivation phenomena under certain in-process conditions and the development of effective means for reducing deactivation leads to more energy

  19. Production of recombinant antigens and antibodies in Nicotiana benthamiana using 'magnifection' technology: GMP-compliant facilities for small- and large-scale manufacturing.

    Science.gov (United States)

    Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh

    2014-01-01

    This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.

  20. Cattle mammary bioreactor generated by a novel procedure of transgenic cloning for large-scale production of functional human lactoferrin.

    Directory of Open Access Journals (Sweden)

    Penghua Yang

    Full Text Available Large-scale production of biopharmaceuticals by current bioreactor techniques is limited by low transgenic efficiency and low expression of foreign proteins. In general, a bacterial artificial chromosome (BAC harboring most regulatory elements is capable of overcoming the limitations, but transferring BAC into donor cells is difficult. We describe here the use of cattle mammary bioreactor to produce functional recombinant human lactoferrin (rhLF by a novel procedure of transgenic cloning, which employs microinjection to generate transgenic somatic cells as donor cells. Bovine fibroblast cells were co-microinjected for the first time with a 150-kb BAC carrying the human lactoferrin gene and a marker gene. The resulting transfection efficiency of up to 15.79 x 10(-2 percent was notably higher than that of electroporation and lipofection. Following somatic cell nuclear transfer, we obtained two transgenic cows that secreted rhLF at high levels, 2.5 g/l and 3.4 g/l, respectively. The rhLF had a similar pattern of glycosylation and proteolytic susceptibility as the natural human counterpart. Biochemical analysis revealed that the iron-binding and releasing properties of rhLF were identical to that of native hLF. Importantly, an antibacterial experiment further demonstrated that rhLF was functional. Our results indicate that co-microinjection with a BAC and a marker gene into donor cells for somatic cell cloning indeed improves transgenic efficiency. Moreover, the cattle mammary bioreactors generated with this novel procedure produce functional rhLF on an industrial scale.

  1. Magma viscosity estimation based on analysis of erupted products. Potential assessment for large-scale pyroclastic eruptions

    International Nuclear Information System (INIS)

    Takeuchi, Shingo

    2010-01-01

    After the formulation of guidelines for volcanic hazards in site evaluation for nuclear installations (e.g. JEAG4625-2009), it is required to establish appropriate methods to assess potential of large-scale pyroclastic eruptions at long-dormant volcanoes, which is one of the most hazardous volcanic phenomena on the safety of the installations. In considering the volcanic dormancy, magma eruptability is an important concept. The magma eruptability is dominantly controlled by magma viscosity, which can be estimated from petrological analysis of erupted materials. Therefore, viscosity estimation of magmas erupted in past eruptions should provide important information to assess future activities at hazardous volcanoes. In order to show the importance of magma viscosity in the concept of magma eruptability, this report overviews dike propagation processes from a magma chamber and nature of magma viscosity. Magma viscosity at pre-eruptive conditions of magma chambers were compiled based on previous petrological studies on past eruptions in Japan. There are only 16 examples of eruptions at 9 volcanoes satisfying data requirement for magma viscosity estimation. Estimated magma viscosities range from 10 2 to 10 7 Pa·s for basaltic to rhyolitic magmas. Most of examples fall below dike propagation limit of magma viscosity (ca. 10 6 Pa·s) estimated based on a dike propagation model. Highly viscous magmas (ca. 10 7 Pa·s) than the dike propagation limit are considered to lose eruptability which is the ability to form dikes and initiate eruptions. However, in some cases, small precursory eruptions of less viscous magmas commonly occurred just before climactic eruptions of the highly viscous magmas, suggesting that the precursory dike propagation by the less viscous magmas induced the following eruptions of highly viscous magmas (ca. 10 7 Pa·s). (author)

  2. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  3. Engineered catalytic biofilms for continuous large scale production of n-octanol and (S)-styrene oxide.

    Science.gov (United States)

    Gross, Rainer; Buehler, Katja; Schmid, Andreas

    2013-02-01

    This study evaluates the technical feasibility of biofilm-based biotransformations at an industrial scale by theoretically designing a process employing membrane fiber modules as being used in the chemical industry and compares the respective process parameters to classical stirred-tank studies. To our knowledge, catalytic biofilm processes for fine chemicals production have so far not been reported on a technical scale. As model reactions, we applied the previously studied asymmetric styrene epoxidation employing Pseudomonas sp. strain VLB120ΔC biofilms and the here-described selective alkane hydroxylation. Using the non-heme iron containing alkane hydroxylase system (AlkBGT) from P. putida Gpo1 in the recombinant P. putida PpS81 pBT10 biofilm, we were able to continuously produce 1-octanol from octane with a maximal productivity of 1.3 g L ⁻¹(aq) day⁻¹ in a single tube micro reactor. For a possible industrial application, a cylindrical membrane fiber module packed with 84,000 polypropylene fibers is proposed. Based on the here presented calculations, 59 membrane fiber modules (of 0.9 m diameter and 2 m length) would be feasible to realize a production process of 1,000 tons/year for styrene oxide. Moreover, the product yield on carbon can at least be doubled and over 400-fold less biomass waste would be generated compared to classical stirred-tank reactor processes. For the octanol process, instead, further intensification in biological activity and/or surface membrane enlargement is required to reach production scale. By taking into consideration challenges such as biomass growth control and maintaining a constant biological activity, this study shows that a biofilm process at an industrial scale for the production of fine chemicals is a sustainable alternative in terms of product yield and biomass waste production. Copyright © 2012 Wiley Periodicals, Inc.

  4. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  5. Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process

    Science.gov (United States)

    Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue

    2014-04-01

    The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.

  6. Large-scale bioenergy production from soybeans and switchgrass in Argentina: Part A: Potential and economic feasibility for national and international markets

    NARCIS (Netherlands)

    van Dam, J.; Faaij, A.P.C.; Hilbert, J.; Petruzzi, H.; Turkenburg, W.C.

    2009-01-01

    This study focuses on the economic feasibility for large-scale biomass production from soybeans or switchgrass from a region in Argentina. This is determined, firstly, by estimating whether the potential supply of biomass, when food and feed demand are met, is sufficient under different scenarios to

  7. Understory fern community structure, growth and spore production responses to a large-scale hurricane experiment in a Puerto Rico rainforest

    Science.gov (United States)

    Joanne M. Sharpe; Aaron B. Shiels

    2014-01-01

    Ferns are abundant in most rainforest understories yet their responses to hurricanes have not been well studied. Fern community structure, growth and spore production were monitored for two years before and five years after a large-scale experiment that simulated two key components of severe hurricane disturbance: canopy openness and debris deposition. The canopy was...

  8. Parametric Evaluation of Large-Scale High-Temperature Electrolysis Hydrogen Production Using Different Advanced Nuclear Reactor Heat Sources

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; McKellar, Michael G.; O'Brien, James E.; Herring, J. Stephen

    2009-01-01

    High Temperature Electrolysis (HTE), when coupled to an advanced nuclear reactor capable of operating at reactor outlet temperatures of 800 C to 950 C, has the potential to efficiently produce the large quantities of hydrogen needed to meet future energy and transportation needs. To evaluate the potential benefits of nuclear-driven hydrogen production, the UniSim process analysis software was used to evaluate different reactor concepts coupled to a reference HTE process design concept. The reference HTE concept included an Intermediate Heat Exchanger and intermediate helium loop to separate the reactor primary system from the HTE process loops and additional heat exchangers to transfer reactor heat from the intermediate loop to the HTE process loops. The two process loops consisted of the water/steam loop feeding the cathode side of a HTE electrolysis stack, and the sweep gas loop used to remove oxygen from the anode side. The UniSim model of the process loops included pumps to circulate the working fluids and heat exchangers to recover heat from the oxygen and hydrogen product streams to improve the overall hydrogen production efficiencies. The reference HTE process loop model was coupled to separate UniSim models developed for three different advanced reactor concepts (a high-temperature helium cooled reactor concept and two different supercritical CO2 reactor concepts). Sensitivity studies were then performed to evaluate the affect of reactor outlet temperature on the power cycle efficiency and overall hydrogen production efficiency for each of the reactor power cycles. The results of these sensitivity studies showed that overall power cycle and hydrogen production efficiencies increased with reactor outlet temperature, but the power cycles producing the highest efficiencies varied depending on the temperature range considered

  9. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  10. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    Energy Technology Data Exchange (ETDEWEB)

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many of the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.

  11. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    is the idea that the large-scale foodservice such as hospital food service should adopt a buy organic policy due to their large buying volume. But whereas implementation of organic foods has developed quite unproblematically in smaller institutions such as kindergartens and nurseries, introduction of organic...... foods into large-scale foodservice such as that taking place in hospitals and larger homes for the elderly, has proven to be quite difficult. The very complex planning, procurement and processing procedures used in such facilities are among reasons for this. Against this background an evaluation...

  12. The LHC Cryomagnet Supports in Glass-Fiber Reinforced Epoxy A Large Scale Industrial Production with High Reproducibility in Performance

    CERN Document Server

    Poncet, A; Trigo, J; Parma, V

    2008-01-01

    The about 1700 LHC main ring super-conducting magnets are supported within their cryostats on 4700 low heat in leak column-type supports. The supports were designed to ensure a precise and stable positioning of the heavy dipole and quadrupole magnets while keeping thermal conduction heat loads within budget. A trade-off between mechanical and thermal properties, as well as cost considerations, led to the choice of glass fibre reinforced epoxy (GFRE). Resin Transfer Moulding (RTM), featuring a high level of automation and control, was the manufacturing process retained to ensure the reproducibility of the performance of the supports throughout the large production. The Spanish aerospace company EADS-CASA Espacio developed the specific RTM process, and produced the total quantity of supports between 2001 and 2004. This paper describes the development and the production of the supports, and presents the production experience and the achieved performance.

  13. THE LHC CRYOMAGNET SUPPORTS IN GLASS-FIBER REINFORCED EPOXY: A LARGE SCALE INDUSTRIAL PRODUCTION WITH HIGH REPRODUCIBILITY IN PERFORMANCE

    International Nuclear Information System (INIS)

    Poncet, A.; Struik, M.; Parma, V.; Trigo, J.

    2008-01-01

    The about 1700 LHC main ring super-conducting magnets are supported within their cryostats on 4700 low heat in leak column-type supports. The supports were designed to ensure a precise and stable positioning of the heavy dipole and quadrupole magnets while keeping thermal conduction heat loads within budget. A trade-off between mechanical and thermal properties, as well as cost considerations, led to the choice of glass fibre reinforced epoxy (GFRE). Resin Transfer Moulding (RTM), featuring a high level of automation and control, was the manufacturing process retained to ensure the reproducibility of the performance of the supports throughout the large production.The Spanish aerospace company EADS-CASA Espacio developed the specific RTM process, and produced the total quantity of supports between 2001 and 2004.This paper describes the development and the production of the supports, and presents the production experience and the achieved performance

  14. Large scale carbon dioxide production from coal-fired power stations for enhanced oil recovery: a new economic feasibility study

    International Nuclear Information System (INIS)

    Tontiwachwuthikul, P.; Chan, C. W.; Kritpiphat, W.; Demontigny, D.; Skoropad, D.; Gelowitz, D.; Aroonwilas, A.; Mourits, F.; Wilson, M.; Ward, L.

    1998-01-01

    The concept of capturing carbon dioxide from fossil-fuelled electric power generating plants and utilizing it as a flooding agent in enhanced oil recovery (EOR) processes, was explored. In this context, this paper describes how cogeneration concepts, together with process optimization strategies, help to reduce the carbon dioxide production cost by utilizing low-pressure steam and waste heat from various sections of the power generation process. Based on these optimization strategies, the recovery cost of carbon dioxide from coal-fired power stations is estimated to be in the range of $ 0.50 to $ 2.00/mscf. Assuming an average cost of $ 1.25/mscf, the production cost of incremental oil would be about $ 18.00. This means that even with today's modest oil prices, there is room for profit to be made operating a carbon dioxide flood with flue gas extracted carbon dioxide

  15. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    was carried out of the change process related implementation of organic foods in large-scale foodservice facilities in Greater Copenhagen county in order to study the effects of such a change. Based on the findings, a set of guidelines has been developed for the successful implementation of organic foods...

  16. Importance of regional species pools and functional traits in colonization processes: predicting re-colonization after large-scale destruction of ecosystems

    NARCIS (Netherlands)

    Kirmer, A.; Tischew, S.; Ozinga, W.A.; Lampe, von M.; Baasch, A.; Groenendael, van J.M.

    2008-01-01

    Large-scale destruction of ecosystems caused by surface mining provides an opportunity for the study of colonization processes starting with primary succession. Surprisingly, over several decades and without any restoration measures, most of these sites spontaneously developed into valuable biotope

  17. Inducing a health-promoting change process within an organization the Effectiveness of a Large-Scale Intervention on Social Capital, Openness, and Autonomous Motivation Toward Health

    NARCIS (Netherlands)

    Scheppingen, A.R. van; Vroome, E.M.M. de; Have, K.C.J.M. ten; Bos, E.H.; Zwetsloot, G.I.J.M.; Mechelen, W. van

    2014-01-01

    Objective: To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. Design and Methods: A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n =324)

  18. History matching of large scale fractures to production data; Calage de la geometrie des reseaux de fractures aux donnees hydrodynamiques de production d'un champ petrolier

    Energy Technology Data Exchange (ETDEWEB)

    Jenni, S.

    2005-01-01

    Object based models are very helpful to represent complex geological media such as fractured reservoirs. For building realistic fracture networks, these models have to be constrained to both static (seismic, geomechanics, geology) and dynamic data (well tests and production history). In this report we present a procedure for the calibration of large-scale fracture networks to production history. The history matching procedure includes a realistic geological modeling, a parameterization method coherent with the geological model and allowing an efficient optimization. Fluid flow modeling is based on a double medium approach. The calibration procedure was applied to a semi-synthetic case based on a real fractured reservoir. The calibration to water-cut data was performed. (author)

  19. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim

    Directory of Open Access Journals (Sweden)

    José Medina Pestana

    Full Text Available Summary The kidney transplant program at Hospital do Rim (hrim is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym, has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001 and patient survival (90.5 vs. 95.1%, p=0.001. The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

  20. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim.

    Science.gov (United States)

    Pestana, José Medina

    2016-10-01

    The kidney transplant program at Hospital do Rim (hrim) is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym), has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001) and patient survival (90.5 vs. 95.1%, p=0.001). The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

  1. Cell therapy-processing economics: small-scale microfactories as a stepping stone toward large-scale macrofactories.

    Science.gov (United States)

    Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A

    2018-03-01

    Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive.  The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.

  2. Inducing a health-promoting change process within an organization: the effectiveness of a large-scale intervention on social capital, openness, and autonomous motivation toward health.

    Science.gov (United States)

    van Scheppingen, Arjella R; de Vroome, Ernest M M; Ten Have, Kristin C J M; Bos, Ellen H; Zwetsloot, Gerard I J M; van Mechelen, W

    2014-11-01

    To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n = 324) were used to examine the effects on bonding social capital, openness, and autonomous motivation toward health and on employees' lifestyle, health, vitality, and sustainable employability. Also, the sensitivity of the intervention components was examined. Intervention effects were found for bonding social capital, openness toward health, smoking, healthy eating, and sustainable employability. The effects were primarily attributable to the intervention's dialogue component. The change process initiated by the large-scale intervention contributed to a social climate in the workplace that promoted health and ownership toward health. The study confirms the relevance of collective change processes for health promotion.

  3. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  4. Anti-L. donovani activity in macrophage/amastigote model of palmarumycin CP18 and its large scale production.

    Science.gov (United States)

    Ortega, Humberto E; Teixeira, Eliane de Morais; Rabello, Ana; Higginbotham, Sarah; Cubilla-Ríos, Luis

    2014-01-01

    Palmarumycin CP18, isolated from an extract of the fermentation broth and mycelium of the Panamanian endophytic fungus Edenia sp., was previously reported with strong and specific activity against Leishmania donovani. Here we report that when the same strain was cultured on different solid media--Harrold Agar, Leonian Agar, Potato dextrose Agar (PDA), Corn Meal Agar, Honey Peptone Agar, and eight vegetables (V8) Agar--in order to determine the optimal conditions for isolation of palmarumycin CP18, no signal for this compound was observed in any of the 1H NMR spectra of fractions obtained from these extracts. However, one extract, prepared from the fungal culture in PDA contained significant amounts of CJ-12,372, a possible biosynthetic precursor of palmarumycin CP18. Edenia sp. was cultivated on a large scale on PDA and CJ-12,372 was converted to palmarumycin CP18 by oxidation of its p-hydroquinone moiety with DDQ in dioxane. Palmarumycin CP18 showed anti-leishmanial activity against L. donovani in a macrophage/amastigote model, with IC50 values of 23.5 microM.

  5. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    Science.gov (United States)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  6. Effect of grain boundary phase on the magnetization reversal process of nanocrystalline magnet using large-scale micromagnetic simulation

    Directory of Open Access Journals (Sweden)

    Hiroshi Tsukahara

    2018-05-01

    Full Text Available We investigated the effects of grain boundary phases on magnetization reversal in permanent magnets by performing large-scale micromagnetic simulations based on Landau–Lifshitz–Gilbert equation under a periodic boundary. We considered planar grain boundary phases parallel and perpendicular to an easy axis of the permanent magnet and assumed the saturation magnetization and exchange stiffness constant of the grain boundary phase to be 10% and 1%, respectively, for Nd2Fe14B grains. The grain boundary phase parallel to the easy axis effectively inhibits propagation of magnetization reversal. In contrast, the domain wall moves across the grain boundary perpendicular to the easy axis. These properties of the domain wall motion are explained by dipole interaction, which stabilizes the antiparallel magnetic configuration in the direction perpendicular to the magnetization orientation. On the other hand, the magnetization is aligned in the same direction by the dipole interaction parallel to the magnetization orientation. This anisotropy of the effect of the grain boundary phase shows that improvement of the grain boundary phase perpendicular to the easy axis effectively enhances the coercivity of permanent magnets.

  7. Large scale carbon dioxide production from coal-fired power stations for enhanced oil recovery : a new economic feasibility study

    International Nuclear Information System (INIS)

    Tontiwachwuthikul, P.; Chan, C.W.; Kritpiphat, W.; DeMontigny, D.; Skoropad, D.; Gelowitz, D.; Aroonwilas, A.; Mourits, F.; Wilson, M.; Ward, L.

    1998-01-01

    A study was conducted to investigate the economics of capturing carbon dioxide from coal-fired power plants to be subsequently used as a flooding agent for enhanced oil recovery (EOR) technologies. It was shown that the production of CO 2 for EOR projects can be technically and economically feasible, particularly when the concepts of cogeneration and optimization are used to reduce steam and electricity expenditures. This is done by using low-pressure steam and waste heat from various sections of the power generation process. It was shown that recovery costs could range between $0.50 to $2.00 per mscf. This translates to a recovered oil price of in the range of $17.39 to $19.95 per bbl., suggesting that even at today's low oil prices there is room for CO 2 flooding with flue gas extracted CO 2 . Practical implications for Saskatchewan were examined. 15 refs., 4 tabs., 7 figs

  8. PROVING THE CAPABILITY FOR LARGE SCALE REGIONAL LAND-COVER DATA PRODUCTION BY SELF-FUNDED COMMERCIAL OPERATORS

    Directory of Open Access Journals (Sweden)

    M. W. Thompson

    2017-11-01

    Full Text Available For service providers developing commercial value-added data content based on remote sensing technologies, the focus is to typically create commercially appropriate geospatial information which has downstream business value. The primary aim being to link locational intelligence with business intelligence in order to better make informed decisions. From a geospatial perspective this locational information must be relevant, informative, and most importantly current; with the ability to maintain the information timeously into the future for change detection purposes. Aligned with this, GeoTerraImage has successfully embarked on the production of land-cover/land-use content over southern Africa. The ability for a private company to successfully implement and complete such an exercise has been the capability to leverage the combined advantages of cutting edge data processing technologies and methodologies, with emphasis on processing repeatability and speed, and the use of a wide range of readily available imagery. These production workflows utilise a wide range of integrated procedures including machine learning algorithms, innovative use of non-specialists for sourcing of reference data, and conventional pixel and object-based image classification routines, and experienced/expert landscape interpretation. This multi-faceted approach to data produce development demonstrates the capability for SMME level commercial entities such as GeoTerraImage to generate industry applicable large data content, in this case, wide area coverage land-cover and land-use data across the sub-continent. Within this development, the emphasis has been placed on the key land-use information, such as mining, human settlements, and agriculture, given the importance of this geo-spatial land-use information in business and socio-economic applications and decision making.

  9. A test trial irradiation of natural rubber latex on large scale for the production of examination gloves in a production scale

    International Nuclear Information System (INIS)

    Devendra, R.; Kulatunge, S.; Chandralal, H.N.K.K.; Kalyani, N.M.V.; Seneviratne, J.; Wellage, S.

    1996-01-01

    Radiation Vulcanization of natural rubber latex has been developed extensively through various research and development programme. During these investigations many data was collected and from these data it was proved that radiation vulcanized natural rubber latex (RVNRL) can be used as a new material for industry (RVNRL symposium 1989; Makuuchi IAEA report). This material has been extensively tested in making of dipped goods and extruded products. However these investigations were confined only to laboratory experiments and these experiments mainly reflected material properties of RVNRL and only a little was observed about its behavior in actual production scale operation. The present exercise was carried out mainly to study the behavior of the material in production scale by irradiating latex on a large scale and producing gloves in a production scale plant. It was found that RVNRL can be used in conventional glove plants without making major alteration to the plant. Quality of the gloves that were produced using RVNRL is acceptable. It was also found that the small deviation of vulcanization dose will affect the crosslinking density of films. This will drastically reduce the tensile strength of the film. Crosslinking density or pre-vulcanized relax modulus (PRM) at 100% is a reliable property to control the pre vulcanization of latex by radiation

  10. How Close We Are to Achieving Commercially Viable Large-Scale Photobiological Hydrogen Production by Cyanobacteria: A Review of the Biological Aspects

    Science.gov (United States)

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2015-01-01

    Photobiological production of H2 by cyanobacteria is considered to be an ideal source of renewable energy because the inputs, water and sunlight, are abundant. The products of photobiological systems are H2 and O2; the H2 can be used as the energy source of fuel cells, etc., which generate electricity at high efficiencies and minimal pollution, as the waste product is H2O. Overall, production of commercially viable algal fuels in any form, including biomass and biodiesel, is challenging, and the very few systems that are operational have yet to be evaluated. In this paper we will: briefly review some of the necessary conditions for economical production, summarize the reports of photobiological H2 production by cyanobacteria, present our schemes for future production, and discuss the necessity for further progress in the research needed to achieve commercially viable large-scale H2 production. PMID:25793279

  11. Sensing across large-scale cognitive radio networks: Data processing, algorithms, and testbed for wireless tomography and moving target tracking

    Science.gov (United States)

    Bonior, Jason David

    As the use of wireless devices has become more widespread so has the potential for utilizing wireless networks for remote sensing applications. Regular wireless communication devices are not typically designed for remote sensing. Remote sensing techniques must be carefully tailored to the capabilities of these networks before they can be applied. Experimental verification of these techniques and algorithms requires robust yet flexible testbeds. In this dissertation, two experimental testbeds for the advancement of research into sensing across large-scale cognitive radio networks are presented. System architectures, implementations, capabilities, experimental verification, and performance are discussed. One testbed is designed for the collection of scattering data to be used in RF and wireless tomography research. This system is used to collect full complex scattering data using a vector network analyzer (VNA) and amplitude-only data using non-synchronous software-defined radios (SDRs). Collected data is used to experimentally validate a technique for phase reconstruction using semidefinite relaxation and demonstrate the feasibility of wireless tomography. The second testbed is a SDR network for the collection of experimental data. The development of tools for network maintenance and data collection is presented and discussed. A novel recursive weighted centroid algorithm for device-free target localization using the variance of received signal strength for wireless links is proposed. The signal variance resulting from a moving target is modeled as having contours related to Cassini ovals. This model is used to formulate recursive weights which reduce the influence of wireless links that are farther from the target location estimate. The algorithm and its implementation on this testbed are presented and experimental results discussed.

  12. Process evaluation and assessment of use of a large scale water filter and cookstove program in Rwanda

    Directory of Open Access Journals (Sweden)

    Christina K. Barstow

    2016-07-01

    financed, public health intervention can achieve high levels of initial adoption and usage of household level water filtration and improved cookstoves at a large scale.

  13. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  14. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  15. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  16. Large-scale inference of gene function through phylogenetic annotation of Gene Ontology terms: case study of the apoptosis and autophagy cellular processes.

    Science.gov (United States)

    Feuermann, Marc; Gaudet, Pascale; Mi, Huaiyu; Lewis, Suzanna E; Thomas, Paul D

    2016-01-01

    We previously reported a paradigm for large-scale phylogenomic analysis of gene families that takes advantage of the large corpus of experimentally supported Gene Ontology (GO) annotations. This 'GO Phylogenetic Annotation' approach integrates GO annotations from evolutionarily related genes across ∼100 different organisms in the context of a gene family tree, in which curators build an explicit model of the evolution of gene functions. GO Phylogenetic Annotation models the gain and loss of functions in a gene family tree, which is used to infer the functions of uncharacterized (or incompletely characterized) gene products, even for human proteins that are relatively well studied. Here, we report our results from applying this paradigm to two well-characterized cellular processes, apoptosis and autophagy. This revealed several important observations with respect to GO annotations and how they can be used for function inference. Notably, we applied only a small fraction of the experimentally supported GO annotations to infer function in other family members. The majority of other annotations describe indirect effects, phenotypes or results from high throughput experiments. In addition, we show here how feedback from phylogenetic annotation leads to significant improvements in the PANTHER trees, the GO annotations and GO itself. Thus GO phylogenetic annotation both increases the quantity and improves the accuracy of the GO annotations provided to the research community. We expect these phylogenetically based annotations to be of broad use in gene enrichment analysis as well as other applications of GO annotations.Database URL: http://amigo.geneontology.org/amigo. © The Author(s) 2016. Published by Oxford University Press.

  17. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  18. Production of margarine fats by enzymatic interesterification with silica-granulated Thermomyces lanuginosa lipase in a large-scale study

    DEFF Research Database (Denmark)

    Zhang, Hong; Xu, Xuebing; Nilsson, Jörgen

    2001-01-01

    Interesterification of a blend of palm stearin and coconut oil (75:25, w/w), catalyzed by an immobilized Thermomyces lanuginosa lipase by silica granulation, Lipozyme TL IM, was studied for production of margarine fats in a 1- or 300-kg pilot-scale batch-stirred tank reactor. Parameters...

  19. Linking small-scale circulation dynamics with large-scale seasonal production (phytoplankton) in the Southern Ocean

    CSIR Research Space (South Africa)

    Nicholson, S

    2012-10-01

    Full Text Available Understanding the seasonal and intra-seasonal (daily to weekly) changes of the upper ocean and the impact on the primary production in the Southern Ocean is key to better understanding the sensitivities of the global carbon cycle....

  20. Integrating the NEPA 216 process with large-scale privatization projects under the US Department of Energy

    International Nuclear Information System (INIS)

    Eccleston, C.H.

    1994-05-01

    The US Department of Energy (DOE) is considering the possibility of replacing the existing Hanford Site 200 Are steam system through a privatization effort. Such an action would be subject to requirements of the National Environmental Policy Act (NEPA) of 1969. Section 216 of the Doe NEPA Implementation Procedures (216 Process) provides a specific mechanism for integrating the DOE procurement process with NEPA compliance requirements

  1. Large-scale purification of 90Sr from nuclear waste materials for production of 90Y, a therapeutic medical radioisotope.

    Science.gov (United States)

    Wester, Dennis W; Steele, Richard T; Rinehart, Donald E; DesChane, Jaquetta R; Carson, Katharine J; Rapko, Brian M; Tenforde, Thomas S

    2003-07-01

    A major limitation on the supply of the short-lived medical isotope 90Y (t1/2 = 64 h) is the available quantity of highly purified 90Sr generator material. A radiochemical production campaign was therefore undertaken to purify 1,500 Ci of 90Sr that had been isolated from fission waste materials. A series of alkaline precipitation steps removed all detectable traces of 137Cs, alpha emitters, and uranium and transuranic elements. Technical obstacles such as the buildup of gas pressure generated upon mixing large quantities of acid with solid 90Sr carbonate were overcome through safety features incorporated into the custom-built equipment used for 90Sr purification. Methods are described for analyzing the chemical and radiochemical purity of the final product and for accurately determining by gravimetry the quantities of 90Sr immobilized on stainless steel filters for future use.

  2. Large-scale purification of 90Sr from nuclear waste materials for production of 90Y, a therapeutic medical radioisotope

    International Nuclear Information System (INIS)

    Wester, D.W.; Steele, R.T.; Rinehart, D.E.; DesChane, J.R.; Carson, K.J.; Rapko, B.M.; Tenforde, T.S.

    2003-01-01

    A major limitation on the supply of the short-lived medical isotope 90 Y (t 1/2 =64 h) is the available quantity of highly purified 90 Sr generator material. A radiochemical production campaign was therefore undertaken to purify 1500 Ci of 90 Sr that had been isolated from fission waste materials. A series of alkaline precipitation steps removed all detectable traces of 137 Cs, alpha emitters, and uranium and transuranic elements. Technical obstacles such as the buildup of gas pressure generated upon mixing large quantities of acid with solid 90 Sr carbonate were overcome through safety features incorporated into the custom-built equipment used for 90 Sr purification. Methods are described for analyzing the chemical and radiochemical purity of the final product and for accurately determining by gravimetry the quantities of 90 Sr immobilized on stainless steel filters for future use

  3. Nuclear criticality safety practices in digestion systems of the large scale production facility of the Department of Energy at Fernald

    International Nuclear Information System (INIS)

    Dolan, L.C.

    1982-01-01

    Nuclear criticality safety practices used at the Feed Materials Production Center at Fernald, Ohio in conjunction with its metal dissolving and nonmetal, e.g., ash and ore concentrates, digesting operations are reviewed. Operating procedures with several different types of dissolver or digestor systems, i.e., metal dissolver, continuous, drum and safe geometry, are discussed. Calculations performed to verify the criticality safety of the operations are described

  4. A novel plasmid addiction system for large-scale production of cyanophycin in Escherichia coli using mineral salts medium.

    Science.gov (United States)

    Kroll, Jens; Klinter, Stefan; Steinbüchel, Alexander

    2011-02-01

    Hitherto the production of the biopolymer cyanophycin (CGP) using recombinant Escherichia coli strains and cheap mineral salts medium yielded only trace amounts of CGP (dapE disrupts the native succinylase pathway in E. coli and (2) the complementation by the plasmid-encoded artificial aminotransferase pathway mediated by the dapL gene from Synechocystis sp. PCC 6308, which allows the synthesis of the essential lysine precursor L,L-2,6-diaminopimelate. In addition, this plasmid also harbors cphAC595S, an engineered cyanophycin synthetase gene responsible for CGP production. Cultivation experiments in Erlenmeyer flask and also in bioreactors in mineral salts medium without antibiotics revealed an at least 4.5-fold enhanced production of CGP in comparison to control cultivations without PAS. Fermentation experiments with culture volume of up to 400 l yielded a maximum of 18% CGP (w/w) and a final cell density of 15.2 g CDM/l. Lactose was used constantly as an effective inducer and carbon source. Thus, we present a convenient option to produce CGP with E. coli at a technical scale without the need to add antibiotics or amino acids using the mineral salts medium designed in this study.

  5. Evaluation of the Potential Environmental Impacts from Large-Scale Use and Production of Hydrogen in Energy and Transportation Applications

    Energy Technology Data Exchange (ETDEWEB)

    Wuebbles, D.J.; Dubey, M.K., Edmonds, J.; Layzell, D.; Olsen, S.; Rahn, T.; Rocket, A.; Wang, D.; Jia, W.

    2010-06-01

    The purpose of this project is to systematically identify and examine possible near and long-term ecological and environmental effects from the production of hydrogen from various energy sources based on the DOE hydrogen production strategy and the use of that hydrogen in transportation applications. This project uses state-of-the-art numerical modeling tools of the environment and energy system emissions in combination with relevant new and prior measurements and other analyses to assess the understanding of the potential ecological and environmental impacts from hydrogen market penetration. H2 technology options and market penetration scenarios will be evaluated using energy-technology-economics models as well as atmospheric trace gas projections based on the IPCC SRES scenarios including the decline in halocarbons due to the Montreal Protocol. Specifically we investigate the impact of hydrogen releases on the oxidative capacity of the atmosphere, the long-term stability of the ozone layer due to changes in hydrogen emissions, the impact of hydrogen emissions and resulting concentrations on climate, the impact on microbial ecosystems involved in hydrogen uptake, and criteria pollutants emitted from distributed and centralized hydrogen production pathways and their impacts on human health, air quality, ecosystems, and structures under different penetration scenarios

  6. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  7. Enantiomeric separation of pharmaceutically important drug intermediates using a Metagenomic lipase and optimization of its large scale production.

    Science.gov (United States)

    Kumar, Rakesh; Banoth, Linga; Banerjee, Uttam Chand; Kaur, Jagdeep

    2017-02-01

    In the present study, efficient enzymatic methods were developed using a recombinant metagenomic lipase (LipR1) for the synthesis of corresponding esters by the transesterification of five different pharmaceutically important secondary alcohols. The recombinant lipase (specific activity=87m6U/mg) showed maximum conversion in presence of ionic liquid with Naphthyl-ethanol (eeP=99%), Indanol and Methyl-4 pyridine methanol (eeS of 98% and 99%) respectively in 1h. Vinyl acetate was found as suitable acyl donor in transesterification reactions. It was interesting to observe that maximum eeP of 85% was observed in just 15min with 1-indanol. As this enzyme demonstrated pharmaceutical applications, attempts were made to scale up the enzyme production on a pilot scale in a 5litre bioreactor. Different physical parameters affecting enzyme production and biomass concentration such as agitation rate, aeration rate and inoculum concentration were evaluated. Maximum lipase activity of 8463U/ml was obtained at 7h of cultivation at 1 lpm, 300rpm and 1.5% inoculum. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Sustainable production of toxin free marine microalgae biomass as fish feed in large scale open system in the Qatari desert.

    Science.gov (United States)

    Das, Probir; Thaher, Mahmoud Ibrahim; Hakim, Mohammed Abdul Quadir Mohd Abdul; Al-Jabri, Hareb Mohammed S J

    2015-09-01

    Mass cultivation of microalgae biomass for feed should be cost effective and toxin free. Evaporation loss in Qatar can be as high as 2 cm/d. Hence, production of marine microalgae biomass in Qatar would also require mitigating water loss as there was only very limited groundwater reserve. To address these issues, a combination of four growth conditions were applied to a 25,000 L raceway pond: locally isolated microalgae strain was selected which could grow in elevated salinity; strain that did not require silica and vitamins; volume of the culture would increase over time keeping denser inoculum in the beginning, and evaporation water loss would be balanced by adding seawater only. A local saline tolerant Nannochloropsis sp. was selected which did not require silica and vitamins. When the above conditions were combined in the pond, average areal biomass productivities reached 20.37 g/m(2)/d, and the culture was not contaminated by any toxic microalgae. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. [Energy Consumption Comparison and Energy Saving Approaches for Different Wastewater Treatment Processes in a Large-scale Reclaimed Water Plant].

    Science.gov (United States)

    Yang, Min; Li, Ya-ming; Wei, Yuan-song; Lü, Jian; Yu, Da-wei; Liu, Ji-bao; Fan, Yao-bo

    2015-06-01

    Energy consumption is the main performance indicator of reclaimed water plant (RWP) operation. Methods of specific energy consumption analysis, unit energy consumption analysis and redundancy analysis were applied to investigate the composition and spatio-temporal distribution of energy consumption in Qinghe RWP with inverted A2/O, A2/O and A2/O-MBR processes. And the A2/ O-MBR process was mainly analyzed to identify the main nodes and causes for high energy consumption, approaches for energy saving were explored, and the energy consumption before and after upgrading for energy saving was compared. The results showed that aeration was the key factor affecting energy consumption in both conventional and A2/O-MBR processes, accounting for 42.97% and 50.65% of total energy consumption, respectively. A pulsating aeration allowed an increasing membrane flux and remarkably reduced the energy consumption of the A2/O-MBR process while still meeting the effluent standard, e.g., the membrane flux was increased by 20%, and the energy consumptions per kiloton wastewater and kilogram COD(removed) were decreased by 42.39% to 0.53 kW-h-kg-3 and by 54.74% to 1.29 kW x h x kg(-1), respectively. The decrease of backflow ratio in the A2/O-MBR process within a certain range would not deteriorate the effluent quality due to its insignificant correlation with the effluent quality, and therefore may be considered as one of the ways for further energy saving.

  10. How engineering data management and system support the main process[-oriented] functions of a large-scale project

    CERN Document Server

    Hameri, A P

    1999-01-01

    By dividing the development process into successive functional operations, this paper studies the benefits of establishing configuration management procedures and of using an engineering data management systems (EDMS) in order to execute the tasks. The underlying environment is that of CERN and the ongoing, a decade long, Large Hadron Collider (LHC)-project. By identifying the main functional groups who will use the EDMS the paper outlines the basic motivations and services provided by such a system to each process function. The implications of strict configuration management on the daily operation of each functional user group are also discussed. The main argument of the paper is that each and every user of the EDMS must act in compliance with the configuration management procedures to guarantee the overall benefits from the system. The pilot EDMS being developed at CERN, which serves as a test-bed to discover the real functional needs of the organisation of an EDMS supports the conclusions. The preliminary ...

  11. Large-scale freestanding nanometer-thick graphite pellicles for mass production of nanodevices beyond 10 nm.

    Science.gov (United States)

    Kim, Seul-Gi; Shin, Dong-Wook; Kim, Taesung; Kim, Sooyoung; Lee, Jung Hun; Lee, Chang Gu; Yang, Cheol-Woong; Lee, Sungjoo; Cho, Sang Jin; Jeon, Hwan Chul; Kim, Mun Ja; Kim, Byung-Gook; Yoo, Ji-Beom

    2015-09-21

    Extreme ultraviolet lithography (EUVL) has received much attention in the semiconductor industry as a promising candidate to extend dimensional scaling beyond 10 nm. We present a new pellicle material, nanometer-thick graphite film (NGF), which shows an extreme ultraviolet (EUV) transmission of 92% at a thickness of 18 nm. The maximum temperature induced by laser irradiation (λ = 800 nm) of 9.9 W cm(-2) was 267 °C, due to the high thermal conductivity of the NGF. The freestanding NGF was found to be chemically stable during annealing at 500 °C in a hydrogen environment. A 50 × 50 mm large area freestanding NGF was fabricated using the wet and dry transfer (WaDT) method. The NGF can be used as an EUVL pellicle for the mass production of nanodevices beyond 10 nm.

  12. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan; Bachl, Fabian E.; Lindgren, Finn; Borchers, David L.; Illian, Janine B.; Buckland, Stephen T.; Rue, Haavard; Gerrodette, Tim

    2017-01-01

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  13. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan

    2017-12-28

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  14. Heat recovery networks synthesis of large-scale industrial sites: Heat load distribution problem with virtual process subsystems

    International Nuclear Information System (INIS)

    Pouransari, Nasibeh; Maréchal, Francois

    2015-01-01

    Highlights: • Synthesizing industrial size heat recovery network with match reduction approach. • Targeting TSI with minimum exchange between process subsystems. • Generating a feasible close-to-optimum network. • Reducing tremendously the HLD computational time and complexity. • Generating realistic network with respect to the plant layout. - Abstract: This paper presents a targeting strategy to design a heat recovery network for an industrial plant by dividing the system into subsystems while considering the heat transfer opportunities between them. The methodology is based on a sequential approach. The heat recovery opportunity between process units and the optimal flow rates of utilities are first identified using a Mixed Integer Linear Programming (MILP) model. The site is then divided into a number of subsystems where the overall interaction is resumed by a pair of virtual hot and cold stream per subsystem which is reconstructed by solving the heat cascade inside each subsystem. The Heat Load Distribution (HLD) problem is then solved between those packed subsystems in a sequential procedure where each time one of the subsystems is unpacked by switching from the virtual stream pair back into the original ones. The main advantages are to minimize the number of connections between process subsystems, to alleviate the computational complexity of the HLD problem and to generate a feasible network which is compatible with the minimum energy consumption objective. The application of the proposed methodology is illustrated through a number of case studies, discussed and compared with the relevant results from the literature

  15. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    Science.gov (United States)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  16. Computation of 2-D pinhole image-formation process of large-scale furnaces using the discrete ordinates method

    CERN Document Server

    Li Hong; Lu Ji Dong; Zheng Chu Guan

    2003-01-01

    In most of the discrete ordinate schemes (DOS) reported in the literature, the discrete directions are fixed, and unable to be arbitrarily adjusted; therefore, it is difficult to employ these schemes to calculate the radiative energy image-formation of pulverized-coal furnaces. On the basis of a new DOS, named the discrete ordinate scheme with (an) infinitely small weight(s), which was recently proposed by the authors, a novel algorithm for computing the pinhole image-formation process is developed in this work. The performance of this algorithm is tested, and is found to be also suitable for parallel computation.

  17. Computation of 2-D pinhole image-formation process of large-scale furnaces using the discrete ordinates method

    International Nuclear Information System (INIS)

    Li Hongshun; Zhou Huaichun; Lu Jidong; Zheng Chuguang

    2003-01-01

    In most of the discrete ordinate schemes (DOS) reported in the literature, the discrete directions are fixed, and unable to be arbitrarily adjusted; therefore, it is difficult to employ these schemes to calculate the radiative energy image-formation of pulverized-coal furnaces. On the basis of a new DOS, named the discrete ordinate scheme with (an) infinitely small weight(s), which was recently proposed by the authors, a novel algorithm for computing the pinhole image-formation process is developed in this work. The performance of this algorithm is tested, and is found to be also suitable for parallel computation

  18. Evaluation of landfill gas production and emissions in a MSW large-scale Experimental Cell in Brazil.

    Science.gov (United States)

    Maciel, Felipe Jucá; Jucá, José Fernando Thomé

    2011-05-01

    Landfill gas (LFG) emissions from municipal solid waste (MSW) landfills are an important environmental concern in Brazil due to the existence of several uncontrolled disposal sites. A program of laboratory and field tests was conducted to investigate gas generation in and emission from an Experimental Cell with a 36,659-ton capacity in Recife/PE - Brazil. This investigation involved waste characterisation, gas production and emission monitoring, and geotechnical and biological evaluations and was performed using three types of final cover layers. The results obtained in this study showed that waste decomposes 4-5 times faster in a tropical wet climate than predicted by traditional first-order models using default parameters. This fact must be included when considering the techniques and economics of projects developed in tropical climate countries. The design of the final cover layer and its geotechnical and biological behaviour proved to have an important role in minimising gas emissions to the atmosphere. Capillary and methanotrophic final cover layers presented lower CH(4) flux rates than the conventional layer. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Large-scale analyses of synonymous substitution rates can be sensitive to assumptions about the process of mutation.

    Science.gov (United States)

    Aris-Brosou, Stéphane; Bielawski, Joseph P

    2006-08-15

    A popular approach to examine the roles of mutation and selection in the evolution of genomes has been to consider the relationship between codon bias and synonymous rates of molecular evolution. A significant relationship between these two quantities is taken to indicate the action of weak selection on substitutions among synonymous codons. The neutral theory predicts that the rate of evolution is inversely related to the level of functional constraint. Therefore, selection against the use of non-preferred codons among those coding for the same amino acid should result in lower rates of synonymous substitution as compared with sites not subject to such selection pressures. However, reliably measuring the extent of such a relationship is problematic, as estimates of synonymous rates are sensitive to our assumptions about the process of molecular evolution. Previous studies showed the importance of accounting for unequal codon frequencies, in particular when synonymous codon usage is highly biased. Yet, unequal codon frequencies can be modeled in different ways, making different assumptions about the mutation process. Here we conduct a simulation study to evaluate two different ways of modeling uneven codon frequencies and show that both model parameterizations can have a dramatic impact on rate estimates and affect biological conclusions about genome evolution. We reanalyze three large data sets to demonstrate the relevance of our results to empirical data analysis.

  20. Beyond single syllables: large-scale modeling of reading aloud with the Connectionist Dual Process (CDP++) model.

    Science.gov (United States)

    Perry, Conrad; Ziegler, Johannes C; Zorzi, Marco

    2010-09-01

    Most words in English have more than one syllable, yet the most influential computational models of reading aloud are restricted to processing monosyllabic words. Here, we present CDP++, a new version of the Connectionist Dual Process model (Perry, Ziegler, & Zorzi, 2007). CDP++ is able to simulate the reading aloud of mono- and disyllabic words and nonwords, and learns to assign stress in exactly the same way as it learns to associate graphemes with phonemes. CDP++ is able to simulate the monosyllabic benchmark effects its predecessor could, and therefore shows full backwards compatibility. CDP++ also accounts for a number of novel effects specific to disyllabic words, including the effects of stress regularity and syllable number. In terms of database performance, CDP++ accounts for over 49% of the reaction time variance on items selected from the English Lexicon Project, a very large database of several thousand of words. With its lexicon of over 32,000 words, CDP++ is therefore a notable example of the successful scaling-up of a connectionist model to a size that more realistically approximates the human lexical system. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Advanced Heat Transfer Studies in Superfluid Helium for Large-scale High-yield Production of Superconducting Radio Frequency Cavities

    CERN Document Server

    Peters, Benedikt J; Schirm, Karl-Martin; Koettig, Torsten

    Oscillating Superleak Transducers (OSTs) can be used to localize quenches in superconducting radio frequency cavities. In the presented work the occurring thermal effects during such events are investigated both theoretically and experimentally. In the theoretical part the entire heat transfer process from the heat generation to the detection is covered. The experimental part focuses on the effects in superfluid helium. Previous publications observed the detection of an OST signal that was faster than the second sound velocity. This fast propagation could be verified in dedicated small scale experiments. Resistors were used to simulate the quench spots under controlled conditions. The three dimensional propagation of second sound was linked to OST signals for the first time, which improves the understanding of the OST signal and allows to gather information about the heating pulse. Additionally, OSTs were used as a tool for quench localisation on a real size cavity. Their sensitivity as well as the time resol...

  2. FENIX experimental results of large-scale CICC made of bronze-processed Nb3Sn strands

    International Nuclear Information System (INIS)

    Shen, S.S.; Felker, B.; Moller, J.M.; Parker, J.M.; Isono, T.; Yasukawa, Y.; Hosono, F.; Nishi, M.

    1994-01-01

    The Fusion ENgineering International eXperiments (FENIX) Test Facility recently has successfully complete the testing of a pair of Nb 3 rSn cable-in-conduit conductors developed by the Japan Atomic Energy Research Institute. These conductors, made of bronze-processed strands, were designed to operate stably with 40-kA transport current at a magnetic field of 13 T. In addition to the measurements of major design parameters such as current-sharing temperature, FENIX provided several experiments specifically designed to provide results urgently needed by magnet designers. Performed experiments include measurements of ramp-rate limit, current-distribution, stability, and joint performance. This paper presents the design and results of these special experiments

  3. MacroBac: New Technologies for Robust and Efficient Large-Scale Production of Recombinant Multiprotein Complexes.

    Science.gov (United States)

    Gradia, Scott D; Ishida, Justin P; Tsai, Miaw-Sheue; Jeans, Chris; Tainer, John A; Fuss, Jill O

    2017-01-01

    Recombinant expression of large, multiprotein complexes is essential and often rate limiting for determining structural, biophysical, and biochemical properties of DNA repair, replication, transcription, and other key cellular processes. Baculovirus-infected insect cell expression systems are especially well suited for producing large, human proteins recombinantly, and multigene baculovirus systems have facilitated studies of multiprotein complexes. In this chapter, we describe a multigene baculovirus system called MacroBac that uses a Biobricks-type assembly method based on restriction and ligation (Series 11) or ligation-independent cloning (Series 438). MacroBac cloning and assembly is efficient and equally well suited for either single subcloning reactions or high-throughput cloning using 96-well plates and liquid handling robotics. MacroBac vectors are polypromoter with each gene flanked by a strong polyhedrin promoter and an SV40 poly(A) termination signal that minimize gene order expression level effects seen in many polycistronic assemblies. Large assemblies are robustly achievable, and we have successfully assembled as many as 10 genes into a single MacroBac vector. Importantly, we have observed significant increases in expression levels and quality of large, multiprotein complexes using a single, multigene, polypromoter virus rather than coinfection with multiple, single-gene viruses. Given the importance of characterizing functional complexes, we believe that MacroBac provides a critical enabling technology that may change the way that structural, biophysical, and biochemical research is done. © 2017 Elsevier Inc. All rights reserved.

  4. Large-scale evaluation of β -decay rates of r -process nuclei with the inclusion of first-forbidden transitions

    Science.gov (United States)

    Marketin, T.; Huther, L.; Martínez-Pinedo, G.

    2016-02-01

    Background: r -process nucleosynthesis models rely, by necessity, on nuclear structure models for input. Particularly important are β -decay half-lives of neutron-rich nuclei. At present only a single systematic calculation exists that provides values for all relevant nuclei making it difficult to test the sensitivity of nucleosynthesis models to this input. Additionally, even though there are indications that their contribution may be significant, the impact of first-forbidden transitions on decay rates has not been systematically studied within a consistent model. Purpose: Our goal is to provide a table of β -decay half-lives and β -delayed neutron emission probabilities, including first-forbidden transitions, calculated within a fully self-consistent microscopic theoretical framework. The results are used in an r -process nucleosynthesis calculation to asses the sensitivity of heavy element nucleosynthesis to weak interaction reaction rates. Method: We use a fully self-consistent covariant density functional theory (CDFT) framework. The ground state of all nuclei is calculated with the relativistic Hartree-Bogoliubov (RHB) model, and excited states are obtained within the proton-neutron relativistic quasiparticle random phase approximation (p n -RQRPA). Results: The β -decay half-lives, β -delayed neutron emission probabilities, and the average number of emitted neutrons have been calculated for 5409 nuclei in the neutron-rich region of the nuclear chart. We observe a significant contribution of the first-forbidden transitions to the total decay rate in nuclei far from the valley of stability. The experimental half-lives are in general well reproduced for even-even, odd-A , and odd-odd nuclei, in particular for short-lived nuclei. The resulting data table is included with the article as Supplemental Material. Conclusions: In certain regions of the nuclear chart, first-forbidden transitions constitute a large fraction of the total decay rate and must be

  5. Modeling temporal and large-scale spatial variability of soil respiration from soil water availability, temperature and vegetation productivity indices

    Science.gov (United States)

    Reichstein, Markus; Rey, Ana; Freibauer, Annette; Tenhunen, John; Valentini, Riccardo; Banza, Joao; Casals, Pere; Cheng, Yufu; Grünzweig, Jose M.; Irvine, James; Joffre, Richard; Law, Beverly E.; Loustau, Denis; Miglietta, Franco; Oechel, Walter; Ourcival, Jean-Marc; Pereira, Joao S.; Peressotti, Alessandro; Ponti, Francesca; Qi, Ye; Rambal, Serge; Rayment, Mark; Romanya, Joan; Rossi, Federica; Tedeschi, Vanessa; Tirone, Giampiero; Xu, Ming; Yakir, Dan

    2003-12-01

    Field-chamber measurements of soil respiration from 17 different forest and shrubland sites in Europe and North America were summarized and analyzed with the goal to develop a model describing seasonal, interannual and spatial variability of soil respiration as affected by water availability, temperature, and site properties. The analysis was performed at a daily and at a monthly time step. With the daily time step, the relative soil water content in the upper soil layer expressed as a fraction of field capacity was a good predictor of soil respiration at all sites. Among the site variables tested, those related to site productivity (e.g., leaf area index) correlated significantly with soil respiration, while carbon pool variables like standing biomass or the litter and soil carbon stocks did not show a clear relationship with soil respiration. Furthermore, it was evidenced that the effect of precipitation on soil respiration stretched beyond its direct effect via soil moisture. A general statistical nonlinear regression model was developed to describe soil respiration as dependent on soil temperature, soil water content, and site-specific maximum leaf area index. The model explained nearly two thirds of the temporal and intersite variability of soil respiration with a mean absolute error of 0.82 μmol m-2 s-1. The parameterized model exhibits the following principal properties: (1) At a relative amount of upper-layer soil water of 16% of field capacity, half-maximal soil respiration rates are reached. (2) The apparent temperature sensitivity of soil respiration measured as Q10 varies between 1 and 5 depending on soil temperature and water content. (3) Soil respiration under reference moisture and temperature conditions is linearly related to maximum site leaf area index. At a monthly timescale, we employed the approach by [2002] that used monthly precipitation and air temperature to globally predict soil respiration (T&P model). While this model was able to

  6. Modelling temporal and large-scale spatial variability of soil respiration from soil water availability, temperature and vegetation productivity indices

    Science.gov (United States)

    Reichstein, M.; Rey, A.; Freibauer, A.; Tenhunen, J.; Valentini, R.; Soil Respiration Synthesis Team

    2003-04-01

    Field-chamber measurements of soil respiration from 17 different forest and shrubland sites in Europe and North America were summarized and analyzed with the goal to develop a model describing seasonal, inter-annual and spatial variability of soil respiration as affected by water availability, temperature and site properties. The analysis was performed at a daily and at a monthly time step. With the daily time step, the relative soil water content in the upper soil layer expressed as a fraction of field capacity was a good predictor of soil respiration at all sites. Among the site variables tested, those related to site productivity (e.g. leaf area index) correlated significantly with soil respiration, while carbon pool variables like standing biomass or the litter and soil carbon stocks did not show a clear relationship with soil respiration. Furthermore, it was evidenced that the effect of precipitation on soil respiration stretched beyond its direct effect via soil moisture. A general statistical non-linear regression model was developed to describe soil respiration as dependent on soil temperature, soil water content and site-specific maximum leaf area index. The model explained nearly two thirds of the temporal and inter-site variability of soil respiration with a mean absolute error of 0.82 µmol m-2 s-1. The parameterised model exhibits the following principal properties: 1) At a relative amount of upper-layer soil water of 16% of field capacity half-maximal soil respiration rates are reached. 2) The apparent temperature sensitivity of soil respiration measured as Q10 varies between 1 and 5 depending on soil temperature and water content. 3) Soil respiration under reference moisture and temperature conditions is linearly related to maximum site leaf area index. At a monthly time-scale we employed the approach by Raich et al. (2002, Global Change Biol. 8, 800-812) that used monthly precipitation and air temperature to globally predict soil respiration (T

  7. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  8. Large-scale Lurgi plant would be uneconomic: study group

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-21

    Gas Council and National Coal Board agreed that building of large scale Lurgi plant on the basis of study is not at present acceptable on economic grounds. The committee considered that new processes based on naphtha offered more economic sources of base and peak load production. Tables listing data provided in contractors' design studies and summary of contractors' process designs are included.

  9. Improved Large-Scale Inundation Modelling by 1D-2D Coupling and Consideration of Hydrologic and Hydrodynamic Processes - a Case Study in the Amazon

    Science.gov (United States)

    Hoch, J. M.; Bierkens, M. F.; Van Beek, R.; Winsemius, H.; Haag, A.

    2015-12-01

    Understanding the dynamics of fluvial floods is paramount to accurate flood hazard and risk modeling. Currently, economic losses due to flooding constitute about one third of all damage resulting from natural hazards. Given future projections of climate change, the anticipated increase in the World's population and the associated implications, sound knowledge of flood hazard and related risk is crucial. Fluvial floods are cross-border phenomena that need to be addressed accordingly. Yet, only few studies model floods at the large-scale which is preferable to tiling the output of small-scale models. Most models cannot realistically model flood wave propagation due to a lack of either detailed channel and floodplain geometry or the absence of hydrologic processes. This study aims to develop a large-scale modeling tool that accounts for both hydrologic and hydrodynamic processes, to find and understand possible sources of errors and improvements and to assess how the added hydrodynamics affect flood wave propagation. Flood wave propagation is simulated by DELFT3D-FM (FM), a hydrodynamic model using a flexible mesh to schematize the study area. It is coupled to PCR-GLOBWB (PCR), a macro-scale hydrological model, that has its own simpler 1D routing scheme (DynRout) which has already been used for global inundation modeling and flood risk assessments (GLOFRIS; Winsemius et al., 2013). A number of model set-ups are compared and benchmarked for the simulation period 1986-1996: (0) PCR with DynRout; (1) using a FM 2D flexible mesh forced with PCR output and (2) as in (1) but discriminating between 1D channels and 2D floodplains, and, for comparison, (3) and (4) the same set-ups as (1) and (2) but forced with observed GRDC discharge values. Outputs are subsequently validated against observed GRDC data at Óbidos and flood extent maps from the Dartmouth Flood Observatory. The present research constitutes a first step into a globally applicable approach to fully couple

  10. Karhunen-Loève (PCA) based detection of multiple oscillations in multiple measurement signals from large-scale process plants

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Wickerhauser, M.V.

    2007-01-01

     In the perspective of optimizing the control and operation of large scale process plants, it is important to detect and to locate oscillations in the plants. This paper presents a scheme for detecting and localizing multiple oscillations in multiple measurements from such a large-scale power plant....... The scheme is based on a Karhunen-Lo\\`{e}ve analysis of the data from the plant. The proposed scheme is subsequently tested on two sets of data: a set of synthetic data and a set of data from a coal-fired power plant. In both cases the scheme detects the beginning of the oscillation within only a few samples....... In addition the oscillation localization has also shown its potential by localizing the oscillations in both data sets....

  11. Enhanced saturated fatty acids accumulation in cultures of newly-isolated strains of Schizochytrium sp. and Thraustochytriidae sp. for large-scale biodiesel production.

    Science.gov (United States)

    Wang, Qiuzhen; Sen, Biswarup; Liu, Xianhua; He, Yaodong; Xie, Yunxuan; Wang, Guangyi

    2018-08-01

    Heterotrophic marine protists (Thraustochytrids) have received increasingly global attention as a renewable, sustainable and alternative source of biodiesel because of their high ability of saturated fatty acids (SFAs) accumulation. Yet, the influence of extrinsic factors (nutrients and environmental conditions) on thraustochytrid culture and optimal conditions for high SFAs production are poorly described. In the present study, two different thraustochytrid strains, Schizochytrium sp. PKU#Mn4 and Thraustochytriidae sp. PKU#Mn16 were studied for their growth and SFAs production profiles under various conditions (carbon, nitrogen, temperature, pH, KH 2 PO 4 , salinity, and agitation speed). Of the culture conditions, substrates (C and N) source and conc., temperature, and agitation speed significantly influenced the cell growth and SFAs production of both strains. Although both the strains were capable of growth and SFAs production in the broad range of culture conditions, their physiological responses to KH 2 PO 4 , pH, and salinity were dissimilar. Under their optimal batch culture conditions, peak SFAs productions of 3.3g/L and 2.2g/L with 62% and 49% SFAs contents (relative to total fatty acids) were achieved, respectively. The results of 5-L fed-batch fermentation under optimal conditions showed a nearly 4.5-fold increase in SFAs production (i.e., 7.5g/L) by both strains compared to unoptimized conditions. Of the two strains, the quality of biodiesel produced from the fatty acids of PKU#Mn4 met the biodiesel standard defined by ASTM6751. This study, to the knowledge of the authors, is the first comprehensive report of optimal fermentation conditions demonstrating enhanced SFAs production by strains belonging to two different thraustochytrid genera and provides the basis for large-scale biodiesel production. Copyright © 2018. Published by Elsevier B.V.

  12. Toxic Combustion Product Yields as a Function of Equivalence Ratio and Flame Retardants in Under-Ventilated Fires: Bench-Large-Scale Comparisons

    Directory of Open Access Journals (Sweden)

    David A. Purser

    2016-09-01

    Full Text Available In large-scale compartment fires; combustion product yields vary with combustion conditions mainly in relation to the fuel:air equivalence ratio (Φ and the effects of gas-phase flame retardants. Yields of products of inefficient combustion; including the major toxic products CO; HCN and organic irritants; increase considerably as combustion changes from well-ventilated (Φ < 1 to under-ventilated (Φ = 1–3. It is therefore essential that bench-scale toxicity tests reproduce this behaviour across the Φ range. Yield data from repeat compartment fire tests for any specific fuel show some variation on either side of a best-fit curve for CO yield as a function of Φ. In order to quantify the extent to which data from the steady state tube furnace (SSTF [1]; ISO TS19700 [2] represents compartment fire yields; the range and average deviations of SSTF data for CO yields from the compartment fire best-fit curve were compared to those for direct compartment fire measurements for six different polymeric fuels with textile and non-textile applications and for generic post-flashover fire CO yield data. The average yields; range and standard deviations of the SSTF data around the best-fit compartment fire curves were found to be close to those for the compartment fire data. It is concluded that SSTF data are as good a predictor of compartment fire yields as are repeat compartment fire test data.

  13. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  14. Scale-up and large-scale production of Tetraselmis sp. CTP4 (Chlorophyta) for CO2 mitigation: from an agar plate to 100-m3 industrial photobioreactors.

    Science.gov (United States)

    Pereira, Hugo; Páramo, Jaime; Silva, Joana; Marques, Ana; Barros, Ana; Maurício, Dinis; Santos, Tamára; Schulze, Peter; Barros, Raúl; Gouveia, Luísa; Barreira, Luísa; Varela, João

    2018-03-23

    Industrial production of novel microalgal isolates is key to improving the current portfolio of available strains that are able to grow in large-scale production systems for different biotechnological applications, including carbon mitigation. In this context, Tetraselmis sp. CTP4 was successfully scaled up from an agar plate to 35- and 100-m 3 industrial scale tubular photobioreactors (PBR). Growth was performed semi-continuously for 60 days in the autumn-winter season (17 th October - 14 th December). Optimisation of tubular PBR operations showed that improved productivities were obtained at a culture velocity of 0.65-1.35 m s -1 and a pH set-point for CO 2 injection of 8.0. Highest volumetric (0.08 ± 0.01 g L -1 d -1 ) and areal (20.3 ± 3.2 g m -2 d -1 ) biomass productivities were attained in the 100-m 3 PBR compared to those of the 35-m 3 PBR (0.05 ± 0.02 g L -1 d -1 and 13.5 ± 4.3 g m -2 d -1 , respectively). Lipid contents were similar in both PBRs (9-10% of ash free dry weight). CO 2 sequestration was followed in the 100-m 3 PBR, revealing a mean CO 2 mitigation efficiency of 65% and a biomass to carbon ratio of 1.80. Tetraselmis sp. CTP4 is thus a robust candidate for industrial-scale production with promising biomass productivities and photosynthetic efficiencies up to 3.5% of total solar irradiance.

  15. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  16. Sound to language: different cortical processing for first and second languages in elementary school children as revealed by a large-scale study using fNIRS.

    Science.gov (United States)

    Sugiura, Lisa; Ojima, Shiro; Matsuba-Kurita, Hiroko; Dan, Ippeita; Tsuzuki, Daisuke; Katura, Takusige; Hagiwara, Hiroko

    2011-10-01

    A large-scale study of 484 elementary school children (6-10 years) performing word repetition tasks in their native language (L1-Japanese) and a second language (L2-English) was conducted using functional near-infrared spectroscopy. Three factors presumably associated with cortical activation, language (L1/L2), word frequency (high/low), and hemisphere (left/right), were investigated. L1 words elicited significantly greater brain activation than L2 words, regardless of semantic knowledge, particularly in the superior/middle temporal and inferior parietal regions (angular/supramarginal gyri). The greater L1-elicited activation in these regions suggests that they are phonological loci, reflecting processes tuned to the phonology of the native language, while phonologically unfamiliar L2 words were processed like nonword auditory stimuli. The activation was bilateral in the auditory and superior/middle temporal regions. Hemispheric asymmetry was observed in the inferior frontal region (right dominant), and in the inferior parietal region with interactions: low-frequency words elicited more right-hemispheric activation (particularly in the supramarginal gyrus), while high-frequency words elicited more left-hemispheric activation (particularly in the angular gyrus). The present results reveal the strong involvement of a bilateral language network in children's brains depending more on right-hemispheric processing while acquiring unfamiliar/low-frequency words. A right-to-left shift in laterality should occur in the inferior parietal region, as lexical knowledge increases irrespective of language.

  17. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  18. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  19. Gravitation on large scales

    Science.gov (United States)

    Giraud, E.

    found to be E(Gamma, r) = {Gamma^2 over G}r 8) A quantized model is deduced from a Schrodinger-type equation - {{D^2} {{d^2 Psi(r)} over {dr^2}}} = {[E - {{G M} over r}] Psi(r)} where D^2 is the product of the energy Gamma M^{1/2} by the square of the radius r where {{G M} over r} = {Gamma_f M^{1/2}}. The boundary conditions are given by Psi (0) = 0 and the effective potential 9) The data are in agreement with the hypothesis of quantization, but that hypothesis is not proved because, the mass-to-light ratio being a ''free'' variable, it is always possible to shift a Gamma-curve out of its best ''energy level''. However, if one moves a Gamma-fit from an ''energy level'' to the next, the fitting of the curve becomes clearly poorer. 10) The Newtonian mass-to-light ratios of Class I galaxies range from ~7 to ~75. The mass-to-light ratios of the same objects deduced from the Gamma-dynamics are reduced to 1.1 articles. The Gamma-dynamics is sensitive to the integrated mass through the term Gamma M^{1/2}, and to the mass and density through the Newtonian term {G M} over r. This kind of coupling is particularly efficient in galaxies like NGC 1560 whose rotation curve shows conspicuous structure.

  20. Numerical and experimental simulation of accident processes using KMS large-scale test facility under the program of training university students for nuclear power industry

    International Nuclear Information System (INIS)

    Aniskevich, Yu.N.

    2005-01-01

    The KMS large-scale test facility is being constructed at NITI site and designed to model accident processes in VVER reactor plants and provide experimental data for safety analysis of both existing and future NPPs. The KMS phase I is at the completion stage. This is a containment model of 2000 m3 volume intended for experimentally simulating heat and mass transfers of steam-gas mixtures and aerosols inside containment. The KMS phase II will incorporate a reactor model (1:27 scale) and be used for analysing a number of events including primary and secondary LOCA. The KMS program for background training of university students in the nuclear field will include preparation and conduction of experiments, analysis of experiment data. The KMS program for background training of university students in nuclear will include: participation in the development and application of experiment procedures, preparation and carrying out experiments; carrying out pretest and post-test calculations with different computer codes; on-the-job training as operators of experiment scenarios; training of specialists in measurement and information acquisition technologies. (author)

  1. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  2. The impact of a large-scale quality improvement programme on work engagement: preliminary results from a national cross-sectional-survey of the 'Productive Ward'.

    Science.gov (United States)

    White, Mark; Wells, John S G; Butterworth, Tony

    2014-12-01

    Quality improvement (QI) Programmes, like the Productive Ward: Releasing-time-to-care initiative, aim to 'engage' and 'empower' ward teams to actively participate, innovate and lead quality improvement at the front line. However, little is known about the relationship and impact that QI work has on the 'engagement' of the clinical teams who participate and vice-versa. This paper explores and examines the impact of a large-scale QI programme, the Productive Ward, on the 'work engagement' of the nurses and ward teams involved. Using the Utrecht Work Engagement Scale (UWES), we surveyed, measured and analysed work engagement in a representative test group of hospital-based ward teams who had recently commenced the latest phase of the national 'Productive Ward' initiative in Ireland and compared them to a control group of similar size and matched (as far as is possible) on variables such as ward size, employment grade and clinical specialty area. 338 individual datasets were recorded, n=180 (53.6%) from the Productive Ward group, and n=158 (46.4%) from the control group; the overall response rate was 67%, and did not differ significantly between the Productive Ward and control groups. The work engagement mean score (±standard deviation) in the Productive group was 4.33(±0.88), and 4.07(±1.06) in the control group, representing a modest but statistically significant between-group difference (p=0.013, independent samples t-test). Similarly modest differences were observed in all three dimensions of the work engagement construct. Employment grade and the clinical specialty area were also significantly related to the work engagement score (pengagement (the vigour, absorption and dedication) of ward-based teams. The use and suitability of the UWES as an appropriate measure of 'engagement' in QI interventions was confirmed. The engagement of nurses and front-line clinical teams is a major component of creating, developing and sustaining a culture of improvement. Copyright

  3. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  4. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  5. Large scale expression changes of genes related to neuronal signaling and developmental processes found in lateral septum of postpartum outbred mice.

    Directory of Open Access Journals (Sweden)

    Brian E Eisinger

    Full Text Available Coordinated gene expression changes across the CNS are required to produce the mammalian maternal phenotype. Lateral septum (LS is a brain region critically involved with aspects of maternal care, and we recently examined gene expression of whole septum (LS and medial septum in selectively bred maternal mice. Here, we expand on the prior study by 1 conducting microarray analysis solely on LS in virgin and postpartum mice, 2 using outbred mice, and 3 evaluating the role of sensory input on gene expression changes. Large scale changes in genes related to neuronal signaling were identified, including four GABAA receptor subunits. Subunits α4 and δ were downregulated in maternal LS, likely reflecting a reduction in the extrasynaptic, neurosteroid-sensitive α4/δ containing receptor subtype. Conversely, subunits ε and θ were increased in maternal LS. Fifteen K+ channel related genes showed altered expression, as did dopamine receptors Drd1a and Drd2 (both downregulated, hypocretin receptor 1 (Hcrtr1, kappa opioid receptor 1 (Oprk1, and transient receptor potential channel 4 (Trpc4. Expression of a large number of genes linked to developmental processes or cell differentiation were also altered in postpartum LS, including chemokine (C-X-C motif ligand 12 (Cxcl12, fatty acid binding protein 7 (Fabp7, plasma membrane proteolipid (Pllp, and suppressor of cytokine signaling 2 (Socs2. Additional genes that are linked to anxiety, such as glutathione reductase (Gsr, exhibited altered expression. Pathway analysis also identified changes in genes related to cyclic nucleotide metabolism, chromatin structure, and the Ras gene family. The sensory presence of pups was found to contribute to the altered expression of a subset of genes across all categories. This study suggests that both large changes in neuronal signaling and the possible terminal differentiation of neuronal and/or glial cells play important roles in producing the maternal state.

  6. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  7. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    Science.gov (United States)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    -core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  8. Use of large-scale multi-configuration EMI measurements to characterize heterogeneous subsurface structures and their impact on crop productivity

    Science.gov (United States)

    Brogi, Cosimo; Huisman, Johan Alexander; Kaufmann, Manuela Sarah; von Hebel, Christian; van der Kruk, Jan; Vereecken, Harry

    2017-04-01

    Soil subsurface structures can play a key role in crop performance, especially during water stress periods. Geophysical techniques like electromagnetic induction EMI have been shown to be able of providing information about dominant shallow subsurface features. However, previous work with EMI has typically not reached beyond the field scale. The objective of this study is to use large-scale multi-configuration EMI to characterize patterns of soil structural organization (layering and texture) and the associated impact on crop vegetation at the km2 scale. For this, we carried out an intensive measurement campaign and collected high spatial resolution multi-configuration EMI data on an agricultural area of approx. 1 km2 (102 ha) near Selhausen (North Rhine-Westphalia, Germany) with a maximum depth of investigation of around 2.5 m. We measured using two EMI instruments simultaneously with a total of nine coil configurations. The instruments were placed inside polyethylene sleds that were pulled by an all-terrain-vehicle along parallel lines with a spacing of 2 to 2.5 m. The driving speed was between 5 and 7 km h-1 and we used a 0.2 Hz sampling frequency to obtain an in-line resolution of approximately 0.3 m. The survey area consists of almost 50 different fields managed in different way. The EMI measurements were collected between April and December 2016 within a few days after the harvest of each field. After data acquisition, EMI data were automatically filtered, temperature corrected, and interpolated onto a common grid. The resulting EMI maps allowed us to identify three main areas with different subsurface heterogeneities. The differences between these areas are likely related to the late quaternary geological history (Pleistocene and Holocene) of the area that resulted in spatially variable soil texture and layering, which has a strong impact on spatio-temporal soil water content variability. The high resolution surveys also allowed us to identify small scale

  9. Studies on improvement of tomato productivity in a large-scale greenhouse: Prediction of tomato yield based on integrated solar radiation

    International Nuclear Information System (INIS)

    Hisaeda, K.; Nishina, H.

    2007-01-01

    As there are currently many large-scale production facilities that have contracts with the large retailing companies, accurate prediction of yield is necessary. The present study developed a method to predict tomato yield accurately using the data on the outside solar radiation. The present study was conducted in a Venlo-type greenhouse (29,568 square m) at Sera Farm Co., Ltd. in Sera-cho in Hiroshima prefecture. The cultivar used for this experiment was plum tomato. The sowing took place on July 18, the planting took place on August 30, and the harvesting started on October 9, 2002. The planting density was 2.5 plants msup(-2). As the results of the analysis of correlation between the weekly tomato yield and the integrated solar radiation for the period from October 7 to July 28 (43 weeks), the highest correlation (r = 0.518) between the weekly tomato yield and the solar radiation integrated from seven to one weeks before the harvesting was observed. Further investigation by the same correlation analysis was conducted for the 25 weeks period from December 8 to May 26, during which time the effect of growing stages and air temperature were considered to be relatively small. The results showed the highest correlation (r = 0.730) between the weekly tomato yield and the solar radiation integrated from eight to one weeks before the harvesting. The tomato yield occasionally needed to be adjusted at Sera Farm. Consequently, the correlation between the three-week moving average of tomato yield and the integrated solar radiation was calculated. The results showed the highest correlation was obtained for the period from eight to one weeks before the harvesting (r = 0.860). This study therefore showed that it was possible to predict the tomato yield (y: kg.msup(-2).weeksup(-1)) using the following equation on the solar radiation integrated from eight to one weeks before the harvesting(x: MJ.msup(-2)): y = 7.50 x 10 sup(-6)x + 0.148 (rsup(2) = 0.740)

  10. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  11. Image-processing of time-averaged interface distributions representing CCFL characteristics in a large scale model of a PWR hot-leg pipe geometry

    International Nuclear Information System (INIS)

    Al Issa, Suleiman; Macián-Juan, Rafael

    2017-01-01

    Highlights: • CCFL characteristics are investigated in PWR large-scale hot-leg pipe geometry. • Image processing of air-water interface produced time-averaged interface distributions. • Time-averages provide a comparative method of CCFL characteristics among different studies. • CCFL correlations depend upon the range of investigated water delivery for Dh ≫ 50 mm. • 1D codes are incapable of investigating CCFL because of lack of interface distribution. - Abstract: Countercurrent Flow Limitation (CCFL) was experimentally investigated in a 1/3.9 downscaled COLLIDER facility with a 190 mm pipe’s diameter using air/water at 1 atmospheric pressure. Previous investigations provided knowledge over the onset of CCFL mechanisms. In current article, CCFL characteristics at the COLLIDER facility are measured and discussed along with time-averaged distributions of the air/water interface for a selected matrix of liquid/gas velocities. The article demonstrates the time-averaged interface as a useful method to identify CCFL characteristics at quasi-stationary flow conditions eliminating variations that appears in single images, and showing essential comparative flow features such as: the degree of restriction at the bend, the extension and the intensity of the two-phase mixing zones, and the average water level within the horizontal part and the steam generator. Consequently, making it possible to compare interface distributions obtained at different investigations. The distributions are also beneficial for CFD validations of CCFL as the instant chaotic gas/liquid interface is impossible to reproduce in CFD simulations. The current study shows that final CCFL characteristics curve (and the corresponding CCFL correlation) depends upon the covered measuring range of water delivery. It also shows that a hydraulic diameter should be sufficiently larger than 50 mm in order to obtain CCFL characteristics comparable to the 1:1 scale data (namely the UPTF data). Finally

  12. Evaluation of the social and economical impacts related to the large scale production of bioethanol in Brazil; Avaliacao dos impactos socioeconomicos relacionados a producao em larga escala do bioetanol no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-10-15

    The social and economical impacts related to the large scale bio ethanol production have been evaluated considering the replacement of 10% of gasoline equivalent consumption in the world forecasting for the year 2025, evaluating the impacts not only in the sectors directly involved (bio ethanol and sugar cane production), but also taking into account the effects on all production chain of the economy (direct, indirect and induced effects). For this analysis, an income-product model was developed allowing to simulate production gains during the agricultural phase, and to combine different technologies for the production of bio ethanol, for quantification the impacts of the forwarding the second generation technology.

  13. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  14. Solar heating of air used for the drying at medium and large scale, of forestry, fishery, agriculture, cattle and industrial products

    International Nuclear Information System (INIS)

    Gutierrez, F.

    1991-01-01

    The drying process and/or preservation of grains is improved through the previous heating of air. In many cases it is enough to raise the temperature only a few degrees (from 10 to 15 Centigrade), in order to increase their capacity to absorb dampness. This can be done using very simple solar captors. A massive use of solar energy in the drying process of products, by means of hot air, can only be done with very expensive equipment. For this reason, it is recommended the use of lower thermic heaters, which will have a lower cost too. (Author)

  15. The utilization of gum tragacanth to improve the growth of Rhodotorula aurantiaca and the production of gamma-decalactone in large scale.

    Science.gov (United States)

    Alchihab, Mohamed; Destain, Jacqueline; Aguedo, Mario; Wathelet, Jean-Paul; Thonart, Philippe

    2010-09-01

    The production of gamma-decalactone and 4-hydroxydecanoic acid by the psychrophilic yeast R. aurantiaca was studied. The effect of both compounds on the growth of R. aurantiaca was also investigated and our results show that gamma-decalactone must be one of the limiting factors for its production. The addition of gum tragacanth to the medium at concentrations of 3 and 4 g/l seems to be an adequate strategy to enhance gamma-decalactone production and to reduce its toxicity towards the cell. The production of gamma-decalactone and 4-hydroxydecanoic acid was significantly higher in 20-l bioreactor than in 100-l bioreactor. By using 20 g/l of castor oil, 6.5 and 4.5 g/l of gamma-decalactone were extracted after acidification at pH 2.0 and distillation at 100 degrees C for 45 min in 20- and 100-l bioreactors, respectively. We propose a process at industrial scale using a psychrophilic yeast to produce naturally gamma-decalactone from castor oil which acts also as a detoxifying agent; moreover the process was improved by adding a natural gum.

  16. The synthesis of alternatives for the bioconversion of waste-monoethanolamine from large-scale CO{sub 2}-removal processes

    Energy Technology Data Exchange (ETDEWEB)

    Ohtaguchi, Kazuhisa; Yokoyama, Takahisa [Tokyo Inst. of Tech. (Japan). Dept. of Chemical Engineering

    1998-12-31

    The alternatives for bioconversion of monoethanolamine (MEA), which would appear in large quantities in industrial effluent of CO{sub 2}-removal process of power companies, have been proposed by investigating the ability of some microorganisms to deaminate MEA. An evaluation of biotechnology, which includes productions from MEA of acetic acid and acetaldehyde with Escherichia coli, of formic and acetic acids with Clostridium formicoaceticum, confirms and extends our earlier remarks on availability of ecotechnology for solving the above problem. (Author)

  17. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  18. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  19. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  20. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  1. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  2. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  3. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  4. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  5. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  6. Electro-spray deposition of a mesoporous TiO2 charge collection layer: toward large scale and continuous production of high efficiency perovskite solar cells.

    Science.gov (United States)

    Kim, Min-cheol; Kim, Byeong Jo; Yoon, Jungjin; Lee, Jin-wook; Suh, Dongchul; Park, Nam-gyu; Choi, Mansoo; Jung, Hyun Suk

    2015-12-28

    The spin-coating method, which is widely used for thin film device fabrication, is incapable of large-area deposition or being performed continuously. In perovskite hybrid solar cells using CH(3)NH(3)PbI(3) (MAPbI(3)), large-area deposition is essential for their potential use in mass production. Prior to replacing all the spin-coating process for fabrication of perovskite solar cells, herein, a mesoporous TiO(2) electron-collection layer is fabricated by using the electro-spray deposition (ESD) system. Moreover, impedance spectroscopy and transient photocurrent and photovoltage measurements reveal that the electro-sprayed mesoscopic TiO(2) film facilitates charge collection from the perovskite. The series resistance of the perovskite solar cell is also reduced owing to the highly porous nature of, and the low density of point defects in, the film. An optimized power conversion efficiency of 15.11% is achieved under an illumination of 1 sun; this efficiency is higher than that (13.67%) of the perovskite solar cell with the conventional spin-coated TiO(2) films. Furthermore, the large-area coating capability of the ESD process is verified through the coating of uniform 10 × 10 cm(2) TiO(2) films. This study clearly shows that ESD constitutes therefore a viable alternative for the fabrication of high-throughput, large-area perovskite solar cells.

  7. A remote-control datalogger for large-scale resistivity surveys and robust processing of its signals using a software lock-in approach

    Science.gov (United States)

    Oppermann, Frank; Günther, Thomas

    2018-02-01

    We present a new versatile datalogger that can be used for a wide range of possible applications in geosciences. It is adjustable in signal strength and sampling frequency, battery saving and can remotely be controlled over a Global System for Mobile Communication (GSM) connection so that it saves running costs, particularly in monitoring experiments. The internet connection allows for checking functionality, controlling schedules and optimizing pre-amplification. We mainly use it for large-scale electrical resistivity tomography (ERT), where it independently registers voltage time series on three channels, while a square-wave current is injected. For the analysis of this time series we present a new approach that is based on the lock-in (LI) method, mainly known from electronic circuits. The method searches the working point (phase) using three different functions based on a mask signal, and determines the amplitude using a direct current (DC) correlation function. We use synthetic data with different types of noise to compare the new method with existing approaches, i.e. selective stacking and a modified fast Fourier transformation (FFT)-based approach that assumes a 1/f noise characteristics. All methods give comparable results, but the LI is better than the well-established stacking method. The FFT approach can be even better but only if the noise strictly follows the assumed characteristics. If overshoots are present in the data, which is typical in the field, FFT performs worse even with good data, which is why we conclude that the new LI approach is the most robust solution. This is also proved by a field data set from a long 2-D ERT profile.

  8. Dynamics of large-scale cortical interactions at high gamma frequencies during word production: event related causality (ERC) analysis of human electrocorticography (ECoG).

    Science.gov (United States)

    Korzeniewska, Anna; Franaszczuk, Piotr J; Crainiceanu, Ciprian M; Kuś, Rafał; Crone, Nathan E

    2011-06-15

    Intracranial EEG studies in humans have shown that functional brain activation in a variety of functional-anatomic domains of human cortex is associated with an increase in power at a broad range of high gamma (>60Hz) frequencies. Although these electrophysiological responses are highly specific for the location and timing of cortical processing and in animal recordings are highly correlated with increased population firing rates, there has been little direct empirical evidence for causal interactions between different recording sites at high gamma frequencies. Such causal interactions are hypothesized to occur during cognitive tasks that activate multiple brain regions. To determine whether such causal interactions occur at high gamma frequencies and to investigate their functional significance, we used event-related causality (ERC) analysis to estimate the dynamics, directionality, and magnitude of event-related causal interactions using subdural electrocorticography (ECoG) recorded during two word production tasks: picture naming and auditory word repetition. A clinical subject who had normal hearing but was skilled in American Signed Language (ASL) provided a unique opportunity to test our hypothesis with reference to a predictable pattern of causal interactions, i.e. that language cortex interacts with different areas of sensorimotor cortex during spoken vs. signed responses. Our ERC analyses confirmed this prediction. During word production with spoken responses, perisylvian language sites had prominent causal interactions with mouth/tongue areas of motor cortex, and when responses were gestured in sign language, the most prominent interactions involved hand and arm areas of motor cortex. Furthermore, we found that the sites from which the most numerous and prominent causal interactions originated, i.e. sites with a pattern of ERC "divergence", were also sites where high gamma power increases were most prominent and where electrocortical stimulation mapping

  9. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  10. In Vitro Large Scale Production of Human Mature Red Blood Cells from Hematopoietic Stem Cells by Coculturing with Human Fetal Liver Stromal Cells

    Directory of Open Access Journals (Sweden)

    Jiafei Xi

    2013-01-01

    Full Text Available In vitro models of human erythropoiesis are useful in studying the mechanisms of erythroid differentiation in normal and pathological conditions. Here we describe an erythroid liquid culture system starting from cord blood derived hematopoietic stem cells (HSCs. HSCs were cultured for more than 50 days in erythroid differentiation conditions and resulted in a more than 109-fold expansion within 50 days under optimal conditions. Homogeneous erythroid cells were characterized by cell morphology, flow cytometry, and hematopoietic colony assays. Furthermore, terminal erythroid maturation was improved by cosculturing with human fetal liver stromal cells. Cocultured erythroid cells underwent multiple maturation events, including decrease in size, increase in glycophorin A expression, and nuclear condensation. This process resulted in extrusion of the pycnotic nuclei in up to 80% of the cells. Importantly, they possessed the capacity to express the adult definitive β-globin chain upon further maturation. We also show that the oxygen equilibrium curves of the cord blood-differentiated red blood cells (RBCs are comparable to normal RBCs. The large number and purity of erythroid cells and RBCs produced from cord blood make this method useful for fundamental research in erythroid development, and they also provide a basis for future production of available RBCs for transfusion.

  11. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  12. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  13. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  14. Diacetyl and 2,3-pentanedione in breathing zone and area air during large-scale commercial coffee roasting, blending and grinding processes.

    Science.gov (United States)

    McCoy, Michael J; Hoppe Parr, Kimberly A; Anderson, Kim E; Cornish, Jim; Haapala, Matti; Greivell, John

    2017-01-01

    Recently described scientific literature has identified the airborne presence of 2,3-butanedione (diacetyl) and 2,3-pentanedione at concentrations approaching or potentially exceeding the current American Conference of Industrial Hygienists' (ACGIH) Threshold Limit Values (TLVs) at commercial coffee roasting and production facilities. Newly established National Institutes of Occupational Safety and Health (NIOSH) Recommended Exposure Limits for diacetyl and 2,3-pentanedione are even more conservative. Chronic exposure to these alpha-diketones at elevated airborne concentrations has been associated with lung damage, specifically bronchiolitis obliterans, most notably in industrial food processing facilities. Workers at a large commercial coffee roaster were monitored for both eight-hour and task-based, short-term, 15-min sample durations for airborne concentrations of these alpha-diketones during specific work processes, including the coffee bean roasting, blending and grinding processes, during two separate 8-h work periods. Additionally, the authors performed real-time Fourier transform infrared spectroscopy (FTIR) analysis of the workers' breathing zone as well as the area workplace air for the presence of organic compounds to determine the sources, as well as quantitate and identify various organic compounds proximal to the roasting and grinding processes. Real-time FTIR measurements provided both the identification and quantitation of diacetyl and 2,3-pentanedione, as well as other organic compounds generated during coffee bean roasting and grinding operations. Airborne concentrations of diacetyl in the workers' breathing zone, as eight-hour time-weighted averages were less than the ACGIH TLVs for diacetyl, while concentrations of 2,3-pentanedione were below the limit of detection in all samples. Short-term breathing zone samples revealed airborne concentrations for diacetyl that exceeded the ACGIH short-term exposure limit of 0.02 parts per million (ppm) in

  15. Diacetyl and 2,3-pentanedione in breathing zone and area air during large-scale commercial coffee roasting, blending and grinding processes

    Directory of Open Access Journals (Sweden)

    Michael J. McCoy

    Full Text Available Recently described scientific literature has identified the airborne presence of 2,3-butanedione (diacetyl and 2,3-pentanedione at concentrations approaching or potentially exceeding the current American Conference of Industrial Hygienists’ (ACGIH Threshold Limit Values (TLVs at commercial coffee roasting and production facilities. Newly established National Institutes of Occupational Safety and Health (NIOSH Recommended Exposure Limits for diacetyl and 2,3-pentanedione are even more conservative. Chronic exposure to these alpha-diketones at elevated airborne concentrations has been associated with lung damage, specifically bronchiolitis obliterans, most notably in industrial food processing facilities.Workers at a large commercial coffee roaster were monitored for both eight-hour and task-based, short-term, 15-min sample durations for airborne concentrations of these alpha-diketones during specific work processes, including the coffee bean roasting, blending and grinding processes, during two separate 8-h work periods. Additionally, the authors performed real-time Fourier transform infrared spectroscopy (FTIR analysis of the workers’ breathing zone as well as the area workplace air for the presence of organic compounds to determine the sources, as well as quantitate and identify various organic compounds proximal to the roasting and grinding processes. Real-time FTIR measurements provided both the identification and quantitation of diacetyl and 2,3-pentanedione, as well as other organic compounds generated during coffee bean roasting and grinding operations.Airborne concentrations of diacetyl in the workers’ breathing zone, as eight-hour time-weighted averages were less than the ACGIH TLVs for diacetyl, while concentrations of 2,3-pentanedione were below the limit of detection in all samples. Short-term breathing zone samples revealed airborne concentrations for diacetyl that exceeded the ACGIH short-term exposure limit of 0

  16. Materials specification VGB-R 109 and processing standards. First experiences of a large-scaled power plant for quality control purposes

    Energy Technology Data Exchange (ETDEWEB)

    Bareiss, J.; Nothdurft, R.; Kurtz, M. [EnBW Kraftwerke AG, Stuttgart (Germany); Helmrich, A.; Hartwig, R. [Alstom Power Systems GmbH, Stuttgart (Germany); Bantle, M. [TUEV SUED Industrie Service GmbH, Filderstadt (Germany)

    2009-07-01

    New boilers in Europe shall be manufactured by the Manufacturer as contractor of the Customer based on PED (European Pressure Equipment Directive 97/23/EC), applicable as legal Directive since May 2002. According PED, the Manufacturer is equivalent to a legal person, responsible for calculation, design, fabrication at workshop and site, including final inspection and declaration of conformity, independent if work packages are subcontracted by Manufacturer or not. Based on Customer contract, Module G shall be used as process to prove conformity according PED. As principle, PED specifies fundamental safety requirements with main focus for materials, and fabrication. For 600 C / 620 C power plants with advanced steam conditions and considerably improving efficiency, new materials are necessary. For selection of the materials, attention has to be paid to the long term design strength values, the manufacturability of the materials as well as the corrosion and oxidation behaviour. Particularly the correct fabrication according to the state of the art closely linked to new discoveries about material behaviour for semi-finished products as well as for boiler components has to be ensured. These new materials are mainly not covered by PED respectively the harmonized standard EN 12952. For that reason and Customer contract with view on national legal directives which specifies the period of in-service inspection during the operational lifetime of the boiler, additional codes and standards shall be applied for boiler manufacturing. All these requirements shall be specified in the Quality Engineering Documents of Manufacturer. The presentation gives an overview of fundamentals of PED and describes the implementation of the requirements for materials, fabrication and inspections in the Quality Engineering Documents, both in the framework of PED and Customer contract. As part of Design Approval by NoBo according PED Module G, the Quality Engineering Documents are fundamental

  17. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  18. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  19. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  20. Isolation of monocytes from leukapheretic products for large-scale GMP-grade generation of cytomegalovirus-specific T-cell lines by means of an automated elutriation device.

    Science.gov (United States)

    Perseghin, Paolo; D'Amico, Giovanna; Dander, Erica; Gaipa, Giuseppe; Dassi, Maria; Biagi, Ettore; Biondi, Andrea

    2008-08-01

    Dendritic cells (DC) act as antigen-presenting cells in immune response-mediated mechanisms against malignant cells and/or viral or fungal pathogens. CD14+ monocytes have been so far isolated by techniques of plastic adherence or by using immunomagnetic methods. Here the effectiveness of a commercially available cell separation system (Elutra, Gambro BCT) in the separation of monocytes and the large-scale production of cytomegalovirus (CMV)-specific T-cell lines were investigated. Six mononuclear cell (MNC) collections were processed with the Elutra system. Monocyte-enriched fraction was differentiated into DCs by addition of granulocyte-macrophage-colony-stimulating factor and interleukin (IL)-4. After 6 days of culture, DCs were matured in the presence of interferon (IFN)-gamma, IFN-alpha, IL-1beta, tumor necrosis factor-alpha, and poly(I:C) and pulsed with a pool of 48 MHC Class I and II-binding CMV peptides. Lymphocytes were then stimulated with mature autologous CMV peptide-pulsed DCs. After elutriation, the mean monocyte yield was 0.89 x 10(9) +/- 0.65 x 10(9), with a 51.0 +/- 31.6 percent recovery and a 51.1 +/- 35.4 percent purity. A significant correlation was observed when basal monocyte content was related to the postelutriation recovery (p < 0.0116). More than 60 percent of plated monocytes were differentiated into DCs, which after pulsing with CMV peptides, were able to stimulate a robust enrichment in CMV antigen-specific T cells in all tested samples (mean percentage of pentamer-positive CD8+ cells, 35% compared to the initial 2%). Our findings might be helpful for an appropriate MNC collection, to maximize the efficiency of the elutriation system and subsequently obtain an optimal monocyte-enriched yield for further DC generation and T-cell stimulation.

  1. Large-scale management of common reed, Phragmites australis, for paper production: A case study from the Liaohe Delta, China

    DEFF Research Database (Denmark)

    Brix, Hans; Ye, Siyuan; Laws, Edward A.

    2014-01-01

    The largest Phragmites reed field in the world, with a historical area of approximately 1000 km2, is located in the Liaohe Delta in northeastern China. The Phragmites wetlands are extensively managed to maximize the production of reed biomass for the paper industry. Based on satellite remote sens...

  2. Impact of large-scale organic conversion on food production and food security in two Indian states, Tamil Nadu and Madhya Pradesh

    DEFF Research Database (Denmark)

    Panneerselvam, P.; Hermansen, John Erik; Halberg, Niels

    2015-01-01

    farmers in Tamil Nadu and Madhya Pradesh, and on the total food production in these states. This study also considered a situation where fertilizer subsidies would be discontinued, with farmers having to carry the full cost of fertilizer. Results show that conversion to organic improved the economic...

  3. The impact of a large-scale quality improvement programme on work engagement: Preliminary results from a national cross-sectional-survey of the 'Productive Ward'

    LENUS (Irish Health Repository)

    White, Mark

    2014-05-14

    Quality improvement (QI) Programmes, like the Productive Ward: Releasing-time-to-care initiative, aim to \\'engage\\' and \\'empower\\' ward teams to actively participate, innovate and lead quality improvement at the front line. However, little is known about the relationship and impact that QI work has on the \\'engagement\\' of the clinical teams who participate and vice-versa.

  4. Modelling and assessment of algae cultivation for large scale biofuel production – sustainability and aspects of up-scaling of algae biorefineries

    NARCIS (Netherlands)

    Hingsamer, Maria; Jungmeier, Gerfried; Kleinegris, Dorinde; Barbosa, Maria

    2016-01-01

    Microalgae are currently considered to be highly attractive as a raw material for production of bioenergy and biomaterials in the future BioEconomy. However, a number of successful developments are still necessary before algae can reach commercial applications. These include the development of

  5. Large-scale feasibility of organic acids as a permanent preharvest intervention in drinking water of broilers and their effect on foodborne Campylobacter spp. before processing.

    Science.gov (United States)

    Jansen, W; Reich, F; Klein, G

    2014-06-01

    Evaluating the effect of a commercially available organic acid water additive in conventional broiler production on Campylobacter spp. The organic acid water additive was added to the drinking water from chick housing to catching in three consecutive rearing cycles. The broiler performance data were evaluated, and the load of thermophilic Campylobacter spp. was analysed in water, feed and the environment as well as determined in caecum content and on carcasses at the abattoir according to ISO 10272:1.2-2002. The results indicated that permanent application of acidified drinking water did not have detrimental effects on production parameters or animal welfare. The quantitative results obtained at slaughter were ambiguous, but suggested a reduced carriage of Campylobacter spp. by the flock and in caecum content. Such reduction did not result in lower Campylobacter carriage of the carcasses after slaughter. Organic acids in drinking water of broilers can partly reduce the caecal Campylobacter spp. load, but this did not reduce carcass contamination. Broiler meat is a major source of foodborne campylobacteriosis. The public health would considerably benefit from controlling Campylobacter in the food chain. The addition of organic acid to drinking water of broilers can potentially lower the caecal carriage in primary production. However, in this field trial, a commercial product failed to have an impact on the bacterial load after slaughter. © 2014 The Society for Applied Microbiology.

  6. Improving Prediction Accuracy of a Rate-Based Model of an MEA-Based Carbon Capture Process for Large-Scale Commercial Deployment

    Directory of Open Access Journals (Sweden)

    Xiaobo Luo

    2017-04-01

    Full Text Available Carbon capture and storage (CCS technology will play a critical role in reducing anthropogenic carbon dioxide (CO2 emission from fossil-fired power plants and other energy-intensive processes. However, the increment of energy cost caused by equipping a carbon capture process is the main barrier to its commercial deployment. To reduce the capital and operating costs of carbon capture, great efforts have been made to achieve optimal design and operation through process modeling, simulation, and optimization. Accurate models form an essential foundation for this purpose. This paper presents a study on developing a more accurate rate-based model in Aspen Plus® for the monoethanolamine (MEA-based carbon capture process by multistage model validations. The modeling framework for this process was established first. The steady-state process model was then developed and validated at three stages, which included a thermodynamic model, physical properties calculations, and a process model at the pilot plant scale, covering a wide range of pressures, temperatures, and CO2 loadings. The calculation correlations of liquid density and interfacial area were updated by coding Fortran subroutines in Aspen Plus®. The validation results show that the correlation combination for the thermodynamic model used in this study has higher accuracy than those of three other key publications and the model prediction of the process model has a good agreement with the pilot plant experimental data. A case study was carried out for carbon capture from a 250 MWe combined cycle gas turbine (CCGT power plant. Shorter packing height and lower specific duty were achieved using this accurate model.

  7. Large-scale purification of {sup 90}Sr from nuclear waste materials for production of {sup 90}Y, a therapeutic medical radioisotope

    Energy Technology Data Exchange (ETDEWEB)

    Wester, D.W.; Steele, R.T.; Rinehart, D.E.; DesChane, J.R.; Carson, K.J.; Rapko, B.M.; Tenforde, T.S. E-mail: tenforde@ncrp.com

    2003-07-01

    A major limitation on the supply of the short-lived medical isotope {sup 90}Y (t{sub 1/2}=64 h) is the available quantity of highly purified {sup 90}Sr generator material. A radiochemical production campaign was therefore undertaken to purify 1500 Ci of {sup 90}Sr that had been isolated from fission waste materials. A series of alkaline precipitation steps removed all detectable traces of {sup 137}Cs, alpha emitters, and uranium and transuranic elements. Technical obstacles such as the buildup of gas pressure generated upon mixing large quantities of acid with solid {sup 90}Sr carbonate were overcome through safety features incorporated into the custom-built equipment used for {sup 90}Sr purification. Methods are described for analyzing the chemical and radiochemical purity of the final product and for accurately determining by gravimetry the quantities of {sup 90}Sr immobilized on stainless steel filters for future use.

  8. Large-scale biodiesel production using flue gas from coal-fired power plants with Nannochloropsis microalgal biomass in open raceway ponds.

    Science.gov (United States)

    Zhu, Baohua; Sun, Faqiang; Yang, Miao; Lu, Lin; Yang, Guanpin; Pan, Kehou

    2014-12-01

    The potential use of microalgal biomass as a biofuel source has raised broad interest. Highly effective and economically feasible biomass generating techniques are essential to realize such potential. Flue gas from coal-fired power plants may serve as an inexpensive carbon source for microalgal culture, and it may also facilitate improvement of the environment once the gas is fixed in biomass. In this study, three strains of the genus Nannochloropsis (4-38, KA2 and 75B1) survived this type of culture and bloomed using flue gas from coal-fired power plants in 8000-L open raceway ponds. Lower temperatures and solar irradiation reduced the biomass yield and lipid productivities of these strains. Strain 4-38 performed better than the other two as it contained higher amounts of triacylglycerols and fatty acids, which are used for biodiesel production. Further optimization of the application of flue gas to microalgal culture should be undertaken. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Anaerobic digestion of crude glycerol from biodiesel manufacturing using a large-scale pilot plant: methane production and application of digested sludge as fertilizer.

    Science.gov (United States)

    Baba, Yasunori; Tada, Chika; Watanabe, Ryoya; Fukuda, Yasuhiro; Chida, Nobuyoshi; Nakai, Yutaka

    2013-07-01

    This report is the first to consider methane production energy balance from crude glycerol at a practical rather than a laboratory scale. Crude glycerol was added to the plant progressively at between 5 and 75 L glycerol/30 m(3)-day for 1.5 years, and the energy balance was positive at a loading rate of 30 L glycerol/30 m(3)-day (1 ml/L-day). At this loading rate over one year, an energy output equivalent to 106% of the energy input was achieved. The surplus energy was equivalent to transport for 1200 km, so the proper feedstock-transportation distance was within a 12.5-km radius of the biogas plant. In addition, the digested sludge contained fertilizer components (T-N: 0.11%, P2O5: 0.036%, K2O: 0.19%) that increased grass yield by 1.2 times when applied to grass fields. Thus, crude glycerol is an attractive bioresource that can be used as both a feedstock for methane production and a liquid fertilizer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Large-Scale Total Water Storage and Water Flux Changes over the Arid and Semiarid Parts of the Middle East from GRACE and Reanalysis Products

    Science.gov (United States)

    Forootan, E.; Safari, A.; Mostafaie, A.; Schumacher, M.; Delavar, M.; Awange, J. L.

    2017-05-01

    Previous studies indicate that water storage over a large part of the Middle East has been decreased over the last decade. Variability in the total (hydrological) water flux (TWF, i.e., precipitation minus evapotranspiration minus runoff) and water storage changes of the Tigris-Euphrates river basin and Iran's six major basins (Khazar, Persian, Urmia, Markazi, Hamun, and Sarakhs) over 2003-2013 is assessed in this study. Our investigation is performed based on the TWF that are estimated as temporal derivatives of terrestrial water storage (TWS) changes from the Gravity Recovery and Climate Experiment (GRACE) products and those from the reanalysis products of ERA-Interim and MERRA-Land. An inversion approach is applied to consistently estimate the spatio-temporal changes of soil moisture and groundwater storage compartments of the seven basins during the study period from GRACE TWS, altimetry, and land surface model products. The influence of TWF trends on separated water storage compartments is then explored. Our results, estimated as basin averages, indicate negative trends in the maximums of TWF peaks that reach up to -5.2 and -2.6 (mm/month/year) over 2003-2013, respectively, for the Urmia and Tigris-Euphrates basins, which are most likely due to the reported meteorological drought. Maximum amplitudes of the soil moisture compartment exhibit negative trends of -11.1, -6.6, -6.1, -4.8, -4.7, -3.8, and -1.2 (mm/year) for Urmia, Tigris-Euphrates, Khazar, Persian, Markazi, Sarakhs, and Hamun basins, respectively. Strong groundwater storage decrease is found, respectively, within the Khazar -8.6 (mm/year) and Sarakhs -7.0 (mm/year) basins. The magnitude of water storage decline in the Urmia and Tigris-Euphrates basins is found to be bigger than the decrease in the monthly accumulated TWF indicating a contribution of human water use, as well as surface and groundwater flow to the storage decline over the study area.

  11. Water quality and quantity in the context of large-scale cellulosic biofuel production in the Mississippi-Atchafalaya River Basin

    Science.gov (United States)

    VanLoocke, A.; Bernacchi, C. J.; Twine, T. E.; Kucharik, C. J.

    2012-12-01

    Numerous socio-economic and environmental pressures have driven the need to increase domestic renewable energy production in the Midwest. The primary attempt at addressing this need has been to use maize; however, the leaching of residual nitrate from maize fertilizer into runoff drives the formation of the Gulf of Mexico hypoxic or "Dead" zone which can have significant environmental impacts on the marine ecosystems. As a result of the threat to benthic organisms and fisheries in this region, The Mississippi Basin/Gulf of Mexico Task Force has set in place goals to reduce the size of the hypoxic zone from the current size of ~ 20,000 km2 to nitrate (DIN) export would have to decrease by 30 to 55% to meet this goal. An alternative option to meet the renewable energy needs while reducing the environmental impacts associated with DIN export is to produce high-yielding, low fertilizer input perennial grasses such as switchgrass and miscanthus. Miscanthus and switchgrass have been shown to greatly reduce nitrate leaching at the plot scale, even during the establishment phase. This reduction in leaching is attributed to the perennial nature and the efficient recycling of nutrients via nutrient translocation. While these feedstocks are able to achieve higher productivity than maize grain with fewer inputs, they require more water, presenting the potential for environmental impacts on regional hydrologic cycle, including reductions in streamflow. The goal of this research is to determine the change in streamflow in the Mississippi-Atchafalaya River Basin (MARB) and the export of nitrogen from fertilizer to the Gulf of Mexico. To address this goal, we adapted a vegetation model capable of simulating the biogeochemistry of current crops as well as miscanthus and switchgrass, the Integrated Biosphere Simulator - agricultural version (Agro-IBIS) and coupled it with a hydrology model capable of simulating streamflow and nitrogen export, the Terrestrial Hydrology Model with

  12. Fabrication of High Strength Lightweight Metals for Armor and Structural Applications: Large Scale Equal Channel Angular Extrusion Processing of Aluminum 5083 Alloy

    Science.gov (United States)

    2017-06-01

    estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington...process. These issues include the need to maintain the tooling at isothermal conditions, reducing the time delay between successive passes, and...right side of the press is the blue-colored auxiliary hydraulic power control unit for the ejection of the billet from the Engineered Performance

  13. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  14. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  15. Two case studies on the interaction of large-scale transport, mesoscale photochemistry, and boundary-layer processes on the lower tropospheric ozone dynamics in early spring

    Energy Technology Data Exchange (ETDEWEB)

    Broennimann, S.; Siegrist, F.C.; Eugster, W.; Cattin, R.; Sidle, C.; Wanner, H. [Inst. of Geography, Univ. of Bern (Switzerland); Hirschberg, M.M. [Lehrstuhl fuer Bioklimatologie und Immissionsforschung, TU Muenchen, Freising-Weihenstephan (Germany); Schneiter, D. [MeteoSwiss, Station Aerologique, Payerne (Switzerland); Perego, S. [IBM Switzerland, Zuerich (Switzerland)

    2001-04-01

    The vertical distribution of ozone in the lower troposphere over the Swiss Plateau is investigated in detail for two episodes in early spring (February 1998 and March 1999). Profile measurements of boundary-layer ozone performed during two field campaigns with a tethered balloon sounding system and a kite are investigated using regular aerological and ozone soundings from a nearby site, measurements from monitoring stations at various altitudes, backward trajectories, and synoptic analyses of meteorological fields. Additionally, the effect of in situ photochemistry was estimated for one of the episodes employing the Metphomod Eulerian photochemical model. Although the meteorological situations were completely different, both cases had elevated layers with high ozone concentrations, which is not untypical for late winter and early spring. In the February episode, the highest ozone concentrations of 55 to 60 ppb, which were found at around 1100 m asl, were partly advected from Southern France, but a considerable contribution of in situ photochemistry is also predicted by the model. Below that elevation, the local chemical sinks and surface deposition probably overcompensated chemical production, and the vertical ozone distribution was governed by boundary-layer dynamics. In the March episode, the results suggest that ozone-rich air parcels, probably of stratospheric or upper tropospheric origin, were advected aloft the boundary layer on the Swiss Plateau. (orig.)

  16. Two case studies on the interaction of large-scale transport, mesoscale photochemistry, and boundary-layer processes on the lower tropospheric ozone dynamics in early spring

    Directory of Open Access Journals (Sweden)

    S. Brönnimann

    2001-04-01

    Full Text Available The vertical distribution of ozone in the lower troposphere over the Swiss Plateau is investigated in detail for two episodes in early spring (February 1998 and March 1999. Profile measurements of boundary-layer ozone performed during two field campaigns with a tethered balloon sounding system and a kite are investigated using regular aerological and ozone soundings from a nearby site, measurements from monitoring stations at various altitudes, backward trajectories, and synoptic analyses of meteorological fields. Additionally, the effect of in situ photochemistry was estimated for one of the episodes employing the Metphomod Eulerian photochemical model. Although the meteorological situations were completely different, both cases had elevated layers with high ozone concentrations, which is not untypical for late winter and early spring. In the February episode, the highest ozone concentrations of 55 to 60 ppb, which were found at around 1100 m asl, were partly advected from Southern France, but a considerable contribution of in situ photochemistry is also predicted by the model. Below that elevation, the local chemical sinks and surface deposition probably overcompensated chemical production, and the vertical ozone distribution was governed by boundary-layer dynamics. In the March episode, the results suggest that ozone-rich air parcels, probably of stratospheric or upper tropospheric origin, were advected aloft the boundary layer on the Swiss Plateau.Key words. Atmospheric composition and structure (pollution – urban and regional; troposphere – composition and  chemistry – Meteorology and atmospheric dynamics (mesoscale meteorology

  17. Femtosecond laser ablation of highly oriented pyrolytic graphite: a green route for large-scale production of porous graphene and graphene quantum dots

    Science.gov (United States)

    Russo, Paola; Hu, Anming; Compagnini, Giuseppe; Duley, Walter W.; Zhou, Norman Y.

    2014-01-01

    Porous graphene (PG) and graphene quantum dots (GQDs) are attracting attention due to their potential applications in photovoltaics, catalysis, and bio-related fields. We present a novel way for mass production of these promising materials. The femtosecond laser ablation of highly oriented pyrolytic graphite (HOPG) is employed for their synthesis. Porous graphene (PG) layers were found to float at the water-air interface, while graphene quantum dots (GQDs) were dispersed in the solution. The sheets consist of one to six stacked layers of spongy graphene, which form an irregular 3D porous structure that displays pores with an average size of 15-20 nm. Several characterization techniques have confirmed the porous nature of the collected layers. The analyses of the aqueous solution confirmed the presence of GQDs with dimensions of about 2-5 nm. It is found that the formation of both PG and GQDs depends on the fs-laser ablation energy. At laser fluences less than 12 J cm-2, no evidence of either PG or GQDs is detected. However, polyynes with six and eight carbon atoms per chain are found in the solution. For laser energies in the 20-30 J cm-2 range, these polyynes disappeared, while PG and GQDs were found at the water-air interface and in the solution, respectively. The origin of these materials can be explained based on the mechanisms for water breakdown and coal gasification. The absence of PG and GQDs, after the laser ablation of HOPG in liquid nitrogen, confirms the proposed mechanisms.Porous graphene (PG) and graphene quantum dots (GQDs) are attracting attention due to their potential applications in photovoltaics, catalysis, and bio-related fields. We present a novel way for mass production of these promising materials. The femtosecond laser ablation of highly oriented pyrolytic graphite (HOPG) is employed for their synthesis. Porous graphene (PG) layers were found to float at the water-air interface, while graphene quantum dots (GQDs) were dispersed in the

  18. Femtosecond laser ablation of highly oriented pyrolytic graphite: a green route for large-scale production of porous graphene and graphene quantum dots.

    Science.gov (United States)

    Russo, Paola; Hu, Anming; Compagnini, Giuseppe; Duley, Walter W; Zhou, Norman Y

    2014-02-21

    Porous graphene (PG) and graphene quantum dots (GQDs) are attracting attention due to their potential applications in photovoltaics, catalysis, and bio-related fields. We present a novel way for mass production of these promising materials. The femtosecond laser ablation of highly oriented pyrolytic graphite (HOPG) is employed for their synthesis. Porous graphene (PG) layers were found to float at the water-air interface, while graphene quantum dots (GQDs) were dispersed in the solution. The sheets consist of one to six stacked layers of spongy graphene, which form an irregular 3D porous structure that displays pores with an average size of 15-20 nm. Several characterization techniques have confirmed the porous nature of the collected layers. The analyses of the aqueous solution confirmed the presence of GQDs with dimensions of about 2-5 nm. It is found that the formation of both PG and GQDs depends on the fs-laser ablation energy. At laser fluences less than 12 J cm(-2), no evidence of either PG or GQDs is detected. However, polyynes with six and eight carbon atoms per chain are found in the solution. For laser energies in the 20-30 J cm(-2) range, these polyynes disappeared, while PG and GQDs were found at the water-air interface and in the solution, respectively. The origin of these materials can be explained based on the mechanisms for water breakdown and coal gasification. The absence of PG and GQDs, after the laser ablation of HOPG in liquid nitrogen, confirms the proposed mechanisms.

  19. Two case studies on the interaction of large-scale transport, mesoscale photochemistry, and boundary-layer processes on the lower tropospheric ozone dynamics in early spring

    Directory of Open Access Journals (Sweden)

    S. Brönnimann

    Full Text Available The vertical distribution of ozone in the lower troposphere over the Swiss Plateau is investigated in detail for two episodes in early spring (February 1998 and March 1999. Profile measurements of boundary-layer ozone performed during two field campaigns with a tethered balloon sounding system and a kite are investigated using regular aerological and ozone soundings from a nearby site, measurements from monitoring stations at various altitudes, backward trajectories, and synoptic analyses of meteorological fields. Additionally, the effect of in situ photochemistry was estimated for one of the episodes employing the Metphomod Eulerian photochemical model. Although the meteorological situations were completely different, both cases had elevated layers with high ozone concentrations, which is not untypical for late winter and early spring. In the February episode, the highest ozone concentrations of 55 to 60 ppb, which were found at around 1100 m asl, were partly advected from Southern France, but a considerable contribution of in situ photochemistry is also predicted by the model. Below that elevation, the local chemical sinks and surface deposition probably overcompensated chemical production, and the vertical ozone distribution was governed by boundary-layer dynamics. In the March episode, the results suggest that ozone-rich air parcels, probably of stratospheric or upper tropospheric origin, were advected aloft the boundary layer on the Swiss Plateau.

    Key words. Atmospheric composition and structure (pollution – urban and regional; troposphere – composition and  chemistry – Meteorology and atmospheric dynamics (mesoscale meteorology

  20. Treatments of tilapia (Oreochromis niloticus) using nitric oxide for quality improvement: Establishing a potential method for large-scale processing of farmed fish.

    Science.gov (United States)

    Wang, Zi-Chao; Yan, Yuzhen; Su, Ping; Zhao, Mou-Ming; Xia, Ning; Chen, De-Wei

    2018-07-01

    To find a succedaneum of present methods for slaughtering tilapia, we have demonstrated the influence of nitric oxide (NO) (saturated NO solution) through euthanasia before slaughter on the animal welfare and muscle color of tilapia. The results suggested that NO euthanasia significantly improved the animal welfare and muscle color. Besides, the investigation of NO postmortem treatment on the muscle color and color stability of tilapia fillets suggested that NO postmortem treatment not only improved the muscle color and color stability but also prolonged the shelf-life of tilapia fillets during the refrigerated storage. To further investigate the effect of NO euthanasia on the quality of tilapia fillets and to estimate the safety of NO treatments (NO euthanasia and NO postmortem treatment) for the application of NO treatments in industrial manufacturing of tilapia and possibly of other fish species. NO euthanasia was adopted in this study following a simulated fish processing line. HbNO and MbNO values were measured to clarify the mechanism and process of NO euthanasia. The blood parameters, muscle pH, rigor index, drip loss and total volatile basic nitrogen (TVB-N) values were measured to evaluate the quality of the fillets obtained from NO euthanized tilapia. Besides, the nitrate (NO 3 - ) levels in the muscles after the refrigerated storage were detected to estimate the food safety of both NO euthanasia and NO postmortem treatment. Fillets obtained from the tilapia euthanized by NO showed a later reduction of muscle pH, a later onset of rigor mortis postmortem and less drip loss during the refrigerated storage than control. NO euthanasia caused less TVB-N than control and prolonged the shelf life of tilapia fillets. Moreover, the NO 3 - levels in the muscles of both NO euthanasia and NO postmortem treatment after the refrigerated storage were below the maximum permitted limit. Both the NO euthanasia and NO postmortem treatment are suitable for improving the

  1. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  2. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  3. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  4. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  5. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  6. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  7. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  8. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  9. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  10. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  11. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  12. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  13. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  14. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  15. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  16. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  17. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  18. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  19. Process and equipment design optimising product properties and attributes

    NARCIS (Netherlands)

    Bongers, P.M.M.; Thullie, J.

    2009-01-01

    Classically, when products have been developed at the bench, process engineers will search for equipment to manufacture the product at large scale. More than often, this search is constraint to the existing equipment base, or a catalog search for standard equipment. It is then not surprising, that

  20. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  1. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  2. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  3. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  4. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  5. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  6. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  7. Large Scale EOF Analysis of Climate Data

    Science.gov (United States)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  8. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  9. DECOVALEX III/BENCHPAR PROJECTS. Approaches to Upscaling Thermal-Hydro-Mechanical Processes in a Fractured Rock. Mass and its Significance for Large-Scale Repository Performance Assessment. Summary of Findings. Report of BMT2/WP3

    International Nuclear Information System (INIS)

    Andersson, Johan; Staub, Isabelle; Knight, Les

    2005-02-01

    The Benchmark Test 2 of DECOVALEX III and Work Package 3 of BENCHPAR concerns the upscaling Thermal (T), Hydrological (H) and Mechanical (M) processes in a fractured rock mass and its significance for large-scale repository performance assessment. The work is primarily concerned with the extent to which various thermo-hydro-mechanical couplings in a fractured rock mass adjacent to a repository are significant in terms of solute transport typically calculated in large-scale repository performance assessments. Since the presence of even quite small fractures may control the hydraulic, mechanical and coupled hydromechanical behaviour of the rock mass, a key of the work has been to explore the extent to which these can be upscaled and represented by 'equivalent' continuum properties appropriate PA calculations. From these general aims the BMT was set-up as a numerical study of a large scale reference problem. Analysing this reference problem should: help explore how different means of simplifying the geometrical detail of a site, with its implications on model parameters, ('upscaling') impacts model predictions of relevance to repository performance, explore to what extent the THM-coupling needs to be considered in relation to PA-measures, compare the uncertainties in upscaling (both to uncertainty on how to upscale or uncertainty that arises due to the upscaling processes) and consideration of THM couplings with the inherent uncertainty and spatial variability of the site specific data. Furthermore, it has been an essential component of the work that individual teams not only produce numerical results but are forced to make their own judgements and to provide the proper justification for their conclusions based on their analysis. It should also be understood that conclusions drawn will partly be specific to the problem analysed, in particular as it mainly concerns a 2D application. This means that specific conclusions may have limited applicability to real problems in

  10. DECOVALEX III III/BENCHPAR PROJECTS. Approaches to Upscaling Thermal-Hydro-Mechanical Processes in a Fractured Rock. Mass and its Significance for Large-Scale Repository Performance Assessment. Summary of Findings. Report of BMT2/WP3

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan (comp.) [JA Streamflow AB, Aelvsjoe (Sweden); Staub, Isabelle (comp.) [Golder Associates AB, Stockholm (Sweden); Knight, Les (comp.) [Nirex UK Ltd, Oxon (United Kingdom)

    2005-02-15

    The Benchmark Test 2 of DECOVALEX III and Work Package 3 of BENCHPAR concerns the upscaling Thermal (T), Hydrological (H) and Mechanical (M) processes in a fractured rock mass and its significance for large-scale repository performance assessment. The work is primarily concerned with the extent to which various thermo-hydro-mechanical couplings in a fractured rock mass adjacent to a repository are significant in terms of solute transport typically calculated in large-scale repository performance assessments. Since the presence of even quite small fractures may control the hydraulic, mechanical and coupled hydromechanical behaviour of the rock mass, a key of the work has been to explore the extent to which these can be upscaled and represented by 'equivalent' continuum properties appropriate PA calculations. From these general aims the BMT was set-up as a numerical study of a large scale reference problem. Analysing this reference problem should: help explore how different means of simplifying the geometrical detail of a site, with its implications on model parameters, ('upscaling') impacts model predictions of relevance to repository performance, explore to what extent the THM-coupling needs to be considered in relation to PA-measures, compare the uncertainties in upscaling (both to uncertainty on how to upscale or uncertainty that arises due to the upscaling processes) and consideration of THM couplings with the inherent uncertainty and spatial variability of the site specific data. Furthermore, it has been an essential component of the work that individual teams not only produce numerical results but are forced to make their own judgements and to provide the proper justification for their conclusions based on their analysis. It should also be understood that conclusions drawn will partly be specific to the problem analysed, in particular as it mainly concerns a 2D application. This means that specific conclusions may have limited applicability

  11. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  12. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  13. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  14. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  15. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  16. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  17. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  18. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  19. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  20. Sensitivity technologies for large scale simulation

    International Nuclear Information System (INIS)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    order approximation of the Euler equations and used as a preconditioner. In comparison to other methods, the AD preconditioner showed better convergence behavior. Our ultimate target is to perform shape optimization and hp adaptivity using adjoint formulations in the Premo compressible fluid flow simulator. A mathematical formulation for mixed-level simulation algorithms has been developed where different physics interact at potentially different spatial resolutions in a single domain. To minimize the implementation effort, explicit solution methods can be considered, however, implicit methods are preferred if computational efficiency is of high priority. We present the use of a partial elimination nonlinear solver technique to solve these mixed level problems and show how these formulation are closely coupled to intrusive optimization approaches and sensitivity analyses. Production codes are typically not designed for sensitivity analysis or large scale optimization. The implementation of our optimization libraries into multiple production simulation codes in which each code has their own linear algebra interface becomes an intractable problem. In an attempt to streamline this task, we have developed a standard interface between the numerical algorithm (such as optimization) and the underlying linear algebra. These interfaces (TSFCore and TSFCoreNonlin) have been adopted by the Trilinos framework and the goal is to promote the use of these interfaces especially with new developments. Finally, an adjoint based a posteriori error estimator has been developed for discontinuous Galerkin discretization of Poisson's equation. The goal is to investigate other ways to leverage the adjoint calculations and we show how the convergence of the forward problem can be improved by adapting the grid using adjoint-based error estimates. Error estimation is usually conducted with continuous adjoints but if discrete adjoints are available it may be possible to reuse the discrete version

  1. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  2. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  3. Large scale Brownian dynamics of confined suspensions of rigid particles

    Science.gov (United States)

    Sprinkle, Brennan; Balboa Usabiaga, Florencio; Patankar, Neelesh A.; Donev, Aleksandar

    2017-12-01

    We introduce methods for large-scale Brownian Dynamics (BD) simulation of many rigid particles of arbitrary shape suspended in a fluctuating fluid. Our method adds Brownian motion to the rigid multiblob method [F. Balboa Usabiaga et al., Commun. Appl. Math. Comput. Sci. 11(2), 217-296 (2016)] at a cost comparable to the cost of deterministic simulations. We demonstrate that we can efficiently generate deterministic and random displacements for many particles using preconditioned Krylov iterative methods, if kernel methods to efficiently compute the action of the Rotne-Prager-Yamakawa (RPY) mobility matrix and its "square" root are available for the given boundary conditions. These kernel operations can be computed with near linear scaling for periodic domains using the positively split Ewald method. Here we study particles partially confined by gravity above a no-slip bottom wall using a graphical processing unit implementation of the mobility matrix-vector product, combined with a preconditioned Lanczos iteration for generating Brownian displacements. We address a major challenge in large-scale BD simulations, capturing the stochastic drift term that arises because of the configuration-dependent mobility. Unlike the widely used Fixman midpoint scheme, our methods utilize random finite differences and do not require the solution of resistance problems or the computation of the action of the inverse square root of the RPY mobility matrix. We construct two temporal schemes which are viable for large-scale simulations, an Euler-Maruyama traction scheme and a trapezoidal slip scheme, which minimize the number of mobility problems to be solved per time step while capturing the required stochastic drift terms. We validate and compare these schemes numerically by modeling suspensions of boomerang-shaped particles sedimented near a bottom wall. Using the trapezoidal scheme, we investigate the steady-state active motion in dense suspensions of confined microrollers, whose

  4. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  5. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  6. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  7. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  8. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  9. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  10. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  11. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  12. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  13. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  14. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  15. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  16. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  17. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  18. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  19. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  20. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  1. Production process of VE

    International Nuclear Information System (INIS)

    1987-07-01

    This book tells of synopsis of production process of VE(value engineering), object selection method and establishment of target, collection of object information, design of function, write improvement suggestion, evaluation of improvement suggestion, all sorts of worksheets of production process of VE, explanation of IE, explanation of PERT.

  2. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  3. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  4. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  5. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  6. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  7. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  8. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  9. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  10. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  11. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  12. THE DECAY OF A WEAK LARGE-SCALE MAGNETIC FIELD IN TWO-DIMENSIONAL TURBULENCE

    Energy Technology Data Exchange (ETDEWEB)

    Kondić, Todor; Hughes, David W.; Tobias, Steven M., E-mail: t.kondic@leeds.ac.uk [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2016-06-01

    We investigate the decay of a large-scale magnetic field in the context of incompressible, two-dimensional magnetohydrodynamic turbulence. It is well established that a very weak mean field, of strength significantly below equipartition value, induces a small-scale field strong enough to inhibit the process of turbulent magnetic diffusion. In light of ever-increasing computer power, we revisit this problem to investigate fluids and magnetic Reynolds numbers that were previously inaccessible. Furthermore, by exploiting the relation between the turbulent diffusion of the magnetic potential and that of the magnetic field, we are able to calculate the turbulent magnetic diffusivity extremely accurately through the imposition of a uniform mean magnetic field. We confirm the strong dependence of the turbulent diffusivity on the product of the magnetic Reynolds number and the energy of the large-scale magnetic field. We compare our findings with various theoretical descriptions of this process.

  13. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  14. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  15. Large scale injection test (LASGIT) modelling

    International Nuclear Information System (INIS)

    Arnedo, D.; Olivella, S.; Alonso, E.E.

    2010-01-01

    Document available in extended abstract form only. With the objective of understanding the gas flow processes through clay barriers in schemes of radioactive waste disposal, the Lasgit in situ experiment was planned and is currently in progress. The modelling of the experiment will permit to better understand of the responses, to confirm hypothesis of mechanisms and processes and to learn in order to design future experiments. The experiment and modelling activities are included in the project FORGE (FP7). The in situ large scale injection test Lasgit is currently being performed at the Aespoe Hard Rock Laboratory by SKB and BGS. An schematic layout of the test is shown. The deposition hole follows the KBS3 scheme. A copper canister is installed in the axe of the deposition hole, surrounded by blocks of highly compacted MX-80 bentonite. A concrete plug is placed at the top of the buffer. A metallic lid anchored to the surrounding host rock is included in order to prevent vertical movements of the whole system during gas injection stages (high gas injection pressures are expected to be reached). Hydration of the buffer material is achieved by injecting water through filter mats, two placed at the rock walls and two at the interfaces between bentonite blocks. Water is also injected through the 12 canister filters. Gas injection stages are performed injecting gas to some of the canister injection filters. Since the water pressure and the stresses (swelling pressure development) will be high during gas injection, it is necessary to inject at high gas pressures. This implies mechanical couplings as gas penetrates after the gas entry pressure is achieved and may produce deformations which in turn lead to permeability increments. A 3D hydro-mechanical numerical model of the test using CODE-BRIGHT is presented. The domain considered for the modelling is shown. The materials considered in the simulation are the MX-80 bentonite blocks (cylinders and rings), the concrete plug

  16. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  17. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  18. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  19. Processed Products Database System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  20. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  1. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  2. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  3. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  4. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  5. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  6. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  7. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  8. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  9. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  10. Radiation processed polysaccharide products

    International Nuclear Information System (INIS)

    Nguyen, Quoc Hien

    2007-01-01

    Radiation crosslinking, degradation and grafting techniques for modification of polymeric materials including natural polysaccharides have been providing many unique products. In this communication, typical products from radiation processed polysaccharides particularly plant growth promoter from alginate, plant protector and elicitor from chitosan, super water absorbent containing starch, hydrogel sheet containing carrageenan/CM-chitosan as burn wound dressing, metal ion adsorbent from partially deacetylated chitin were described. The procedures for producing those above products were also outlined. Future development works on radiation processing of polysaccharides were briefly presented. (author)

  11. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  12. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  13. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  14. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  15. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  16. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  17. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  18. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  19. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  20. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  1. Testing Einstein's Gravity on Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chandra

    2011-01-01

    A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.

  2. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  3. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  4. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  5. Hydrogen production processes

    International Nuclear Information System (INIS)

    2003-01-01

    The goals of this first Gedepeon workshop on hydrogen production processes are: to stimulate the information exchange about research programs and research advances in the domain of hydrogen production processes, to indicate the domains of interest of these processes and the potentialities linked with the coupling of a nuclear reactor, to establish the actions of common interest for the CEA, the CNRS, and eventually EDF, that can be funded in the framework of the Gedepeon research group. This document gathers the slides of the 17 presentations given at this workshop and dealing with: the H 2 question and the international research programs (Lucchese P.); the CEA's research program (Lucchese P., Anzieu P.); processes based on the iodine/sulfur cycle: efficiency of a facility - flow-sheets, efficiencies, hard points (Borgard J.M.), R and D about the I/S cycle: Bunsen reaction (Colette S.), R and D about the I/S cycle: the HI/I 2 /H 2 O system (Doizi D.), demonstration loop/chemical engineering (Duhamet J.), materials and corrosion (Terlain A.); other processes under study: the Westinghouse cycle (Eysseric C.), other processes under study at the CEA (UT3, plasma,...) (Lemort F.), database about thermochemical cycles (Abanades S.), Zn/ZnO cycle (Broust F.), H 2 production by cracking, high temperature reforming with carbon trapping (Flamant G.), membrane technology (De Lamare J.); high-temperature electrolysis: SOFC used as electrolyzers (Grastien R.); generic aspects linked with hydrogen production: technical-economical evaluation of processes (Werkoff F.), thermodynamic tools (Neveu P.), the reactor-process coupling (Aujollet P.). (J.S.)

  6. Blackthorn: Large-Scale Interactive Multimodal Learning

    DEFF Research Database (Denmark)

    Zahálka, Jan; Rudinac, Stevan; Jónsson, Björn Thór

    2018-01-01

    learning process. The Ratio-64 data representation introduced in this work only costs tens of bytes per item yet preserves most of the visual and textual semantic information with good accuracy. The optimized interactive learning model scores the Ratio-64- compressed data directly, greatly reducing...... outperforming the baseline with respect to the relevance of results: it vastly outperforms the baseline on recall over time and reaches up to 108% of its precision. Compared to the product quantization variant, Blackthorn is just as fast, while producing more relevant results. On the full YFCC100M dataset...

  7. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  8. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  9. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  10. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  11. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    Science.gov (United States)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  12. The potential of optimized process design to advance LCA performance of algae production systems

    NARCIS (Netherlands)

    Boxtel, van A.J.B.; Perez-Lopez, P.; Breitmayer, E.; Slegers, P.M.

    2015-01-01

    Environmental impact is an essential aspect for the introduction of algae production systems. As information of large scale algae production is hardly available, process simulation is the only way to evaluate environmental sustainability in an early phase of process design. Simulation results allow

  13. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  14. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  15. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  16. Abnormal binding and disruption in large scale networks involved in human partial seizures

    Directory of Open Access Journals (Sweden)

    Bartolomei Fabrice

    2013-12-01

    Full Text Available There is a marked increase in the amount of electrophysiological and neuroimaging works dealing with the study of large scale brain connectivity in the epileptic brain. Our view of the epileptogenic process in the brain has largely evolved over the last twenty years from the historical concept of “epileptic focus” to a more complex description of “Epileptogenic networks” involved in the genesis and “propagation” of epileptic activities. In particular, a large number of studies have been dedicated to the analysis of intracerebral EEG signals to characterize the dynamic of interactions between brain areas during temporal lobe seizures. These studies have reported that large scale functional connectivity is dramatically altered during seizures, particularly during temporal lobe seizure genesis and development. Dramatic changes in neural synchrony provoked by epileptic rhythms are also responsible for the production of ictal symptoms or changes in patient’s behaviour such as automatisms, emotional changes or consciousness alteration. Beside these studies dedicated to seizures, large-scale network connectivity during the interictal state has also been investigated not only to define biomarkers of epileptogenicity but also to better understand the cognitive impairments observed between seizures.

  17. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  18. Superconductivity for Large Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    R. Fair; W. Stautner; M. Douglass; R. Rajput-Ghoshal; M. Moscinski; P. Riley; D. Wagner; J. Kim; S. Hou; F. Lopez; K. Haran; J. Bray; T. Laskaris; J. Rochford; R. Duckworth

    2012-10-12

    A conceptual design has been completed for a 10MW superconducting direct drive wind turbine generator employing low temperature superconductors for the field winding. Key technology building blocks from the GE Wind and GE Healthcare businesses have been transferred across to the design of this concept machine. Wherever possible, conventional technology and production techniques have been used in order to support the case for commercialization of such a machine. Appendices A and B provide further details of the layout of the machine and the complete specification table for the concept design. Phase 1 of the program has allowed us to understand the trade-offs between the various sub-systems of such a generator and its integration with a wind turbine. A Failure Modes and Effects Analysis (FMEA) and a Technology Readiness Level (TRL) analysis have been completed resulting in the identification of high risk components within the design. The design has been analyzed from a commercial and economic point of view and Cost of Energy (COE) calculations have been carried out with the potential to reduce COE by up to 18% when compared with a permanent magnet direct drive 5MW baseline machine, resulting in a potential COE of 0.075 $/kWh. Finally, a top-level commercialization plan has been proposed to enable this technology to be transitioned to full volume production. The main body of this report will present the design processes employed and the main findings and conclusions.

  19. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  20. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  1. Safeguarding aspects of large-scale commercial reprocessing plants

    International Nuclear Information System (INIS)

    1979-03-01

    The paper points out that several solutions to the problems of safeguarding large-scale plants have been put forward: (1) Increased measurement accuracy. This does not remove the problem of timely detection. (2) Continuous in-process measurement. As yet unproven and likely to be costly. (3) More extensive use of containment and surveillance. The latter appears to be feasible but requires the incorporation of safeguards into plant design and sufficient redundancy to protect the operators interests. The advantages of altering the emphasis of safeguards philosophy from quantitative goals to the analysis of diversion strategies should be considered

  2. Extending SME to Handle Large-Scale Cognitive Modeling.

    Science.gov (United States)

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  3. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  4. Cod Gadus morhua and climate change: processes, productivity and prediction

    DEFF Research Database (Denmark)

    Brander, Keith

    2010-01-01

    the causes. Investigation of cod Gadus morhua populations across the whole North Atlantic Ocean has shown large-scale patterns of change in productivity due to lower individual growth and condition, caused by large-scale climate forcing. If a population is being heavily exploited then a drop in productivity......Environmental factors act on individual fishes directly and indirectly. The direct effects on rates and behaviour can be studied experimentally and in the field, particularly with the advent of ever smarter tags for tracking fishes and their environment. Indirect effects due to changes in food......, predators, parasites and diseases are much more difficult to estimate and predict. Climate can affect all life-history stages through direct and indirect processes and although the consequences in terms of growth, survival and reproductive output can be monitored, it is often difficult to determine...

  5. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  6. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  7. Continuous downstream processing for high value biological products: A Review.

    Science.gov (United States)

    Zydney, Andrew L

    2016-03-01

    There is growing interest in the possibility of developing truly continuous processes for the large-scale production of high value biological products. Continuous processing has the potential to provide significant reductions in cost and facility size while improving product quality and facilitating the design of flexible multi-product manufacturing facilities. This paper reviews the current state-of-the-art in separations technology suitable for continuous downstream bioprocessing, focusing on unit operations that would be most appropriate for the production of secreted proteins like monoclonal antibodies. This includes cell separation/recycle from the perfusion bioreactor, initial product recovery (capture), product purification (polishing), and formulation. Of particular importance are the available options, and alternatives, for continuous chromatographic separations. Although there are still significant challenges in developing integrated continuous bioprocesses, recent technological advances have provided process developers with a number of attractive options for development of truly continuous bioprocessing operations. © 2015 Wiley Periodicals, Inc.

  8. Large-scale fuel cycle centers

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The United States Nuclear Regulatory Commission (NRC) has considered the nuclear energy center concept for fuel cycle plants in the Nuclear Energy Center Site Survey - 1975 (NECSS-75) -- an important study mandated by the U.S. Congress in the Energy Reorganization Act of 1974 which created the NRC. For the study, NRC defined fuel cycle centers to consist of fuel reprocessing and mixed oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle center sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000 - 300,000 MWe. The types of fuel cycle facilities located at the fuel cycle center permit the assessment of the role of fuel cycle centers in enhancing safeguarding of strategic special nuclear materials -- plutonium and mixed oxides. Siting of fuel cycle centers presents a considerably smaller problem than the siting of reactors. A single reprocessing plant of the scale projected for use in the United States (1500-2000 MT/yr) can reprocess the fuel from reactors producing 50,000-65,000 MWe. Only two or three fuel cycle centers of the upper limit size considered in the NECSS-75 would be required in the United States by the year 2000 . The NECSS-75 fuel cycle center evaluations showed that large scale fuel cycle centers present no real technical difficulties in siting from a radiological effluent and safety standpoint. Some construction economies may be attainable with fuel cycle centers; such centers offer opportunities for improved waste management systems. Combined centers consisting of reactors and fuel reprocessing and mixed oxide fuel fabrication plants were also studied in the NECSS. Such centers can eliminate not only shipment of plutonium, but also mixed oxide fuel. Increased fuel cycle costs result from implementation of combined centers unless the fuel reprocessing plants are commercial-sized. Development of plutonium-burning reactors could reduce any

  9. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  10. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  11. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  12. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  13. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    description of efficient large scale explosions it will be necessary to consider three stages: a) the setting up of a quasi-stable initial configuration; b) the triggering of this configuration; c) the propagation of the explosion. In this paper we consider each stage in turn, reviewing the relevant experimental information and theory to see to what extent the requirements for energetic explosions, and the physical processes that can satisfy these requirements, are understood. We pay particular attention to an attractively simple criterion for explosiveness, suggested by Fauske, that the contact temperature should exceed the temperature for spontaneous nucleation of the coolant, because on this criterion, sodium and UO 2 in particular are not explosive

  14. In Situ Vitrification preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-01-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: 1) determine large-scale processing performance and 2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. The LSRT was completed in June 1987 after 295 hours of operation and 460 MWh of electrical energy dissipated to the molten soil. This resulted in a minimum of a 450-t block of vitrified soil extending to a depth of 7.3m (24 ft). The primary contaminants vitrified during the demonstration were Pu and Am transuranics, but also included up to 26,000 ppm fluorides. Preliminary data show that their retention in the vitrified product exceeded predictions meaning that fewer contaminants needed to be removed from the gaseous effluents by the processing equipment. The gaseous effluents were contained and treated throughout the run; that is, no radioactive or hazardous chemical releases were detected

  15. Petroleum product deparaffination process

    Energy Technology Data Exchange (ETDEWEB)

    Martynenko, A.G.; Dorodnova, V.S.; Korzhov, Yu.A.

    1980-12-23

    In the process for deparaffination of petroleum products (NP) by treatment of them with carbamide (KA) in the presence of an activator (AK) and solvent with subsequent separation of the forming paraffin-carbamide complex (PKC) the NP is mixed with the AK at a temperature of 5-40/sup 0/ in an NP:AK ratio of 1:0.01-0.03 with subsequent addition of portions of KA at a temperature of 5-20/sup 0/. The advantages of the process in comparison with the known one consists in the fact that the process of complex formation is realized practically without an induction period, and paraffin yield is increased by 27% on the potential with a simultaneous decrease in reagent consumption: KA by 20% and MeOH by 35%. Example -- 100 g of diesel fuel (a 200-360/sup 0/ fraction) is mixed with MeOH (1.3% on the feedstock), which in the obtained mixture is found in a liquid droplet state. To the mixture 40 g KA is added (2/3 of the total amount of KA to be added). The complex formation temperature is held at 35/sup 0/. To the PKC practically forming at once 260 g of solvent (an 85-120/sup 0/ gasoline fraction) and then 20 g KA are added. The temperature is held at 20/sup 0/. The forming suspension of the complex is distinguished by uniformity and the absence of coarse conglomerates. The paraffin yield amounts to 11.8% on the feedstock or 74% on the potential. The melting point of the paraffin is 22/sup 0/. In the described deparaffination scheme according to the proposed process complex formation is carried out in two steps; in a third step the suspension of KA and centrifuged paraffin is separated; the KA is returned to the complex formation reactor, and the centrifuged paraffin --- for solvent regeneration.

  16. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  17. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    Science.gov (United States)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  18. Mapping spatial patterns of denitrifiers at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Ramette, A.; Saby, N.; Bru, D.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 739 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  19. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  20. Development of large-scale functional brain networks in children.

    Directory of Open Access Journals (Sweden)

    Kaustubh Supekar

    2009-07-01

    Full Text Available The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y and 22 young-adults (ages 19-22 y. Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  1. Development of large-scale functional brain networks in children.

    Science.gov (United States)

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-07-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y) and 22 young-adults (ages 19-22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  2. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  3. The combustion behavior of large scale lithium titanate battery

    Science.gov (United States)

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  4. Sample-Starved Large Scale Network Analysis

    Science.gov (United States)

    2016-05-05

    As reported in our journal publication (G. Marjanovic and A. O. Hero, ”l0 Sparse Inverse Covariance Estimation,” IEEE Trans on Signal Processing, vol... Marjanovic and A. O. Hero, ”l0 Sparse Inverse Covariance Estimation,” in IEEE Trans on Signal Processing, vol. 63, no. 12, pp. 3218-3231, May 2015. 6. G

  5. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  6. In situ vitrification large-scale operational acceptance test analysis

    International Nuclear Information System (INIS)

    Buelt, J.L.; Carter, J.G.

    1986-05-01

    A thermal treatment process is currently under study to provide possible enhancement of in-place stabilization of transuranic and chemically contaminated soil sites. The process is known as in situ vitrification (ISV). In situ vitrification is a remedial action process that destroys solid and liquid organic contaminants and incorporates radionuclides into a glass-like material that renders contaminants substantially less mobile and less likely to impact the environment. A large-scale operational acceptance test (LSOAT) was recently completed in which more than 180 t of vitrified soil were produced in each of three adjacent settings. The LSOAT demonstrated that the process conforms to the functional design criteria necessary for the large-scale radioactive test (LSRT) to be conducted following verification of the performance capabilities of the process. The energy requirements and vitrified block size, shape, and mass are sufficiently equivalent to those predicted by the ISV mathematical model to confirm its usefulness as a predictive tool. The LSOAT demonstrated an electrode replacement technique, which can be used if an electrode fails, and techniques have been identified to minimize air oxidation, thereby extending electrode life. A statistical analysis was employed during the LSOAT to identify graphite collars and an insulative surface as successful cold cap subsidence techniques. The LSOAT also showed that even under worst-case conditions, the off-gas system exceeds the flow requirements necessary to maintain a negative pressure on the hood covering the area being vitrified. The retention of simulated radionuclides and chemicals in the soil and off-gas system exceeds requirements so that projected emissions are one to two orders of magnitude below the maximum permissible concentrations of contaminants at the stack

  7. Iodine oxides in large-scale THAI tests

    International Nuclear Information System (INIS)

    Funke, F.; Langrock, G.; Kanzleiter, T.; Poss, G.; Fischer, K.; Kühnel, A.; Weber, G.; Allelein, H.-J.

    2012-01-01

    Highlights: ► Iodine oxide particles were produced from gaseous iodine and ozone. ► Ozone replaced the effect of ionizing radiation in the large-scale THAI facility. ► The mean diameter of the iodine oxide particles was about 0.35 μm. ► Particle formation was faster than the chemical reaction between iodine and ozone. ► Deposition of iodine oxide particles was slow in the absence of other aerosols. - Abstract: The conversion of gaseous molecular iodine into iodine oxide aerosols has significant relevance in the understanding of the fission product iodine volatility in a LWR containment during severe accidents. In containment, the high radiation field caused by fission products released from the reactor core induces radiolytic oxidation into iodine oxides. To study the characteristics and the behaviour of iodine oxides in large scale, two THAI tests Iod-13 and Iod-14 were performed, simulating radiolytic oxidation of molecular iodine by reaction of iodine with ozone, with ozone injected from an ozone generator. The observed iodine oxides form submicron particles with mean volume-related diameters of about 0.35 μm and show low deposition rates in the THAI tests performed in the absence of other nuclear aerosols. Formation of iodine aerosols from gaseous precursors iodine and ozone is fast as compared to their chemical interaction. The current approach in empirical iodine containment behaviour models in severe accidents, including the radiolytic production of I 2 -oxidizing agents followed by the I 2 oxidation itself, is confirmed by these THAI tests.

  8. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  9. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus; Krueger, Jens; Beyer, Johanna; Bruckner, Stefan

    2013-01-01

    and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems

  10. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  11. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  12. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  13. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  14. Decomposition of residual oil by large scale HSC plant

    Energy Technology Data Exchange (ETDEWEB)

    Washimi, Koichi; Ogata, Yoshitaka; Limmer, H.; Schuetter, H. (Toyo Engineering Corp., funabashi, Japan VEB Petrolchemisches Kombinat Schwedt, Schwedt (East Germany))

    1989-07-01

    Regarding large scale and high decomposition ratio visbreaker HSC, characteristic points and operation conditions of a new plant in East Germany were introduced. As for the characteristics of the process, high decomposition ratio and stable decpmposed oil, availability of high sulfur content oil or even decomposed residuum of visbreaker, stableness of produced light oil with low content of unsaturated components, low investment with low running cost, were indicated. For the realization of high decomposition ratio, designing for suppressing the decomposition in heating furnace and accelaration of it in soaking drum, high space velocity of gas phase for better agitation, were raised. As the main subject of technical development, design of soaking drum was indicated with main dimensions for the designing. Operation conditions of the process in East Germany using residual oil supplied from already working visbreaker for USSR crude oil were introduced. 6 refs., 4 figs., 2 tabs.

  15. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  16. Satellite Imagery Production and Processing Using Apache Hadoop

    Science.gov (United States)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  17. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  18. Signaling in large-scale neural networks

    DEFF Research Database (Denmark)

    Berg, Rune W; Hounsgaard, Jørn

    2009-01-01

    We examine the recent finding that neurons in spinal motor circuits enter a high conductance state during functional network activity. The underlying concomitant increase in random inhibitory and excitatory synaptic activity leads to stochastic signal processing. The possible advantages of this m......We examine the recent finding that neurons in spinal motor circuits enter a high conductance state during functional network activity. The underlying concomitant increase in random inhibitory and excitatory synaptic activity leads to stochastic signal processing. The possible advantages...... of this metabolically costly organization are analyzed by comparing with synaptically less intense networks driven by the intrinsic response properties of the network neurons....

  19. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  20. Lightweight computational steering of very large scale molecular dynamics simulations

    International Nuclear Information System (INIS)

    Beazley, D.M.

    1996-01-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages