WorldWideScience

Sample records for large-scale production processes

  1. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  2. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  3. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  5. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  6. Charm production and mass scales in deep inelastic processes

    International Nuclear Information System (INIS)

    Close, F.E.; Scott, D.M.; Sivers, D.

    1976-07-01

    Because of their large mass, the production of charmed particles offers the possibility of new insight into fundamental dynamics. An approach to deep inelastic processes is discussed in which Generalized Vector Meson Dominance is used to extend parton model results away from the strict Bjorken scaling limit into regions where mass scales play an important role. The processes e + e - annihilation, photoproduction, deep inelastic leptoproduction, photon-photon scattering and the production of lepton pairs in hadronic collisions are discussed. The GCMD approach provides a reasonably unified framework and makes specific predictions concerning the way in which these reactions reflect an underlying flavour symmetry, broken by large mass differences. (author)

  7. Large-scale production of UO2 kernels by sol–gel process at INET

    International Nuclear Information System (INIS)

    Hao, Shaochang; Ma, Jingtao; Zhao, Xingyu; Wang, Yang; Zhou, Xiangwen; Deng, Changsheng

    2014-01-01

    In order to supply elements (300,000 elements per year) for the Chinese pebble bed modular high temperature gas cooled reactor (HTR-PM), it is necessary to scale up the production of UO 2 kernels to 3–6 kgU per batch. The sol–gel process for preparation of UO 2 kernels have been improved and optimized at Institute of Nuclear and New Energy Technology (INET), Tsinghua University, PR China, and a whole set of facility was designed and constructed based on the process. This report briefly describes the main steps of the process, the key equipment and the production capacities of every step. Six batches of kernels for scale-up verification and four batches of kernels for fuel elements for in-pile irradiation tests have been successfully produced, respectively. The quality of the produced kernels meets the design requirements. The production capacity of the process reaches 3–6 kgU per batch

  8. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  9. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  10. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  11. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  12. Very-large-scale production of antibodies in plants: The biologization of manufacturing.

    Science.gov (United States)

    Buyel, J F; Twyman, R M; Fischer, R

    2017-07-01

    Gene technology has facilitated the biologization of manufacturing, i.e. the use and production of complex biological molecules and systems at an industrial scale. Monoclonal antibodies (mAbs) are currently the major class of biopharmaceutical products, but they are typically used to treat specific diseases which individually have comparably low incidences. The therapeutic potential of mAbs could also be used for more prevalent diseases, but this would require a massive increase in production capacity that could not be met by traditional fermenter systems. Here we outline the potential of plants to be used for the very-large-scale (VLS) production of biopharmaceutical proteins such as mAbs. We discuss the potential market sizes and their corresponding production capacities. We then consider available process technologies and scale-down models and how these can be used to develop VLS processes. Finally, we discuss which adaptations will likely be required for VLS production, lessons learned from existing cell culture-based processes and the food industry, and practical requirements for the implementation of a VLS process. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  14. LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY

    International Nuclear Information System (INIS)

    SCHULTZ, K.R.; BROWN, L.C.; BESENBRUCH, G.E.; HAMILTON, C.J.

    2003-01-01

    OAK B202 LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY. The ''Hydrogen Economy'' will reduce petroleum imports and greenhouse gas emissions. However, current commercial hydrogen production processes use fossil fuels and releases carbon dioxide. Hydrogen produced from nuclear energy could avoid these concerns. The authors have recently completed a three-year project for the US Department of Energy whose objective was to ''define an economically feasible concept for production of hydrogen, by nuclear means, using an advanced high-temperature nuclear reactor as the energy source''. Thermochemical water-splitting, a chemical process that accomplishes the decomposition of water into hydrogen and oxygen, met this objective. The goal of the first phase of this study was to evaluate thermochemical processes which offer the potential for efficient, cost-effective, large-scale production of hydrogen and to select one for further detailed consideration. The authors selected the Sulfur-Iodine cycle, In the second phase, they reviewed all the basic reactor types for suitability to provide the high temperature heat needed by the selected thermochemical water splitting cycle and chose the helium gas-cooled reactor. In the third phase they designed the chemical flowsheet for the thermochemical process and estimated the efficiency and cost of the process and the projected cost of producing hydrogen. These results are summarized in this paper

  15. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  16. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  17. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  18. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  19. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  20. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  1. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  2. Comparative Study of Laboratory-Scale and Prototypic Production-Scale Fuel Fabrication Processes and Product Characteristics

    International Nuclear Information System (INIS)

    Marshall, Douglas W.

    2014-01-01

    An objective of the High Temperature Gas Reactor fuel development and qualification program for the United States Department of Energy has been to qualify fuel fabricated in prototypic production-scale equipment. The quality and characteristics of the tristructural isotropic (TRISO) coatings on fuel kernels are influenced by the equipment scale and processing parameters. The standard deviations of some TRISO layer characteristics were diminished while others have become more significant in the larger processing equipment. The impact on statistical variability of the processes and the products, as equipment was scaled, are discussed. The prototypic production-scale processes produce test fuels meeting all fuel quality specifications. (author)

  3. Comparative Study of Laboratory-Scale and Prototypic Production-Scale Fuel Fabrication Processes and Product Characteristics

    International Nuclear Information System (INIS)

    2014-01-01

    An objective of the High Temperature Gas Reactor fuel development and qualification program for the United States Department of Energy has been to qualify fuel fabricated in prototypic production-scale equipment. The quality and characteristics of the tristructural isotropic coatings on fuel kernels are influenced by the equipment scale and processing parameters. Some characteristics affecting product quality were suppressed while others have become more significant in the larger equipment. Changes to the composition and method of producing resinated graphite matrix material has eliminated the use of hazardous, flammable liquids and enabled it to be procured as a vendor-supplied feed stock. A new method of overcoating TRISO particles with the resinated graphite matrix eliminates the use of hazardous, flammable liquids, produces highly spherical particles with a narrow size distribution, and attains product yields in excess of 99%. Compact fabrication processes have been scaled-up and automated with relatively minor changes to compact quality to manual laboratory-scale processes. The impact on statistical variability of the processes and the products as equipment was scaled are discussed. The prototypic production-scale processes produce test fuels that meet fuel quality specifications.

  4. Computational Modelling of Large Scale Phage Production Using a Two-Stage Batch Process

    Directory of Open Access Journals (Sweden)

    Konrad Krysiak-Baltyn

    2018-04-01

    Full Text Available Cost effective and scalable methods for phage production are required to meet an increasing demand for phage, as an alternative to antibiotics. Computational models can assist the optimization of such production processes. A model is developed here that can simulate the dynamics of phage population growth and production in a two-stage, self-cycling process. The model incorporates variable infection parameters as a function of bacterial growth rate and employs ordinary differential equations, allowing application to a setup with multiple reactors. The model provides simple cost estimates as a function of key operational parameters including substrate concentration, feed volume and cycling times. For the phage and bacteria pairing examined, costs and productivity varied by three orders of magnitude, with the lowest cost found to be most sensitive to the influent substrate concentration and low level setting in the first vessel. An example case study of phage production is also presented, showing how parameter values affect the production costs and estimating production times. The approach presented is flexible and can be used to optimize phage production at laboratory or factory scale by minimizing costs or maximizing productivity.

  5. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  6. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  7. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    Science.gov (United States)

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Large-scale enzymatic production of natural flavour esters in organic solvent with continuous water removal.

    Science.gov (United States)

    Gubicza, L; Kabiri-Badr, A; Keoves, E; Belafi-Bako, K

    2001-11-30

    A new, large-scale process was developed for the enzymatic production of low molecular weight flavour esters in organic solvent. Solutions for the elimination of substrate and product inhibitions are presented. The excess water produced during the process was continuously removed by hetero-azeotropic distillation and esters were produced at yields of over 90%.

  9. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  10. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  11. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  12. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  13. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  14. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  15. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  16. Polymerase-endonuclease amplification reaction (PEAR for large-scale enzymatic production of antisense oligonucleotides.

    Directory of Open Access Journals (Sweden)

    Xiaolong Wang

    Full Text Available Antisense oligonucleotides targeting microRNAs or their mRNA targets prove to be powerful tools for molecular biology research and may eventually emerge as new therapeutic agents. Synthetic oligonucleotides are often contaminated with highly homologous failure sequences. Synthesis of a certain oligonucleotide is difficult to scale up because it requires expensive equipment, hazardous chemicals and a tedious purification process. Here we report a novel thermocyclic reaction, polymerase-endonuclease amplification reaction (PEAR, for the amplification of oligonucleotides. A target oligonucleotide and a tandem repeated antisense probe are subjected to repeated cycles of denaturing, annealing, elongation and cleaving, in which thermostable DNA polymerase elongation and strand slipping generate duplex tandem repeats, and thermostable endonuclease (PspGI cleavage releases monomeric duplex oligonucleotides. Each round of PEAR achieves over 100-fold amplification. The product can be used in one more round of PEAR directly, and the process can be further repeated. In addition to avoiding dangerous materials and improved product purity, this reaction is easy to scale up and amenable to full automation. PEAR has the potential to be a useful tool for large-scale production of antisense oligonucleotide drugs.

  17. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  18. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  19. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  20. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  1. Electrolytic production of light lanthanides from molten chloride alloys on a large laboratory scale

    International Nuclear Information System (INIS)

    Szklarski, W.; Bogacz, A.; Strzyzewska, M.

    1979-01-01

    Literature data relating to electrolytic production of rare earth metals are presented. Conditions and results are given of own investigations into the electrolytic process of light lanthanide chloride solutions (LA-Nd) in molten potassium and sodium chlorides conducted on a large laboratory scale using molybdenic, iron, cobaltic and zinc cathodes. Design schemes of employed electrolysers are enclosed. (author)

  2. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  3. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  4. Technical data summary: Uranium(IV) production using a large scale electrochemical cell

    International Nuclear Information System (INIS)

    Hsu, T.C.

    1984-05-01

    This Technical Data Summary outlines an electrochemical process to produce U(IV), in the form of uranous nitrate, from U(VI), as uranyl nitrate. U(IV) with hydrazine could then be used as an alternative plutonium reductant to substantially reduce the waste volume from the Purex solvent extraction process. This TDS is divided into three parts. The first part (Chapters I to IV) generally describes the electrochemical production of U(IV). The second part (Chapters V to VII) describes a pilot scale U(IV) production facility that was constructed and operated at an engineering semiworks area of SRP, referred to as TNX. The lst part (Chapter VIII) describes a preliminary design for a full-scale facility that would meet the projected need for U(IV) as a reductant in SRP's separations processes. The preliminary design was described in a Basic Data Summary for the U(IV) production facility, and a Venture Guidance Appraisal (VGA) was prepared from the Basic Data Summary. The VGA for the U(IV) process showed that because of the large capital investment required, this approach to waste reduction was not economically competitive with another alternative that required only modifying the ongoing Purex process at no additional capital cost. However, implementing he U(IV) process as part of an overall canyon renovation, presently scheduled for the 1990's, may be economically attractive. The purpose of this TDS is therefore to bring together the information and experience obtained thus far in the U(IV) program so that a useful body of information will be available to support any future development of this process

  5. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    Energy Technology Data Exchange (ETDEWEB)

    Crater, Jason [Gemomatica, Inc., San Diego, CA (United States); Galleher, Connor [Gemomatica, Inc., San Diego, CA (United States); Lievense, Jeff [Gemomatica, Inc., San Diego, CA (United States)

    2017-05-12

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integrated black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.

  6. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  7. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  8. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  9. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  11. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  12. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  13. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  14. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  15. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  16. Large-scale production and properties of human plasma-derived activated Factor VII concentrate.

    Science.gov (United States)

    Tomokiyo, K; Yano, H; Imamura, M; Nakano, Y; Nakagaki, T; Ogata, Y; Terano, T; Miyamoto, S; Funatsu, A

    2003-01-01

    An activated Factor VII (FVIIa) concentrate, prepared from human plasma on a large scale, has to date not been available for clinical use for haemophiliacs with antibodies against FVIII and FIX. In the present study, we attempted to establish a large-scale manufacturing process to obtain plasma-derived FVIIa concentrate with high recovery and safety, and to characterize its biochemical and biological properties. FVII was purified from human cryoprecipitate-poor plasma, by a combination of anion exchange and immunoaffinity chromatography, using Ca2+-dependent anti-FVII monoclonal antibody. To activate FVII, a FVII preparation that was nanofiltered using a Bemberg Microporous Membrane-15 nm was partially converted to FVIIa by autoactivation on an anion-exchange resin. The residual FVII in the FVII and FVIIa mixture was completely activated by further incubating the mixture in the presence of Ca2+ for 18 h at 10 degrees C, without any additional activators. For preparation of the FVIIa concentrate, after dialysis of FVIIa against 20 mm citrate, pH 6.9, containing 13 mm glycine and 240 mm NaCl, the FVIIa preparation was supplemented with 2.5% human albumin (which was first pasteurized at 60 degrees C for 10 h) and lyophilized in vials. To inactivate viruses contaminating the FVIIa concentrate, the lyophilized product was further heated at 65 degrees C for 96 h in a water bath. Total recovery of FVII from 15 000 l of plasma was approximately 40%, and the FVII preparation was fully converted to FVIIa with trace amounts of degraded products (FVIIabeta and FVIIagamma). The specific activity of the FVIIa was approximately 40 U/ micro g. Furthermore, virus-spiking tests demonstrated that immunoaffinity chromatography, nanofiltration and dry-heating effectively removed and inactivated the spiked viruses in the FVIIa. These results indicated that the FVIIa concentrate had both high specific activity and safety. We established a large-scale manufacturing process of human plasma

  17. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  18. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  19. Large-Scale Production of Fuel and Feed from Marine Microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Huntley, Mark [Cornell Univ., Ithaca, NY (United States)

    2015-09-30

    In summary, this Consortium has demonstrated a fully integrated process for the production of biofuels and high-value nutritional bioproducts at pre-commercial scale. We have achieved unprecedented yields of algal oil, and converted the oil to viable fuels. We have demonstrated the potential value of the residual product as a viable feed ingredient for many important animals in the global food supply.

  20. Higher-Twist Dynamics in Large Transverse Momentum Hadron Production

    International Nuclear Information System (INIS)

    Francois, Alero

    2009-01-01

    A scaling law analysis of the world data on inclusive large-p # perpendicular# hadron production in hadronic collisions is carried out. A significant deviation from leading-twist perturbative QCD predictions at next-to-leading order is reported. The observed discrepancy is largest at high values of x # perpendicular# = 2p # perpendicular#/√s. In contrast, the production of prompt photons and jets exhibits the scaling behavior which is close to the conformal limit, in agreement with the leading-twist expectation. These results bring evidence for a non-negligible contribution of higher-twist processes in large-p # perpendicular# hadron production in hadronic collisions, where the hadron is produced directly in the hard subprocess rather than by gluon or quark jet fragmentation. Predictions for scaling exponents at RHIC and LHC are given, and it is suggested to trigger the isolated large-p # perpendicular# hadron production to enhance higher-twist processes.

  1. LARGE SCALE METHOD FOR THE PRODUCTION AND PURIFICATION OF CURIUM

    Science.gov (United States)

    Higgins, G.H.; Crane, W.W.T.

    1959-05-19

    A large-scale process for production and purification of Cm/sup 242/ is described. Aluminum slugs containing Am are irradiated and declad in a NaOH-- NaHO/sub 3/ solution at 85 to 100 deg C. The resulting slurry filtered and washed with NaOH, NH/sub 4/OH, and H/sub 2/O. Recovery of Cm from filtrate and washings is effected by an Fe(OH)/sub 3/ precipitation. The precipitates are then combined and dissolved ln HCl and refractory oxides centrifuged out. These oxides are then fused with Na/sub 2/CO/sub 3/ and dissolved in HCl. The solution is evaporated and LiCl solution added. The Cm, rare earths, and anionic impurities are adsorbed on a strong-base anfon exchange resin. Impurities are eluted with LiCl--HCl solution, rare earths and Cm are eluted by HCl. Other ion exchange steps further purify the Cm. The Cm is then precipitated as fluoride and used in this form or further purified and processed. (T.R.H.)

  2. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Multi-scale process and supply chain modelling: from lignocellulosic feedstock to process and products.

    Science.gov (United States)

    Hosseini, Seyed Ali; Shah, Nilay

    2011-04-06

    There is a large body of literature regarding the choice and optimization of different processes for converting feedstock to bioethanol and bio-commodities; moreover, there has been some reasonable technological development in bioconversion methods over the past decade. However, the eventual cost and other important metrics relating to sustainability of biofuel production will be determined not only by the performance of the conversion process, but also by the performance of the entire supply chain from feedstock production to consumption. Moreover, in order to ensure world-class biorefinery performance, both the network and the individual components must be designed appropriately, and allocation of resources over the resulting infrastructure must effectively be performed. The goal of this work is to describe the key challenges in bioenergy supply chain modelling and then to develop a framework and methodology to show how multi-scale modelling can pave the way to answer holistic supply chain questions, such as the prospects for second generation bioenergy crops.

  4. Multi-scale process and supply chain modelling: from lignocellulosic feedstock to process and products

    Science.gov (United States)

    Hosseini, Seyed Ali; Shah, Nilay

    2011-01-01

    There is a large body of literature regarding the choice and optimization of different processes for converting feedstock to bioethanol and bio-commodities; moreover, there has been some reasonable technological development in bioconversion methods over the past decade. However, the eventual cost and other important metrics relating to sustainability of biofuel production will be determined not only by the performance of the conversion process, but also by the performance of the entire supply chain from feedstock production to consumption. Moreover, in order to ensure world-class biorefinery performance, both the network and the individual components must be designed appropriately, and allocation of resources over the resulting infrastructure must effectively be performed. The goal of this work is to describe the key challenges in bioenergy supply chain modelling and then to develop a framework and methodology to show how multi-scale modelling can pave the way to answer holistic supply chain questions, such as the prospects for second generation bioenergy crops. PMID:22482032

  5. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  6. Low-Cost and Scaled-Up Production of Fluorine-Free, Substrate-Independent, Large-Area Superhydrophobic Coatings Based on Hydroxyapatite Nanowire Bundles.

    Science.gov (United States)

    Chen, Fei-Fei; Yang, Zi-Yue; Zhu, Ying-Jie; Xiong, Zhi-Chao; Dong, Li-Ying; Lu, Bing-Qiang; Wu, Jin; Yang, Ri-Long

    2018-01-09

    To date, the scaled-up production and large-area applications of superhydrophobic coatings are limited because of complicated procedures, environmentally harmful fluorinated compounds, restrictive substrates, expensive equipment, and raw materials usually involved in the fabrication process. Herein, the facile, low-cost, and green production of superhydrophobic coatings based on hydroxyapatite nanowire bundles (HNBs) is reported. Hydrophobic HNBs are synthesised by using a one-step solvothermal method with oleic acid as the structure-directing and hydrophobic agent. During the reaction process, highly hydrophobic C-H groups of oleic acid molecules can be attached in situ to the surface of HNBs through the chelate interaction between Ca 2+ ions and carboxylic groups. This facile synthetic method allows the scaled-up production of HNBs up to about 8 L, which is the largest production scale of superhydrophobic paint based on HNBs ever reported. In addition, the design of the 100 L reaction system is also shown. The HNBs can be coated on any substrate with an arbitrary shape by the spray-coating technique. The self-cleaning ability in air and oil, high-temperature stability, and excellent mechanical durability of the as-prepared superhydrophobic coatings are demonstrated. More importantly, the HNBs are coated on large-sized practical objects to form large-area superhydrophobic coatings. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  8. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    Science.gov (United States)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  9. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    Science.gov (United States)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  10. Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity

    Science.gov (United States)

    Morris, C. K.; Knighton, J.

    2017-12-01

    Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.

  11. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  12. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  13. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    Science.gov (United States)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  14. A test trial irradiation of natural rubber latex on large scale for the production of examination gloves in a production scale

    International Nuclear Information System (INIS)

    Devendra, R.; Kulatunge, S.; Chandralal, H.N.K.K.; Kalyani, N.M.V.; Seneviratne, J.; Wellage, S.

    1996-01-01

    Radiation Vulcanization of natural rubber latex has been developed extensively through various research and development programme. During these investigations many data was collected and from these data it was proved that radiation vulcanized natural rubber latex (RVNRL) can be used as a new material for industry (RVNRL symposium 1989; Makuuchi IAEA report). This material has been extensively tested in making of dipped goods and extruded products. However these investigations were confined only to laboratory experiments and these experiments mainly reflected material properties of RVNRL and only a little was observed about its behavior in actual production scale operation. The present exercise was carried out mainly to study the behavior of the material in production scale by irradiating latex on a large scale and producing gloves in a production scale plant. It was found that RVNRL can be used in conventional glove plants without making major alteration to the plant. Quality of the gloves that were produced using RVNRL is acceptable. It was also found that the small deviation of vulcanization dose will affect the crosslinking density of films. This will drastically reduce the tensile strength of the film. Crosslinking density or pre-vulcanized relax modulus (PRM) at 100% is a reliable property to control the pre vulcanization of latex by radiation

  15. Inkjet printing as a roll-to-roll compatible technology for the production of large area electronic devices on a pre-industrial scale

    NARCIS (Netherlands)

    Teunissen, P.; Rubingh, E.; Lammeren, T. van; Abbel, R.J.; Groen, P.

    2014-01-01

    Inkjet printing is a promising approach towards the solution processing of electronic devices on an industrial scale. Of particular interest is the production of high-end applications such as large area OLEDs on flexible substrates. Roll-to-roll (R2R) processing technologies involving inkjet

  16. Optimization of Large-Scale Culture Conditions for the Production of Cordycepin with Cordyceps militaris by Liquid Static Culture

    Directory of Open Access Journals (Sweden)

    Chao Kang

    2014-01-01

    Full Text Available Cordycepin is one of the most important bioactive compounds produced by species of Cordyceps sensu lato, but it is hard to produce large amounts of this substance in industrial production. In this work, single factor design, Plackett-Burman design, and central composite design were employed to establish the key factors and identify optimal culture conditions which improved cordycepin production. Using these culture conditions, a maximum production of cordycepin was 2008.48 mg/L for 700 mL working volume in the 1000 mL glass jars and total content of cordycepin reached 1405.94 mg/bottle. This method provides an effective way for increasing the cordycepin production at a large scale. The strategies used in this study could have a wide application in other fermentation processes.

  17. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    Science.gov (United States)

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  18. Engineered catalytic biofilms for continuous large scale production of n-octanol and (S)-styrene oxide.

    Science.gov (United States)

    Gross, Rainer; Buehler, Katja; Schmid, Andreas

    2013-02-01

    This study evaluates the technical feasibility of biofilm-based biotransformations at an industrial scale by theoretically designing a process employing membrane fiber modules as being used in the chemical industry and compares the respective process parameters to classical stirred-tank studies. To our knowledge, catalytic biofilm processes for fine chemicals production have so far not been reported on a technical scale. As model reactions, we applied the previously studied asymmetric styrene epoxidation employing Pseudomonas sp. strain VLB120ΔC biofilms and the here-described selective alkane hydroxylation. Using the non-heme iron containing alkane hydroxylase system (AlkBGT) from P. putida Gpo1 in the recombinant P. putida PpS81 pBT10 biofilm, we were able to continuously produce 1-octanol from octane with a maximal productivity of 1.3 g L ⁻¹(aq) day⁻¹ in a single tube micro reactor. For a possible industrial application, a cylindrical membrane fiber module packed with 84,000 polypropylene fibers is proposed. Based on the here presented calculations, 59 membrane fiber modules (of 0.9 m diameter and 2 m length) would be feasible to realize a production process of 1,000 tons/year for styrene oxide. Moreover, the product yield on carbon can at least be doubled and over 400-fold less biomass waste would be generated compared to classical stirred-tank reactor processes. For the octanol process, instead, further intensification in biological activity and/or surface membrane enlargement is required to reach production scale. By taking into consideration challenges such as biomass growth control and maintaining a constant biological activity, this study shows that a biofilm process at an industrial scale for the production of fine chemicals is a sustainable alternative in terms of product yield and biomass waste production. Copyright © 2012 Wiley Periodicals, Inc.

  19. Large-scale membrane transfer process: its application to single-crystal-silicon continuous membrane deformable mirror

    International Nuclear Information System (INIS)

    Wu, Tong; Sasaki, Takashi; Hane, Kazuhiro; Akiyama, Masayuki

    2013-01-01

    This paper describes a large-scale membrane transfer process developed for the construction of large-scale membrane devices via the transfer of continuous single-crystal-silicon membranes from one substrate to another. This technique is applied for fabricating a large stroke deformable mirror. A bimorph spring array is used to generate a large air gap between the mirror membrane and the electrode. A 1.9 mm × 1.9 mm × 2 µm single-crystal-silicon membrane is successfully transferred to the electrode substrate by Au–Si eutectic bonding and the subsequent all-dry release process. This process provides an effective approach for transferring a free-standing large continuous single-crystal-silicon to a flexible suspension spring array with a large air gap. (paper)

  20. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  1. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  2. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    International Nuclear Information System (INIS)

    Ma, Huiqiang; Shi, Zhenyu; Li, Shuang; Liu, Na

    2016-01-01

    Highlights: • Microwave method for synthesizing g-C_3N_4 with N_2 photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C_3N_4. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C_3N_4) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N_2 adsorption, UV–vis spectroscopy, SEM, N_2-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C_3N_4, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C_3N_4. These nitrogen vacancies not only serve as active sites to adsorb and activate N_2 molecules but also promote interfacial charge transfer from catalysts to N_2 molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C_3N_4 which is significantly important for the practical application.

  3. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  4. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  5. A 3D Sphere Culture System Containing Functional Polymers for Large-Scale Human Pluripotent Stem Cell Production

    Directory of Open Access Journals (Sweden)

    Tomomi G. Otsuji

    2014-05-01

    Full Text Available Utilizing human pluripotent stem cells (hPSCs in cell-based therapy and drug discovery requires large-scale cell production. However, scaling up conventional adherent cultures presents challenges of maintaining a uniform high quality at low cost. In this regard, suspension cultures are a viable alternative, because they are scalable and do not require adhesion surfaces. 3D culture systems such as bioreactors can be exploited for large-scale production. However, the limitations of current suspension culture methods include spontaneous fusion between cell aggregates and suboptimal passaging methods by dissociation and reaggregation. 3D culture systems that dynamically stir carrier beads or cell aggregates should be refined to reduce shearing forces that damage hPSCs. Here, we report a simple 3D sphere culture system that incorporates mechanical passaging and functional polymers. This setup resolves major problems associated with suspension culture methods and dynamic stirring systems and may be optimal for applications involving large-scale hPSC production.

  6. Micro-scaled products development via microforming deformation behaviours, processes, tooling and its realization

    CERN Document Server

    Fu, Ming Wang

    2014-01-01

    ‘Micro-scaled Products Development via Microforming’ presents state-of-the-art research on microforming processes, and focuses on the development of micro-scaled metallic parts via microforming processes. Microforming refers to the fabrication of microparts via micro-scaled plastic deformation and  presents a promising micromanufacturing process. When compared to other  micromanufacturing processes, microforming offers advantages such as high productivity and good mechanical properties of the deformed microparts. This book provides extensive and informative illustrations, tables and photos in order to convey this information clearly and directly to readers. Although the knowledge of macroforming processes is abundant and widely used in industry, microparts cannot be developed by leveraging existing knowledge of macroforming because the size effect presents a barrier to this knowledge transfer. Therefore systematic knowledge of microforming needs to be developed. In tandem with product miniaturization, t...

  7. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  8. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  9. Towards large-scale production of solution-processed organic tandem modules based on ternary composites: Design of the intermediate layer, device optimization and laser based module processing

    DEFF Research Database (Denmark)

    Li, Ning; Kubis, Peter; Forberich, Karen

    2014-01-01

    on commercially available materials, which enhances the absorption of poly(3-hexylthiophene) (P3HT) and as a result increase the PCE of the P3HT-based large-scale OPV devices; 3. laser-based module processing, which provides an excellent processing resolution and as a result can bring the power conversion...... efficiency (PCE) of mass-produced organic photovoltaic (OPV) devices close to the highest PCE values achieved for lab-scale solar cells through a significant increase in the geometrical fill factor. We believe that the combination of the above mentioned concepts provides a clear roadmap to push OPV towards...

  10. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian. Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  11. Large-scale Lurgi plant would be uneconomic: study group

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-21

    Gas Council and National Coal Board agreed that building of large scale Lurgi plant on the basis of study is not at present acceptable on economic grounds. The committee considered that new processes based on naphtha offered more economic sources of base and peak load production. Tables listing data provided in contractors' design studies and summary of contractors' process designs are included.

  12. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Huiqiang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China); Shi, Zhenyu; Li, Shuang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); Liu, Na, E-mail: Naliujlu@163.com [College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China)

    2016-08-30

    Highlights: • Microwave method for synthesizing g-C{sub 3}N{sub 4} with N{sub 2} photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4}. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C{sub 3}N{sub 4}) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N{sub 2} adsorption, UV–vis spectroscopy, SEM, N{sub 2}-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C{sub 3}N{sub 4}, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C{sub 3}N{sub 4}. These nitrogen vacancies not only serve as active sites to adsorb and activate N{sub 2} molecules but also promote interfacial charge transfer from catalysts to N{sub 2} molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4} which is significantly important for the practical application.

  13. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  14. Semihard processes with BLM renormalization scale setting

    Energy Technology Data Exchange (ETDEWEB)

    Caporale, Francesco [Instituto de Física Teórica UAM/CSIC, Nicolás Cabrera 15 and U. Autónoma de Madrid, E-28049 Madrid (Spain); Ivanov, Dmitry Yu. [Sobolev Institute of Mathematics and Novosibirsk State University, 630090 Novosibirsk (Russian Federation); Murdaca, Beatrice; Papa, Alessandro [Dipartimento di Fisica, Università della Calabria, and Istituto Nazionale di Fisica Nucleare, Gruppo collegato di Cosenza, Arcavacata di Rende, I-87036 Cosenza (Italy)

    2015-04-10

    We apply the BLM scale setting procedure directly to amplitudes (cross sections) of several semihard processes. It is shown that, due to the presence of β{sub 0}-terms in the NLA results for the impact factors, the obtained optimal renormalization scale is not universal, but depends both on the energy and on the process in question. We illustrate this general conclusion considering the following semihard processes: (i) inclusive production of two forward high-p{sub T} jets separated by large interval in rapidity (Mueller-Navelet jets); (ii) high-energy behavior of the total cross section for highly virtual photons; (iii) forward amplitude of the production of two light vector mesons in the collision of two virtual photons.

  15. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  16. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  17. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  18. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  19. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    Energy Technology Data Exchange (ETDEWEB)

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.; Izaurralde, Roberto C.; Kim, Seungdo; Dale, Bruce E.

    2013-07-23

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) model estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.

  20. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  1. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  2. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  3. Biological hydrogen production by dark fermentation: challenges and prospects towards scaled-up production.

    Science.gov (United States)

    RenNanqi; GuoWanqian; LiuBingfeng; CaoGuangli; DingJie

    2011-06-01

    Among different technologies of hydrogen production, bio-hydrogen production exhibits perhaps the greatest potential to replace fossil fuels. Based on recent research on dark fermentative hydrogen production, this article reviews the following aspects towards scaled-up application of this technology: bioreactor development and parameter optimization, process modeling and simulation, exploitation of cheaper raw materials and combining dark-fermentation with photo-fermentation. Bioreactors are necessary for dark-fermentation hydrogen production, so the design of reactor type and optimization of parameters are essential. Process modeling and simulation can help engineers design and optimize large-scale systems and operations. Use of cheaper raw materials will surely accelerate the pace of scaled-up production of biological hydrogen. And finally, combining dark-fermentation with photo-fermentation holds considerable promise, and has successfully achieved maximum overall hydrogen yield from a single substrate. Future development of bio-hydrogen production will also be discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  5. Potential for large-scale uses for fission-product Xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-03-01

    Of all fission products in spent, low-enrichment-uranium power-reactor fuels, xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the US, radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state-of-the-art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission-product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much-more-voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays, and luminescence - as well as for medicinal diagnostics and therapeutics - fission-product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly hgiher atomic weight, because of the much higher concentrations of the 134 Xe and 136 Xe isotopes. Therefore, fission-product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  6. Market competitive Fischer-Tropsch diesel production. Techno-economic and environmental analysis of a thermo-chemical Biorefinery process for large scale biosyngas-derived FT-diesel production

    International Nuclear Information System (INIS)

    Van Ree, R.; Van der Drift, A.; Zwart, R.W.R.; Boerrigter, H.

    2005-08-01

    The contents of the presentation are summarized as follows: Introduction of the Dutch policy framework, Biomass availability and contractibility, and Biomass transportation fuels: current use and perspectives; Next subject concerns Large-scale BioSyngas production: optimum gasification technology; slagging EF-gasifier; identification and modelling biomass-conversion chains; overall energetic chain efficiencies, economics, environmental char; and a comparison with fossil-derived diesel. Further subjects are Current technological SOTA and R, D and D-trajectory; Pre-design 600 MWth demonstration plant; and the Conclusions

  7. On BLM scale fixing in exclusive processes

    International Nuclear Information System (INIS)

    Anikin, I.V.; Pire, B.; Szymanowski, L.; Teryaev, O.V.; Wallon, S.

    2005-01-01

    We discuss the BLM scale fixing procedure in exclusive electroproduction processes in the Bjorken regime with rather large x B . We show that in the case of vector meson production dominated in this case by quark exchange the usual way to apply the BLM method fails due to singularities present in the equations fixing the BLM scale. We argue that the BLM scale should be extracted from the squared amplitudes which are directly related to observables. (orig.)

  8. On BLM scale fixing in exclusive processes

    Energy Technology Data Exchange (ETDEWEB)

    Anikin, I.V. [JINR, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Universite Paris-Sud, LPT, Orsay (France); Pire, B. [Ecole Polytechnique, CPHT, Palaiseau (France); Szymanowski, L. [Soltan Institute for Nuclear Studies, Warsaw (Poland); Univ. de Liege, Inst. de Physique, Liege (Belgium); Teryaev, O.V. [JINR, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Wallon, S. [Universite Paris-Sud, LPT, Orsay (France)

    2005-07-01

    We discuss the BLM scale fixing procedure in exclusive electroproduction processes in the Bjorken regime with rather large x{sub B}. We show that in the case of vector meson production dominated in this case by quark exchange the usual way to apply the BLM method fails due to singularities present in the equations fixing the BLM scale. We argue that the BLM scale should be extracted from the squared amplitudes which are directly related to observables. (orig.)

  9. Prelude to rational scale-up of penicillin production: a scale-down study.

    Science.gov (United States)

    Wang, Guan; Chu, Ju; Noorman, Henk; Xia, Jianye; Tang, Wenjun; Zhuang, Yingping; Zhang, Siliang

    2014-03-01

    Penicillin is one of the best known pharmaceuticals and is also an important member of the β-lactam antibiotics. Over the years, ambitious yields, titers, productivities, and low costs in the production of the β-lactam antibiotics have been stepwise realized through successive rounds of strain improvement and process optimization. Penicillium chrysogenum was proven to be an ideal cell factory for the production of penicillin, and successful approaches were exploited to elevate the production titer. However, the industrial production of penicillin faces the serious challenge that environmental gradients, which are caused by insufficient mixing and mass transfer limitations, exert a considerably negative impact on the ultimate productivity and yield. Scale-down studies regarding diverse environmental gradients have been carried out on bacteria, yeasts, and filamentous fungi as well as animal cells. In accordance, a variety of scale-down devices combined with fast sampling and quenching protocols have been established to acquire the true snapshots of the perturbed cellular conditions. The perturbed metabolome information stemming from scale-down studies contributed to the comprehension of the production process and the identification of improvement approaches. However, little is known about the influence of the flow field and the mechanisms of intracellular metabolism. Consequently, it is still rather difficult to realize a fully rational scale-up. In the future, developing a computer framework to simulate the flow field of the large-scale fermenters is highly recommended. Furthermore, a metabolically structured kinetic model directly related to the production of penicillin will be further coupled to the fluid flow dynamics. A mathematical model including the information from both computational fluid dynamics and chemical reaction dynamics will then be established for the prediction of detailed information over the entire period of the fermentation process and

  10. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  11. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  12. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  13. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Nunes, Suzana Pereira; Amy, Gary L.

    2014-01-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  14. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  15. Production of recombinant antigens and antibodies in Nicotiana benthamiana using 'magnifection' technology: GMP-compliant facilities for small- and large-scale manufacturing.

    Science.gov (United States)

    Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh

    2014-01-01

    This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.

  16. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  17. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  18. Large-scale functional networks connect differently for processing words and symbol strings.

    Science.gov (United States)

    Liljeström, Mia; Vartiainen, Johanna; Kujala, Jan; Salmelin, Riitta

    2018-01-01

    Reconfigurations of synchronized large-scale networks are thought to be central neural mechanisms that support cognition and behavior in the human brain. Magnetoencephalography (MEG) recordings together with recent advances in network analysis now allow for sub-second snapshots of such networks. In the present study, we compared frequency-resolved functional connectivity patterns underlying reading of single words and visual recognition of symbol strings. Word reading emphasized coherence in a left-lateralized network with nodes in classical perisylvian language regions, whereas symbol processing recruited a bilateral network, including connections between frontal and parietal regions previously associated with spatial attention and visual working memory. Our results illustrate the flexible nature of functional networks, whereby processing of different form categories, written words vs. symbol strings, leads to the formation of large-scale functional networks that operate at distinct oscillatory frequencies and incorporate task-relevant regions. These results suggest that category-specific processing should be viewed not so much as a local process but as a distributed neural process implemented in signature networks. For words, increased coherence was detected particularly in the alpha (8-13 Hz) and high gamma (60-90 Hz) frequency bands, whereas increased coherence for symbol strings was observed in the high beta (21-29 Hz) and low gamma (30-45 Hz) frequency range. These findings attest to the role of coherence in specific frequency bands as a general mechanism for integrating stimulus-dependent information across brain regions.

  19. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    Science.gov (United States)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  20. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  1. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  2. Cell therapy-processing economics: small-scale microfactories as a stepping stone toward large-scale macrofactories.

    Science.gov (United States)

    Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A

    2018-03-01

    Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive.  The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.

  3. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  4. Manufacturing and mechanical property test of the large-scale oxide dispersion strengthened martensitic mother tube by hot isostatic pressing and hot extrusion process

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2003-09-01

    Mass production capability of Oxide Dispersion Strengthened (ODS) ferritic steel cladding (9Cr) is evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube is a dominant factor in the total cost for manufacturing ODS ferritic cladding. In this study, the large-scale 9Cr-ODS martensitic mother tube was produced by overseas supplier with mass production equipments for commercialized ODS steels. The process of manufacturing the ODS mother tube consists of raw material powder production, mechanical alloying by high energy ball mill, hot isostatic pressing(HIP), and hot extrusion. Following results were obtained in this study. (1) Micro structure of the ODS steels is equivalent to that of domestic products, and fine oxides are uniformly distributed. The mechanical alloying by large capacity (1 ton) ball mill can be satisfactorily carried out. (2) A large scale mother tube (65 mm OD x 48 mm ID x 10,000 mm L), which can produce about 60 pieces of 3 m length ODS ferritic claddings by four times cold rolling, have been successfully manufactured through HIP and Hot Extrusion process. (3) Rough surface of the mother tubes produced in this study can be improved by selecting the reasonable hot extrusion condition. (4) Hardness and tensile strength of the manufactured ODS steels are lower than domestic products with same chemical composition. This is owing to the high aluminum content in the product, and those properties could be improved by decreasing the aluminum content in the raw material powder. (author)

  5. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    Science.gov (United States)

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Upscaling from benchtop processing to industrial scale production: More factors to be considered for pulsed electric field food processing

    Science.gov (United States)

    Pulsed electric field (PEF) processing has been intensively studied with benchtop scale experiments. However, there is still limited information regarding critical factors to be considered for PEF efficacy in microbial reduction with PEF processing at a pilot or commercial scale production of juice....

  7. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  8. Review of AVLIS technology for production-scale LIS systems and construction

    International Nuclear Information System (INIS)

    Davis, J.I.; Moses, E.I.

    1983-12-01

    The use of lasers for uranium and/or plutonium isotope separation is expected to be the first application of lasers utilizing specific atomic processes for large-scale materials processing. Specific accomplishments toward the development of production-scale technology for LIS systems will be presented, along with the status of major construction projects. 24 figures

  9. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  10. Scale up of proteoliposome derived Cochleate production.

    Science.gov (United States)

    Zayas, Caridad; Bracho, Gustavo; Lastre, Miriam; González, Domingo; Gil, Danay; Acevedo, Reinaldo; del Campo, Judith; Taboada, Carlos; Solís, Rosa L; Barberá, Ramón; Pérez, Oliver

    2006-04-12

    Cochleate are highly stable structures with promising immunological features. Cochleate structures are usually obtaining from commercial lipids. Proteoliposome derived Cochleate are derived from an outer membrane vesicles of Neisseria meningitidis B. Previously, we obtained Cochleates using dialysis procedures. In order to increase the production process, we used a crossflow system (CFS) that allows easy scale up to obtain large batches in an aseptic environment. The raw material and solutions used in the production process are already approved for human application. This work demonstrates that CFS is very efficient process to obtain Cochleate structures with a yield of more than 80% and the immunogenicity comparable to that obtained by dialysis membrane.

  11. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  12. The potential for large scale uses for fission product xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-01-01

    Of all fission products in spent, low enrichment, uranium, power reactor fuels xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the U.S. radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state of the art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much more voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays and luminescence as well as for medicinal diagnostics and therapeutics fission product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly higher atomic weight, because of the much higher concentrations of the 134 X and 136 Xe isotopes. Therefore, fission product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  13. The use of soil moisture - remote sensing products for large-scale groundwater modeling and assessment

    NARCIS (Netherlands)

    Sutanudjaja, E.H.

    2012-01-01

    In this thesis, the possibilities of using spaceborne remote sensing for large-scale groundwater modeling are explored. We focus on a soil moisture product called European Remote Sensing Soil Water Index (ERS SWI, Wagner et al., 1999) - representing the upper profile soil moisture. As a test-bed, we

  14. Testing of Large-Scale ICV Glasses with Hanford LAW Simulant

    Energy Technology Data Exchange (ETDEWEB)

    Hrma, Pavel R.; Kim, Dong-Sang; Vienna, John D.; Matyas, Josef; Smith, Donald E.; Schweiger, Michael J.; Yeager, John D.

    2005-03-01

    Preliminary glass compositions for immobilizing Hanford low-activity waste (LAW) by the in-container vitrification (ICV) process were initially fabricated at crucible- and engineering-scale, including simulants and actual (radioactive) LAW. Glasses were characterized for vapor hydration test (VHT) and product consistency test (PCT) responses and crystallinity (both quenched and slow-cooled samples). Selected glasses were tested for toxicity characteristic leach procedure (TCLP) responses, viscosity, and electrical conductivity. This testing showed that glasses with LAW loading of 20 mass% can be made readily and meet all product constraints by a far margin. Glasses with over 22 mass% Na2O can be made to meet all other product quality and process constraints. Large-scale testing was performed at the AMEC, Geomelt Division facility in Richland. Three tests were conducted using simulated LAW with increasing loadings of 12, 17, and 20 mass% Na2O. Glass samples were taken from the test products in a manner to represent the full expected range of product performance. These samples were characterized for composition, density, crystalline and non-crystalline phase assemblage, and durability using the VHT, PCT, and TCLP tests. The results, presented in this report, show that the AMEC ICV product with meets all waste form requirements with a large margin. These results provide strong evidence that the Hanford LAW can be successfully vitrified by the ICV technology and can meet all the constraints related to product quality. The economic feasibility of the ICV technology can be further enhanced by subsequent optimization.

  15. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  16. Comparing centralised and decentralised anaerobic digestion of stillage from a large-scale bioethanol plant to animal feed production.

    Science.gov (United States)

    Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R

    2008-01-01

    A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.

  17. Large-scale calculations of the beta-decay rates and r-process nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Borzov, I N; Goriely, S [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); Pearson, J M [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); [Lab. de Physique Nucleaire, Univ. de Montreal, Montreal (Canada)

    1998-06-01

    An approximation to a self-consistent model of the ground state and {beta}-decay properties of neutron-rich nuclei is outlined. The structure of the {beta}-strength functions in stable and short-lived nuclei is discussed. The results of large-scale calculations of the {beta}-decay rates for spherical and slightly deformed nuclides of relevance to the r-process are analysed and compared with the results of existing global calculations and recent experimental data. (orig.)

  18. Processing Tritiated Water at the Savannah River Site: A Production-Scale Demonstration of a palladium membrane reactor

    International Nuclear Information System (INIS)

    Sessions, K

    2004-01-01

    The Palladium Membrane Reactor (PMR) process was installed in the Tritium Facilities at the Savannah River Site to perform a production-scale demonstration for the recovery of tritium from tritiated water adsorbed on molecular sieve (zeolite). Unlike the current recovery process that utilizes magnesium, the PMR offers a means to process tritiated water in a more cost effective and environmentally friendly manner. The design and installation of the large-scale PMR process was part of a collaborative effort between the Savannah River Site and Los Alamos National Laboratory. The PMR process operated at the Savannah River Site between May 2001 and April 2003. During the initial phase of operation the PMR processed thirty-four kilograms of tritiated water from the Princeton Plasma Physics Laboratory. The water was processed in fifteen separate batches to yield approximately 34,400 liters (STP) of hydrogen isotopes. Each batch consisted of round-the-clock operations for approximately nine days. In April 2003 the reactor's palladium-silver membrane ruptured resulting in the shutdown of the PMR process. Reactor performance, process performance and operating experiences have been evaluated and documented. A performance comparison between PMR and current magnesium process is also documented

  19. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  20. Large-scale production of Fischer-Tropsch diesel from biomass. Optimal gasification and gas cleaning systems

    International Nuclear Information System (INIS)

    Boerrigter, H.; Van der Drift, A.

    2004-12-01

    The paper is presented in the form of copies of overhead sheets. The contents concern definitions, an overview of Integrated biomass gasification and Fischer Tropsch (FT) systems (state-of-the-art, gas cleaning and biosyngas production, experimental demonstration and conclusions), some aspects of large-scale systems (motivation, biomass import) and an outlook

  1. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  2. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  3. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using...

  4. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  5. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  6. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  7. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  8. Scale up risk of developing oil shale processing units

    International Nuclear Information System (INIS)

    Oepik, I.

    1991-01-01

    The experiences in oil shale processing in three large countries, China, the U.S.A. and the U.S.S.R. have demonstrated, that the relative scale up risk of developing oil shale processing units is related to the scale up factor. On the background of large programmes for developing the oil shale industry branch, i.e. the $30 billion investments in colorado and Utah or 50 million t/year oil shale processing in Estonia and Leningrad Region planned in the late seventies, the absolute scope of the scale up risk of developing single retorting plants, seems to be justified. But under the conditions of low crude oil prices, when the large-scale development of oil shale processing industry is stopped, the absolute scope of the scale up risk is to be divided between a small number of units. Therefore, it is reasonable to build the new commercial oil shale processing plants with a minimum scale up risk. For example, in Estonia a new oil shale processing plant with gas combustion retorts projected to start in the early nineties will be equipped with four units of 1500 t/day enriched oil shale throughput each, designed with scale up factor M=1.5 and with a minimum scale up risk, only r=2.5-4.5%. The oil shale retorting unit for the PAMA plant in Israel [1] is planned to develop in three steps, also with minimum scale up risk: feasibility studies in Colorado with Israel's shale at Paraho 250 t/day retort and other tests, demonstration retort of 700 t/day and M=2.8 in Israel, and commercial retorts in the early nineties with the capacity of about 1000 t/day with M=1.4. The scale up risk of the PAMA project r=2-4% is approximately the same as that in Estonia. the knowledge of the scope of the scale up risk of developing oil shale processing retorts assists on the calculation of production costs in erecting new units. (author). 9 refs., 2 tabs

  9. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  10. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    Directory of Open Access Journals (Sweden)

    Anirban Nandi

    2014-01-01

    Full Text Available Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D. In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA. It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  11. Scale-up and optimization of biohydrogen production reactor from laboratory-scale to industrial-scale on the basis of computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xu; Ding, Jie; Guo, Wan-Qian; Ren, Nan-Qi [State Key Laboratory of Urban Water Resource and Environment, Harbin Institute of Technology, 202 Haihe Road, Nangang District, Harbin, Heilongjiang 150090 (China)

    2010-10-15

    The objective of conducting experiments in a laboratory is to gain data that helps in designing and operating large-scale biological processes. However, the scale-up and design of industrial-scale biohydrogen production reactors is still uncertain. In this paper, an established and proven Eulerian-Eulerian computational fluid dynamics (CFD) model was employed to perform hydrodynamics assessments of an industrial-scale continuous stirred-tank reactor (CSTR) for biohydrogen production. The merits of the laboratory-scale CSTR and industrial-scale CSTR were compared and analyzed on the basis of CFD simulation. The outcomes demonstrated that there are many parameters that need to be optimized in the industrial-scale reactor, such as the velocity field and stagnation zone. According to the results of hydrodynamics evaluation, the structure of industrial-scale CSTR was optimized and the results are positive in terms of advancing the industrialization of biohydrogen production. (author)

  12. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  13. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  14. A pilot scale demonstration of the DWPF process control and product verification strategy

    International Nuclear Information System (INIS)

    Hutson, N.D.; Jantzen, C.M.; Beam, D.C.

    1992-01-01

    The Defense Waste Processing Facility (DWPF) has been designed and constructed to immobilize Savannah River Site high level liquid waste within a durable borosilicate glass matrix for permanent storage. The DWPF will be operated to produce a glass product which must meet a number of product property constraints which are dependent upon the final product composition. During actual operations, the DWPF will control the properties of the glass product by the controlled blending of the waste streams with a glass-forming frit to produce the final melter feed slurry. The DWPF will verify control of the glass product through analysis of vitrified samples of slurry material. In order to demonstrate the DWPF process control and product verification strategy, a pilot-scale vitrification research facility was operated in three discrete batches using simulated DWPF waste streams. All of the DWPF process control methodologies were followed and the glass produce from each experiment was leached according to the Product Consistency Test. Results of the campaign are summarized

  15. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  16. Process and equipment design optimising product properties and attributes

    NARCIS (Netherlands)

    Bongers, P.M.M.; Thullie, J.

    2009-01-01

    Classically, when products have been developed at the bench, process engineers will search for equipment to manufacture the product at large scale. More than often, this search is constraint to the existing equipment base, or a catalog search for standard equipment. It is then not surprising, that

  17. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  18. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    Science.gov (United States)

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be ≈50% α-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices. PMID:19581598

  19. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  20. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  1. A versatile, steam reforming based small-scale hydrogen production process

    International Nuclear Information System (INIS)

    P C Hulteberg; F A Silversand; B Porter; R Woods

    2006-01-01

    In this paper, a new design methodology and process is proposed for small scale pure hydrogen production capable of serving energy markets ranging from distributed generation to vehicular refuelling. The system was designed for producing 7 Nm 3 /hr pure hydrogen (purity of ≤ 1 ppm CO dry), yielding 10 kWe net power from a fuel cell system with an overall parasitic power loss ≤ 10 %. The discussion of this process includes a detailed description of the design methodology and operational results of the catalytic converter, the hydrogen purification system and the fuel cell system. This paper will discuss the design methodology of the overall system, as well as the specific design of the catalytic converter, the catalysts used within, and the hydrogen purification system. It will also report the system performance including gas purity, recovery rate, overall hydrogen production efficiencies, and electrical efficiencies during fuel cell operation. (authors)

  2. Large quantity production of carbon and boron nitride nanotubes by mechano-thermal process

    International Nuclear Information System (INIS)

    Chen, Y.; Fitzgerald, J.D.; Chadderton, L.; Williams, J.S.; Campbell, S.J.

    2002-01-01

    Full text: Nanotube materials including carbon and boron nitride have excellent properties compared with bulk materials. The seamless graphene cylinders with a high length to diameter ratio make them as superstrong fibers. A high amount of hydrogen can be stored into nanotubes as future clean fuel source. Theses applications require large quantity of nanotubes materials. However, nanotube production in large quantity, fully controlled quality and low costs remains challenges for most popular synthesis methods such as arc discharge, laser heating and catalytic chemical decomposition. Discovery of new synthesis methods is still crucial for future industrial application. The new low-temperature mechano-thermal process discovered by the current author provides an opportunity to develop a commercial method for bulk production. This mechano-thermal process consists of a mechanical ball milling and a thermal annealing processes. Using this method, both carbon and boron nitride nanotubes were produced. I will present the mechano-thermal method as the new bulk production technique in the conference. The lecture will summarise main results obtained. In the case of carbon nanotubes, different nanosized structures including multi-walled nanotubes, nanocells, and nanoparticles have been produced in a graphite sample using a mechano-thermal process, consisting of I mechanical milling at room temperature for up to 150 hours and subsequent thermal annealing at 1400 deg C. Metal particles have played an important catalytic effect on the formation of different tubular structures. While defect structure of the milled graphite appears to be responsible for the formation of small tubes. It is found that the mechanical treatment of graphite powder produces a disordered and microporous structure, which provides nucleation sites for nanotubes as well as free carbon atoms. Multiwalled carbon nanotubes appear to grow via growth of the (002) layers during thermal annealing. In the case of BN

  3. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  4. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard; Moeglein, William AM; Newby, Deborah T.; Venteris, Erik R.; Wigmosta, Mark S.

    2014-06-19

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space and time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.

  5. Sustainable multistage process for enhanced productivity of bioplastics from waste remediation through aerobic dynamic feeding strategy: Process integration for up-scaling.

    Science.gov (United States)

    Amulya, K; Jukuri, Srinivas; Venkata Mohan, S

    2015-01-01

    Polyhydroxyalkanoates (PHA) production was evaluated in a multistage operation using food waste as a renewable feedstock. The first step involved the production of bio-hydrogen (bio-H2) via acidogenic fermentation. Volatile fatty acid (VFA) rich effluent from bio-H2 reactor was subsequently used for PHA production, which was carried out in two stages, Stage II (culture enrichment) and Stage III (PHA production). PHA-storing microorganisms were enriched in a sequencing batch reactor (SBR), operated at two different cycle lengths (CL-24; CL-12). Higher polymer recovery as well as VFA removal was achieved in CL-12 operation both in Stage II (16.3% dry cell weight (DCW); VFA removal, 84%) and Stage III (23.7% DCW; VFA removal, 88%). The PHA obtained was a co-polymer [P(3HB-co-3HV)] of PHB and PHV. The results obtained indicate that this integrated multistage process offers new opportunities to further leverage large scale PHA production with simultaneous waste remediation in the framework of biorefinery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A pilot scale demonstration of the DWPF process control and product verification strategy

    International Nuclear Information System (INIS)

    Hutson, N.D.; Jantzen, C.M.; Beam, D.C.

    1992-01-01

    The Defense Waste Processing Facility (DWPF) has been designed and constructed to immobilize Savannah River Site high level liquid waste within a durable borosilicate glass matrix for permanent storage. The DWPF will be operated to produce a glass product which must meet a number of product property constraints which are dependent upon the final product composition. During actual operations, the DWPF will control the properties of the glass product by the controlled blending of the waste streams with a glass-forming frit to produce the final melter feed slurry. The DWPF will verify control of the glass product through analysis of vitrified samples of slurry material. In order to demonstrate the DWPF process control and product verification strategy, a pilot-scale vitrification research facility was operated in three discrete batches using simulated DWPF waste streams. All of the DWPF process control methodologies were followed and the glass product from each experiment was leached according to the Product Consistency Test. In this paper results of the campaign are summarized

  7. Scaling up the production capacity of U-Mo powder by HMD process

    International Nuclear Information System (INIS)

    Pasqualini, E.E.; Lopez, M.; Helzel Garcia, L.J.; Echenique, P.; Adelfang, P.

    2002-01-01

    The recent discovery that uranium alloys in metastable gamma phase can be hydrided at low temperatures and pressures have allowed developing the method of commuting bulk materials by milling the hydride to desired size and then dehydriding the powder. This process is called HMD (hydriding-milling-dehydriding) and needs an initial step of hydrogen incorporation to allow the alloy to be hydrided. This four step process has been conveniently set up for the production of U-7Mo powder for its use in nuclear fuels. Low equipment investment and low man power are needed for this achievement. The process is being analyzed in its scaling up for one kilogram batches and a 50 kilogram per year production capacity of U-Mo powder. (author)

  8. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    Science.gov (United States)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  9. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  10. Constructing Model of Relationship among Behaviors and Injuries to Products Based on Large Scale Text Data on Injuries

    Science.gov (United States)

    Nomori, Koji; Kitamura, Koji; Motomura, Yoichi; Nishida, Yoshifumi; Yamanaka, Tatsuhiro; Komatsubara, Akinori

    In Japan, childhood injury prevention is urgent issue. Safety measures through creating knowledge of injury data are essential for preventing childhood injuries. Especially the injury prevention approach by product modification is very important. The risk assessment is one of the most fundamental methods to design safety products. The conventional risk assessment has been carried out subjectively because product makers have poor data on injuries. This paper deals with evidence-based risk assessment, in which artificial intelligence technologies are strongly needed. This paper describes a new method of foreseeing usage of products, which is the first step of the evidence-based risk assessment, and presents a retrieval system of injury data. The system enables a product designer to foresee how children use a product and which types of injuries occur due to the product in daily environment. The developed system consists of large scale injury data, text mining technology and probabilistic modeling technology. Large scale text data on childhood injuries was collected from medical institutions by an injury surveillance system. Types of behaviors to a product were derived from the injury text data using text mining technology. The relationship among products, types of behaviors, types of injuries and characteristics of children was modeled by Bayesian Network. The fundamental functions of the developed system and examples of new findings obtained by the system are reported in this paper.

  11. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  12. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  13. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  14. Performance of mushroom fruiting for large scale commercial production

    International Nuclear Information System (INIS)

    Mat Rosol Awang; Rosnani Abdul Rashid; Hassan Hamdani Mutaat; Mohd Meswan Maskom

    2012-01-01

    The paper described the determination of mushroom fruiting yield, which is vital to economics of mushroom production. Consistency in mushroom yields enabling an estimation to be made for revenues and hence profitability could be predicted. It has been reported by many growers, there are a large variation in mushroom yields over different times of production. To assess such claims we have run four batches of mushroom fruiting and the performance fruiting body productions are presented. (author)

  15. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  16. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  17. THE DECAY OF A WEAK LARGE-SCALE MAGNETIC FIELD IN TWO-DIMENSIONAL TURBULENCE

    Energy Technology Data Exchange (ETDEWEB)

    Kondić, Todor; Hughes, David W.; Tobias, Steven M., E-mail: t.kondic@leeds.ac.uk [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2016-06-01

    We investigate the decay of a large-scale magnetic field in the context of incompressible, two-dimensional magnetohydrodynamic turbulence. It is well established that a very weak mean field, of strength significantly below equipartition value, induces a small-scale field strong enough to inhibit the process of turbulent magnetic diffusion. In light of ever-increasing computer power, we revisit this problem to investigate fluids and magnetic Reynolds numbers that were previously inaccessible. Furthermore, by exploiting the relation between the turbulent diffusion of the magnetic potential and that of the magnetic field, we are able to calculate the turbulent magnetic diffusivity extremely accurately through the imposition of a uniform mean magnetic field. We confirm the strong dependence of the turbulent diffusivity on the product of the magnetic Reynolds number and the energy of the large-scale magnetic field. We compare our findings with various theoretical descriptions of this process.

  18. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  19. Quality Assurance in Large Scale Online Course Production

    Science.gov (United States)

    Holsombach-Ebner, Cinda

    2013-01-01

    The course design and development process (often referred to here as the "production process") at Embry-Riddle Aeronautical University (ERAU-Worldwide) aims to produce turnkey style courses to be taught by a highly-qualified pool of over 800 instructors. Given the high number of online courses and tremendous number of live sections…

  20. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  1. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  2. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  3. Some Examples of Residence-Time Distribution Studies in Large-Scale Chemical Processes by Using Radiotracer Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, R. M.; Johnson, P.; Whiston, J. [Imperial Chemical Industries Ltd., Billingham, Co., Durham (United Kingdom)

    1967-06-15

    The application of radiotracers to determine flow patterns in chemical processes is discussed with particular reference to the derivation of design data from model reactors for translation to large-scale units, the study of operating efficiency and design attainment in established plant and the rapid identification of various types of process malfunction. The requirements governing the selection of tracers for various types of media are considered and an example is given of the testing of the behaviour of a typical tracer before use in a particular large-scale process operating at 250 atm and 200 Degree-Sign C. Information which may be derived from flow patterns is discussed including the determination of mixing parameters, gas hold-up in gas/liquid reactions and the detection of channelling and stagnant regions. Practical results and their interpretation are given in relation to an define hydroformylation reaction system, a process for the conversion of propylene to isopropanol, a moving bed catalyst system for the isomerization of xylenes and a three-stage gas-liquid reaction system. The use of mean residence-time data for the detection of leakage between reaction vessels and a heat interchanger system is given as an example of the identification of process malfunction. (author)

  4. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  5. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  6. Design of a Production Process to Enhance Optical Performance of 3(omega) Optics

    International Nuclear Information System (INIS)

    Prasad, R.R.; Bruere, J.R.; Halpin, J.; Lucero, P.; Mills, S.; Bernacil, M.; Hackel, R.P.

    2003-01-01

    Using the Phoenix pre-production conditioning facility we have shown that raster scanning of 3ω optics using a XeF excimer laser and mitigation of the resultant damage sites with a CO 2 laser can enhance their optical damage resistance. Several large-scale (43 cm x 43 cm) optics have been processed in this facility. A production facility capable of processing several large optics a week has been designed based on our experience in the pre-production facility. The facility will be equipped with UV conditioning lasers--351-nm XeF excimer lasers operating at 100 Hz and 23 ns. The facility will also include a CO 2 laser for damage mitigation, an optics stage for raster scanning large-scale optics, a damage mapping system (DMS) that images large-scale optics and can detect damage sites or precursors as small as ∼ 15 (micro)m, and two microscopes to image damage sites with ∼ 5 (micro)m resolution. The optics will be handled in a class 100 clean room, within the facility that will be maintained at class 1000

  7. Large-Scale Selection and Breeding To Generate Industrial Yeasts with Superior Aroma Production

    Science.gov (United States)

    Steensels, Jan; Meersman, Esther; Snoek, Tim; Saels, Veerle

    2014-01-01

    The concentrations and relative ratios of various aroma compounds produced by fermenting yeast cells are essential for the sensory quality of many fermented foods, including beer, bread, wine, and sake. Since the production of these aroma-active compounds varies highly among different yeast strains, careful selection of variants with optimal aromatic profiles is of crucial importance for a high-quality end product. This study evaluates the production of different aroma-active compounds in 301 different Saccharomyces cerevisiae, Saccharomyces paradoxus, and Saccharomyces pastorianus yeast strains. Our results show that the production of key aroma compounds like isoamyl acetate and ethyl acetate varies by an order of magnitude between natural yeasts, with the concentrations of some compounds showing significant positive correlation, whereas others vary independently. Targeted hybridization of some of the best aroma-producing strains yielded 46 intraspecific hybrids, of which some show a distinct heterosis (hybrid vigor) effect and produce up to 45% more isoamyl acetate than the best parental strains while retaining their overall fermentation performance. Together, our results demonstrate the potential of large-scale outbreeding to obtain superior industrial yeasts that are directly applicable for commercial use. PMID:25192996

  8. The Role of Small-Scale Biofuel Production in Brazil: Lessons for Developing Countries

    Directory of Open Access Journals (Sweden)

    Arielle Muniz Kubota

    2017-07-01

    Full Text Available Small-scale biofuel initiatives to produce sugarcane ethanol are claimed to be a sustainable opportunity for ethanol supply, particularly for regions with price-restricted or no access to modern biofuels, such as communities located far from the large ethanol production centers in Brazil and family-farm communities in Sub-Saharan Africa, respectively. However, smallholders often struggle to achieve economic sustainability with ethanol microdistilleries. The aim of this paper is to provide an assessment of the challenges faced by small-scale bioenergy initiatives and discuss the conditions that would potentially make these initiatives economically feasible. Ethanol microdistilleries were assessed through a critical discussion of existent models and through an economic analysis of different sugarcane ethanol production models. The technical-economic analysis showed that the lack of competitiveness against large-scale ethanol distillery, largely due to both low crop productivity and process efficiency, makes it unlikely that small-scale distilleries can compete in the national/international ethanol market without governmental policies and subsidies. Nevertheless, small-scale projects intended for local supply and integrated food–fuel systems seem to be an interesting alternative that can potentially make ethanol production in small farms viable as well as increase food security and project sustainability particularly for local communities in developing countries.

  9. Modeling of a Large-Scale High Temperature Regenerative Sulfur Removal Process

    DEFF Research Database (Denmark)

    Konttinen, Jukka T.; Johnsson, Jan Erik

    1999-01-01

    model that does not account for bed hydrodynamics. The pilot-scale test run results, obtained in the test runs of the sulfur removal process with real coal gasifier gas, have been used for parameter estimation. The validity of the reactor model for commercial-scale design applications is discussed.......Regenerable mixed metal oxide sorbents are prime candidates for the removal of hydrogen sulfide from hot gasifier gas in the simplified integrated gasification combined cycle (IGCC) process. As part of the regenerative sulfur removal process development, reactor models are needed for scale......-up. Steady-state kinetic reactor models are needed for reactor sizing, and dynamic models can be used for process control design and operator training. The regenerative sulfur removal process to be studied in this paper consists of two side-by-side fluidized bed reactors operating at temperatures of 400...

  10. Satellite Imagery Production and Processing Using Apache Hadoop

    Science.gov (United States)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  11. Large-scale synthesis of single-crystalline MgO with bone-like nanostructures

    International Nuclear Information System (INIS)

    Niu Haixia; Yang Qing; Tang Kaibin; Xie Yi

    2006-01-01

    Uniform bone-like MgO nanocrystals have been prepared via a solvothermal process using commercial Mg powders as the starting material in the absence of any catalyst or surfactant followed by a subsequent calcination. Field emission scanning electron microscopy (FE-SEM) and transmission electron microscopy (TEM) measurements indicate that the product consists of a large quantity of bone-like nanocrystals with lengths of 120-200 nm. The widths of these nanocrystals at both ends are in the range of 20-50 nm, which are 3-20 nm wider than those of the middle parts. Explorations of X-ray diffraction (XRD) and selected area electronic diffraction (SAED) exhibit that the product is high-quality cubic single-crystalline nanocrystals. The photoluminescence (PL) measurement suggests that the product has an intensive emission centered at 410 nm, showing that the product has potential application in optical devices. The advantages of our method lie in high yield, the easy availability of the starting materials and permitting large-scale production at low cost. The growth mechanism was proposed to be related with solvent's oxidation in the precursor formation process and following nucleation and mass-transfer in the decomposition of the precursor

  12. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    is the idea that the large-scale foodservice such as hospital food service should adopt a buy organic policy due to their large buying volume. But whereas implementation of organic foods has developed quite unproblematically in smaller institutions such as kindergartens and nurseries, introduction of organic...... foods into large-scale foodservice such as that taking place in hospitals and larger homes for the elderly, has proven to be quite difficult. The very complex planning, procurement and processing procedures used in such facilities are among reasons for this. Against this background an evaluation...

  13. Evaluation of enzymatic reactors for large-scale panose production.

    Science.gov (United States)

    Fernandes, Fabiano A N; Rodrigues, Sueli

    2007-07-01

    Panose is a trisaccharide constituted by a maltose molecule bonded to a glucose molecule by an alpha-1,6-glycosidic bond. This trisaccharide has potential to be used in the food industry as a noncariogenic sweetener, as the oral flora does not ferment it. Panose can also be considered prebiotic for stimulating the growth of benefic microorganisms, such as lactobacillus and bifidobacteria, and for inhibiting the growth of undesired microorganisms such as E. coli and Salmonella. In this paper, the production of panose by enzymatic synthesis in a batch and a fed-batch reactor was optimized using a mathematical model developed to simulate the process. Results show that optimum production is obtained in a fed-batch process with an optimum production of 11.23 g/l h of panose, which is 51.5% higher than production with batch reactor.

  14. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  15. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  16. The potential of optimized process design to advance LCA performance of algae production systems

    NARCIS (Netherlands)

    Boxtel, van A.J.B.; Perez-Lopez, P.; Breitmayer, E.; Slegers, P.M.

    2015-01-01

    Environmental impact is an essential aspect for the introduction of algae production systems. As information of large scale algae production is hardly available, process simulation is the only way to evaluate environmental sustainability in an early phase of process design. Simulation results allow

  17. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  18. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  19. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  20. Forest landscape models, a tool for understanding the effect of the large-scale and long-term landscape processes

    Science.gov (United States)

    Hong S. He; Robert E. Keane; Louis R. Iverson

    2008-01-01

    Forest landscape models have become important tools for understanding large-scale and long-term landscape (spatial) processes such as climate change, fire, windthrow, seed dispersal, insect outbreak, disease propagation, forest harvest, and fuel treatment, because controlled field experiments designed to study the effects of these processes are often not possible (...

  1. Production of baryons with large transverse momentum

    International Nuclear Information System (INIS)

    Landshoff, P.V.; Polkinghorne, J.C.; Scott, D.M.

    1975-01-01

    The multiple scattering of constituent quarks provides a natural mechanism for fairly copious production of large-transverse-momentum baryons in nucleon--nucleon collisions. The predicted scaling law agrees well with available data, and the mechanism provides a qualitative explanation of nuclear-target effects. In comparison with previous parton models, correlations are predicted to be qualitatively different, and large-p/sub T/ baryon production by meson beams is relatively suppressed

  2. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  3. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  4. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  5. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  6. Abnormal binding and disruption in large scale networks involved in human partial seizures

    Directory of Open Access Journals (Sweden)

    Bartolomei Fabrice

    2013-12-01

    Full Text Available There is a marked increase in the amount of electrophysiological and neuroimaging works dealing with the study of large scale brain connectivity in the epileptic brain. Our view of the epileptogenic process in the brain has largely evolved over the last twenty years from the historical concept of “epileptic focus” to a more complex description of “Epileptogenic networks” involved in the genesis and “propagation” of epileptic activities. In particular, a large number of studies have been dedicated to the analysis of intracerebral EEG signals to characterize the dynamic of interactions between brain areas during temporal lobe seizures. These studies have reported that large scale functional connectivity is dramatically altered during seizures, particularly during temporal lobe seizure genesis and development. Dramatic changes in neural synchrony provoked by epileptic rhythms are also responsible for the production of ictal symptoms or changes in patient’s behaviour such as automatisms, emotional changes or consciousness alteration. Beside these studies dedicated to seizures, large-scale network connectivity during the interictal state has also been investigated not only to define biomarkers of epileptogenicity but also to better understand the cognitive impairments observed between seizures.

  7. Production Supervision Incorporated With Network Technology-A Solution For Controlling In-Process Inventory

    Directory of Open Access Journals (Sweden)

    Suraj Yadav

    2013-06-01

    Full Text Available In context to the manufacturing management in medium scale production floor, work-in-process (WIP management or the inprocess inventory and control as the inevitable result of the production process has become a vital link of production plan. Due to the growing production requirements and the potential economic benefits of manufacturing process flow, enterprises have been pushed to integrate work-in-process management with their manufacturing process and the larger the company the larger the list of in-process inventory and this all are typically hard to manage so for the same respect the author in this paper has lighted on the integration of sophisticated electronics and networking technologies with the W.I.P with an native and low cost solution for managing the same, specially for the medium scaled company dealing with large number of product or with the customized product with reference to study of present scenario of a multinational company’s plant engineering department.

  8. Luminescence property and large-scale production of ZnO nanowires by current heating deposition

    International Nuclear Information System (INIS)

    Singjai, P.; Jintakosol, T.; Singkarat, S.; Choopun, S.

    2007-01-01

    Large-scale production for ZnO nanowires has been demonstrated by current heating deposition. Based on the use of a solid-vapor phase carbothermal sublimation technique, a ZnO-graphite mixed rod was placed between two copper bars and gradually heated by passing current through it under constant flowing of argon gas at atmospheric pressure. The product seen as white films deposited on the rod surface was separated for further characterizations. The results have shown mainly comb-like structures of ZnO nanowires in diameter ranging from 50 to 200 nm and length up to several tens micrometers. From optical testing, ionoluminescence spectra of as-grown and annealed samples have shown high green emission intensities centered at 510 nm. In contrast, the small UV peak centered at 390 nm was observed clearly in the as-grown sample which almost disappeared after the annealing treatment

  9. New technologies for large-scale micropatterning of functional nanocomposite polymers

    Science.gov (United States)

    Khosla, A.; Gray, B. L.

    2012-04-01

    We present a review of different micropatterning technologies for flexible elastomeric functional nanocomposites with a particular emphasis on mold material and processes for production of large size substrates. The functional polymers include electrically conducting and magnetic materials developed at the Micro-instrumentation Laboratory at Simon Fraser University, Canada. We present a chart that compares many of these different conductive and magnetic functional nanocomposites and their measured characteristics. Furthermore, we have previously reported hybrid processes for nanocomposite polymers micromolded against SU-8 photoepoxy masters. However, SU-8 is typically limited to substrate sizes that are compatible with microelectronics processing as a microelectronics uv-patterning step is typically involved, and de-molding problems are observed. Recently, we have developed new processes that address the problems faced with SU-8 molds. These new technologies for micropatterning nanocomposites involve new substrate materials. A low cost Poly(methyl methacrylate) (PMMA) microfabrication technology has been developed, which involves fabrication of micromold via either CO2 laser ablation or deep UV. We have previously reported this large-scale patterning technique using laser ablation. Finally, we compare the two processes for PMMA producing micromolds for nanocomposites.

  10. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  11. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  12. New advances in the integrated management of food processing by-products in Europe: sustainable exploitation of fruit and cereal processing by-products with the production of new food products (NAMASTE EU).

    Science.gov (United States)

    Fava, Fabio; Zanaroli, Giulio; Vannini, Lucia; Guerzoni, Elisabetta; Bordoni, Alessandra; Viaggi, Davide; Robertson, Jim; Waldron, Keith; Bald, Carlos; Esturo, Aintzane; Talens, Clara; Tueros, Itziar; Cebrián, Marta; Sebők, András; Kuti, Tunde; Broeze, Jan; Macias, Marta; Brendle, Hans-Georg

    2013-09-25

    By-products generated every year by the European fruit and cereal processing industry currently exceed several million tons. They are disposed of mainly through landfills and thus are largely unexploited sources of several valuable biobased compounds potentially profitable in the formulation of novel food products. The opportunity to design novel strategies to turn them into added value products and food ingredients via novel and sustainable processes is the main target of recently EC-funded FP7 project NAMASTE-EU. NAMASTE-EU aims at developing new laboratory-scale protocols and processes for the exploitation of citrus processing by-products and wheat bran surpluses via the production of ingredients useful for the formulation of new beverage and food products. Among the main results achieved in the first two years of the project, there are the development and assessment of procedures for the selection, stabilization and the physical/biological treatment of citrus and wheat processing by-products, the obtainment and recovery of some bioactive molecules and ingredients and the development of procedures for assessing the quality of the obtained ingredients and for their exploitation in the preparation of new food products. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Basic physical phenomena, neutron production and scaling of the dense plasma focus

    International Nuclear Information System (INIS)

    Kaeppeler, H.J.

    This paper presents an attempt at establishing a model theory for the dense plasma focus in order to present a consistent interpretation of the basic physical phenomena leading to neutron production from both acceleration and thermal processes. To achieve this, the temporal history of the focus is divided into the compression of the plasma sheath, a qiescent and very dense phase with ensuing expansion, and an instable phase where the focus plasma is disrupted by instabilities. Finally, the decay of density, velocity and thermal fields is considered. Under the assumption that Io 2 /sigmaoRo 2 = const and to/Tc = const, scaling laws for plasma focus devices are derived. It is shown that while generally the neutron yield scales with the fourth power of maximum current, neutron production from thermal processes becomes increasingly important for large devices, while in the small devices neutron production from acceleration processes is by far predominant. (orig.) [de

  14. Centralized manure digestion. Selection of locations and estimation of costs of large-scale manure storage application

    International Nuclear Information System (INIS)

    1995-03-01

    A study to assess the possibilities and the consequences of the use of existing Dutch large scale manure silos at centralised anaerobic digestion plants (CAD-plants) for manure and energy-rich organic wastes is carried out. Reconstruction of these large scale manure silos into digesters for a CAD-plant is not self-evident due to the high height/diameter ratio of these silos and the extra investments that have to be made for additional facilities for roofing, insulation, mixing and heating. From the results of an inventory and selection of large scale manure silos with a storage capacity above 1,500 m 3 it appeared that there are 21 locations in The Netherlands that can be qualified for realisation of a CAD plant with a processing capacity of 100 m 3 biomass (80% manure, 20% additives) per day. These locations are found in particular at the 'shortage-areas' for manure fertilisation in the Dutch provinces Groningen and Drenthe. Three of these 21 locations with large scale silos are considered to be the most suitable for realisation of a large scale CAD-plant. The selection is based on an optimal scale for a CAD-plant of 300 m 3 material (80% manure, 20% additives) to be processed per day and the most suitable consuming markets for the biogas produced at the CAD-plant. The three locations are at Middelharnis, Veendam, and Klazinaveen. Applying the conditions as used in this study and accounting for all costs for transport of manure, additives and end-product including the costs for the storage facilities, a break-even operation might be realised at a minimum income for the additives of approximately 50 Dutch guilders per m 3 (including TAV). This income price is considerably lower than the prevailing costs for tipping or processing of organic wastes in The Netherlands. This study revealed that a break-even exploitation of a large scale CAD-plant for the processing of manure with energy-rich additives is possible. (Abstract Truncated)

  15. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  16. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  17. Development and scale-up of the production process of NovoCell fuel cells; Desenvolvimento e 'scale-up' do processo de producao de celulas a combustivel NovoCell

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, Dayse Caldas de; Souza, Adler de; Ferreira, Valdemar Stelita [NovoCell Sistemas de Energia S.A., Santa Barbara D' Oeste, SP (Brazil)

    2008-07-01

    Fuel cells present the potentiality to substitute the engines of internal combustion in vehicles and to supply energy for stationary use. This potentiality, however, not yet reflected in its introduction in the market with regular lines of production, because of its high cost and lack of criteria that demonstrate its reliability and durability. These subjects are the main goals of the programs of development of fuel cells worldwide. NovoCell is a Brazilian company whose objective is to develop and to produce hydrogen/air fuel cells for stationary generation. All the project is guided by the use of technologies/processes and materials that allow production in large scale and to a competitive cost, giving support to a continuous program of innovation and development of the product. In this work the technological solutions developed by the company are presented. (author)

  18. Toyota production system beyond large-scale production

    CERN Document Server

    Ohno, Taiichi

    1998-01-01

    In this classic text, Taiichi Ohno--inventor of the Toyota Production System and Lean manufacturing--shares the genius that sets him apart as one of the most disciplined and creative thinkers of our time. Combining his candid insights with a rigorous analysis of Toyota's attempts at Lean production, Ohno's book explains how Lean principles can improve any production endeavor. A historical and philosophical description of just-in-time and Lean manufacturing, this work is a must read for all students of human progress. On a more practical level, it continues to provide inspiration and instruction for those seeking to improve efficiency through the elimination of waste.

  19. Large-scale production of poly(3-hydroxyoctanoic acid) by Pseudomonas putida GPo1 and a simplified downstream process.

    Science.gov (United States)

    Elbahloul, Yasser; Steinbüchel, Alexander

    2009-02-01

    The suitability of Pseudomonas putida GPo1 for large-scale cultivation and production of poly(3-hydroxyoctanoate) (PHO) was investigated in this study. Three fed-batch cultivations of P. putida GPo1 at the 350- or 400-liter scale in a bioreactor with a capacity of 650 liters were done in mineral salts medium containing initially 20 mM sodium octanoate as the carbon source. The feeding solution included ammonium octanoate, which was fed at a relatively low concentration to promote PHO accumulation under nitrogen-limited conditions. During cultivation, the pH was regulated by addition of NaOH, NH(4)OH, or octanoic acid, which was used as an additional carbon source. Partial O(2) pressure (pO(2)) was adjusted to 20 to 40% by controlling the airflow and stirrer speed. Under the optimized conditions, P. putida GPo1 was able to grow to cell densities as high as 18, 37, and 53 g cells (dry mass) (CDM) per liter containing 49, 55, and 60% (wt/wt) of PHO, respectively. The resulting 40 kg CDM from these three cultivations was used directly for extraction of PHO. Three different methods of extraction of PHO were applied. From these, only acetone extraction showed better performance and resulted in 94% recovery of the PHO contents of cells. A novel mixture of precipitation solvents composed of 70% (vol/vol) methanol and 70% (vol/vol) ethanol was identified in this study. The ratio of PHO concentrate to the mixture was 0.2:1 (vol/vol) and allowed complete precipitation of PHO as white flakes. However, at a ratio of 1:1 (vol/vol) of the solvent mixture to PHO concentrate, a highly purified PHO was obtained. Precipitation yielded a dough-like polymeric material which was cast into thin layers and then shredded into small strips to allow evaporation of the remaining solvents. Gas chromatographic analysis revealed a purity of about 99% +/- 0.2% (wt/wt) of the polymer, which consisted mainly of 3-hydroxyoctanoic acid (96 mol%).

  20. Logistics of large scale commercial IVF embryo production.

    Science.gov (United States)

    Blondin, P

    2016-01-01

    The use of IVF in agriculture is growing worldwide. This can be explained by the development of better IVF media and techniques, development of sexed semen and the recent introduction of bovine genomics on farms. Being able to perform IVF on a large scale, with multiple on-farm experts to perform ovum pick-up and IVF laboratories capable of handling large volumes in a consistent and sustainable way, remains a huge challenge. To be successful, there has to be a partnership between veterinarians on farms, embryologists in the laboratory and animal owners. Farmers must understand the limits of what IVF can or cannot do under different conditions; veterinarians must manage expectations of farmers once strategies have been developed regarding potential donors; and embryologists must maintain fluent communication with both groups to make sure that objectives are met within predetermined budgets. The logistics of such operations can be very overwhelming, but the return can be considerable if done right. The present mini review describes how such operations can become a reality, with an emphasis on the different aspects that must be considered by all parties.

  1. The effects of large scale processing on caesium leaching from cemented simulant sodium nitrate waste

    International Nuclear Information System (INIS)

    Lee, D.J.; Brown, D.J.

    1982-01-01

    The effects of large scale processing on the properties of cemented simulant sodium nitrate waste have been investigated. Leach tests have been performed on full-size drums, cores and laboratory samples of cement formulations containing Ordinary Portland Cement (OPC), Sulphate Resisting Portland Cement (SRPC) and a blended cement (90% ground granulated blast furnace slag/10% OPC). In addition, development of the cement hydration exotherms with time and the temperature distribution in 220 dm 3 samples have been followed. (author)

  2. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  3. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  4. Large Scale Product Recommendation of Supermarket Ware Based on Customer Behaviour Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Kanavos

    2018-05-01

    Full Text Available In this manuscript, we present a prediction model based on the behaviour of each customer using data mining techniques. The proposed model utilizes a supermarket database and an additional database from Amazon, both containing information about customers’ purchases. Subsequently, our model analyzes these data in order to classify customers as well as products, being trained and validated with real data. This model is targeted towards classifying customers according to their consuming behaviour and consequently proposes new products more likely to be purchased by them. The corresponding prediction model is intended to be utilized as a tool for marketers so as to provide an analytically targeted and specified consumer behavior. Our algorithmic framework and the subsequent implementation employ the cloud infrastructure and use the MapReduce Programming Environment, a model for processing large data-sets in a parallel manner with a distributed algorithm on computer clusters, as well as Apache Spark, which is a newer framework built on the same principles as Hadoop. Through a MapReduce model application on each step of the proposed method, text processing speed and scalability are enhanced in reference to other traditional methods. Our results show that the proposed method predicts with high accuracy the purchases of a supermarket.

  5. Metoder for Modellering, Simulering og Regulering af Større Termiske Processer anvendt i Sukkerproduktion. Methods for Modelling, Simulation and Control of Large Scale Thermal Systems Applied in Sugar Production

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    The subject of this Ph.D. thesis is to investigate and develop methods for modelling, simulation and control applicable in large scale termal industrial plants. An ambition has been to evaluate the results in a physical process. Sugar production is well suited for the purpose. In collaboration...... simulator has been developed. The simulator handles the normal working conditions relevant to control engineers. A non-linear dynamic model based on mass and energy balances has been developed. The model parameters have been adjusted to data measured on a Danish sugar plant. The simulator consists...... of a computer, a data terminal and an electric interface corresponding to the interface at the sugar plant. The simulator is operating in realtime and thus a realistic test of controllers is possible. The idiomatic control methodology has been investigated developing a control concept for the evaporation...

  6. Modelling aggregation on the large scale and regularity on the small scale in spatial point pattern datasets

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper

    We consider a dependent thinning of a regular point process with the aim of obtaining aggregation on the large scale and regularity on the small scale in the resulting target point process of retained points. Various parametric models for the underlying processes are suggested and the properties...

  7. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  8. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    OpenAIRE

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been...

  9. Analysis of supply chain, scale factor, and optimum plant capacity for the production of ethanol from corn stover

    International Nuclear Information System (INIS)

    Leboreiro, Jose; Hilaly, Ahmad K.

    2013-01-01

    A detailed model is used to perform a thorough analysis on ethanol production from corn stover via the dilute acid process. The biomass supply chain cost model accounts for all steps needed to source corn stover including collection, transportation, and storage. The manufacturing cost model is based on work done at NREL; attainable conversions of key process parameters are used to calculate production cost. The choice of capital investment scaling function and scaling parameter has a significant impact on the optimum plant capacity. For the widely used exponential function, the scaling factors are functions of plant capacity. The pre-exponential factor decreases with increasing plant capacity while the exponential factor increases as the plant capacity increases. The use of scaling parameters calculated for small plant capacities leads to falsely large optimum plants; data from a wide range of plant capacities is required to produce accurate results. A mathematical expression to scale capital investment for fermentation-based biorefineries is proposed which accounts for the linear scaling behavior of bio-reactors (such as saccharification vessels and fermentors) as well as the exponential nature of all other plant equipment. Ignoring the linear scaling behavior of bio-reactors leads to artificially large optimum plant capacities. The minimum production cost is found to be in the range of 789–830 $ m −3 which is significantly higher than previously reported. Optimum plant capacities are in the range of 5750–9850 Mg d −1 . The optimum plant capacity and production cost are highly sensitive to farmer participation in biomass harvest for low participation rates. -- Highlights: •A detailed model is used to perform a technoeconomic analysis for the production of ethanol from corn stover. •The capital investment scaling factors were found to be a function of plant capacity. •Bio-reactors (such as saccharification vessels and fermentors) in large size

  10. Process simulation of nuclear-based thermochemical hydrogen production with a copper-chlorine cycle

    International Nuclear Information System (INIS)

    Chukwu, C.C.; Naterer, G.F.; Rosen, M.A.

    2008-01-01

    Thermochemical processes for hydrogen production driven by nuclear energy are promising alternatives to existing technologies for large-scale commercial production of hydrogen without fossil fuels. The copper-chlorine (Cu-Cl) cycle, in which water is decomposed into hydrogen and oxygen, is promising for thermochemical hydrogen production in conjunction with a Supercritical Water Cooled Reactor. Here, the cycle efficiency is examined using the Aspen Plus process simulation code. Possible efficiency improvements are discussed. The results are expected to assist the development of a lab-scale cycle demonstration, which is currently being undertaken at University of Ontario Institute of Technology in collaboration with numerous partners. (author)

  11. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  12. Large-scale distribution of tritium in a commercial product

    International Nuclear Information System (INIS)

    Combs, F.; Doda, R.J.

    1979-01-01

    Tritium enters the environment from various sources including nuclear reactor operations, weapons testing, natural production, and from the manufacture, use and ultimate disposal of commercial products containing tritium. A recent commercial application of tritium in the United States of America involves the backlighting of liquid crystal displays (LCD) in digital electronic watches. These watches are distributed through normal commercial channels to the general public. One million curies (MCi) of tritium were distributed in 1977 in this product. This is a significant quantity of tritium compared with power reactor-produced tritium (3MCi yearly) or with naturally produced tritium (6MCi yearly). This is the single largest commercial application involving tritium to date. The final disposition of tritium from large quantities of this product, after its useful life, must be estimated by considering the means of disposal and the possibility of dispersal of tritium concurrent with disposal. The most likely method of final disposition of this product will be disposal in solid refuse; this includes burial in land fills and incineration. Burial in land fills will probably contain the tritium for its effective lifetime, whereas incineration will release all the tritium gas (as the oxide) to the atmosphere. The use and disposal of this product will be studied as part of an environmental study that is at present being prepared for the U.S. Nuclear Regulatory Commission. (author)

  13. Recycling of mill scale in sintering process

    Directory of Open Access Journals (Sweden)

    El-Hussiny N.A.

    2011-01-01

    Full Text Available This investigation deals with the effect of replacing some amount of Baharia high barite iron ore concentrate by mill scale waste which was characterized by high iron oxide content on the parameters of the sintering process., and investigation the effect of different amount of coke breeze added on sintering process parameters when using 5% mill scale waste with 95% iron ore concentrate. The results of this work show that, replacement of iron ore concentrate with mill scale increases the amount of ready made sinter, sinter strength and productivity of the sinter machine and productivity at blast furnace yard. Also, the increase of coke breeze leads to an increase the ready made sinter and productivity of the sintering machine at blast furnace yard. The productivity of the sintering machine after 5% decreased slightly due to the decrease of vertical velocity.

  14. Biomethanol production from gasification of non-woody plant in South Africa: Optimum scale and economic performance

    International Nuclear Information System (INIS)

    Amigun, Bamikole; Gorgens, Johann; Knoetze, Hansie

    2010-01-01

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW th . The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW th to about 6.44 R/l for a 60 MW th and 3.95 R/l for a 400 MW th methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW th , but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW th plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons.

  15. Biomethanol production from gasification of non-woody plant in South Africa: Optimum scale and economic performance

    Energy Technology Data Exchange (ETDEWEB)

    Amigun, Bamikole, E-mail: bamigun@csir.co.z [Sustainable Energy Futures, Natural Resources and the Environment, Council for Scientific and Industrial Research (CSIR), Pretoria (South Africa); Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa); Gorgens, Johann; Knoetze, Hansie [Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa)

    2010-01-15

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW{sub th}. The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW{sub th} to about 6.44 R/l for a 60 MW{sub th} and 3.95 R/l for a 400 MW{sub th} methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW{sub th}, but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW{sub th} plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons.

  16. Biomethanol production from gasification of non-woody plant in South Africa. Optimum scale and economic performance

    Energy Technology Data Exchange (ETDEWEB)

    Amigun, Bamikole [Sustainable Energy Futures, Natural Resources and the Environment, Council for Scientific and Industrial Research (CSIR), Pretoria (South Africa); Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa); Gorgens, Johann; Knoetze, Hansie [Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa)

    2010-01-15

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW{sub th}. The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW{sub th} to about 6.44 R/l for a 60 MW{sub th} and 3.95 R/l for a 400 MW{sub th} methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW{sub th}, but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW{sub th} plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons. (author)

  17. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    Science.gov (United States)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables

  18. Benchmarking of Processes for the Biosynthesis of Natural Products

    DEFF Research Database (Denmark)

    Seita, Catarina Sanches

    putida GS1. (R)-perillic acid is a monoterpenoic acid with antimicrobial properties. It has a strong inhibitory effect on bacteria and fungus, which makes it an attractive compound to be used as a preservative for instance in cosmetic industry, but on the other hand makes the biosynthesis a complicated....... These biological activities can be of interest for use in different sectors of chemical industry, in particular pharmaceutical industry where several drugs are derived or inspired by natural products structure. However, the large scale production of natural products is hindered by its relatively poor abundance...... of the process in comparison with other sweeteners. The main benefit of this early-stage evaluation is putting the biosynthesis of natural products into context in relation to demands of an industrially feasible chemical process. Moreover, it can give very meaningful insight into process development and provides...

  19. Scale-up and large-scale production of Tetraselmis sp. CTP4 (Chlorophyta) for CO2 mitigation: from an agar plate to 100-m3 industrial photobioreactors.

    Science.gov (United States)

    Pereira, Hugo; Páramo, Jaime; Silva, Joana; Marques, Ana; Barros, Ana; Maurício, Dinis; Santos, Tamára; Schulze, Peter; Barros, Raúl; Gouveia, Luísa; Barreira, Luísa; Varela, João

    2018-03-23

    Industrial production of novel microalgal isolates is key to improving the current portfolio of available strains that are able to grow in large-scale production systems for different biotechnological applications, including carbon mitigation. In this context, Tetraselmis sp. CTP4 was successfully scaled up from an agar plate to 35- and 100-m 3 industrial scale tubular photobioreactors (PBR). Growth was performed semi-continuously for 60 days in the autumn-winter season (17 th October - 14 th December). Optimisation of tubular PBR operations showed that improved productivities were obtained at a culture velocity of 0.65-1.35 m s -1 and a pH set-point for CO 2 injection of 8.0. Highest volumetric (0.08 ± 0.01 g L -1 d -1 ) and areal (20.3 ± 3.2 g m -2 d -1 ) biomass productivities were attained in the 100-m 3 PBR compared to those of the 35-m 3 PBR (0.05 ± 0.02 g L -1 d -1 and 13.5 ± 4.3 g m -2 d -1 , respectively). Lipid contents were similar in both PBRs (9-10% of ash free dry weight). CO 2 sequestration was followed in the 100-m 3 PBR, revealing a mean CO 2 mitigation efficiency of 65% and a biomass to carbon ratio of 1.80. Tetraselmis sp. CTP4 is thus a robust candidate for industrial-scale production with promising biomass productivities and photosynthetic efficiencies up to 3.5% of total solar irradiance.

  20. Novel fermentation processes for manufacturing plant natural products.

    Science.gov (United States)

    Zhou, Jingwen; Du, Guocheng; Chen, Jian

    2014-02-01

    Microbial production of plant natural products (PNPs), such as terpenoids, flavonoids from renewable carbohydrate feedstocks offers sustainable and economically attractive alternatives to their petroleum-based production. Rapid development of metabolic engineering and synthetic biology of microorganisms shows many advantages to replace the current extraction of these useful high price chemicals from plants. Although few of them were actually applied on a large scale for PNPs production, continuous research on these high-price chemicals and the rapid growing global market of them, show the promising future for the production of these PNPs by microorganisms with a more economic and environmental friendly way. Introduction of novel pathways and optimization of the native cellular processes by metabolic engineering of microorganisms for PNPs production are rapidly expanding its range of cell-factory applications. Here we review recent progress in metabolic engineering of microorganisms for the production of PNPs. Besides, factors restricting the yield improvement and application of lab-scale achievements to industrial applications have also been discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    International Nuclear Information System (INIS)

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  2. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  3. Scale-up of precipitation processes

    OpenAIRE

    Zauner, R.

    1999-01-01

    This thesis concerns the scale-up of precipitation processes aimed at predicting product particle characteristics. Although precipitation is widely used in the chemical and pharmaceutical industry, successful scale-up is difficult due to the absence of a validated methodology. It is found that none of the conventional scale-up criteria reported in the literature (equal power input per unit mass, equal tip speed, equal stirring rate) is capable of predicting the experimentally o...

  4. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  5. New systems for the large-scale production of male tsetse flies (Diptera: Glossinidae)

    International Nuclear Information System (INIS)

    Opiyo, E.; Luger, D.; Robinson, A.S.

    2000-01-01

    morsitans morsitans Westwood produced a total of 500,000 sterile males. In Burkina Faso, between 1976 and 1984, a colony of 330,000 G. palpalis gambiensis Vanderplank and G. tachinoides Westwood provided 950,000 sterile males for release into an area of 3,000 km 2 (Clair et al. 1990) while during the Bicot project in Nigeria in an area of 1,500 km 2 , 1.5 million sterile male G. p. palpalis Robineau-Desvoidy were released (Olandunmade et al. 1990). Recently, 8.5 million sterile males were released on Unguja Island, Zanzibar, the United Republic of Tanzania in an area of 1,600 km 2 produced by a colony of about 600,000 G. austeni Newstead (Saleh et al. 1997, Kitwika et al. 1997). This led to the eradication of the tsetse population and a massive reduction in disease incidence in cattle (Saleh et al. 1997). Tsetse fly SIT has been applied on a limited scale because of the inability to provide large numbers of sterile males for release. The present rearing system is labour intensive and too many quality sensitive steps in the mass production system are not sufficiently standardised to transfer the system directly to large-scale production. Tsetse rearing evolved from feeding on live hosts to an in vitro rearing system where blood is fed to flies through a silicone membrane (Feldmann 1994a). At present, cages are small, hold a small number of flies and have to be manually transferred for feeding and then returned for pupal collection. This limits the number of flies that can be handled at any one time. In order to improve these processes, a Tsetse Production Unit (TPU) was developed and evaluated. During conventional tsetse rearing, flies need to be sexed with the correct number and sex of flies, whether for stocking production cages or for the release of males only. This has to be done by hand on an individual fly basis following the immobilisation of adults at C. A procedure is reported in this paper for the self-stocking of production cages (SSPC) which enables flies to

  6. The large-scale process of microbial carbonate precipitation for nickel remediation from an industrial soil.

    Science.gov (United States)

    Zhu, Xuejiao; Li, Weila; Zhan, Lu; Huang, Minsheng; Zhang, Qiuzhuo; Achal, Varenyam

    2016-12-01

    Microbial carbonate precipitation is known as an efficient process for the remediation of heavy metals from contaminated soils. In the present study, a urease positive bacterial isolate, identified as Bacillus cereus NS4 through 16S rDNA sequencing, was utilized on a large scale to remove nickel from industrial soil contaminated by the battery industry. The soil was highly contaminated with an initial total nickel concentration of approximately 900 mg kg -1 . The soluble-exchangeable fraction was reduced to 38 mg kg -1 after treatment. The primary objective of metal stabilization was achieved by reducing the bioavailability through immobilizing the nickel in the urease-driven carbonate precipitation. The nickel removal in the soils contributed to the transformation of nickel from mobile species into stable biominerals identified as calcite, vaterite, aragonite and nickelous carbonate when analyzed under XRD. It was proven that during precipitation of calcite, Ni 2+ with an ion radius close to Ca 2+ was incorporated into the CaCO 3 crystal. The biominerals were also characterized by using SEM-EDS to observe the crystal shape and Raman-FTIR spectroscopy to predict responsible bonding during bioremediation with respect to Ni immobilization. The electronic structure and chemical-state information of the detected elements during MICP bioremediation process was studied by XPS. This is the first study in which microbial carbonate precipitation was used for the large-scale remediation of metal-contaminated industrial soil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  8. Comparison of biohydrogen production processes

    International Nuclear Information System (INIS)

    Manish, S.; Banerjee, Rangan

    2008-01-01

    For hydrogen to be a viable energy carrier, it is important to develop hydrogen generation routes that are renewable like biohydrogen. Hydrogen can be produced biologically by biophotolysis (direct and indirect), photo-fermentation and dark-fermentation or by combination of these processes (such as integration of dark- and photo-fermentation (two-stage process), or biocatalyzed electrolysis, etc.). However, production of hydrogen by these methods at commercial level is not reported in the literature and challenges regarding the process scale up remain. In this scenario net energy analysis (NEA) can provide a tool for establishing the viability of different methods before scaling up. The analysis can also be used to set targets for various process and design parameters for bio-hydrogen production. In this paper, four biohydrogen production processes (dark-fermentation, photo-fermentation, two-stage process and biocatalyzed electrolysis) utilizing sugarcane juice as the carbon source, are compared with base case method steam methane reforming (SMR) on the basis of net energy ratio, energy efficiency and greenhouse gas (GHG) emissions. It was found that when by-products are not considered, the efficiencies of biological hydrogen processes are lower than that of SMR. However, these processes reduce GHG emissions and non-renewable energy use by 57-73% and 65-79%, respectively, as compared to the SMR process. Efficiencies of biohydrogen processes increase significantly when by-products are considered hence by-products removal and utilization is an important issue in biological hydrogen production. (author)

  9. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  10. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  11. Evaluation of process parameters in the industrial scale production of fish nuggets

    Directory of Open Access Journals (Sweden)

    Adriane da Silva

    2011-06-01

    Full Text Available This work reports the use of experimental design for the assessment of the effects of process parameters on the production of fish nuggets in an industrial scale environment. The effect of independent factors on the physicochemical and microbiological parameters was investigated through a full 24 experimental design. The studied factors included the temperature of fish fillet and pulp in the mixer, the temperature of the added fat, the temperature of water and the ratio of protein extraction time to emulsion time. The physicochemical analyses showed that the higher temperature of the pulp and fillet of fish, the lower the protein in the final product. Microbiological analyses revealed that the counting of Staphylococcus coagulase positive, total and thermo-tolerant coliforms were in accordance with the current legislation.

  12. Large transverse momentum hadronic processes

    International Nuclear Information System (INIS)

    Darriulat, P.

    1977-01-01

    The possible relations between deep inelastic leptoproduction and large transverse momentum (psub(t)) processes in hadronic collisions are usually considered in the framework of the quark-parton picture. Experiments observing the structure of the final state in proton-proton collisions producing at least one large transverse momentum particle have led to the following conclusions: a large fraction of produced particles are uneffected by the large psub(t) process. The other products are correlated to the large psub(t) particle. Depending upon the sign of scalar product they can be separated into two groups of ''towards-movers'' and ''away-movers''. The experimental evidence are reviewed favouring such a picture and the properties are discussed of each of three groups (underlying normal event, towards-movers and away-movers). Some phenomenological interpretations are presented. The exact nature of away- and towards-movers must be further investigated. Their apparent jet structure has to be confirmed. Angular correlations between leading away and towards movers are very informative. Quantum number flow, both within the set of away and towards-movers, and between it and the underlying normal event, are predicted to behave very differently in different models

  13. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  14. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  15. Development and scale-up of the production process of NovoCell fuel cells; Desenvolvimento e 'scale-up' do processo de producao de celulas a combustivel NovoCell

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, Dayse Caldas de; Souza, Adler de; Ferreira, Valdemar Stelita [NovoCell Sistemas de Energia S.A., Santa Barbara D' Oeste, SP (Brazil)]. E-mail: dayse.azevedo@novocell.ind.br

    2008-07-01

    Fuel cells present the potentiality to substitute the engines of internal combustion in vehicles and to supply energy for stationary use. This potentiality, however, not yet reflected in its introduction in the market with regular lines of production, because of its high cost and lack of criteria that demonstrate its reliability and durability. These subjects are the main goals of the programs of development of fuel cells worldwide. NovoCell is a Brazilian company whose objective is to develop and to produce hydrogen/air fuel cells for stationary generation. All the project is guided by the use of technologies/processes and materials that allow production in large scale and to a competitive cost, giving support to a continuous program of innovation and development of the product. In this work the technological solutions developed by the company are presented. (author)

  16. Cod Gadus morhua and climate change: processes, productivity and prediction

    DEFF Research Database (Denmark)

    Brander, Keith

    2010-01-01

    the causes. Investigation of cod Gadus morhua populations across the whole North Atlantic Ocean has shown large-scale patterns of change in productivity due to lower individual growth and condition, caused by large-scale climate forcing. If a population is being heavily exploited then a drop in productivity......Environmental factors act on individual fishes directly and indirectly. The direct effects on rates and behaviour can be studied experimentally and in the field, particularly with the advent of ever smarter tags for tracking fishes and their environment. Indirect effects due to changes in food......, predators, parasites and diseases are much more difficult to estimate and predict. Climate can affect all life-history stages through direct and indirect processes and although the consequences in terms of growth, survival and reproductive output can be monitored, it is often difficult to determine...

  17. Parallelizing Gene Expression Programming Algorithm in Enabling Large-Scale Classification

    Directory of Open Access Journals (Sweden)

    Lixiong Xu

    2017-01-01

    Full Text Available As one of the most effective function mining algorithms, Gene Expression Programming (GEP algorithm has been widely used in classification, pattern recognition, prediction, and other research fields. Based on the self-evolution, GEP is able to mine an optimal function for dealing with further complicated tasks. However, in big data researches, GEP encounters low efficiency issue due to its long time mining processes. To improve the efficiency of GEP in big data researches especially for processing large-scale classification tasks, this paper presents a parallelized GEP algorithm using MapReduce computing model. The experimental results show that the presented algorithm is scalable and efficient for processing large-scale classification tasks.

  18. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  19. Engineering microbial cell factories for the production of plant natural products: from design principles to industrial-scale production.

    Science.gov (United States)

    Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng

    2017-07-19

    Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.

  20. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  1. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  2. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  3. Large-scale production of bioenergy by the side of fuel-peat; Bioenergian suurtuotanto polttoturpeen rinnalla

    Energy Technology Data Exchange (ETDEWEB)

    Heikkilae, K. [Vapo Oy, Jyvaeskylae (Finland)

    1996-12-31

    The objective of the project was to clarify the large-scale production possibilities and the construction of the costs for bioenergy, and to develop the operational manners so that smaller volumes of biomasses are integrated to prevailing peat production and delivered so that peat ensures the quality of the fuel supply, as well as the prices and the reliability of deliveries. Hence it is possible to utilize the same organisation, machinery and volumes. The operation will be designed to be all-year-round so that the profitability can be improved. Another aim is to get the non-utilizeable wood-wastes into use, which would serve also the silvicultural purposes. The utilizeable municipal and other wastes and sludges could be used within biomass, and to make, using proper mixing ratios, biofuels precisely suitable for the purposes of the customer. At the grain growing areas it is possible to utilize the straw and at the seaside the reed grass

  4. Understory fern community structure, growth and spore production responses to a large-scale hurricane experiment in a Puerto Rico rainforest

    Science.gov (United States)

    Joanne M. Sharpe; Aaron B. Shiels

    2014-01-01

    Ferns are abundant in most rainforest understories yet their responses to hurricanes have not been well studied. Fern community structure, growth and spore production were monitored for two years before and five years after a large-scale experiment that simulated two key components of severe hurricane disturbance: canopy openness and debris deposition. The canopy was...

  5. Large-scale bioenergy production from soybeans and switchgrass in Argentina: Part A: Potential and economic feasibility for national and international markets

    NARCIS (Netherlands)

    van Dam, J.; Faaij, A.P.C.; Hilbert, J.; Petruzzi, H.; Turkenburg, W.C.

    2009-01-01

    This study focuses on the economic feasibility for large-scale biomass production from soybeans or switchgrass from a region in Argentina. This is determined, firstly, by estimating whether the potential supply of biomass, when food and feed demand are met, is sufficient under different scenarios to

  6. Continuous downstream processing for high value biological products: A Review.

    Science.gov (United States)

    Zydney, Andrew L

    2016-03-01

    There is growing interest in the possibility of developing truly continuous processes for the large-scale production of high value biological products. Continuous processing has the potential to provide significant reductions in cost and facility size while improving product quality and facilitating the design of flexible multi-product manufacturing facilities. This paper reviews the current state-of-the-art in separations technology suitable for continuous downstream bioprocessing, focusing on unit operations that would be most appropriate for the production of secreted proteins like monoclonal antibodies. This includes cell separation/recycle from the perfusion bioreactor, initial product recovery (capture), product purification (polishing), and formulation. Of particular importance are the available options, and alternatives, for continuous chromatographic separations. Although there are still significant challenges in developing integrated continuous bioprocesses, recent technological advances have provided process developers with a number of attractive options for development of truly continuous bioprocessing operations. © 2015 Wiley Periodicals, Inc.

  7. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  8. Large-Scale Reactive Atomistic Simulation of Shock-induced Initiation Processes in Energetic Materials

    Science.gov (United States)

    Thompson, Aidan

    2013-06-01

    Initiation in energetic materials is fundamentally dependent on the interaction between a host of complex chemical and mechanical processes, occurring on scales ranging from intramolecular vibrations through molecular crystal plasticity up to hydrodynamic phenomena at the mesoscale. A variety of methods (e.g. quantum electronic structure methods (QM), non-reactive classical molecular dynamics (MD), mesoscopic continuum mechanics) exist to study processes occurring on each of these scales in isolation, but cannot describe how these processes interact with each other. In contrast, the ReaxFF reactive force field, implemented in the LAMMPS parallel MD code, allows us to routinely perform multimillion-atom reactive MD simulations of shock-induced initiation in a variety of energetic materials. This is done either by explicitly driving a shock-wave through the structure (NEMD) or by imposing thermodynamic constraints on the collective dynamics of the simulation cell e.g. using the Multiscale Shock Technique (MSST). These MD simulations allow us to directly observe how energy is transferred from the shockwave into other processes, including intramolecular vibrational modes, plastic deformation of the crystal, and hydrodynamic jetting at interfaces. These processes in turn cause thermal excitation of chemical bonds leading to initial chemical reactions, and ultimately to exothermic formation of product species. Results will be presented on the application of this approach to several important energetic materials, including pentaerythritol tetranitrate (PETN) and ammonium nitrate/fuel oil (ANFO). In both cases, we validate the ReaxFF parameterizations against QM and experimental data. For PETN, we observe initiation occurring via different chemical pathways, depending on the shock direction. For PETN containing spherical voids, we observe enhanced sensitivity due to jetting, void collapse, and hotspot formation, with sensitivity increasing with void size. For ANFO, we

  9. Development of large scale production of Nd-doped phosphate glasses for megajoule-scale laser systems

    International Nuclear Information System (INIS)

    Ficini, G.; Campbell, J.H.

    1996-01-01

    Nd-doped phosphate glasses are the preferred gain medium for high-peak-power lasers used for Inertial Confinement Fusion research because they have excellent energy storage and extraction characteristics. In addition, these glasses can be manufactured defect-free in large sizes and at relatively low cost. To meet the requirements of the future mega-joule size lasers, advanced laser glass manufacturing methods are being developed that would enable laser glass to be continuously produced at the rate of several thousand large (790 x 440 x 44 mm 3 ) plates of glass per year. This represents more than a 10 to 100-fold improvement in the scale of the present manufacturing technology

  10. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  11. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  12. Rain forest nutrient cycling and productivity in response to large-scale litter manipulation.

    Science.gov (United States)

    Wood, Tana E; Lawrence, Deborah; Clark, Deborah A; Chazdon, Robin L

    2009-01-01

    Litter-induced pulses of nutrient availability could play an important role in the productivity and nutrient cycling of forested ecosystems, especially tropical forests. Tropical forests experience such pulses as a result of wet-dry seasonality and during major climatic events, such as strong El Niños. We hypothesized that (1) an increase in the quantity and quality of litter inputs would stimulate leaf litter production, woody growth, and leaf litter nutrient cycling, and (2) the timing and magnitude of this response would be influenced by soil fertility and forest age. To test these hypotheses in a Costa Rican wet tropical forest, we established a large-scale litter manipulation experiment in two secondary forest sites and four old-growth forest sites of differing soil fertility. In replicated plots at each site, leaves and twigs (forest floor. We analyzed leaf litter mass, [N] and [P], and N and P inputs for addition, removal, and control plots over a two-year period. We also evaluated basal area increment of trees in removal and addition plots. There was no response of forest productivity or nutrient cycling to litter removal; however, litter addition significantly increased leaf litter production and N and P inputs 4-5 months following litter application. Litter production increased as much as 92%, and P and N inputs as much as 85% and 156%, respectively. In contrast, litter manipulation had no significant effect on woody growth. The increase in leaf litter production and N and P inputs were significantly positively related to the total P that was applied in litter form. Neither litter treatment nor forest type influenced the temporal pattern of any of the variables measured. Thus, environmental factors such as rainfall drive temporal variability in litter and nutrient inputs, while nutrient release from decomposing litter influences the magnitude. Seasonal or annual variation in leaf litter mass, such as occurs in strong El Niño events, could positively

  13. Scale down of the inactivated polio vaccine production process

    NARCIS (Netherlands)

    Thomassen, Y.E.; Oever, van 't R.; Vinke, C.M.; Spiekstra, A.; Wijffels, R.H.; Pol, van der L.A.; Bakker, W.A.M.

    2013-01-01

    The anticipated increase in the demand for inactivated polio vaccines resulting from the success in the polio eradication program requires an increase in production capacity and cost price reduction of the current inactivated polio vaccine production processes. Improvement of existing production

  14. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  15. Inverse problem to constrain the controlling parameters of large-scale heat transport processes: The Tiberias Basin example

    Science.gov (United States)

    Goretzki, Nora; Inbar, Nimrod; Siebert, Christian; Möller, Peter; Rosenthal, Eliyahu; Schneider, Michael; Magri, Fabien

    2015-04-01

    Salty and thermal springs exist along the lakeshore of the Sea of Galilee, which covers most of the Tiberias Basin (TB) in the northern Jordan- Dead Sea Transform, Israel/Jordan. As it is the only freshwater reservoir of the entire area, it is important to study the salinisation processes that pollute the lake. Simulations of thermohaline flow along a 35 km NW-SE profile show that meteoric and relic brines are flushed by the regional flow from the surrounding heights and thermally induced groundwater flow within the faults (Magri et al., 2015). Several model runs with trial and error were necessary to calibrate the hydraulic conductivity of both faults and major aquifers in order to fit temperature logs and spring salinity. It turned out that the hydraulic conductivity of the faults ranges between 30 and 140 m/yr whereas the hydraulic conductivity of the Upper Cenomanian aquifer is as high as 200 m/yr. However, large-scale transport processes are also dependent on other physical parameters such as thermal conductivity, porosity and fluid thermal expansion coefficient, which are hardly known. Here, inverse problems (IP) are solved along the NW-SE profile to better constrain the physical parameters (a) hydraulic conductivity, (b) thermal conductivity and (c) thermal expansion coefficient. The PEST code (Doherty, 2010) is applied via the graphical interface FePEST in FEFLOW (Diersch, 2014). The results show that both thermal and hydraulic conductivity are consistent with the values determined with the trial and error calibrations. Besides being an automatic approach that speeds up the calibration process, the IP allows to cover a wide range of parameter values, providing additional solutions not found with the trial and error method. Our study shows that geothermal systems like TB are more comprehensively understood when inverse models are applied to constrain coupled fluid flow processes over large spatial scales. References Diersch, H.-J.G., 2014. FEFLOW Finite

  16. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    was carried out of the change process related implementation of organic foods in large-scale foodservice facilities in Greater Copenhagen county in order to study the effects of such a change. Based on the findings, a set of guidelines has been developed for the successful implementation of organic foods...

  17. Importance of regional species pools and functional traits in colonization processes: predicting re-colonization after large-scale destruction of ecosystems

    NARCIS (Netherlands)

    Kirmer, A.; Tischew, S.; Ozinga, W.A.; Lampe, von M.; Baasch, A.; Groenendael, van J.M.

    2008-01-01

    Large-scale destruction of ecosystems caused by surface mining provides an opportunity for the study of colonization processes starting with primary succession. Surprisingly, over several decades and without any restoration measures, most of these sites spontaneously developed into valuable biotope

  18. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  19. Formation and fate of marine snow: small-scale processes with large- scale implications

    Directory of Open Access Journals (Sweden)

    Thomas Kiørboe

    2001-12-01

    Full Text Available Marine snow aggregates are believed to be the main vehicles for vertical material transport in the ocean. However, aggregates are also sites of elevated heterotrophic activity, which may rather cause enhanced retention of aggregated material in the upper ocean. Small-scale biological-physical interactions govern the formation and fate of marine snow. Aggregates may form by physical coagulation: fluid motion causes collisions between small primary particles (e.g. phytoplankton that may then stick together to form aggregates with enhanced sinking velocities. Bacteria may subsequently solubilise and remineralise aggregated particles. Because the solubilization rate exceeds the remineralization rate, organic solutes leak out of sinking aggregates. The leaking solutes spread by diffusion and advection and form a chemical trail in the wake of the sinking aggregate that may guide small zooplankters to the aggregate. Also, suspended bacteria may enjoy the elevated concentration of organic solutes in the plume. I explore these small-scale formation and degradation processes by means of models, experiments and field observations. The larger scale implications for the structure and functioning of pelagic food chains of export vs. retention of material will be discussed.

  20. Development of small-scale peat production; Pienturvetuotannon kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Erkkilae, A.; Kallio, E. [VTT Energy, Jyvaeskylae (Finland)

    1997-12-01

    The aim of the project is to develop production conditions, methods and technology of small-scale peat production to such a level that the productivity is improved and competitivity maintained. The aim in 1996 was to survey the present status of small-scale peat production, and research and development needs and to prepare a development plan for small-scale peat production for a continued project in 1997 and for the longer term. A questionnaire was sent to producers by mail, and its results were completed by phone interviews. Response was obtained from 164 producers, i.e. from about 75 - 85 % of small-scale peat producers. The quantity of energy peat produced by these amounted to 3.3 TWh and that of other peat to 265 000 m{sup 3}. The total production of energy peat (large- scale producers Vapo Oy and Turveruukki Oy included) amounted to 25.0 TWh in 1996 in Finland, of which 91 % (22.8 TWh) was milled peat and 9 % (2.2 TWh) of sod peat. The total production of peat other than energy peat amounted to 1.4 million m{sup 3}. The proportion of small-scale peat production was 13 % of energy peat, 11 % of milled peat and 38 % of sod peat. The proportion of small-scale producers was 18 % of other peat production. The results deviate clearly from those obtained in a study of small-scale production in the 1980s. The amount of small-scale production is clearly larger than generally assessed. Small-scale production focuses more on milled peat than on sod peat. The work will be continued in 1997. Based on development needs appeared in the questionnaire, the aim is to reduce environmental impacts and runoff effluents from small- scale production, to increase the efficiency of peat deliveries and to reduce peat production costs by improving the service value of machines by increasing co-operative use. (orig.)

  1. Probing high scale physics with top quarks at the Large Hadron Collider

    Science.gov (United States)

    Dong, Zhe

    With the Large Hadron Collider (LHC) running at TeV scale, we are expecting to find the deviations from the Standard Model in the experiments, and understanding what is the origin of these deviations. Being the heaviest elementary particle observed so far in the experiments with the mass at the electroweak scale, top quark is a powerful probe for new phenomena of high scale physics at the LHC. Therefore, we concentrate on studying the high scale physics phenomena with top quark pair production or decay at the LHC. In this thesis, we study the discovery potential of string resonances decaying to t/tbar final state, and examine the possibility of observing baryon-number-violating top-quark production or decay, at the LHC. We point out that string resonances for a string scale below 4 TeV can be detected via the t/tbar channel, by reconstructing center-of-mass frame kinematics of the resonances from either the t/tbar semi-leptonic decay or recent techniques of identifying highly boosted tops. For the study of baryon-number-violating processes, by a model independent effective approach and focusing on operators with minimal mass-dimension, we find that corresponding effective coefficients could be directly probed at the LHC already with an integrated luminosity of 1 inverse femtobarns at 7 TeV, and further constrained with 30 (100) inverse femtobarns at 7 (14) TeV.

  2. Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process

    Science.gov (United States)

    Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue

    2014-04-01

    The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.

  3. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  4. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  5. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  6. Expanded Large-Scale Forcing Properties Derived from the Multiscale Data Assimilation System and Its Application to Single-Column Models

    Science.gov (United States)

    Feng, S.; Li, Z.; Liu, Y.; Lin, W.; Toto, T.; Vogelmann, A. M.; Fridlind, A. M.

    2013-12-01

    We present an approach to derive large-scale forcing that is used to drive single-column models (SCMs) and cloud resolving models (CRMs)/large eddy simulation (LES) for evaluating fast physics parameterizations in climate models. The forcing fields are derived by use of a newly developed multi-scale data assimilation (MS-DA) system. This DA system is developed on top of the NCEP Gridpoint Statistical Interpolation (GSI) System and is implemented in the Weather Research and Forecasting (WRF) model at a cloud resolving resolution of 2 km. This approach has been applied to the generation of large scale forcing for a set of Intensive Operation Periods (IOPs) over the Atmospheric Radiation Measurement (ARM) Climate Research Facility's Southern Great Plains (SGP) site. The dense ARM in-situ observations and high-resolution satellite data effectively constrain the WRF model. The evaluation shows that the derived forcing displays accuracies comparable to the existing continuous forcing product and, overall, a better dynamic consistency with observed cloud and precipitation. One important application of this approach is to derive large-scale hydrometeor forcing and multiscale forcing, which is not provided in the existing continuous forcing product. It is shown that the hydrometeor forcing poses an appreciable impact on cloud and precipitation fields in the single-column model simulations. The large-scale forcing exhibits a significant dependency on domain-size that represents SCM grid-sizes. Subgrid processes often contribute a significant component to the large-scale forcing, and this contribution is sensitive to the grid-size and cloud-regime.

  7. Linking soil DOC production rates and transport processes from landscapes to sub-basin scales

    Science.gov (United States)

    Tian, Y. Q.; Yu, Q.; Li, J.; Ye, C.

    2014-12-01

    Recent research rejects the traditional perspective that dissolved organic carbon (DOC) component in global carbon cycle are simply trivial, and in fact evidence demonstrates that lakes likely mediate carbon dynamics on a global scale. Riverine and estuarine carbon fluxes play a critical role in transporting and recycling carbon and nutrients, not only within watersheds but in their receiving waters. However, the underlying mechanisms that drive carbon fluxes, from land to rivers, lake and oceans, remain poorly understood. This presentation will report a research result of the scale-dependent DOC production rate in coastal watersheds and DOC transport processes in estuarine regions. We conducted a series of controlled experiments and field measurements for examining biogeochemical, biological, and geospatial variables that regulate downstream processing on global-relevant carbon fluxes. Results showed that increased temperatures and raised soil moistures accelerate decomposition rates of organic matter with significant variations between vegetation types. The measurements at meso-scale ecosystem demonstrated a good correlation to bulk concentration of DOC monitored in receiving waters at the outlets of sub-basins (R2 > 0.65). These field and experimental measurements improved the model of daily carbon exports through below-ground processes as a function of the organic matter content of surface soils, forest litter supply, and temperature. The study demonstrated a potential improvement in modeling the co-variance of CDOM and DOC with the unique terrestrial sources. This improvement indicated a significant promise for monitoring riverine and estuarine carbon flux from satellite images. The technical innovations include deployments of 1) mini-ecosystem (mesocosms) with soil as replicate controlled experiments for DOC production and leaching rates, and 2) aquatic mesocosms for co-variances of DOC and CDOM endmembers, and an instrumented incubation experiment for

  8. Development of Industrial-Scale Fission 99Mo Production Process Using Low Enriched Uranium Target

    Directory of Open Access Journals (Sweden)

    Seung-Kon Lee

    2016-06-01

    Full Text Available Molybdenum-99 (99Mo is the most important isotope because its daughter isotope, technetium-99m (99mTc, has been the most widely used medical radioisotope for more than 50 years, accounting for > 80% of total nuclear diagnostics worldwide. In this review, radiochemical routes for the production of 99Mo, and the aspects for selecting a suitable process strategy are discussed from the historical viewpoint of 99Mo technology developments. Most of the industrial-scale 99Mo processes have been based on the fission of 235U. Recently, important issues have been raised for the conversion of fission 99Mo targets from highly enriched uranium to low enriched uranium (LEU. The development of new LEU targets with higher density was requested to compensate for the loss of 99Mo yield, caused by a significant reduction of 235U enrichment, from the conversion. As the dramatic increment of intermediate level liquid waste is also expected from the conversion, an effective strategy to reduce the waste generation from the fission 99Mo production is required. The mitigation of radioxenon emission from medical radioisotope production facilities is discussed in relation with the monitoring of nuclear explosions and comprehensive nuclear test ban. Lastly, the 99Mo production process paired with the Korea Atomic Energy Research Institute's own LEU target is proposed as one of the most suitable processes for the LEU target.

  9. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  10. Mesoderm Lineage 3D Tissue Constructs Are Produced at Large-Scale in a 3D Stem Cell Bioprocess.

    Science.gov (United States)

    Cha, Jae Min; Mantalaris, Athanasios; Jung, Sunyoung; Ji, Yurim; Bang, Oh Young; Bae, Hojae

    2017-09-01

    Various studies have presented different approaches to direct pluripotent stem cell differentiation such as applying defined sets of exogenous biochemical signals and genetic/epigenetic modifications. Although differentiation to target lineages can be successfully regulated, such conventional methods are often complicated, laborious, and not cost-effective to be employed to the large-scale production of 3D stem cell-based tissue constructs. A 3D-culture platform that could realize the large-scale production of mesoderm lineage tissue constructs from embryonic stem cells (ESCs) is developed. ESCs are cultured using our previously established 3D-bioprocess platform which is amenable to mass-production of 3D ESC-based tissue constructs. Hepatocarcinoma cell line conditioned medium is introduced to the large-scale 3D culture to provide a specific biomolecular microenvironment to mimic in vivo mesoderm formation process. After 5 days of spontaneous differentiation period, the resulting 3D tissue constructs are composed of multipotent mesodermal progenitor cells verified by gene and molecular expression profiles. Subsequently the optimal time points to trigger terminal differentiation towards cardiomyogenesis or osteogenesis from the mesodermal tissue constructs is found. A simple and affordable 3D ESC-bioprocess that can reach the scalable production of mesoderm origin tissues with significantly improved correspondent tissue properties is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. In Situ Vitrification preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-01-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: 1) determine large-scale processing performance and 2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. The LSRT was completed in June 1987 after 295 hours of operation and 460 MWh of electrical energy dissipated to the molten soil. This resulted in a minimum of a 450-t block of vitrified soil extending to a depth of 7.3m (24 ft). The primary contaminants vitrified during the demonstration were Pu and Am transuranics, but also included up to 26,000 ppm fluorides. Preliminary data show that their retention in the vitrified product exceeded predictions meaning that fewer contaminants needed to be removed from the gaseous effluents by the processing equipment. The gaseous effluents were contained and treated throughout the run; that is, no radioactive or hazardous chemical releases were detected

  12. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  13. Applications of Neutron Scattering in the Chemical Industry: Proton Dynamics of Highly Dispersed Materials, Characterization of Fuel Cell Catalysts, and Catalysts from Large-Scale Chemical Processes

    Science.gov (United States)

    Albers, Peter W.; Parker, Stewart F.

    The attractiveness of neutron scattering techniques for the detailed characterization of materials of high degrees of dispersity and structural complexity as encountered in the chemical industry is discussed. Neutron scattering picks up where other analytical methods leave off because of the physico-chemical properties of finely divided products and materials whose absorption behavior toward electromagnetic radiation and electrical conductivity causes serious problems. This is demonstrated by presenting typical applications from large-scale production technology and industrial catalysis. These include the determination of the proton-related surface chemistry of advanced materials that are used as reinforcing fillers in the manufacture of tires, where interrelations between surface chemistry, rheological properties, improved safety, and significant reduction of fuel consumption are the focus of recent developments. Neutron scattering allows surface science studies of the dissociative adsorption of hydrogen on nanodispersed, supported precious metal particles of fuel cell catalysts under in situ loading at realistic gas pressures of about 1 bar. Insight into the occupation of catalytically relevant surface sites provides valuable information about the catalyst in the working state and supplies essential scientific input for tailoring better catalysts by technologists. The impact of deactivation phenomena on industrial catalysts by coke deposition, chemical transformation of carbonaceous deposits, and other processes in catalytic hydrogenation processes that result in significant shortening of the time of useful operation in large-scale plants can often be traced back in detail to surface or bulk properties of catalysts or materials of catalytic relevance. A better understanding of avoidable or unavoidable aspects of catalyst deactivation phenomena under certain in-process conditions and the development of effective means for reducing deactivation leads to more energy

  14. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  15. Lobster processing by-products as valuable bioresource of marine functional ingredients, nutraceuticals, and pharmaceuticals.

    Science.gov (United States)

    Nguyen, Trung T; Barber, Andrew R; Corbin, Kendall; Zhang, Wei

    2017-01-01

    The worldwide annual production of lobster was 165,367 tons valued over $3.32 billion in 2004, but this figure rose up to 304,000 tons in 2012. Over half the volume of the worldwide lobster production has been processed to meet the rising global demand in diversified lobster products. Lobster processing generates a large amount of by-products (heads, shells, livers, and eggs) which account for 50-70% of the starting material. Continued production of these lobster processing by-products (LPBs) without corresponding process development for efficient utilization has led to disposal issues associated with costs and pollutions. This review presents the promising opportunities to maximize the utilization of LPBs by economic recovery of their valuable components to produce high value-added products. More than 50,000 tons of LPBs are globally generated, which costs lobster processing companies upward of about $7.5 million/year for disposal. This not only presents financial and environmental burdens to the lobster processors but also wastes a valuable bioresource. LPBs are rich in a range of high-value compounds such as proteins, chitin, lipids, minerals, and pigments. Extracts recovered from LPBs have been demonstrated to possess several functionalities and bioactivities, which are useful for numerous applications in water treatment, agriculture, food, nutraceutical, pharmaceutical products, and biomedicine. Although LPBs have been studied for recovery of valuable components, utilization of these materials for the large-scale production is still very limited. Extraction of lobster components using microwave, ultrasonic, and supercritical fluid extraction were found to be promising techniques that could be used for large-scale production. LPBs are rich in high-value compounds that are currently being underutilized. These compounds can be extracted for being used as functional ingredients, nutraceuticals, and pharmaceuticals in a wide range of commercial applications

  16. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  17. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  18. Goethite Bench-scale and Large-scale Preparation Tests

    Energy Technology Data Exchange (ETDEWEB)

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    ferrous ion, Fe{sup 2+}-Fe{sup 2+} is oxidized to Fe{sup 3+} - in the presence of goethite seed particles. Rhenium does not mimic that process; it is not a strong enough reducing agent to duplicate the TcO{sub 4}{sup -}/Fe{sup 2+} redox reactions. Laboratory tests conducted in parallel with these scaled tests identified modifications to the liquid chemistry necessary to reduce ReO{sub 4}{sup -} and capture rhenium in the solids at levels similar to those achieved by Um (2010) for inclusion of Tc into goethite. By implementing these changes, Re was incorporated into Fe-rich solids for testing at VSL. The changes also changed the phase of iron that was in the slurry product: rather than forming goethite ({alpha}-FeOOH), the process produced magnetite (Fe{sub 3}O{sub 4}). Magnetite was considered by Pacific Northwest National Laboratory (PNNL) and VSL to probably be a better product to improve Re retention in the melter because it decomposes at a higher temperature than goethite (1538 C vs. 136 C). The feasibility tests at VSL were conducted using Re-rich magnetite. The tests did not indicate an improved retention of Re in the glass during vitrification, but they did indicate an improved melting rate (+60%), which could have significant impact on HLW processing. It is still to be shown whether the Re is a solid solution in the magnetite as {sup 99}Tc was determined to be in goethite.

  19. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  20. Multi-scale modeling for sustainable chemical production.

    Science.gov (United States)

    Zhuang, Kai; Bakshi, Bhavik R; Herrgård, Markus J

    2013-09-01

    With recent advances in metabolic engineering, it is now technically possible to produce a wide portfolio of existing petrochemical products from biomass feedstock. In recent years, a number of modeling approaches have been developed to support the engineering and decision-making processes associated with the development and implementation of a sustainable biochemical industry. The temporal and spatial scales of modeling approaches for sustainable chemical production vary greatly, ranging from metabolic models that aid the design of fermentative microbial strains to material and monetary flow models that explore the ecological impacts of all economic activities. Research efforts that attempt to connect the models at different scales have been limited. Here, we review a number of existing modeling approaches and their applications at the scales of metabolism, bioreactor, overall process, chemical industry, economy, and ecosystem. In addition, we propose a multi-scale approach for integrating the existing models into a cohesive framework. The major benefit of this proposed framework is that the design and decision-making at each scale can be informed, guided, and constrained by simulations and predictions at every other scale. In addition, the development of this multi-scale framework would promote cohesive collaborations across multiple traditionally disconnected modeling disciplines to achieve sustainable chemical production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. LHCb: Managing Large Data Productions in LHCb

    CERN Multimedia

    Tsaregorodtsev, A

    2009-01-01

    LHC experiments are producing very large volumes of data either accumulated from the detectors or generated via the Monte-Carlo modeling. The data should be processed as quickly as possible to provide users with the input for their analysis. Processing of multiple hundreds of terabytes of data necessitates generation, submission and following a huge number of grid jobs running all over the Computing Grid. Manipulation of these large and complex workloads is impossible without powerful production management tools. In LHCb, the DIRAC Production Management System (PMS) is used to accomplish this task. It enables production managers and end-users to deal with all kinds of data generation, processing and storage. Application workflow tools allow to define jobs as complex sequences of elementary application steps expressed as Directed Acyclic Graphs. Specialized databases and a number of dedicated software agents ensure automated data driven job creation and submission. The productions are accomplished by thorough ...

  2. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  3. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  4. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Mining manufacturing data for discovery of high productivity process characteristics.

    Science.gov (United States)

    Charaniya, Salim; Le, Huong; Rangwala, Huzefa; Mills, Keri; Johnson, Kevin; Karypis, George; Hu, Wei-Shou

    2010-06-01

    Modern manufacturing facilities for bioproducts are highly automated with advanced process monitoring and data archiving systems. The time dynamics of hundreds of process parameters and outcome variables over a large number of production runs are archived in the data warehouse. This vast amount of data is a vital resource to comprehend the complex characteristics of bioprocesses and enhance production robustness. Cell culture process data from 108 'trains' comprising production as well as inoculum bioreactors from Genentech's manufacturing facility were investigated. Each run constitutes over one-hundred on-line and off-line temporal parameters. A kernel-based approach combined with a maximum margin-based support vector regression algorithm was used to integrate all the process parameters and develop predictive models for a key cell culture performance parameter. The model was also used to identify and rank process parameters according to their relevance in predicting process outcome. Evaluation of cell culture stage-specific models indicates that production performance can be reliably predicted days prior to harvest. Strong associations between several temporal parameters at various manufacturing stages and final process outcome were uncovered. This model-based data mining represents an important step forward in establishing a process data-driven knowledge discovery in bioprocesses. Implementation of this methodology on the manufacturing floor can facilitate a real-time decision making process and thereby improve the robustness of large scale bioprocesses. 2010 Elsevier B.V. All rights reserved.

  6. Sub-surface laser nanostructuring in stratified metal/dielectric media: a versatile platform towards flexible, durable and large-scale plasmonic writing

    International Nuclear Information System (INIS)

    Siozios, A; Bellas, D V; Lidorikis, E; Patsalas, P; Kalfagiannis, N; Cranton, W M; Koutsogeorgis, D C; Bazioti, C; Dimitrakopulos, G P; Vourlias, G

    2015-01-01

    Laser nanostructuring of pure ultrathin metal layers or ceramic/metal composite thin films has emerged as a promising route for the fabrication of plasmonic patterns with applications in information storage, cryptography, and security tagging. However, the environmental sensitivity of pure Ag layers and the complexity of ceramic/metal composite film growth hinder the implementation of this technology to large-scale production, as well as its combination with flexible substrates. In the present work we investigate an alternative pathway, namely, starting from non-plasmonic multilayer metal/dielectric layers, whose growth is compatible with large scale production such as in-line sputtering and roll-to-roll deposition, which are then transformed into plasmonic templates by single-shot UV-laser annealing (LA). This entirely cold, large-scale process leads to a subsurface nanoconstruction involving plasmonic Ag nanoparticles (NPs) embedded in a hard and inert dielectric matrix on top of both rigid and flexible substrates. The subsurface encapsulation of Ag NPs provides durability and long-term stability, while the cold character of LA suits the use of sensitive flexible substrates. The morphology of the final composite film depends primarily on the nanocrystalline character of the dielectric host and its thermal conductivity. We demonstrate the emergence of a localized surface plasmon resonance, and its tunability depending on the applied fluence and environmental pressure. The results are well explained by theoretical photothermal modeling. Overall, our findings qualify the proposed process as an excellent candidate for versatile, large-scale optical encoding applications. (paper)

  7. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  8. Iodine oxides in large-scale THAI tests

    International Nuclear Information System (INIS)

    Funke, F.; Langrock, G.; Kanzleiter, T.; Poss, G.; Fischer, K.; Kühnel, A.; Weber, G.; Allelein, H.-J.

    2012-01-01

    Highlights: ► Iodine oxide particles were produced from gaseous iodine and ozone. ► Ozone replaced the effect of ionizing radiation in the large-scale THAI facility. ► The mean diameter of the iodine oxide particles was about 0.35 μm. ► Particle formation was faster than the chemical reaction between iodine and ozone. ► Deposition of iodine oxide particles was slow in the absence of other aerosols. - Abstract: The conversion of gaseous molecular iodine into iodine oxide aerosols has significant relevance in the understanding of the fission product iodine volatility in a LWR containment during severe accidents. In containment, the high radiation field caused by fission products released from the reactor core induces radiolytic oxidation into iodine oxides. To study the characteristics and the behaviour of iodine oxides in large scale, two THAI tests Iod-13 and Iod-14 were performed, simulating radiolytic oxidation of molecular iodine by reaction of iodine with ozone, with ozone injected from an ozone generator. The observed iodine oxides form submicron particles with mean volume-related diameters of about 0.35 μm and show low deposition rates in the THAI tests performed in the absence of other nuclear aerosols. Formation of iodine aerosols from gaseous precursors iodine and ozone is fast as compared to their chemical interaction. The current approach in empirical iodine containment behaviour models in severe accidents, including the radiolytic production of I 2 -oxidizing agents followed by the I 2 oxidation itself, is confirmed by these THAI tests.

  9. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    Science.gov (United States)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of

  10. Multi-scale modeling for sustainable chemical production

    DEFF Research Database (Denmark)

    Zhuang, Kai; Bakshi, Bhavik R.; Herrgard, Markus

    2013-01-01

    associated with the development and implementation of a su stainable biochemical industry. The temporal and spatial scales of modeling approaches for sustainable chemical production vary greatly, ranging from metabolic models that aid the design of fermentative microbial strains to material and monetary flow......With recent advances in metabolic engineering, it is now technically possible to produce a wide portfolio of existing petrochemical products from biomass feedstock. In recent years, a number of modeling approaches have been developed to support the engineering and decision-making processes...... models that explore the ecological impacts of all economic activities. Research efforts that attempt to connect the models at different scales have been limited. Here, we review a number of existing modeling approaches and their applications at the scales of metabolism, bioreactor, overall process...

  11. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  12. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  13. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  14. Biodiesel production from Jatropha curcas: Integrated process optimization

    International Nuclear Information System (INIS)

    Huerga, Ignacio R.; Zanuttini, María Soledad; Gross, Martín S.; Querini, Carlos A.

    2014-01-01

    Highlights: • The oil obtained from Jatropha curcas fruits has high variability in its properties. • A process for biodiesel production has been developed for small scale projects. • Oil neutralization with the glycerine phase has important advantages. • The glycerine phase and the meal are adequate to produce biogas. - Abstract: Energy obtained from renewable sources has increased its participation in the energy matrix worldwide, and it is expected to maintain this tendency. Both in large and small scales, there have been numerous developments and research with the aim of generating fuels and energy using different raw materials such as alternative crops, algae and lignocellulosic residues. In this work, Jatropha curcas plantation from the North West of Argentina was studied, with the objective of developing integrated processes for low and medium sizes farms. In these cases, glycerine purification and meal detoxification processes represent a very high cost, and usually are not included in the project. Consequently, alternative uses for these products are proposed. This study includes the evaluation of the Jatropha curcas crop during two years, evaluating the yields and oil properties. The solids left after the oil extraction were evaluated as solid fuels, the glycerine and the meal were used to generate biogas, and the oil was used to produce biodiesel. The oil pretreatment was carried out with the glycerine obtained in the biodiesel production process, thus neutralizing the free fatty acid, and decreasing the phosphorous and water content

  15. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    underestimation of wet-to-dry-season droughts and snow-related droughts. Furthermore, almost no composite droughts were simulated for slowly responding areas, while many multi-year drought events were expected in these systems.

    We conclude that most drought propagation processes are reasonably well reproduced by the ensemble mean of large-scale models in contrasting catchments in Europe. Challenges, however, remain in catchments with cold and semi-arid climates and catchments with large storage in aquifers or lakes. This leads to a high uncertainty in hydrological drought simulation at large scales. Improvement of drought simulation in large-scale models should focus on a better representation of hydrological processes that are important for drought development, such as evapotranspiration, snow accumulation and melt, and especially storage. Besides the more explicit inclusion of storage in large-scale models, also parametrisation of storage processes requires attention, for example through a global-scale dataset on aquifer characteristics, improved large-scale datasets on other land characteristics (e.g. soils, land cover, and calibration/evaluation of the models against observations of storage (e.g. in snow, groundwater.

  16. Large Deviations for Two-Time-Scale Diffusions, with Delays

    International Nuclear Information System (INIS)

    Kushner, Harold J.

    2010-01-01

    We consider the problem of large deviations for a two-time-scale reflected diffusion process, possibly with delays in the dynamical terms. The Dupuis-Ellis weak convergence approach is used. It is perhaps the most intuitive and simplest for the problems of concern. The results have applications to the problem of approximating optimal controls for two-time-scale systems via use of the averaged equation.

  17. Parametric Evaluation of Large-Scale High-Temperature Electrolysis Hydrogen Production Using Different Advanced Nuclear Reactor Heat Sources

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; McKellar, Michael G.; O'Brien, James E.; Herring, J. Stephen

    2009-01-01

    High Temperature Electrolysis (HTE), when coupled to an advanced nuclear reactor capable of operating at reactor outlet temperatures of 800 C to 950 C, has the potential to efficiently produce the large quantities of hydrogen needed to meet future energy and transportation needs. To evaluate the potential benefits of nuclear-driven hydrogen production, the UniSim process analysis software was used to evaluate different reactor concepts coupled to a reference HTE process design concept. The reference HTE concept included an Intermediate Heat Exchanger and intermediate helium loop to separate the reactor primary system from the HTE process loops and additional heat exchangers to transfer reactor heat from the intermediate loop to the HTE process loops. The two process loops consisted of the water/steam loop feeding the cathode side of a HTE electrolysis stack, and the sweep gas loop used to remove oxygen from the anode side. The UniSim model of the process loops included pumps to circulate the working fluids and heat exchangers to recover heat from the oxygen and hydrogen product streams to improve the overall hydrogen production efficiencies. The reference HTE process loop model was coupled to separate UniSim models developed for three different advanced reactor concepts (a high-temperature helium cooled reactor concept and two different supercritical CO2 reactor concepts). Sensitivity studies were then performed to evaluate the affect of reactor outlet temperature on the power cycle efficiency and overall hydrogen production efficiency for each of the reactor power cycles. The results of these sensitivity studies showed that overall power cycle and hydrogen production efficiencies increased with reactor outlet temperature, but the power cycles producing the highest efficiencies varied depending on the temperature range considered

  18. Large-Scale Consumption and Zero-Waste Recycling Method of Red Mud in Steel Making Process

    Directory of Open Access Journals (Sweden)

    Guoshan Ning

    2018-03-01

    Full Text Available To release the environmental pressure from the massive discharge of bauxite residue (red mud, a novel recycling method of red mud in steel making process was investigated through high-temperature experiments and thermodynamic analysis. The results showed that after the reduction roasting of the carbon-bearing red mud pellets at 1100–1200 °C for 12–20 min, the metallic pellets were obtained with the metallization ratio of ≥88%. Then, the separation of slag and iron achieved from the metallic pellets at 1550 °C, after composition adjustment targeting the primary crystal region of the 12CaO·7Al2O3 phase. After iron removal and composition adjustment, the smelting-separation slag had good smelting performance and desulfurization capability, which meets the demand of sulfurization flux in steel making process. The pig iron quality meets the requirements of the high-quality raw material for steel making. In virtue of the huge scale and output of steel industry, the large-scale consumption and zero-waste recycling method of red mud was proposed, which comprised of the carbon-bearing red mud pellets roasting in the rotary hearth furnace and smelting separation in the electric arc furnace after composition adjustment.

  19. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    Science.gov (United States)

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  20. Fed-batch CHO cell culture for lab-scale antibody production

    DEFF Research Database (Denmark)

    Fan, Yuzhou; Ley, Daniel; Andersen, Mikael Rørdam

    2017-01-01

    Fed-batch culture is the most commonly used upstream process in industry today for recombinant monoclonal antibody production using Chinese hamster ovary cells. Developing and optimizing this process in the lab is crucial for establishing process knowledge, which enable rapid and predictable tech......-transfer to manufacturing scale. In this chapter, we will describe stepwise how to carry out fed-batch CHO cell culture for lab-scale antibody production....

  1. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  2. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  3. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  4. Rapid Large Scale Reprocessing of the ODI Archive using the QuickReduce Pipeline

    Science.gov (United States)

    Gopu, A.; Kotulla, R.; Young, M. D.; Hayashi, S.; Harbeck, D.; Liu, W.; Henschel, R.

    2015-09-01

    The traditional model of astronomers collecting their observations as raw instrument data is being increasingly replaced by astronomical observatories serving standard calibrated data products to observers and to the public at large once proprietary restrictions are lifted. For this model to be effective, observatories need the ability to periodically re-calibrate archival data products as improved master calibration products or pipeline improvements become available, and also to allow users to rapidly calibrate their data on-the-fly. Traditional astronomy pipelines are heavily I/O dependent and do not scale with increasing data volumes. In this paper, we present the One Degree Imager - Portal, Pipeline and Archive (ODI-PPA) calibration pipeline framework which integrates the efficient and parallelized QuickReduce pipeline to enable a large number of simultaneous, parallel data reduction jobs - initiated by operators AND/OR users - while also ensuring rapid processing times and full data provenance. Our integrated pipeline system allows re-processing of the entire ODI archive (˜15,000 raw science frames, ˜3.0 TB compressed) within ˜18 hours using twelve 32-core compute nodes on the Big Red II supercomputer. Our flexible, fast, easy to operate, and highly scalable framework improves access to ODI data, in particular when data rates double with an upgraded focal plane (scheduled for 2015), and also serve as a template for future data processing infrastructure across the astronomical community and beyond.

  5. Biotechnological Processes in Microbial Amylase Production.

    Science.gov (United States)

    Gopinath, Subash C B; Anbu, Periasamy; Arshad, M K Md; Lakshmipriya, Thangavel; Voon, Chun Hong; Hashim, Uda; Chinni, Suresh V

    2017-01-01

    Amylase is an important and indispensable enzyme that plays a pivotal role in the field of biotechnology. It is produced mainly from microbial sources and is used in many industries. Industrial sectors with top-down and bottom-up approaches are currently focusing on improving microbial amylase production levels by implementing bioengineering technologies. The further support of energy consumption studies, such as those on thermodynamics, pinch technology, and environment-friendly technologies, has hastened the large-scale production of the enzyme. Herein, the importance of microbial (bacteria and fungi) amylase is discussed along with its production methods from the laboratory to industrial scales.

  6. Large-scale freestanding nanometer-thick graphite pellicles for mass production of nanodevices beyond 10 nm.

    Science.gov (United States)

    Kim, Seul-Gi; Shin, Dong-Wook; Kim, Taesung; Kim, Sooyoung; Lee, Jung Hun; Lee, Chang Gu; Yang, Cheol-Woong; Lee, Sungjoo; Cho, Sang Jin; Jeon, Hwan Chul; Kim, Mun Ja; Kim, Byung-Gook; Yoo, Ji-Beom

    2015-09-21

    Extreme ultraviolet lithography (EUVL) has received much attention in the semiconductor industry as a promising candidate to extend dimensional scaling beyond 10 nm. We present a new pellicle material, nanometer-thick graphite film (NGF), which shows an extreme ultraviolet (EUV) transmission of 92% at a thickness of 18 nm. The maximum temperature induced by laser irradiation (λ = 800 nm) of 9.9 W cm(-2) was 267 °C, due to the high thermal conductivity of the NGF. The freestanding NGF was found to be chemically stable during annealing at 500 °C in a hydrogen environment. A 50 × 50 mm large area freestanding NGF was fabricated using the wet and dry transfer (WaDT) method. The NGF can be used as an EUVL pellicle for the mass production of nanodevices beyond 10 nm.

  7. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  8. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  9. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  10. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  11. Estimating GHG emission mitigation supply curves of large-scale biomass use on a country level

    International Nuclear Information System (INIS)

    Dornburg, Veronika; Dam, Jinke van; Faaij, Andre

    2007-01-01

    This study evaluates the possible influences of a large-scale introduction of biomass material and energy systems and their market volumes on land, material and energy market prices and their feedback to greenhouse gas (GHG) emission mitigation costs. GHG emission mitigation supply curves for large-scale biomass use were compiled using a methodology that combines a bottom-up analysis of biomass applications, biomass cost supply curves and market prices of land, biomaterials and bioenergy carriers. These market prices depend on the scale of biomass use and the market volume of materials and energy carriers and were estimated using own-price elasticities of demand. The methodology was demonstrated for a case study of Poland in the year 2015 applying different scenarios on economic development and trade in Europe. For the key technologies considered, i.e. medium density fibreboard, poly lactic acid, electricity and methanol production, GHG emission mitigation costs increase strongly with the scale of biomass production. Large-scale introduction of biomass use decreases the GHG emission reduction potential at costs below 50 Euro /Mg CO 2eq with about 13-70% depending on the scenario. Biomaterial production accounts for only a small part of this GHG emission reduction potential due to relatively small material markets and the subsequent strong decrease of biomaterial market prices at large scale of production. GHG emission mitigation costs depend strongly on biomass supply curves, own-price elasticity of land and market volumes of bioenergy carriers. The analysis shows that these influences should be taken into account for developing biomass implementations strategies

  12. Decoupling processes and scales of shoreline morphodynamics

    Science.gov (United States)

    Hapke, Cheryl J.; Plant, Nathaniel G.; Henderson, Rachel E.; Schwab, William C.; Nelson, Timothy R.

    2016-01-01

    Behavior of coastal systems on time scales ranging from single storm events to years and decades is controlled by both small-scale sediment transport processes and large-scale geologic, oceanographic, and morphologic processes. Improved understanding of coastal behavior at multiple time scales is required for refining models that predict potential erosion hazards and for coastal management planning and decision-making. Here we investigate the primary controls on shoreline response along a geologically-variable barrier island on time scales resolving extreme storms and decadal variations over a period of nearly one century. An empirical orthogonal function analysis is applied to a time series of shoreline positions at Fire Island, NY to identify patterns of shoreline variance along the length of the island. We establish that there are separable patterns of shoreline behavior that represent response to oceanographic forcing as well as patterns that are not explained by this forcing. The dominant shoreline behavior occurs over large length scales in the form of alternating episodes of shoreline retreat and advance, presumably in response to storms cycles. Two secondary responses include long-term response that is correlated to known geologic variations of the island and the other reflects geomorphic patterns with medium length scale. Our study also includes the response to Hurricane Sandy and a period of post-storm recovery. It was expected that the impacts from Hurricane Sandy would disrupt long-term trends and spatial patterns. We found that the response to Sandy at Fire Island is not notable or distinguishable from several other large storms of the prior decade.

  13. IS process for thermochemical hydrogen production

    International Nuclear Information System (INIS)

    Onuki, Kaoru; Nakajima, Hayato; Ioka, Ikuo; Futakawa, Masatoshi; Shimizu, Saburo

    1994-11-01

    The state-of-the-art of thermochemical hydrogen production by IS process is reviewed including experimental data obtained at JAERI on the chemistry of the Bunsen reaction step and on the corrosion resistance of the structural materials. The present status of laboratory scale demonstration at JAERI is also included. The study on the chemistry of the chemical reactions and the products separations has identified feasible methods to function the process. The flowsheeting studies revealed a process thermal efficiency higher than 40% is achievable under efficient process conditions. The corrosion resistance of commercially available structural materials have been clarified under various process conditions. The basic scheme of the process has been realized in a laboratory scale apparatus. R and D requirements to proceed to the engineering demonstration coupled with HTTR are briefly discussed. (author)

  14. Karhunen-Loève (PCA) based detection of multiple oscillations in multiple measurement signals from large-scale process plants

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Wickerhauser, M.V.

    2007-01-01

     In the perspective of optimizing the control and operation of large scale process plants, it is important to detect and to locate oscillations in the plants. This paper presents a scheme for detecting and localizing multiple oscillations in multiple measurements from such a large-scale power plant....... The scheme is based on a Karhunen-Lo\\`{e}ve analysis of the data from the plant. The proposed scheme is subsequently tested on two sets of data: a set of synthetic data and a set of data from a coal-fired power plant. In both cases the scheme detects the beginning of the oscillation within only a few samples....... In addition the oscillation localization has also shown its potential by localizing the oscillations in both data sets....

  15. Scale Up of Malonic Acid Fermentation Process: Cooperative Research and Development Final Report, CRADA Number CRD-16-612

    Energy Technology Data Exchange (ETDEWEB)

    Schell, Daniel J [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-16

    The goal of this work is to use the large fermentation vessels in the National Renewable Energy Laboratory's (NREL) Integrated Biorefinery Research Facility (IBRF) to scale-up Lygos' biological-based process for producing malonic acid and to generate performance data. Initially, work at the 1 L scale validated successful transfer of Lygos' fermentation protocols to NREL using a glucose substrate. Outside of the scope of the CRADA with NREL, Lygos tested their process on lignocellulosic sugars produced by NREL at Lawrence Berkeley National Laboratory's (LBNL) Advanced Biofuels Process Development Unit (ABPDU). NREL produced these cellulosic sugar solutions from corn stover using a separate cellulose/hemicellulose process configuration. Finally, NREL performed fermentations using glucose in large fermentors (1,500- and 9,000-L vessels) to intermediate product and to demonstrate successful performance of Lygos' technology at larger scales.

  16. Synthetic Spider Silk Production on a Laboratory Scale

    Science.gov (United States)

    Hsia, Yang; Gnesa, Eric; Pacheco, Ryan; Kohler, Kristin; Jeffery, Felicia; Vierra, Craig

    2012-01-01

    As society progresses and resources become scarcer, it is becoming increasingly important to cultivate new technologies that engineer next generation biomaterials with high performance properties. The development of these new structural materials must be rapid, cost-efficient and involve processing methodologies and products that are environmentally friendly and sustainable. Spiders spin a multitude of different fiber types with diverse mechanical properties, offering a rich source of next generation engineering materials for biomimicry that rival the best manmade and natural materials. Since the collection of large quantities of natural spider silk is impractical, synthetic silk production has the ability to provide scientists with access to an unlimited supply of threads. Therefore, if the spinning process can be streamlined and perfected, artificial spider fibers have the potential use for a broad range of applications ranging from body armor, surgical sutures, ropes and cables, tires, strings for musical instruments, and composites for aviation and aerospace technology. In order to advance the synthetic silk production process and to yield fibers that display low variance in their material properties from spin to spin, we developed a wet-spinning protocol that integrates expression of recombinant spider silk proteins in bacteria, purification and concentration of the proteins, followed by fiber extrusion and a mechanical post-spin treatment. This is the first visual representation that reveals a step-by-step process to spin and analyze artificial silk fibers on a laboratory scale. It also provides details to minimize the introduction of variability among fibers spun from the same spinning dope. Collectively, these methods will propel the process of artificial silk production, leading to higher quality fibers that surpass natural spider silks. PMID:22847722

  17. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  18. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  19. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  20. Large-scale melting and impact mixing on early-formed asteroids

    DEFF Research Database (Denmark)

    Greenwood, Richard; Barrat, J.-A.; Scott, Edward Robert Dalton

    Large-scale melting of asteroids and planetesimals is now known to have taken place ex-tremely early in solar system history [1]. The first-generation bodies produced by this process would have been subject to rapid collisional reprocessing, leading in most cases to fragmentation and/or accretion...... the relationship between the different groups of achondrites [3, 4]. Here we present new oxygen isotope evidence con-cerning the role of large-scale melting and subsequent impact mixing in the evolution of three important achondrite groups: the main-group pallasites, meso-siderites and HEDs....

  1. Development of industrial-scale fission {sup 99}Mo production process using low enriched uranium target

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Kon; Lee, Jun Sig [Radioisotope Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Beyer, Gerd J. [Grunicke Strasse 15, Leipzig (Germany)

    2016-06-15

    Molybdenum-99 ({sup 99}Mo) is the most important isotope because its daughter isotope, technetium-99m ({sup 99}mTc), has been the most widely used medical radioisotope for more than 50 years, accounting for > 80% of total nuclear diagnostics worldwide. In this review, radiochemical routes for the production of {sup 99}Mo, and the aspects for selecting a suitable process strategy are discussed from the historical viewpoint of {sup 99}Mo technology developments. Most of the industrial-scale {sup 99}Mo processes have been based on the fission of {sup 235}U. Recently, important issues have been raised for the conversion of fission {sup 99}Mo targets from highly enriched uranium to low enriched uranium (LEU). The development of new LEU targets with higher density was requested to compensate for the loss of {sup 99}Mo yield, caused by a significant reduction of {sup 235}U enrichment, from the conversion. As the dramatic increment of intermediate level liquid waste is also expected from the conversion, an effective strategy to reduce the waste generation from the fission {sup 99}Mo production is required. The mitigation of radioxenon emission from medical radioisotope production facilities is discussed in relation with the monitoring of nuclear explosions and comprehensive nuclear test ban. Lastly, the {sup 99}Mo production process paired with the Korea Atomic Energy Research Institute's own LEU target is proposed as one of the most suitable processes for the LEU target.

  2. Hydrogen production methods

    International Nuclear Information System (INIS)

    Hammerli, M.

    1982-07-01

    Old, present and new proceses for producing hydrogen are assessed critically. The emphasis throughout is placed on those processes which could be commercially viable before the turn of the century for large-scale hydrogen manufacture. Electrolysis of water is the only industrial process not dependent on fossil resources for large-scale hydrogen production and is likely to remain so for the next two or three decades. While many new processes, including those utilizing sunlight directly or indirectly, are presently not considered to be commercially viable for large-scale hydrogen production, research and development effort is needed to enhance our understanding of the nature of these processes. Water vapour electrolysis is compared with thermochemical processes: the former has the potential for displacing all other processes for producing hydrogen and oxygen from water

  3. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  4. Survey of high-voltage pulse technology suitable for large-scale plasma source ion implantation processes

    International Nuclear Information System (INIS)

    Reass, W.A.

    1994-01-01

    Many new plasma processes ideas are finding their way from the research lab to the manufacturing plant floor. These require high voltage (HV) pulse power equipment, which must be optimized for application, system efficiency, and reliability. Although no single HV pulse technology is suitable for all plasma processes, various classes of high voltage pulsers may offer a greater versatility and economy to the manufacturer. Technology developed for existing radar and particle accelerator modulator power systems can be utilized to develop a modern large scale plasma source ion implantation (PSII) system. The HV pulse networks can be broadly defined by two classes of systems, those that generate the voltage directly, and those that use some type of pulse forming network and step-up transformer. This article will examine these HV pulse technologies and discuss their applicability to the specific PSII process. Typical systems that will be reviewed will include high power solid state, hard tube systems such as crossed-field ''hollow beam'' switch tubes and planar tetrodes, and ''soft'' tube systems with crossatrons and thyratrons. Results will be tabulated and suggestions provided for a particular PSII process

  5. A radiation service centre for research and large-scale irradiation

    International Nuclear Information System (INIS)

    Offermann, B.P.; Hofmann, E.G.

    1978-01-01

    In the near future radiation processing of food may change from the present laboratory-scale to large industrial application. This step will require large irradiation facilities with high flexibility, a safe dose control system and simple food-handling systems. Some design parameters of such an irradiation facility have already been realized at the AEG-Telefunken Radiation Service Centre in Wedel. This centre came into operation in autumn 1976. It is equipped with one research-type high-power X-ray unit (200kV/32mA) and one industrial-type electron accelerator (1500kV/37.5kW). Handling systems are available for radiation crosslinking of wire and cable insulations, of plastic films, for irradiation treatment of components and parts of different types and coatings as also of sewage sludge and waste water. Some of these handling systems can be used for food irradiation too. Other handling systems will be added sometime later. As an additional service the Company's existing material and environmental testing laboratory will be available. The centre is already being used by many interested companies to investigate the effects of radiation on a broad range of organic and inorganic materials, to develop special processing equipment, to process supplied products and to perform R and D work and contracts. The service centre fills an existing gap and will have an impact on the commercialization of radiation processing techniques in Europe. (author)

  6. CHARACTERIZATION AND PROCESSING OF SCALES FROM THE MECHANICAL DESCALING OF CARBON STEELS FOR RECYCLING AS COATING PIGMENTS

    Directory of Open Access Journals (Sweden)

    Anderson de Oliveira Fraga

    2014-10-01

    Full Text Available The large volume of solid wastes generated as scales in Steel Mills accounts to circa 1% to 2% of the total steel production and has led to studies aiming the recycling of scales, usually resulting in products of low added value. In this study, scales from the mechanical descaling of SAE 1045 steel were characterized by SEM and by quantitative X-Ray diffraction (Rietveld method, as well as by differential thermal analysis, aiming to develop its pretreatment for the further use as lamellar pigments in anticorrosive coatings of high added value. Aspect ratios between 1:50 and 1:100 were obtained by the processing of scales, which allows the replacement of other micaceous iron oxides.

  7. Technology Development of an Advanced Small-scale Microchannel-type Process Heat Exchanger (PHE) for Hydrogen Production in Iodine-sulfur Cycle

    Energy Technology Data Exchange (ETDEWEB)

    Sah, Injin; Kim, Chan Soo; Kim, Yong Wan; Park, Jae-Won; Kim, Eung-Seon; Kim, Min-Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this study, ongoing manufacturing processes of the components employed in an advanced small-scale microchannel-type PHE are presented. The components, such as mechanically machined microchannels and a diffusion-bonded stack are introduced. Also, preliminary studies on surface treatment techniques for improving corrosion resistance from the corrosive sulfuric environment will be covered. Ongoing manufacturing process for an advanced small-size microchannel-type PHE in KAERI is presented. Through the preliminary studies for optimizing diffusion bonding condition of Hastelloy-X, a diffusion-bonded stack, consisting of primary and secondary side layer by layer, is scheduled to be fabricated in a few months. Also, surface treatment for enhancing the corrosion resistance from the sulfuric acid environment is in progress for the plates with microchannels. A massive production of hydrogen with electricity generation is expected in a Process Heat Exchanger (PHE) in a Very High Temperature gas-cooled Reactor (VHTR) system. For the application of hydrogen production, a small-scale gas loop for feasibility testing of a laboratory-scale has constructed and operated in Korea Atomic Energy Research Institute (KAERI) as a precursor to an experimental- and a pilot-scale gas loops.

  8. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  9. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  10. A modified indirect mathematical model for evaluation of ethanol production efficiency in industrial-scale continuous fermentation processes.

    Science.gov (United States)

    Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M

    2016-10-01

    To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.

  11. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  12. Signal formation processes in Micromegas detectors and quality control for large size detector construction for the ATLAS new small wheel

    Energy Technology Data Exchange (ETDEWEB)

    Kuger, Fabian

    2017-07-31

    The Micromegas technology is one of the most successful modern gaseous detector concepts and widely utilized in nuclear and particle physics experiments. Twenty years of R and D rendered the technology sufficiently mature to be selected as precision tracking detector for the New Small Wheel (NSW) upgrade of the ATLAS Muon spectrometer. This will be the first large scale application of Micromegas in one of the major LHC experiments. However, many of the fundamental microscopic processes in these gaseous detectors are still not fully understood and studies on several detector aspects, like the micromesh geometry, have never been addressed systematically. The studies on signal formation in Micromegas, presented in the first part of this thesis, focuses on the microscopic signal electron loss mechanisms and the amplification processes in electron gas interaction. Based on a detailed model of detector parameter dependencies, these processes are scrutinized in an iterating comparison between experimental results, theory prediction of the macroscopic observables and process simulation on the microscopic level. Utilizing the specialized detectors developed in the scope of this thesis as well as refined simulation algorithms, an unprecedented level of accuracy in the description of the microscopic processes is reached, deepening the understanding of the fundamental process in gaseous detectors. The second part is dedicated to the challenges arising with the large scale Micromegas production for the ATLAS NSW. A selection of technological choices, partially influenced or determined by the herein presented studies, are discussed alongside a final report on two production related tasks addressing the detectors' core components: For the industrial production of resistive anode PCBs a detailed quality control (QC) and quality assurance (QA) scheme as well as the therefore required testing tools have been developed. In parallel the study on micromesh parameter optimization

  13. Signal formation processes in Micromegas detectors and quality control for large size detector construction for the ATLAS new small wheel

    International Nuclear Information System (INIS)

    Kuger, Fabian

    2017-01-01

    The Micromegas technology is one of the most successful modern gaseous detector concepts and widely utilized in nuclear and particle physics experiments. Twenty years of R and D rendered the technology sufficiently mature to be selected as precision tracking detector for the New Small Wheel (NSW) upgrade of the ATLAS Muon spectrometer. This will be the first large scale application of Micromegas in one of the major LHC experiments. However, many of the fundamental microscopic processes in these gaseous detectors are still not fully understood and studies on several detector aspects, like the micromesh geometry, have never been addressed systematically. The studies on signal formation in Micromegas, presented in the first part of this thesis, focuses on the microscopic signal electron loss mechanisms and the amplification processes in electron gas interaction. Based on a detailed model of detector parameter dependencies, these processes are scrutinized in an iterating comparison between experimental results, theory prediction of the macroscopic observables and process simulation on the microscopic level. Utilizing the specialized detectors developed in the scope of this thesis as well as refined simulation algorithms, an unprecedented level of accuracy in the description of the microscopic processes is reached, deepening the understanding of the fundamental process in gaseous detectors. The second part is dedicated to the challenges arising with the large scale Micromegas production for the ATLAS NSW. A selection of technological choices, partially influenced or determined by the herein presented studies, are discussed alongside a final report on two production related tasks addressing the detectors' core components: For the industrial production of resistive anode PCBs a detailed quality control (QC) and quality assurance (QA) scheme as well as the therefore required testing tools have been developed. In parallel the study on micromesh parameter optimization

  14. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  15. Saltstone studies using the scaled continuous processing facility

    Energy Technology Data Exchange (ETDEWEB)

    Fowley, M. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Cozzi, A. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hansen, E. K. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-08-01

    The Savannah River National Laboratory (SRNL) has supported the Saltstone Facility since its conception with bench-scale laboratory experiments, mid-scale testing at vendor facilities, and consultations and testing at the Saltstone Facility. There have been minimal opportunities for the measurement of rheological properties of the grout slurry at the Saltstone Production Facility (SPF); thus, the Scaled Continuous Processing Facility (SCPF), constructed to provide processing data related to mixing, transfer, and other operations conducted in the SPF, is the most representative process data for determining the expected rheological properties in the SPF. These results can be used to verify the laboratory scale experiments that support the SPF using conventional mixing processes that appropriately represent the shear imparted to the slurry in the SPF.

  16. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  17. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  18. Biotechnological Processes in Microbial Amylase Production

    Directory of Open Access Journals (Sweden)

    Subash C. B. Gopinath

    2017-01-01

    Full Text Available Amylase is an important and indispensable enzyme that plays a pivotal role in the field of biotechnology. It is produced mainly from microbial sources and is used in many industries. Industrial sectors with top-down and bottom-up approaches are currently focusing on improving microbial amylase production levels by implementing bioengineering technologies. The further support of energy consumption studies, such as those on thermodynamics, pinch technology, and environment-friendly technologies, has hastened the large-scale production of the enzyme. Herein, the importance of microbial (bacteria and fungi amylase is discussed along with its production methods from the laboratory to industrial scales.

  19. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  20. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  1. Batch Fermentative Biohydrogen Production Process Using Immobilized Anaerobic Sludge from Organic Solid Waste

    Directory of Open Access Journals (Sweden)

    Patrick T. Sekoai

    2016-12-01

    Full Text Available This study examined the potential of organic solid waste for biohydrogen production using immobilized anaerobic sludge. Biohydrogen was produced under batch mode at process conditions of 7.9, 30.3 °C and 90 h for pH, temperature and fermentation time, respectively. A maximum biohydrogen fraction of 48.67%, which corresponded to a biohydrogen yield of 215.39 mL H2/g Total Volatile Solids (TVS, was achieved. Therefore, the utilization of immobilized cells could pave the way for a large-scale biohydrogen production process.

  2. Manufacturing process scale-up of optical grade transparent spinel ceramic at ArmorLine Corporation

    Science.gov (United States)

    Spilman, Joseph; Voyles, John; Nick, Joseph; Shaffer, Lawrence

    2013-06-01

    While transparent Spinel ceramic's mechanical and optical characteristics are ideal for many Ultraviolet (UV), visible, Short-Wave Infrared (SWIR), Mid-Wave Infrared (MWIR), and multispectral sensor window applications, commercial adoption of the material has been hampered because the material has historically been available in relatively small sizes (one square foot per window or less), low volumes, unreliable supply, and with unreliable quality. Recent efforts, most notably by Technology Assessment and Transfer (TA and T), have scaled-up manufacturing processes and demonstrated the capability to produce larger windows on the order of two square feet, but with limited output not suitable for production type programs. ArmorLine Corporation licensed the hot-pressed Spinel manufacturing know-how of TA and T in 2009 with the goal of building the world's first dedicated full-scale Spinel production facility, enabling the supply of a reliable and sufficient volume of large Transparent Armor and Optical Grade Spinel plates. With over $20 million of private investment by J.F. Lehman and Company, ArmorLine has installed and commissioned the largest vacuum hot press in the world, the largest high-temperature/high-pressure hot isostatic press in the world, and supporting manufacturing processes within 75,000 square feet of manufacturing space. ArmorLine's equipment is capable of producing window blanks as large as 50" x 30" and the facility is capable of producing substantial volumes of material with its Lean configuration and 24/7 operation. Initial production capability was achieved in 2012. ArmorLine will discuss the challenges that were encountered during scale-up of the manufacturing processes, ArmorLine Optical Grade Spinel optical performance, and provide an overview of the facility and its capabilities.

  3. Innovative Techniques for Large-Scale Collection, Processing, and Storage of Eelgrass (Zostera marina) Seeds

    National Research Council Canada - National Science Library

    Orth, Robert J; Marion, Scott R

    2007-01-01

    .... Although methods for hand-collecting, processing and storing eelgrass seeds have advanced to match the scale of collections, the number of seeds collected has limited the scale of restoration efforts...

  4. A process optimization for bio-catalytic production of substituted catechols (3-nitrocatechol and 3-methylcatechol

    Directory of Open Access Journals (Sweden)

    Tiwary Bhupendra N

    2010-06-01

    Full Text Available Abstract Background Substituted catechols are important precursors for large-scale synthesis of pharmaceuticals and other industrial products. Most of the reported chemical synthesis methods are expensive and insufficient at industrial level. However, biological processes for production of substituted catechols could be highly selective and suitable for industrial purposes. Results We have optimized a process for bio-catalytic production of 3-substituted catechols viz. 3-nitrocatechol (3-NC and 3-methylcatechol (3-MC at pilot scale. Amongst the screened strains, two strains viz. Pseudomonas putida strain (F1 and recombinant Escherichia coli expression clone (pDTG602 harboring first two genes of toluene degradation pathway were found to accumulate 3-NC and 3-MC respectively. Various parameters such as amount of nutrients, pH, temperature, substrate concentration, aeration, inoculums size, culture volume, toxicity of substrate and product, down stream extraction, single step and two-step biotransformation were optimized at laboratory scale to obtain high yields of 3-substituted catechols. Subsequently, pilot scale studies were performed in 2.5 liter bioreactor. The rate of product accumulation at pilot scale significantly increased up to ~90-95% with time and high yields of 3-NC (10 mM and 3-MC (12 mM were obtained. Conclusion The biocatalytic production of 3-substituted catechols viz. 3-NC and 3-MC depend on some crucial parameters to obtain maximum yields of the product at pilot scale. The process optimized for production of 3-substituted catechols by using the organisms P. putida (F1 and recombinant E. coli expression clone (pDTG602 may be useful for industrial application.

  5. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  6. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  7. Development of Automated Production Line Processes for Solar Brightfield Modules: Final Annual Technical Progress Report, 1 July 2004 -- 15 October 2005

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore S. B.; Hogan, S. J.

    2006-08-01

    Spire Corporation is addressing the Photovoltaic Manufacturing R&D project goals of improving photovoltaic (PV) manufacturing processes and products while reducing costs and providing a technology foundation that supports significant manufacturing scale-up. To accomplish this, we are focusing our efforts on the design of a large-area utility-scale module and the development of the necessary manufacturing techniques and equipment to manufacture such a module in a high-volume production environment. A three-phase program is under way for developing and demonstrating new automated systems for fabricating very large PV modules ideal for use in multi-megawatt grid-connected applications. We designed a large-area (1.57 m x 3.68 m) 800-W module, and we are developing associated module production equipment that will minimize the total installed system cost for utility-scale PV arrays. Activities in Phase 2 focused on the development of automation for module materials lay-up, cell string busing, and module lamination; enhancements to the cell stringing and lamination processes; and performance testing of large-area modules.

  8. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  9. Large-scale adenovirus and poxvirus-vectored vaccine manufacturing to enable clinical trials.

    Science.gov (United States)

    Kallel, Héla; Kamen, Amine A

    2015-05-01

    Efforts to make vaccines against infectious diseases and immunotherapies for cancer have evolved to utilize a variety of heterologous expression systems such as viral vectors. These vectors are often attenuated or engineered to safely deliver genes encoding antigens of different pathogens. Adenovirus and poxvirus vectors are among the viral vectors that are most frequently used to develop prophylactic vaccines against infectious diseases as well as therapeutic cancer vaccines. This mini-review describes the trends and processes in large-scale production of adenovirus and poxvirus vectors to meet the needs of clinical applications. We briefly describe the general principles for the production and purification of adenovirus and poxvirus viral vectors. Currently, adenovirus and poxvirus vector manufacturing methods rely on well-established cell culture technologies. Several improvements have been evaluated to increase the yield and to reduce the overall manufacturing cost, such as cultivation at high cell densities and continuous downstream processing. Additionally, advancements in vector characterization will greatly facilitate the development of novel vectored vaccine candidates. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  11. How Close We Are to Achieving Commercially Viable Large-Scale Photobiological Hydrogen Production by Cyanobacteria: A Review of the Biological Aspects

    Science.gov (United States)

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2015-01-01

    Photobiological production of H2 by cyanobacteria is considered to be an ideal source of renewable energy because the inputs, water and sunlight, are abundant. The products of photobiological systems are H2 and O2; the H2 can be used as the energy source of fuel cells, etc., which generate electricity at high efficiencies and minimal pollution, as the waste product is H2O. Overall, production of commercially viable algal fuels in any form, including biomass and biodiesel, is challenging, and the very few systems that are operational have yet to be evaluated. In this paper we will: briefly review some of the necessary conditions for economical production, summarize the reports of photobiological H2 production by cyanobacteria, present our schemes for future production, and discuss the necessity for further progress in the research needed to achieve commercially viable large-scale H2 production. PMID:25793279

  12. Quality Assessment of Physical and Organoleptic Instant Corn Rice on Scale-Up Process

    Science.gov (United States)

    Kumalasari, R.; Ekafitri, R.; Indrianti, N.

    2017-12-01

    Development of instant corn rice product has been successfully conducted on a laboratory scale. Corn has high carbohydrate content but low in fiber. The addition of fiber in instant corn rice, intended to improve the functioning of the product, and replace fiber loss during the process. Scale up process of Instant corn rice required to increase the production capacity. Scale up was the process to get identic output on a larger scale based on predetermined production scale. This study aimed to assess the changes and differences in the quality of instant corn rice during scale up. Instant corn rice scale up was done on production capacity 3 kg, 4 kg and 5 kg. Results showed that scale up of instant corn rice producing products with rehydration ratio ranges between 514% - 570%, the absorption rate ranged between 414% - 470%, swelling rate ranging between 119% - 134%, bulk density ranged from 0.3661 to 0.4745 (g/ml) and porosity ranging between 30-37%. The physical quality of instant corn rice on scale up were stable from the ones at laboratory scale on swelling rate, rehydration ratio, and absorption rate but not stable on bulk density and porosity. Organoleptic qualities were stable at increased scale compared on a laboratory scale. Bulk density was higher than those at laboratory scale, and the porosity was lower than those at laboratory scale.

  13. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  14. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  15. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  16. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  17. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  18. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  19. Improved processes of molybdenum-99 production

    International Nuclear Information System (INIS)

    Dadachova, K.; La Riviere, K.; Anderon, P.

    1997-01-01

    Two improved processes of Molybdenum-99 production have been developed at ANSTO on laboratory scale. The first one allows to purify Mo of natural isotopic composition from tungsten impurities by using preferential adsorption of tungsten on hydrated tin(IV) oxide SnO 2 x nH 2 O before irradiation in the nuclear reactor. Mo-99 obtained via this route can be used for production of i nstant Tc-99m. As the starting material MoO 3 contains considerable amounts of tungsten impurity (W > 60 ppm), 5-7 days irradiation results in generation of W-188 in amounts sufficient to contaminate the final Tc-99m product with rhenium-188 (Re-188, 16.8 h half-life) - radioactive daughter of W-188. To overcome this problem, a method of MoO 3 purification from W, based on preferential adsorption of W by hydrated tin (IV) oxide has been developed. The contents of W in MoO 3 purified by this technique became 3 and retaining of Mo-99 on a large alumina column. Mo-99 is stripped off the column with 200 mL 1M NH 4 OH followed by loading this solution onto the AG 1x8 column. The next steps are different for each version of separation process

  20. Buffer provisioning for large-scale data-acquisition systems

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Froening, Holger; Vandelli, Wainer

    2018-01-01

    The data acquisition system of the ATLAS experiment, a major experiment of the Large Hadron Collider (LHC) at CERN, will go through a major upgrade in the next decade. The upgrade is driven by experimental physics requirements, calling for increased data rates on the order of 6~TB/s. By contrast, the data rate of the existing system is 160~GB/s. Among the changes in the upgraded system will be a very large buffer with a projected size on the order of 70 PB. The buffer role will be decoupling of data production from on-line data processing, storing data for periods of up to 24~hours until it can be analyzed by the event processing system. The larger buffer will allow a new data recording strategy, providing additional margins to handle variable data rates. At the same time it will provide sensible trade-offs between buffering space and on-line processing capabilities. This compromise between two resources will be possible since the data production cycle includes time periods where the experiment will not produ...

  1. Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.

    Science.gov (United States)

    Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong

    2017-10-11

    The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.

  2. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  3. Evaluation of hollow fiber culture for large-scale production of mouse embryonic stem cell-derived hematopoietic stem cells.

    Science.gov (United States)

    Nakano, Yu; Iwanaga, Shinya; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2018-03-03

    Hematopoietic stem cells (HSCs) have the ability to differentiate into all types of blood cells and can be transplanted to treat blood disorders. However, it is difficult to obtain HSCs in large quantities because of the shortage of donors. Recent efforts have focused on acquiring HSCs by differentiation of pluripotent stem cells. As a conventional differentiation method of pluripotent stem cells, the formation of embryoid bodies (EBs) is often employed. However, the size of EBs is limited by depletion of oxygen and nutrients, which prevents them from being efficient for the production of HSCs. In this study, we developed a large-scale hematopoietic differentiation approach for mouse embryonic stem (ES) cells by applying a hollow fiber (HF)/organoid culture method. Cylindrical organoids, which had the potential for further spontaneous differentiation, were established inside of hollow fibers. Using this method, we improved the proliferation rate of mouse ES cells to produce an increased HSC population and achieved around a 40-fold higher production volume of HSCs in HF culture than in conventional EB culture. Therefore, the HF/organoid culture method may be a new mass culture method to acquire pluripotent stem cell-derived HSCs.

  4. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  5. Development of Automated Production Line Processes for Solar Brightfield Modules: Annual Technical Progress Report, 1 January 2003 -- 30 June 2004

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2005-06-01

    This report describes how Spire Corporation is addressing the PV Manufacturing R&D project goals of improving photovoltaic (PV) manufacturing processes and products while reducing costs and providing a technology foundation that supports significant manufacturing scale-up. To accomplish this, we are focusing our efforts on the design of a large-area utility-scale module and the development of the necessary manufacturing techniques and equipment to manufacture such a module in a high-volume production environment. A three-phase program is under way for developing and demonstrating new automated systems for fabricating very large PV modules ideal for use in multi-megawatt grid-connected applications. We designed a large-area 800 W module and we are developing associated module production equipment that will minimize the total installed system cost for utility-scale PV arrays. Unique features of the module design include a cantilevered glass superstrate to reduce the glass thickness a nd internally laminated bypass diodes that simplify internal busing and junction-box designs. Other program activities include the development of automation for solar cell string inspections, string busing, materials lay-up, and lamination; enhancements to the lamination process; and performance testing of large-area modules.

  6. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  7. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  8. Improving the large scale purification of the HIV microbicide, griffithsin.

    Science.gov (United States)

    Fuqua, Joshua L; Wanga, Valentine; Palmer, Kenneth E

    2015-02-22

    Griffithsin is a broad spectrum antiviral lectin that inhibits viral entry and maturation processes through binding clusters of oligomannose glycans on viral envelope glycoproteins. An efficient, scaleable manufacturing process for griffithsin active pharmaceutical ingredient (API) is essential for particularly cost-sensitive products such as griffithsin -based topical microbicides for HIV-1 prevention in resource poor settings. Our previously published purification method used ceramic filtration followed by two chromatography steps, resulting in a protein recovery of 30%. Our objective was to develop a scalable purification method for griffithsin expressed in Nicotiana benthamiana plants that would increase yield, reduce production costs, and simplify manufacturing techniques. Considering the future need to transfer griffithsin manufacturing technology to resource poor areas, we chose to focus modifying the purification process, paying particular attention to introducing simple, low-cost, and scalable procedures such as use of temperature, pH, ion concentration, and filtration to enhance product recovery. We achieved >99% pure griffithsin API by generating the initial green juice extract in pH 4 buffer, heating the extract to 55°C, incubating overnight with a bentonite MgCl2 mixture, and final purification with Capto™ multimodal chromatography. Griffithsin extracted with this protocol maintains activity comparable to griffithsin purified by the previously published method and we are able to recover a substantially higher yield: 88 ± 5% of griffithsin from the initial extract. The method was scaled to produce gram quantities of griffithsin with high yields, low endotoxin levels, and low purification costs maintained. The methodology developed to purify griffithsin introduces and develops multiple tools for purification of recombinant proteins from plants at an industrial scale. These tools allow for robust cost-effective production and purification of

  9. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    Science.gov (United States)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  10. Large-scale laboratory study of breaking wave hydrodynamics over a fixed bar

    Science.gov (United States)

    van der A, Dominic A.; van der Zanden, Joep; O'Donoghue, Tom; Hurther, David; Cáceres, Iván.; McLelland, Stuart J.; Ribberink, Jan S.

    2017-04-01

    A large-scale wave flume experiment has been carried out involving a T = 4 s regular wave with H = 0.85 m wave height plunging over a fixed barred beach profile. Velocity profiles were measured at 12 locations along the breaker bar using LDA and ADV. A strong undertow is generated reaching magnitudes of 0.8 m/s on the shoreward side of the breaker bar. A circulation pattern occurs between the breaking area and the inner surf zone. Time-averaged turbulent kinetic energy (TKE) is largest in the breaking area on the shoreward side of the bar where the plunging jet penetrates the water column. At this location, and on the bar crest, TKE generated at the water surface in the breaking process reaches the bottom boundary layer. In the breaking area, TKE does not reduce to zero within a wave cycle which leads to a high level of "residual" turbulence and therefore lower temporal variation in TKE compared to previous studies of breaking waves on plane beach slopes. It is argued that this residual turbulence results from the breaker bar-trough geometry, which enables larger length scales and time scales of breaking-generated vortices and which enhances turbulence production within the water column compared to plane beaches. Transport of TKE is dominated by the undertow-related flux, whereas the wave-related and turbulent fluxes are approximately an order of magnitude smaller. Turbulence production and dissipation are largest in the breaker zone and of similar magnitude, but in the shoaling zone and inner surf zone production is negligible and dissipation dominates.

  11. Ocean Acidification Experiments in Large-Scale Mesocosms Reveal Similar Dynamics of Dissolved Organic Matter Production and Biotransformation

    Directory of Open Access Journals (Sweden)

    Maren Zark

    2017-09-01

    Full Text Available Dissolved organic matter (DOM represents a major reservoir of carbon in the oceans. Environmental stressors such as ocean acidification (OA potentially affect DOM production and degradation processes, e.g., phytoplankton exudation or microbial uptake and biotransformation of molecules. Resulting changes in carbon storage capacity of the ocean, thus, may cause feedbacks on the global carbon cycle. Previous experiments studying OA effects on the DOM pool under natural conditions, however, were mostly conducted in temperate and coastal eutrophic areas. Here, we report on OA effects on the existing and newly produced DOM pool during an experiment in the subtropical North Atlantic Ocean at the Canary Islands during an (1 oligotrophic phase and (2 after simulated deep water upwelling. The last is a frequently occurring event in this region controlling nutrient and phytoplankton dynamics. We manipulated nine large-scale mesocosms with a gradient of pCO2 ranging from ~350 up to ~1,030 μatm and monitored the DOM molecular composition using ultrahigh-resolution mass spectrometry via Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS. An increase of 37 μmol L−1 DOC was observed in all mesocosms during a phytoplankton bloom induced by simulated upwelling. Indications for enhanced DOC accumulation under elevated CO2 became apparent during a phase of nutrient recycling toward the end of the experiment. The production of DOM was reflected in changes of the molecular DOM composition. Out of the 7,212 molecular formulae, which were detected throughout the experiment, ~50% correlated significantly in mass spectrometric signal intensity with cumulative bacterial protein production (BPP and are likely a product of microbial transformation. However, no differences in the produced compounds were found with respect to CO2 levels. Comparing the results of this experiment with a comparable OA experiment in the Swedish Gullmar Fjord, reveals

  12. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  13. Biofuel Development and Large-Scale Land Deals in Sub-Saharan Africa

    OpenAIRE

    Giorgia Giovannetti; Elisa Ticci

    2013-01-01

    Africa's biofuel potential over the last ten years has increasingly attracted foreign investors’ attention. We estimate the determinants of foreign investors land demand for biofuel production in SSA, using Poisson specifications of the gravity model. Our estimates suggest that land availability, abundance of water resources and weak land governance are significant determinants of large-scale land acquisitions for biofuel production. This in turn suggests that this type of investment is mainl...

  14. Two-scale large deviations for chemical reaction kinetics through second quantization path integral

    International Nuclear Information System (INIS)

    Li, Tiejun; Lin, Feng

    2016-01-01

    Motivated by the study of rare events for a typical genetic switching model in systems biology, in this paper we aim to establish the general two-scale large deviations for chemical reaction systems. We build a formal approach to explicitly obtain the large deviation rate functionals for the considered two-scale processes based upon the second quantization path integral technique. We get three important types of large deviation results when the underlying two timescales are in three different regimes. This is realized by singular perturbation analysis to the rate functionals obtained by the path integral. We find that the three regimes possess the same deterministic mean-field limit but completely different chemical Langevin approximations. The obtained results are natural extensions of the classical large volume limit for chemical reactions. We also discuss its implication on the single-molecule Michaelis–Menten kinetics. Our framework and results can be applied to understand general multi-scale systems including diffusion processes. (paper)

  15. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  16. Efficient large-scale protein production of larvae and pupae of silkworm by Bombyx mori nuclear polyhedrosis virus bacmid system

    International Nuclear Information System (INIS)

    Motohashi, Tomoko; Shimojima, Tsukasa; Fukagawa, Tatsuo; Maenaka, Katsumi; Park, Enoch Y.

    2005-01-01

    Silkworm is one of the most attractive hosts for large-scale production of eukaryotic proteins as well as recombinant baculoviruses for gene transfer to mammalian cells. The bacmid system of Autographa californica nuclear polyhedrosis virus (AcNPV) has already been established and widely used. However, the AcNPV does not have a potential to infect silkworm. We developed the first practical Bombyx mori nuclear polyhedrosis virus bacmid system directly applicable for the protein expression of silkworm. By using this system, the green fluorescence protein was successfully expressed in silkworm larvae and pupae not only by infection of its recombinant virus but also by direct injection of its bacmid DNA. This method provides the rapid protein production in silkworm as long as 10 days, is free from biohazard, thus will be a powerful tool for the future production factory of recombinant eukaryotic proteins and baculoviruses

  17. Processes with large Psub(T) in the quantum chromodynamics

    International Nuclear Information System (INIS)

    Slepchenko, L.A.

    1981-01-01

    Necessary data on deep inelastic processes and processes of hard collision of hadrons and their interpretation in QCD are stated. Low of power reduction of exclusive and inclusive cross sections at large transverse momenta, electromagnetic and inelastic (structural functions) formfactors of hadrons have been discussed. When searching for a method of taking account of QCD effects scaling disturbance was considered. It is shown that for the large transverse momenta the deep inelastic l-h scatterina is represented as the scattering with a compound system (hadron) in the pulse approximation. In an assumption of a parton model obtained was a hadron cross section calculated through a renormalized structural parton function was obtained. Proof of the factorization in the principal logarithmic approximation of QCD has been obtained by means of a quark-gluon diagram technique. The cross section of the hadron reaction in the factorized form, which is analogous to the l-h scattering, has been calculated. It is shown that a) the diagram summing with the gluon emission generates the scaling disturbance in renormalized structural functions (SF) of quarks and gluons and a running coupling constant arises simultaneously; b) the disturbance character of the Bjorken scaling of SF is the same as in the deep inelasic lepton scattering. QCD problems which can not be solved within the framework of the perturbation theory, are discussed. The evolution of SF describing the bound state of a hadron and the hadron light cone have been studied. Radiation corrections arising in two-loop and higher approximations have been evaluated. QCD corrections for point-similar power asymptotes of processes with high energies and transfers of momenta have been studied on the example of the inclusive production of quark and gluon jets. Rules of the quark counting of anomalous dimensionalities of QCD have been obtained. It is concluded that the considered limit of the inclusive cross sections is close to

  18. Extending SME to Handle Large-Scale Cognitive Modeling.

    Science.gov (United States)

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  19. Direct Down-scale Experiments of Concentration Column Designs for SHINE Process

    Energy Technology Data Exchange (ETDEWEB)

    Youker, Amanda J. [Argonne National Lab. (ANL), Argonne, IL (United States); Stepinski, Dominique C. [Argonne National Lab. (ANL), Argonne, IL (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-01

    Argonne is assisting SHINE Medical Technologies in their efforts to become a domestic Mo-99 producer. The SHINE accelerator-driven process uses a uranyl-sulfate target solution for the production of fission-product Mo-99. Argonne has developed a molybdenum recovery and purification process for this target solution. The process includes an initial Mo recovery column followed by a concentration column to reduce the product volume from 15-25 L to < 1 L prior to entry into the LEU Modified Cintichem (LMC) process for purification.1 This report discusses direct down-scale experiments of the plant-scale concentration column design, where the effects of loading velocity and temperature were investigated.

  20. Methods for large-scale international studies on ICT in education

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.; Voogt, Joke; Knezek, G.A.

    2008-01-01

    International comparative assessment is a research method applied for describing and analyzing educational processes and outcomes. They are used to ‘describe the status quo’ in educational systems from an international comparative perspective. This chapter reviews different large scale international

  1. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  2. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  3. Mass dependence of Higgs boson production at large transverse momentum through a bottom-quark loop

    Science.gov (United States)

    Braaten, Eric; Zhang, Hong; Zhang, Jia-Wei

    2018-05-01

    In the production of the Higgs through a bottom-quark loop, the transverse momentum distribution of the Higgs at large PT is complicated by its dependence on two other important scales: the bottom quark mass mb and the Higgs mass mH. A strategy for simplifying the calculation of the cross section at large PT is to calculate only the leading terms in its expansion in mb2/PT2. In this paper, we consider the bottom-quark-loop contribution to the parton process q q ¯→H +g at leading order in αs. We show that the leading power of 1 /PT2 can be expressed in the form of a factorization formula that separates the large scale PT from the scale of the masses. All the dependence on mb and mH can be factorized into a distribution amplitude for b b ¯ in the Higgs, a distribution amplitude for b b ¯ in a real gluon, and an end point contribution. The factorization formula can be used to organize the calculation of the leading terms in the expansion in mb2/PT2 so that every calculation involves at most two scales.

  4. Fed batch fermentation scale up in the production of recombinant streptokinase

    Directory of Open Access Journals (Sweden)

    Salvador Losada-Nerey

    2017-01-01

    Full Text Available Due to the high international demand of the recombinant streptokinase (Skr produced at the National Center for Bioproducts (BioCen, it was necessary to increase the production capacity of the drug, since the current production volume does not cover the demand. A scale up of the process of fermentation of the recombinant streptokinase was made using a fed batch culture, from the bank scale towards a 300L fermenter. The scaling criteria used were: the intensive variables of the process, the relationships of volumes of the fermentation medium and inoculum, the volumetric coefficient of oxygen transfer and air volume to liquid flow relationship which were kept constant. With this scale up procedure it was possible to reproduce the results obtained at the bank scale of and to double the biomass production volume with the same equipment, fulfilling all the quality requirements of the product and to cover the current demand of the market. Techno-economic indicators demonstrated the feasibility of this option.

  5. Biogas Production from Sugarcane Waste: Assessment on Kinetic Challenges for Process Designing

    Science.gov (United States)

    Janke, Leandro; Leite, Athaydes; Nikolausz, Marcell; Schmidt, Thomas; Liebetrau, Jan; Nelles, Michael; Stinner, Walter

    2015-01-01

    Biogas production from sugarcane waste has large potential for energy generation, however, to enable the optimization of the anaerobic digestion (AD) process each substrate characteristic should be carefully evaluated. In this study, the kinetic challenges for biogas production from different types of sugarcane waste were assessed. Samples of vinasse, filter cake, bagasse, and straw were analyzed in terms of total and volatile solids, chemical oxygen demand, macronutrients, trace elements, and nutritional value. Biochemical methane potential assays were performed to evaluate the energy potential of the substrates according to different types of sugarcane plants. Methane yields varied considerably (5–181 Nm3·tonFM−1), mainly due to the different substrate characteristics and sugar and/or ethanol production processes. Therefore, for the optimization of AD on a large-scale, continuous stirred-tank reactor with long hydraulic retention times (>35 days) should be used for biogas production from bagasse and straw, coupled with pre-treatment process to enhance the degradation of the fibrous carbohydrates. Biomass immobilization systems are recommended in case vinasse is used as substrate, due to its low solid content, while filter cake could complement the biogas production from vinasse during the sugarcane offseason, providing a higher utilization of the biogas system during the entire year. PMID:26404248

  6. Economies of scale in cardiac surgery

    DEFF Research Database (Denmark)

    Lillrank, Paul; Chaudhuri, Atanu; Torkki, Paulus

    2015-01-01

    Objective: The objective of this paper is to investigate the impact of scale of surgical units on the productivity of patient processes. Methods: The context, intervention, mechanism, output (CIMO) model of Evaluation research is used. The scale–performance mechanisms are examined through resource...... intensity and throughput time per patient. The productivity of Coronary Artery Bypass Graft (CABG) surgery in a very large and a smaller hospital are compared. Results: While the large hospital performed 5.1 times more CABG surgeries per year than the smaller hospital, in terms of total resource consumption...... per patient it was 13% less productive. The large hospital had a 5% efficiency advantage in Operating Theatres (OTs), but it was 30% less efficient in ward care. Conclusions: Economies of scale are not found at the patient process level. Operating policies seem to assume more importance than scale....

  7. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  8. Large-scale manufacture and characterization of a lentiviral vector produced for clinical ex vivo gene therapy application.

    Science.gov (United States)

    Merten, Otto-Wilhelm; Charrier, Sabine; Laroudie, Nicolas; Fauchille, Sylvain; Dugué, Céline; Jenny, Christine; Audit, Muriel; Zanta-Boussif, Maria-Antonietta; Chautard, Hélène; Radrizzani, Marina; Vallanti, Giuliana; Naldini, Luigi; Noguiez-Hellin, Patricia; Galy, Anne

    2011-03-01

    From the perspective of a pilot clinical gene therapy trial for Wiskott-Aldrich syndrome (WAS), we implemented a process to produce a lentiviral vector under good manufacturing practices (GMP). The process is based on the transient transfection of 293T cells in Cell Factory stacks, scaled up to harvest 50 liters of viral stock per batch, followed by purification of the vesicular stomatitis virus glycoprotein-pseudotyped particles through several membrane-based and chromatographic steps. The process leads to a 200-fold volume concentration and an approximately 3-log reduction in protein and DNA contaminants. An average yield of 13% of infectious particles was obtained in six full-scale preparations. The final product contained low levels of contaminants such as simian virus 40 large T antigen or E1A sequences originating from producer cells. Titers as high as 2 × 10(9) infectious particles per milliliter were obtained, generating up to 6 × 10(11) infectious particles per batch. The purified WAS vector was biologically active, efficiently expressing the genetic insert in WAS protein-deficient B cell lines and transducing CD34(+) cells. The vector introduced 0.3-1 vector copy per cell on average in CD34(+) cells when used at the concentration of 10(8) infectious particles per milliliter, which is comparable to preclinical preparations. There was no evidence of cellular toxicity. These results show the implementation of large-scale GMP production, purification, and control of advanced HIV-1-derived lentiviral technology. Results obtained with the WAS vector provide the initial manufacturing and quality control benchmarking that should be helpful to further development and clinical applications.

  9. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  10. Optimizing in vitro large scale production of giant reed (Arundo donax L.) by liquid medium culture

    International Nuclear Information System (INIS)

    Cavallaro, Valeria; Patanè, Cristina; Cosentino, Salvatore L.; Di Silvestro, Isabella; Copani, Venera

    2014-01-01

    Tissue culture methods offer the potential for large-scale propagation of giant reed (Arundo donax L.), a promising crop for energy biomass. In previous trials, giant reed resulted particularly suitable to in vitro culture. In this paper, with the final goal of enhancing the efficiency of in vitro production process and reducing costs, the influence of four different culture media (agar or gellan-gum solidified medium, liquid medium into a temporary immersion system-RITA ® or in a stationary state) on in vitro shoot proliferation of giant reed was evaluated. Giant reed exhibited a particular sensitivity to gelling agents during the phase of secondary shoot formation. Gellan gum, as compared to agar, improved the efficiency of in vitro culture giving more shoots with higher mean fresh and dry weight. Moreover, the cultivation of this species into a liquid medium under temporary immersion conditions or in a stationary state, was comparatively as effective as and cheaper than that into a gellan gum medium. Increasing 6-benzylaminopurine (BA) up to 4 mg l −1 also resulted in a further enhancement of secondary shoot proliferation. The good adaptability of this species to liquid medium and the high multiplication rates observed indicate the possibility to obtain from a single node at least 1200 plantlets every six multiplication cycles (about 6 months), a number 100 fold higher than that obtained yearly per plant by the conventional methods of vegetative multiplication. In open field, micropropagated plantlets guaranteed a higher number of survived plants, secondary stems and above ground biomass as compared to rhizome ones. - Highlights: • In vitro propagation offers the potential for large-scale propagation of giant reed. • The success of an in vitro protocol depends on the rate and mode of shoot proliferation. • Substituting liquid media to solid ones may decrease propagation costs in Arundo donax. • Giant reed showed good proliferation rates in

  11. Advances in ingredient and processing systems for meat and meat products.

    Science.gov (United States)

    Weiss, Jochen; Gibis, Monika; Schuh, Valerie; Salminen, Hanna

    2010-09-01

    Changes in consumer demand of meat products as well as increased global competition are causing an unprecedented spur in processing and ingredient system developments within the meat manufacturing sector. Consumers demand healthier meat products that are low in salt, fat, cholesterol, nitrites and calories in general and contain in addition health-promoting bioactive components such as for example carotenoids, unsaturated fatty acids, sterols, and fibers. On the other hand, consumers expect these novel meat products with altered formulations to taste, look and smell the same way as their traditionally formulated and processed counterparts. At the same time, competition is forcing the meat processing industry to use the increasingly expensive raw material "meat" more efficiently and produce products at lower costs. With these changes in mind, this article presents a review of novel ingredient systems and processing approaches that are emerging to create high quality, affordable meat products not only in batch mode but also in large-scale continuous processes. Fat replacers, fat profile modification and cholesterol reduction techniques, new texture modifiers and alternative antioxidant and antimicrobial systems are being discussed. Modern processing equipment to establish continuously operating product manufacturing lines and that allow new meat product structures to be created and novel ingredients to be effectively utilized including vacuum fillers, grinders and fine dispersers, and slicers is reviewed in the context of structure creation in meat products. Finally, trends in future developments of ingredient and processing systems for meat products are highlighted.

  12. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  13. Most experiments done so far with limited plants. Large-scale testing ...

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Most experiments done so far with limited plants. Large-scale testing needs to be done with objectives such as: Apart from primary transformants, their progenies must be tested. Experiments on segregation, production of homozygous lines, analysis of expression levels in ...

  14. Investigation of factors influencing biogas production in a large-scale thermophilic municipal biogas plant

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, Agnes; Jerome, Valerie; Freitag, Ruth [Bayreuth Univ. (Germany). Chair for Process Biotechnology; Burghardt, Diana; Likke, Likke; Peiffer, Stefan [Bayreuth Univ. (Germany). Dept. of Hydrology; Hofstetter, Eugen M. [RVT Process Equipment GmbH, Steinwiesen (Germany); Gabler, Ralf [BKW Biokraftwerke Fuerstenwalde GmbH, Fuerstenwalde (Germany)

    2009-10-15

    A continuously operated, thermophilic, municipal biogas plant was observed over 26 months (sampling twice per month) in regard to a number of physicochemical parameters and the biogas production. Biogas yields were put in correlation to parameters such as the volatile fatty acid concentration, the pH and the ammonium concentration. When the residing microbiota was classified via analysis of the 16S rRNA genes, most bacterial sequences matched with unidentified or uncultured bacteria from similar habitats. Of the archaeal sequences, 78.4% were identified as belonging to the genus Methanoculleus, which has not previously been reported for biogas plants, but is known to efficiently use H{sub 2} and CO{sub 2} produced by the degradation of fatty acids by syntrophic microorganisms. In order to further investigate the influence of varied amounts of ammonia (2-8 g/L) and volatile fatty acids on biogas production and composition (methane/CO{sub 2}), laboratory scale satellite experiments were performed in parallel to the technical plant. Finally, ammonia stripping of the process water of the technical plant was accomplished, a measure through which the ammonia entering the biogas reactor via the mash could be nearly halved, which increased the energy output of the biogas plant by almost 20%. (orig.)

  15. Development of Best Practices for Large-scale Data Management Infrastructure

    NARCIS (Netherlands)

    S. Stadtmüller; H.F. Mühleisen (Hannes); C. Bizer; M.L. Kersten (Martin); J.A. de Rijke (Arjen); F.E. Groffen (Fabian); Y. Zhang (Ying); G. Ladwig; A. Harth; M Trampus

    2012-01-01

    htmlabstractThe amount of available data for processing is constantly increasing and becomes more diverse. We collect our experiences on deploying large-scale data management tools on local-area clusters or cloud infrastructures and provide guidance to use these computing and storage

  16. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  17. Process and reactor design for biophotolytic hydrogen production.

    Science.gov (United States)

    Tamburic, Bojan; Dechatiwongse, Pongsathorn; Zemichael, Fessehaye W; Maitland, Geoffrey C; Hellgardt, Klaus

    2013-07-14

    The green alga Chlamydomonas reinhardtii has the ability to produce molecular hydrogen (H2), a clean and renewable fuel, through the biophotolysis of water under sulphur-deprived anaerobic conditions. The aim of this study was to advance the development of a practical and scalable biophotolytic H2 production process. Experiments were carried out using a purpose-built flat-plate photobioreactor, designed to facilitate green algal H2 production at the laboratory scale and equipped with a membrane-inlet mass spectrometry system to accurately measure H2 production rates in real time. The nutrient control method of sulphur deprivation was used to achieve spontaneous H2 production following algal growth. Sulphur dilution and sulphur feed techniques were used to extend algal lifetime in order to increase the duration of H2 production. The sulphur dilution technique proved effective at encouraging cyclic H2 production, resulting in alternating Chlamydomonas reinhardtii recovery and H2 production stages. The sulphur feed technique enabled photobioreactor operation in chemostat mode, resulting in a small improvement in H2 production duration. A conceptual design for a large-scale photobioreactor was proposed based on these experimental results. This photobioreactor has the capacity to enable continuous and economical H2 and biomass production using green algae. The success of these complementary approaches demonstrate that engineering advances can lead to improvements in the scalability and affordability of biophotolytic H2 production, giving increased confidence that H2 can fulfil its potential as a sustainable fuel of the future.

  18. Influence of basin-scale and mesoscale physical processes on biological productivity in the Bay of Bengal during the summer monsoon

    Science.gov (United States)

    Muraleedharan, K. R.; Jasmine, P.; Achuthankutty, C. T.; Revichandran, C.; Dinesh Kumar, P. K.; Anand, P.; Rejomon, G.

    2007-03-01

    Physical forcing plays a major role in determining biological processes in the ocean across the full spectrum of spatial and temporal scales. Variability of biological production in the Bay of Bengal (BoB) based on basin-scale and mesoscale physical processes is presented using hydrographic data collected during the peak summer monsoon in July-August, 2003. Three different and spatially varying physical processes were identified in the upper 300 m: (I) anticyclonic warm gyre offshore in the southern Bay; (II) a cyclonic eddy in the northern Bay; and (III) an upwelling region adjacent to the southern coast. In the warm gyre (>28.8 °C), the low salinity (33.5) surface waters contained low concentrations of nutrients. These warm surface waters extended below the euphotic zone, which resulted in an oligotrophic environment with low surface chlorophyll a (0.12 mg m -3), low surface primary production (2.55 mg C m -3 day -1) and low zooplankton biovolume (0.14 ml m -3). In the cyclonic eddy, the elevated isopycnals raised the nutricline upto the surface (NO 3-N > 8.2 μM, PO 4-P > 0.8 μM, SiO 4-Si > 3.5 μM). Despite the system being highly eutrophic, response in the biological activity was low. In the upwelling zone, although the nutrient concentrations were lower compared to the cyclonic eddy, the surface phytoplankton biomass and production were high (Chl a - 0.25 mg m -3, PP - 9.23 mg C m -3 day -1), and mesozooplankton biovolume (1.12 ml m -3) was rich. Normally in oligotrophic, open ocean ecosystems, primary production is based on ‘regenerated’ nutrients, but during episodic events like eddies the ‘production’ switches over to ‘new production’. The switching over from ‘regenerated production’ to ‘new production’ in the open ocean (cyclonic eddy) and establishment of a new phytoplankton community will take longer than in the coastal system (upwelling). Despite the functioning of a cyclonic eddy and upwelling being divergent (transporting of

  19. Exploiting deterministic maintenance opportunity windows created by conservative engineering design rules that result in free time locked into large high-speed coupled production lines with finite buffers

    Directory of Open Access Journals (Sweden)

    Durandt, Casper

    2016-08-01

    Full Text Available Conservative engineering design rules for large serial coupled production processes result in machines having locked-in free time (also called ‘critical downtime’ or ‘maintenance opportunity windows’, which cause idle time if not used. Operators are not able to assess a large production process holistically, and so may not be aware that they form the current bottleneck – or that they have free time available due to interruptions elsewhere. A real-time method is developed to accurately calculate and display free time in location and magnitude, and efficiency improvements are demonstrated in large-scale production runs.

  20. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  1. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... the last 10 years and the corresponding cost per collector area for the final installed plant is kept constant, even so the solar production is increased. Unfortunately large-scale seasonal storage was not able to keep up with the advances in solar technology, at least for pit water and gravel storage...... of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish...

  2. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  3. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  4. Large-scale grain growth in the solid-state process: From "Abnormal" to "Normal"

    Science.gov (United States)

    Jiang, Minhong; Han, Shengnan; Zhang, Jingwei; Song, Jiageng; Hao, Chongyan; Deng, Manjiao; Ge, Lingjing; Gu, Zhengfei; Liu, Xinyu

    2018-02-01

    Abnormal grain growth (AGG) has been a common phenomenon during the ceramic or metallurgy processing since prehistoric times. However, usually it had been very difficult to grow big single crystal (centimeter scale over) by using the AGG method due to its so-called occasionality. Based on the AGG, a solid-state crystal growth (SSCG) method was developed. The greatest advantages of the SSCG technology are the simplicity and cost-effectiveness of the technique. But the traditional SSCG technology is still uncontrollable. This article first summarizes the history and current status of AGG, and then reports recent technical developments from AGG to SSCG, and further introduces a new seed-free, solid-state crystal growth (SFSSCG) technology. This SFSSCG method allows us to repeatedly and controllably fabricate large-scale single crystals with appreciable high quality and relatively stable chemical composition at a relatively low temperature, at least in (K0.5Na0.5)NbO3(KNN) and Cu-Al-Mn systems. In this sense, the exaggerated grain growth is no longer 'Abnormal' but 'Normal' since it is able to be artificially controllable and repeated now. This article also provides a crystal growth model to qualitatively explain the mechanism of SFSSCG for KNN system. Compared with the traditional melt and high temperature solution growth methods, the SFSSCG method has the advantages of low energy consumption, low investment, simple technique, composition homogeneity overcoming the issues with incongruent melting and high volatility. This SFSSCG could be helpful for improving the mechanical and physical properties of single crystals, which should be promising for industrial applications.

  5. Production of High Quality Die Steels from Large ESR Slab Ingots

    Science.gov (United States)

    Geng, Xin; Jiang, Zhou-hua; Li, Hua-bing; Liu, Fu-bin; Li, Xing

    With the rapid development of manufacture industry in China, die steels are in great need of large slab ingot of high quality and large tonnage, such as P20, WSM718R and so on. Solidification structure and size of large slab ingots produced with conventional methods are not satisfied. However, large slab ingots manufactured by ESR process have a good solidification structure and enough section size. In the present research, the new slab ESR process was used to produce the die steels large slab ingots with the maximum size of 980×2000×3200mm. The compact and sound ingot can be manufactured by the slab ESR process. The ultra-heavy plates with the maximum thickness of 410 mm can be obtained after rolling the 49 tons ingots. Due to reducing the cogging and forging process, the ESR for large slab ingots process can increase greatly the yield and production efficiency, and evidently cut off product costs.

  6. Innovation-driven efficient development of the Longwangmiao Fm large-scale sulfur gas reservoir in Moxi block, Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Xinhua Ma

    2016-03-01

    Full Text Available The Lower Cambrian Longwangmiao Fm gas reservoir in Moxi block of the Anyue Gas field, Sichuan Basin, is the largest single-sandbody integrated carbonate gas reservoir proved so far in China. Notwithstanding this reservoir's advantages like large-scale reserves and high single-well productivity, there are multiple complicated factors restricting its efficient development, such as a median content of hydrogen sulfide, low porosity and strong heterogeneity of fracture–cave formation, various modes of gas–water occurrences, and close relation between overpressure and stress sensitivity. Up till now, since only a few Cambrian large-scale carbonate gas reservoirs have ever been developed in the world, there still exists some blind spots especially about its exploration and production rules. Besides, as for large-scale sulfur gas reservoirs, the exploration and construction is costly, and production test in the early evaluation stage is severely limited, all of which will bring about great challenges in productivity construction and high potential risks. In this regard, combining with Chinese strategic demand of strengthening clean energy supply security, the PetroChina Southwest Oil & Gas Field Company has carried out researches and field tests for the purpose of providing high-production wells, optimizing development design, rapidly constructing high-quality productivity and upgrading HSE security in the Longwangmiao Fm gas reservoir in Moxi block. Through the innovations of technology and management mode within 3 years, this gas reservoir has been built into a modern large-scale gas field with high quality, high efficiency and high benefit, and its annual capacity is now up to over 100 × 108 m3, with a desirable production capacity and development indexes gained as originally anticipated. It has become a new model of large-scale gas reservoirs with efficient development, providing a reference for other types of gas reservoirs in China.

  7. The LHC Cryomagnet Supports in Glass-Fiber Reinforced Epoxy A Large Scale Industrial Production with High Reproducibility in Performance

    CERN Document Server

    Poncet, A; Trigo, J; Parma, V

    2008-01-01

    The about 1700 LHC main ring super-conducting magnets are supported within their cryostats on 4700 low heat in leak column-type supports. The supports were designed to ensure a precise and stable positioning of the heavy dipole and quadrupole magnets while keeping thermal conduction heat loads within budget. A trade-off between mechanical and thermal properties, as well as cost considerations, led to the choice of glass fibre reinforced epoxy (GFRE). Resin Transfer Moulding (RTM), featuring a high level of automation and control, was the manufacturing process retained to ensure the reproducibility of the performance of the supports throughout the large production. The Spanish aerospace company EADS-CASA Espacio developed the specific RTM process, and produced the total quantity of supports between 2001 and 2004. This paper describes the development and the production of the supports, and presents the production experience and the achieved performance.

  8. THE LHC CRYOMAGNET SUPPORTS IN GLASS-FIBER REINFORCED EPOXY: A LARGE SCALE INDUSTRIAL PRODUCTION WITH HIGH REPRODUCIBILITY IN PERFORMANCE

    International Nuclear Information System (INIS)

    Poncet, A.; Struik, M.; Parma, V.; Trigo, J.

    2008-01-01

    The about 1700 LHC main ring super-conducting magnets are supported within their cryostats on 4700 low heat in leak column-type supports. The supports were designed to ensure a precise and stable positioning of the heavy dipole and quadrupole magnets while keeping thermal conduction heat loads within budget. A trade-off between mechanical and thermal properties, as well as cost considerations, led to the choice of glass fibre reinforced epoxy (GFRE). Resin Transfer Moulding (RTM), featuring a high level of automation and control, was the manufacturing process retained to ensure the reproducibility of the performance of the supports throughout the large production.The Spanish aerospace company EADS-CASA Espacio developed the specific RTM process, and produced the total quantity of supports between 2001 and 2004.This paper describes the development and the production of the supports, and presents the production experience and the achieved performance

  9. Research Update: Large-area deposition, coating, printing, and processing techniques for the upscaling of perovskite solar cell technology

    Directory of Open Access Journals (Sweden)

    Stefano Razza

    2016-09-01

    Full Text Available To bring perovskite solar cells to the industrial world, performance must be maintained at the photovoltaic module scale. Here we present large-area manufacturing and processing options applicable to large-area cells and modules. Printing and coating techniques, such as blade coating, slot-die coating, spray coating, screen printing, inkjet printing, and gravure printing (as alternatives to spin coating, as well as vacuum or vapor based deposition and laser patterning techniques are being developed for an effective scale-up of the technology. The latter also enables the manufacture of solar modules on flexible substrates, an option beneficial for many applications and for roll-to-roll production.

  10. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  11. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  12. A COMBINED REACTION/PRODUCT RECOVERY PROCESS FOR THE CONTINUOUS PRODUCTION OF BIODIESEL

    International Nuclear Information System (INIS)

    Birdwell, J.F. Jr.; McFarlane, J.; Schuh, D.L.; Tsouris, C.; Day, J.N.; Hullette, J.N.

    2009-01-01

    Oak Ridge National Laboratory (ORNL) and Nu-Energie, LLC entered into a Cooperative Research And Development Agreement (CRADA) for the purpose of demonstrating and deploying a novel technology for the continuous synthesis and recovery of biodiesel from the transesterification of triglycerides. The focus of the work was the demonstration of a combination Couette reactor and centrifugal separator - an invention of ORNL researchers - that facilitates both product synthesis and recovery from reaction byproducts in the same apparatus. At present, transesterification of triglycerides to produce biodiesel is performed in batch-type reactors with an excess of a chemical catalyst, which is required to achieve high reactant conversions in reasonable reaction times (e.g., 1 hour). The need for long reactor residence times requires use of large reactors and ancillary equipment (e.g., feed and product tankage), and correspondingly large facilities, in order to obtain the economy of scale required to make the process economically viable. Hence, the goal of this CRADA was to demonstrate successful, extended operation of a laboratory-scale reactor/separator prototype to process typical industrial reactant materials, and to design, fabricate, and test a production-scale unit for deployment at the biodiesel production site. Because of its ease of operation, rapid attainment of steady state, high mass transfer and phase separation efficiencies, and compact size, a centrifugal contactor was chosen for intensification of the biodiesel production process. The unit was modified to increase the residence time from a few seconds to minutes*. For this application, liquid phases were introduced into the reactor as separate streams. One was composed of the methanol and base catalyst and the other was the soy oil used in the experiments. Following reaction in the mixing zone, the immiscible glycerine and methyl ester products were separated in the high speed rotor and collected from separate

  13. A COMBINED REACTION/PRODUCT RECOVERY PROCESS FOR THE CONTINUOUS PRODUCTION OF BIODIESEL

    Energy Technology Data Exchange (ETDEWEB)

    Birdwell, J.F., Jr.; McFarlane, J.; Schuh, D.L.; Tsouris, C; Day, J.N. (Nu-Energie, LLC); Hullette, J.N. (Nu-Energie, LLC)

    2009-09-01

    Oak Ridge National Laboratory (ORNL) and Nu-Energie, LLC entered into a Cooperative Research And Development Agreement (CRADA) for the purpose of demonstrating and deploying a novel technology for the continuous synthesis and recovery of biodiesel from the transesterification of triglycerides. The focus of the work was the demonstration of a combination Couette reactor and centrifugal separator - an invention of ORNL researchers - that facilitates both product synthesis and recovery from reaction byproducts in the same apparatus. At present, transesterification of triglycerides to produce biodiesel is performed in batch-type reactors with an excess of a chemical catalyst, which is required to achieve high reactant conversions in reasonable reaction times (e.g., 1 hour). The need for long reactor residence times requires use of large reactors and ancillary equipment (e.g., feed and product tankage), and correspondingly large facilities, in order to obtain the economy of scale required to make the process economically viable. Hence, the goal of this CRADA was to demonstrate successful, extended operation of a laboratory-scale reactor/separator prototype to process typical industrial reactant materials, and to design, fabricate, and test a production-scale unit for deployment at the biodiesel production site. Because of its ease of operation, rapid attainment of steady state, high mass transfer and phase separation efficiencies, and compact size, a centrifugal contactor was chosen for intensification of the biodiesel production process. The unit was modified to increase the residence time from a few seconds to minutes*. For this application, liquid phases were introduced into the reactor as separate streams. One was composed of the methanol and base catalyst and the other was the soy oil used in the experiments. Following reaction in the mixing zone, the immiscible glycerine and methyl ester products were separated in the high speed rotor and collected from separate

  14. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  15. Idealised modelling of storm surges in large-scale coastal basins

    NARCIS (Netherlands)

    Chen, Wenlong

    2015-01-01

    Coastal areas around the world are frequently attacked by various types of storms, threatening human life and property. This study aims to understand storm surge processes in large-scale coastal basins, particularly focusing on the influences of geometry, topography and storm characteristics on the

  16. An efficient method based on the uniformity principle for synthesis of large-scale heat exchanger networks

    International Nuclear Information System (INIS)

    Zhang, Chunwei; Cui, Guomin; Chen, Shang

    2016-01-01

    Highlights: • Two dimensionless uniformity factors are presented to heat exchange network. • The grouping of process streams reduces the computational complexity of large-scale HENS problems. • The optimal sub-network can be obtained by Powell particle swarm optimization algorithm. • The method is illustrated by a case study involving 39 process streams, with a better solution. - Abstract: The optimal design of large-scale heat exchanger networks is a difficult task due to the inherent non-linear characteristics and the combinatorial nature of heat exchangers. To solve large-scale heat exchanger network synthesis (HENS) problems, two dimensionless uniformity factors to describe the heat exchanger network (HEN) uniformity in terms of the temperature difference and the accuracy of process stream grouping are deduced. Additionally, a novel algorithm that combines deterministic and stochastic optimizations to obtain an optimal sub-network with a suitable heat load for a given group of streams is proposed, and is named the Powell particle swarm optimization (PPSO). As a result, the synthesis of large-scale heat exchanger networks is divided into two corresponding sub-parts, namely, the grouping of process streams and the optimization of sub-networks. This approach reduces the computational complexity and increases the efficiency of the proposed method. The robustness and effectiveness of the proposed method are demonstrated by solving a large-scale HENS problem involving 39 process streams, and the results obtained are better than those previously published in the literature.

  17. Large scale Brownian dynamics of confined suspensions of rigid particles

    Science.gov (United States)

    Sprinkle, Brennan; Balboa Usabiaga, Florencio; Patankar, Neelesh A.; Donev, Aleksandar

    2017-12-01

    We introduce methods for large-scale Brownian Dynamics (BD) simulation of many rigid particles of arbitrary shape suspended in a fluctuating fluid. Our method adds Brownian motion to the rigid multiblob method [F. Balboa Usabiaga et al., Commun. Appl. Math. Comput. Sci. 11(2), 217-296 (2016)] at a cost comparable to the cost of deterministic simulations. We demonstrate that we can efficiently generate deterministic and random displacements for many particles using preconditioned Krylov iterative methods, if kernel methods to efficiently compute the action of the Rotne-Prager-Yamakawa (RPY) mobility matrix and its "square" root are available for the given boundary conditions. These kernel operations can be computed with near linear scaling for periodic domains using the positively split Ewald method. Here we study particles partially confined by gravity above a no-slip bottom wall using a graphical processing unit implementation of the mobility matrix-vector product, combined with a preconditioned Lanczos iteration for generating Brownian displacements. We address a major challenge in large-scale BD simulations, capturing the stochastic drift term that arises because of the configuration-dependent mobility. Unlike the widely used Fixman midpoint scheme, our methods utilize random finite differences and do not require the solution of resistance problems or the computation of the action of the inverse square root of the RPY mobility matrix. We construct two temporal schemes which are viable for large-scale simulations, an Euler-Maruyama traction scheme and a trapezoidal slip scheme, which minimize the number of mobility problems to be solved per time step while capturing the required stochastic drift terms. We validate and compare these schemes numerically by modeling suspensions of boomerang-shaped particles sedimented near a bottom wall. Using the trapezoidal scheme, we investigate the steady-state active motion in dense suspensions of confined microrollers, whose

  18. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  19. Preserving biological diversity in the face of large-scale demands for biofuels

    International Nuclear Information System (INIS)

    Cook, J.J.; Beyea, J.; Keeler, K.H.

    1991-01-01

    Large-scale production and harvesting of biomass to replace fossil fuels could reduce biological diversity by eliminating habitat for native species. Forests would be managed and harvested more intensively, and virtually all arable land unsuitable for high-value agriculture or silviculture might be used to grow crops dedicated to energy. Given the prospects for a potentially large increase in biofuel production, it is time now to develop strategies for mitigating the loss of biodiversity that might ensue. Planning at micro to macro scales will be crucial to minimize the ecological impacts of producing biofuels. In particular, cropping and harvesting systems will need to provide the biological, spatial, and temporal diversity characteristics of natural ecosystems and successional sequences, if we are to have this technology support the environmental health of the world rather than compromise it. Incorporation of these ecological values will be necessary to forestall costly environmental restoration, even at the cost of submaximal biomass productivity. It is therefore doubtful that all managers will take the longer view. Since the costs of biodiversity loss are largely external to economic markets, society cannot rely on the market to protect biodiversity, and some sort of intervention will be necessary. 116 refs., 1 tab

  20. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  1. Impacts of large-scale climatic disturbances on the terrestrial carbon cycle

    Directory of Open Access Journals (Sweden)

    Lucht Wolfgang

    2006-07-01

    Full Text Available Abstract Background The amount of carbon dioxide in the atmosphere steadily increases as a consequence of anthropogenic emissions but with large interannual variability caused by the terrestrial biosphere. These variations in the CO2 growth rate are caused by large-scale climate anomalies but the relative contributions of vegetation growth and soil decomposition is uncertain. We use a biogeochemical model of the terrestrial biosphere to differentiate the effects of temperature and precipitation on net primary production (NPP and heterotrophic respiration (Rh during the two largest anomalies in atmospheric CO2 increase during the last 25 years. One of these, the smallest atmospheric year-to-year increase (largest land carbon uptake in that period, was caused by global cooling in 1992/93 after the Pinatubo volcanic eruption. The other, the largest atmospheric increase on record (largest land carbon release, was caused by the strong El Niño event of 1997/98. Results We find that the LPJ model correctly simulates the magnitude of terrestrial modulation of atmospheric carbon anomalies for these two extreme disturbances. The response of soil respiration to changes in temperature and precipitation explains most of the modelled anomalous CO2 flux. Conclusion Observed and modelled NEE anomalies are in good agreement, therefore we suggest that the temporal variability of heterotrophic respiration produced by our model is reasonably realistic. We therefore conclude that during the last 25 years the two largest disturbances of the global carbon cycle were strongly controlled by soil processes rather then the response of vegetation to these large-scale climatic events.

  2. Re-thinking china's densified biomass fuel policies: Large or small scale?

    International Nuclear Information System (INIS)

    Shan, Ming; Li, Dingkai; Jiang, Yi; Yang, Xudong

    2016-01-01

    Current policies and strategies related to the utilization of densified biomass fuel (DBF) in China are mainly focused on medium- or large-scale manufacturing modes, which cannot provide feasible solutions to solve the household energy problems in China's rural areas. To simplify commercial processes related to the collection of DBF feedstock and the production and utilization of fuel, a novel village-scale DBF approach is proposed. Pilot demonstration projects have shown the feasibility and flexibility of this new approach in realizing sustainable development in rural China. Effective utilization of DBF in rural China will lead to gains for global, regional, and local energy savings, environmental protection, sustainable development, and related social benefits. It could also benefit other developing countries for better utilization of biomass as a viable household energy source. This proposal therefore delivers the possibility of reciprocal gains, and as such deserves the attention of policy makers and various stakeholders. - Highlights: •A field survey of Chinese densified biomass fuel (DBF) development is conducted. •The current situation and problems related to China's DBF industry are analyzed. •A novel and viable village-scale DBF utilization mode is proposed. •Further actions are suggested to boost the utilization of DBF in rural China.

  3. Inducing a health-promoting change process within an organization: the effectiveness of a large-scale intervention on social capital, openness, and autonomous motivation toward health.

    Science.gov (United States)

    van Scheppingen, Arjella R; de Vroome, Ernest M M; Ten Have, Kristin C J M; Bos, Ellen H; Zwetsloot, Gerard I J M; van Mechelen, W

    2014-11-01

    To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n = 324) were used to examine the effects on bonding social capital, openness, and autonomous motivation toward health and on employees' lifestyle, health, vitality, and sustainable employability. Also, the sensitivity of the intervention components was examined. Intervention effects were found for bonding social capital, openness toward health, smoking, healthy eating, and sustainable employability. The effects were primarily attributable to the intervention's dialogue component. The change process initiated by the large-scale intervention contributed to a social climate in the workplace that promoted health and ownership toward health. The study confirms the relevance of collective change processes for health promotion.

  4. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  5. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  6. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  7. Large-scale solvothermal synthesis of fluorescent carbon nanoparticles

    International Nuclear Information System (INIS)

    Ku, Kahoe; Park, Jinwoo; Kim, Nayon; Kim, Woong; Lee, Seung-Wook; Chung, Haegeun; Han, Chi-Hwan

    2014-01-01

    The large-scale production of high-quality carbon nanomaterials is highly desirable for a variety of applications. We demonstrate a novel synthetic route to the production of fluorescent carbon nanoparticles (CNPs) in large quantities via a single-step reaction. The simple heating of a mixture of benzaldehyde, ethanol and graphite oxide (GO) with residual sulfuric acid in an autoclave produced 7 g of CNPs with a quantum yield of 20%. The CNPs can be dispersed in various organic solvents; hence, they are easily incorporated into polymer composites in forms such as nanofibers and thin films. Additionally, we observed that the GO present during the CNP synthesis was reduced. The reduced GO (RGO) was sufficiently conductive (σ ≈ 282 S m −1 ) such that it could be used as an electrode material in a supercapacitor; in addition, it can provide excellent capacitive behavior and high-rate capability. This work will contribute greatly to the development of efficient synthetic routes to diverse carbon nanomaterials, including CNPs and RGO, that are suitable for a wide range of applications. (paper)

  8. Formation of Large-scale Coronal Loops Interconnecting Two Active Regions through Gradual Magnetic Reconnection and an Associated Heating Process

    Science.gov (United States)

    Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin

    2018-06-01

    Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.

  9. Development of system for product tracking and data acquisition of data irradiation process in large gamma irradiators

    International Nuclear Information System (INIS)

    Soares, Jose Roberto

    2010-01-01

    The sterilization of medical care products using ionizing radiation is a consolidated technique. In Brazil there are in operation gamma irradiators with capacity between 0.37 PBq (10kCi) 185 PBq (5 MCi) using radioisotopes 60 Co as radiation source. The developed work provides an accurate control and a data acquisition for the application of Good Manufacturing Practices during all phases of an irradiation process, required by the standards of ANVISA, ISO and IAEA technical recommendations for the treatment of foods and medical products. All the steps involved in the irradiation treatment are mapped into process flow (work flow), where each agent (participant) has its systematized tasks. The data acquisition process, monitoring and control, are based on a set of tools (free software licenses) integrated by a network of efficient communication, including the use of Web resources. Using the Gamma Irradiator Multipurpose IPEN/CNEN/USP all the development was performed to be applied in irradiators facilities operating in industrial scale. The system enables a complete traceability of the process, in real time, for any participant and also the storage of the corresponding records to be audited. (author)

  10. Development of system for product tracking and data acquisition of data irradiation process in large gamma irradiators

    International Nuclear Information System (INIS)

    Soares, Jose R.; Rela, Paulo R.; Costa, Fabio E.

    2011-01-01

    The sterilization of medical care products using ionizing radiation is a consolidated technique. In Brazil there are in operation gamma irradiators with capacity between 0.37 PBq (10kCi) 185 PBq (5 MCi) using radioisotopes 60 Co as radiation source. The developed work provides an accurate control and data acquisition for the application of good manufacturing practices during all phases of an irradiation process, required by the standards of ANVISA, technical ISO and IAEA recommendations for the treatment of foods and medical products. All the steps involved in the irradiation treatment are mapped into process flow (work flow), where each agent (participant) has its systematized tasks. The automatic data process acquisition using wireless ZigBee technology, monitoring and control, are based on a set of tools (free software licenses) integrated by a network of efficient communication, including the use of Web resources. Using the Gamma Irradiator Multipurpose IPEN/CNEN-SP all the development was performed to be applied in irradiators' facilities operating in industrial scale. The system enables a complete traceability of the process, in real time, for any participant and also the storage of the corresponding records to be audited. (author)

  11. Development of system for product tracking and data acquisition of data irradiation process in large gamma irradiators

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Jose R., E-mail: joseroberto.soares@mackenzie.br [Universidade Presbiteriana Mackenzie. Escola de Engenharia. Sao Paulo, SP (Brazil); Rela, Paulo R.; Costa, Fabio E., E-mail: prela@ipen.br, E-mail: fecosta@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The sterilization of medical care products using ionizing radiation is a consolidated technique. In Brazil there are in operation gamma irradiators with capacity between 0.37 PBq (10kCi) 185 PBq (5 MCi) using radioisotopes {sup 60}Co as radiation source. The developed work provides an accurate control and data acquisition for the application of good manufacturing practices during all phases of an irradiation process, required by the standards of ANVISA, technical ISO and IAEA recommendations for the treatment of foods and medical products. All the steps involved in the irradiation treatment are mapped into process flow (work flow), where each agent (participant) has its systematized tasks. The automatic data process acquisition using wireless ZigBee technology, monitoring and control, are based on a set of tools (free software licenses) integrated by a network of efficient communication, including the use of Web resources. Using the Gamma Irradiator Multipurpose IPEN/CNEN-SP all the development was performed to be applied in irradiators' facilities operating in industrial scale. The system enables a complete traceability of the process, in real time, for any participant and also the storage of the corresponding records to be audited. (author)

  12. Development of large-scale manufacturing of adipose-derived stromal cells for clinical applications using bioreactors and human platelet lysate.

    Science.gov (United States)

    Haack-Sørensen, Mandana; Juhl, Morten; Follin, Bjarke; Harary Søndergaard, Rebekka; Kirchhoff, Maria; Kastrup, Jens; Ekblond, Annette

    2018-04-17

    In vitro expanded adipose-derived stromal cells (ASCs) are a useful resource for tissue regeneration. Translation of small-scale autologous cell production into a large-scale, allogeneic production process for clinical applications necessitates well-chosen raw materials and cell culture platform. We compare the use of clinical-grade human platelet lysate (hPL) and fetal bovine serum (FBS) as growth supplements for ASC expansion in the automated, closed hollow fibre quantum cell expansion system (bioreactor). Stromal vascular fractions were isolated from human subcutaneous abdominal fat. In average, 95 × 10 6 cells were suspended in 10% FBS or 5% hPL medium, and loaded into a bioreactor coated with cryoprecipitate. ASCs (P0) were harvested, and 30 × 10 6 ASCs were reloaded for continued expansion (P1). Feeding rate and time of harvest was guided by metabolic monitoring. Viability, sterility, purity, differentiation capacity, and genomic stability of ASCs P1 were determined. Cultivation of SVF in hPL medium for in average nine days, yielded 546 × 10 6 ASCs compared to 111 × 10 6 ASCs, after 17 days in FBS medium. ASCs P1 yields were in average 605 × 10 6 ASCs (PD [population doublings]: 4.65) after six days in hPL medium, compared to 119 × 10 6 ASCs (PD: 2.45) in FBS medium, after 21 days. ASCs fulfilled ISCT criteria and demonstrated genomic stability and sterility. The use of hPL as a growth supplement for ASCs expansion in the quantum cell expansion system provides an efficient expansion process compared to the use of FBS, while maintaining cell quality appropriate for clinical use. The described process is an obvious choice for manufacturing of large-scale allogeneic ASC products.

  13. Cross-scale modelling of the climate-change mitigation potential of biochar systems: Global implications of nano-scale processes

    Science.gov (United States)

    Woolf, Dominic; Lehmann, Johannes

    2014-05-01

    production, land use, thermochemical conversion (to both biochar and energy products), climate, economics, and also the interactions between these components. Early efforts to model the life-cycle impacts of biochar systems have typically used simple empirical estimates of the strength of various feedback mechanisms, such as the impact of biochar on crop-growth, soil GHG fluxes, and native soil organic carbon. However, an environmental management perspective demands consideration of impacts over a longer time-scale and in broader agroecological situations than can be reliably extrapolated from simple empirical relationships derived from trials and experiments of inevitably limited scope and duration. Therefore, reliable quantification of long-term and large-scale impacts demands an understanding of the fundamental underlying mechanisms. Here, a systems-modelling approach that incorporates mechanistic assumptions will be described, and used to examine how uncertainties in the biogeochemical processes which drive the biochar-plant-soil interactions (particularly those responsible for priming, crop-growth and soil GHG emissions) translate into sensitivities of large scale and long-term impacts. This approach elucidates the aspects of process-level biochar biogeochemistry most critical to determining the large-scale GHG and economic impacts, and thus provides a useful guide to future model-led research.

  14. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  15. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  16. Downstream Processing of Synechocystis for Biofuel Production

    Science.gov (United States)

    Sheng, Jie

    Lipids and free fatty acids (FFA) from cyanobacterium Synechocystis can be used for biofuel (e.g. biodiesel or renewable diesel) production. In order to utilize and scale up this technique, downstream processes including culturing and harvest, cell disruption, and extraction were studied. Several solvents/solvent systems were screened for lipid extraction from Synechocystis. Chloroform + methanol-based Folch and Bligh & Dyer methods were proved to be "gold standard" for small-scale analysis due to their highest lipid recoveries that were confirmed by their penetration of the cell membranes, higher polarity, and stronger interaction with hydrogen bonds. Less toxic solvents, such as methanol and MTBE, or direct transesterification of biomass (without preextraction step) gave only slightly lower lipid-extraction yields and can be considered for large-scale application. Sustained exposure to high and low temperature extremes severely lowered the biomass and lipid productivity. Temperature stress also triggered changes of lipid quality such as the degree of unsaturation; thus, it affected the productivities and quality of Synechocystis-derived biofuel. Pulsed electric field (PEF) was evaluated for cell disruption prior to lipid extraction. A treatment intensity > 35 kWh/m3 caused significant damage to the plasma membrane, cell wall, and thylakoid membrane, and it even led to complete disruption of some cells into fragments. Treatment by PEF enhanced the potential for the low-toxicity solvent isopropanol to access lipid molecules during subsequent solvent extraction, leading to lower usage of isopropanol for the same extraction efficiency. Other cell-disruption methods also were tested. Distinct disruption effects to the cell envelope, plasma membrane, and thylakoid membranes were observed that were related to extraction efficiency. Microwave and ultrasound had significant enhancement of lipid extraction. Autoclaving, ultrasound, and French press caused significant

  17. Large-Scale Image Analytics Using Deep Learning

    Science.gov (United States)

    Ganguly, S.; Nemani, R. R.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Votava, P.

    2014-12-01

    High resolution land cover classification maps are needed to increase the accuracy of current Land ecosystem and climate model outputs. Limited studies are in place that demonstrates the state-of-the-art in deriving very high resolution (VHR) land cover products. In addition, most methods heavily rely on commercial softwares that are difficult to scale given the region of study (e.g. continents to globe). Complexities in present approaches relate to (a) scalability of the algorithm, (b) large image data processing (compute and memory intensive), (c) computational cost, (d) massively parallel architecture, and (e) machine learning automation. In addition, VHR satellite datasets are of the order of terabytes and features extracted from these datasets are of the order of petabytes. In our present study, we have acquired the National Agricultural Imaging Program (NAIP) dataset for the Continental United States at a spatial resolution of 1-m. This data comes as image tiles (a total of quarter million image scenes with ~60 million pixels) and has a total size of ~100 terabytes for a single acquisition. Features extracted from the entire dataset would amount to ~8-10 petabytes. In our proposed approach, we have implemented a novel semi-automated machine learning algorithm rooted on the principles of "deep learning" to delineate the percentage of tree cover. In order to perform image analytics in such a granular system, it is mandatory to devise an intelligent archiving and query system for image retrieval, file structuring, metadata processing and filtering of all available image scenes. Using the Open NASA Earth Exchange (NEX) initiative, which is a partnership with Amazon Web Services (AWS), we have developed an end-to-end architecture for designing the database and the deep belief network (following the distbelief computing model) to solve a grand challenge of scaling this process across quarter million NAIP tiles that cover the entire Continental United States. The

  18. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  19. Water Resources Implications of Cellulosic Biofuel Production at a Regional Scale

    Science.gov (United States)

    Christopher, S. F.; Schoenholtz, S. H.; Nettles, J. E.

    2011-12-01

    Recent increases in oil prices, a strong national interest in greater energy independence, and a concern for the role of fossil fuels in global climate change, have led to a dramatic expansion in use of alternative renewable energy sources in the U.S. The U.S. government has mandated production of 36 billion gallons of renewable fuels by 2022, of which 16 billion gallons are required to be cellulosic biofuels. Production of cellulosic biomass offers a promising alternative to corn-based systems because large-scale production of corn-based ethanol often requires irrigation and is associated with increased erosion, excess sediment export, and enhanced leaching of nitrogen and phosphorus. Although cultivation of switchgrass using standard agricultural practices is one option being considered for production of cellulosic biomass, intercropping cellulosic biofuel crops within managed forests could provide feedstock without primary land use change or the water quality impacts associated with annual crops. Catchlight Energy LLC is examining the feasibility and sustainability of intercropping switchgrass in loblolly pine plantations in the southeastern U.S. Ongoing research is determining efficient operational techniques and information needed to evaluate effects of these practices on water resources in small watershed-scale (~25 ha) studies. Three sets of four to five sub-watersheds are fully instrumented and currently collecting calibration data in North Carolina, Alabama, and Mississippi. These watershed studies will provide detailed information to understand processes and guide management decisions. However, environmental implications of cellulosic systems need to be examined at a regional scale. We used the Soil Water Assessment Tool (SWAT), a physically-based hydrologic model, to examine water quantity effects of various land use change scenarios ranging from switchgrass intercropping a small percentage of managed pine forest land to conversion of all managed

  20. Guidelines for the scale-up of an aqueous ceramic process: a case study of statistical process control

    OpenAIRE

    Mortara, L.; Alcock, Jeffrey R.

    2011-01-01

    Process-scale up is the change from a feasibility study in a laboratory to a full-scale prototype production process. It is an important issue for the ceramics industry, but has been the subject of relatively little systematic research. This paper will show how certain manufacturing concepts used in a number of industries - can be applied to the scale up of a feasibility study level, aqueous tape casting process. In particular, it examines the elements of process standardisa...

  1. Pilot-scale grout production test with a simulated low-level waste

    International Nuclear Information System (INIS)

    Fow, C.L.; Mitchell, D.H.; Treat, R.L.; Hymas, C.R.

    1987-05-01

    Plans are underway at the Hanford Site near Richland, Washington, to convert the low-level fraction of radioactive liquid wastes to a grout form for permanent disposal. Grout is a mixture of liquid waste and grout formers, including portland cement, fly ash, and clays. In the plan, the grout slurry is pumped to subsurface concrete vaults on the Hanford Site, where the grout will solidify into large monoliths, thereby immobilizing the waste. A similar disposal concept is being planned at the Savannah River Laboratory site. The underground disposal of grout was conducted at Oak Ridge National Laboratory between 1966 and 1984. Design and construction of grout processing and disposal facilities are underway. The Transportable Grout Facility (TGF), operated by Rockwell Hanford Operations (Rockwell) for the Department of Energy (DOE), is scheduled to grout Phosphate/Sulfate N Reactor Operations Waste (PSW) in FY 1988. Phosphate/Sulfate Waste is a blend of two low-level waste streams generated at Hanford's N Reactor. Other wastes are scheduled to be grouted in subsequent years. Pacific Northwest Laboratory (PNL) is verifying that Hanford grouts can be safely and efficiently processed. To meet this objective, pilot-scale grout process equipment was installed. On July 29 and 30, 1986, PNL conducted a pilot-scale grout production test for Rockwell. During the test, 16,000 gallons of simulated nonradioactive PSW were mixed with grout formers to produce 22,000 gallons of PSW grout. The grout was pumped at a nominal rate of 15 gpm (about 25% of the nominal production rate planned for the TGF) to a lined and covered trench with a capacity of 30,000 gallons. Emplacement of grout in the trench will permit subsequent evaluation of homogeneity of grout in a large monolith. 12 refs., 34 figs., 5 tabs

  2. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  3. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    Science.gov (United States)

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-02-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications.

  4. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  5. The Proposal of Scaling the Roles in Scrum of Scrums for Distributed Large Projects

    OpenAIRE

    Abeer M. AlMutairi; M. Rizwan Jameel Qureshi

    2015-01-01

    Scrum of scrums is an approach used to scale the traditional Scrum methodology to fit for the development of complex and large projects. However, scaling the roles of scrum members brought new challenges especially in distributed and large software projects. This paper describes in details the roles of each scrum member in scrum of scrum to propose a solution to use a dedicated product owner for a team and inclusion of sub-backlog. The main goal of the proposed solution i...

  6. Analysis of Utilization of Fecal Resources in Large-scale Livestock and Poultry Breeding in China

    Directory of Open Access Journals (Sweden)

    XUAN Meng

    2018-02-01

    Full Text Available The purpose of this paper is to develop a systematic investigation for the serious problems of livestock and poultry breeding in China and the technical demand of promoting the utilization of manure. Based on the status quo of large-scale livestock and poultry farming in typical areas in China, the work had been done beared on statistics and analysis of the modes and proportions of utilization of manure resources. Such a statistical method had been applied to the country -identified large -scale farm, which the total amount of pollutants reduction was in accordance with the "12th Five-Year Plan" standards. The results showed that there were some differences in the modes of resource utilization due to livestock and poultry manure at different scales and types:(1 Hogs, dairy cattle and beef cattle in total accounted for more than 75% of the agricultural manure storage;(2 Laying hens and broiler chickens accounted for about 65% of the total production of the organic manure produced by fecal production. It is demonstrated that the major modes of resource utilization of dung and urine were related to the natural characteristics, agricultural production methods, farming scale and economic development level in the area. It was concluded that the unreasonable planning, lacking of cleansing during breeding, false selection of manure utilizing modes were the major problems in China忆s large-scale livestock and poultry fecal resources utilization.

  7. History matching of large scale fractures to production data; Calage de la geometrie des reseaux de fractures aux donnees hydrodynamiques de production d'un champ petrolier

    Energy Technology Data Exchange (ETDEWEB)

    Jenni, S.

    2005-01-01

    Object based models are very helpful to represent complex geological media such as fractured reservoirs. For building realistic fracture networks, these models have to be constrained to both static (seismic, geomechanics, geology) and dynamic data (well tests and production history). In this report we present a procedure for the calibration of large-scale fracture networks to production history. The history matching procedure includes a realistic geological modeling, a parameterization method coherent with the geological model and allowing an efficient optimization. Fluid flow modeling is based on a double medium approach. The calibration procedure was applied to a semi-synthetic case based on a real fractured reservoir. The calibration to water-cut data was performed. (author)

  8. Large-scale gas dynamical processes affecting the origin and evolution of gaseous galactic halos

    Science.gov (United States)

    Shapiro, Paul R.

    1991-01-01

    Observations of galactic halo gas are consistent with an interpretation in terms of the galactic fountain model in which supernova heated gas in the galactic disk escapes into the halo, radiatively cools and forms clouds which fall back to the disk. The results of a new study of several large-scale gas dynamical effects which are expected to occur in such a model for the origin and evolution of galactic halo gas will be summarized, including the following: (1) nonequilibrium absorption line and emission spectrum diagnostics for radiatively cooling halo gas in our own galaxy, as well the implications of such absorption line diagnostics for the origin of quasar absorption lines in galactic halo clouds of high redshift galaxies; (2) numerical MHD simulations and analytical analysis of large-scale explosions ad superbubbles in the galactic disk and halo; (3) numerical MHD simulations of halo cloud formation by thermal instability, with and without magnetic field; and (4) the effect of the galactic fountain on the galactic dynamo.

  9. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  10. High performance nanostructured Silicon heterojunction for water splitting on large scales

    KAUST Repository

    Bonifazi, Marcella

    2017-11-02

    In past years the global demand for energy has been increasing steeply, as well as the awareness that new sources of clean energy are essential. Photo-electrochemical devices (PEC) for water splitting applications have stirred great interest, and different approach has been explored to improve the efficiency of these devices and to avoid optical losses at the interfaces with water. These include engineering materials and nanostructuring the device\\'s surfaces [1]-[2]. Despite the promising initial results, there are still many drawbacks that needs to be overcome to reach large scale production with optimized performances [3]. We present a new device that relies on the optimization of the nanostructuring process that exploits suitably disordered surfaces. Additionally, this device could harvest light on both sides to efficiently gain and store the energy to keep the photocatalytic reaction active.

  11. High performance nanostructured Silicon heterojunction for water splitting on large scales

    KAUST Repository

    Bonifazi, Marcella; Fu, Hui-chun; He, Jr-Hau; Fratalocchi, Andrea

    2017-01-01

    In past years the global demand for energy has been increasing steeply, as well as the awareness that new sources of clean energy are essential. Photo-electrochemical devices (PEC) for water splitting applications have stirred great interest, and different approach has been explored to improve the efficiency of these devices and to avoid optical losses at the interfaces with water. These include engineering materials and nanostructuring the device's surfaces [1]-[2]. Despite the promising initial results, there are still many drawbacks that needs to be overcome to reach large scale production with optimized performances [3]. We present a new device that relies on the optimization of the nanostructuring process that exploits suitably disordered surfaces. Additionally, this device could harvest light on both sides to efficiently gain and store the energy to keep the photocatalytic reaction active.

  12. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    Science.gov (United States)

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  13. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    Science.gov (United States)

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  14. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  15. Large scale disposal of waste sulfur: From sulfide fuels to sulfate sequestration

    International Nuclear Information System (INIS)

    Rappold, T.A.; Lackner, K.S.

    2010-01-01

    Petroleum industries produce more byproduct sulfur than the market can absorb. As a consequence, most sulfur mines around the world have closed down, large stocks of yellow sulfur have piled up near remote operations, and growing amounts of toxic H 2 S are disposed of in the subsurface. Unless sulfur demand drastically increases or thorough disposal practices are developed, byproduct sulfur will persist as a chemical waste problem on the scale of 10 7 tons per year. We review industrial practices, salient sulfur chemistry, and the geochemical cycle to develop sulfur management concepts at the appropriate scale. We contend that the environmentally responsible disposal of sulfur would involve conversion to sulfuric acid followed by chemical neutralization with equivalent amounts of base, which common alkaline rocks can supply cheaply. The resulting sulfate salts are benign and suitable for brine injection underground or release to the ocean, where they would cause minimal disturbance to ecosystems. Sequestration costs can be recouped by taking advantage of the fuel-grade thermal energy released in the process of oxidizing reduced compounds and sequestering the products. Sulfate sequestration can eliminate stockpiles and avert the proliferation of enriched H 2 S stores underground while providing plenty of carbon-free energy to hydrocarbon processing.

  16. Large-scale climatic anomalies affect marine predator foraging behaviour and demography

    Science.gov (United States)

    Bost, Charles A.; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-01

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  17. Visual attention mitigates information loss in small- and large-scale neural codes

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  18. The influence of control parameter estimation on large scale geomorphological interpretation of pointclouds

    Science.gov (United States)

    Dorninger, P.; Koma, Z.; Székely, B.

    2012-04-01

    In recent years, laser scanning, also referred to as LiDAR, has proved to be an important tool for topographic data acquisition. Basically, laser scanning acquires a more or less homogeneously distributed point cloud. These points represent all natural objects like terrain and vegetation as well as man-made objects such as buildings, streets, powerlines, or other constructions. Due to the enormous amount of data provided by current scanning systems capturing up to several hundred thousands of points per second, the immediate application of such point clouds for large scale interpretation and analysis is often prohibitive due to restrictions of the hard- and software infrastructure. To overcome this, numerous methods for the determination of derived products do exist. Commonly, Digital Terrain Models (DTM) or Digital Surface Models (DSM) are derived to represent the topography using a regular grid as datastructure. The obvious advantages are a significant reduction of the amount of data and the introduction of an implicit neighborhood topology enabling the application of efficient post processing methods. The major disadvantages are the loss of 3D information (i.e. overhangs) as well as the loss of information due to the interpolation approach used. We introduced a segmentation approach enabling the determination of planar structures within a given point cloud. It was originally developed for the purpose of building modeling but has proven to be well suited for large scale geomorphological analysis as well. The result is an assignment of the original points to a set of planes. Each plane is represented by its plane parameters. Additionally, numerous quality and quantity parameters are determined (e.g. aspect, slope, local roughness, etc.). In this contribution, we investigate the influence of the control parameters required for the plane segmentation on the geomorphological interpretation of the derived product. The respective control parameters may be determined

  19. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    Science.gov (United States)

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  20. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  1. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-04-30

    We explore visualization and abstraction approaches to represent neuronal data. Neuroscientists acquire electron microscopy volumes to reconstruct a complete wiring diagram of the neurons in the brain, called the connectome. This will be crucial to understanding brains and their development. However, the resulting data is complex and large, posing a big challenge to existing visualization techniques in terms of clarity and scalability. We describe solutions to tackle the problems of scalability and cluttered presentation. We first show how a query-guided interactive approach to visual exploration can reduce the clutter and help neuroscientists explore their data dynamically. We use a knowledge-based query algebra that facilitates the interactive creation of queries. This allows neuroscientists to pose domain-specific questions related to their research. Simple queries can be combined to form complex queries to answer more sophisticated questions. We then show how visual abstractions from 3D to 2D can significantly reduce the visual clutter and add clarity to the visualization so that scientists can focus more on the analysis. We abstract the topology of 3D neurons into a multi-scale, relative distance-preserving subway map visualization that allows scientists to interactively explore the morphological and connectivity features of neuronal cells. We then focus on the process of acquisition, where neuroscientists segment electron microscopy images to reconstruct neurons. The segmentation process of such data is tedious, time-intensive, and usually performed using a diverse set of tools. We present a novel web-based visualization system for tracking the state, progress, and evolution of segmentation data in neuroscience. Our multi-user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large

  2. Continuous production of fullerenes and other carbon nanomaterials on a semi-industrial scale using plasma technology

    International Nuclear Information System (INIS)

    Gruenberger, T.M.; Gonzalez-Aguilar, J.; Fulcheri, L.; Fabry, F.; Grivei, E.; Probst, N.; Flamant, G.; Charlier, J.-C.

    2002-01-01

    A new production method is presented allowing the production of bulk quantities of fullerenes and other carbon nanomaterials using a 3-phase thermal plasma (260 kW). The main characteristics of this method lie in the independent control of the carbon throughput by injection of a solid carbon feedstock, and the immediate extraction of the synthesised product from the reactor, allowing production on a continuous basis. The currently investigated plasma facility is of an intermediate scale between lab-size and an industrial pilot plant, ready for further up scaling to an industrial size. The influence of a large number of different carbon precursors, plasma gases and operating conditions on the fullerene yield has been studied. At this state, quantities of up to 1 kg of carbon can be processed per hour with further scope for increase, leading to production rates for this type of materials not achievable with any other technology at present

  3. The synthesis of alternatives for the bioconversion of waste-monoethanolamine from large-scale CO{sub 2}-removal processes

    Energy Technology Data Exchange (ETDEWEB)

    Ohtaguchi, Kazuhisa; Yokoyama, Takahisa [Tokyo Inst. of Tech. (Japan). Dept. of Chemical Engineering

    1998-12-31

    The alternatives for bioconversion of monoethanolamine (MEA), which would appear in large quantities in industrial effluent of CO{sub 2}-removal process of power companies, have been proposed by investigating the ability of some microorganisms to deaminate MEA. An evaluation of biotechnology, which includes productions from MEA of acetic acid and acetaldehyde with Escherichia coli, of formic and acetic acids with Clostridium formicoaceticum, confirms and extends our earlier remarks on availability of ecotechnology for solving the above problem. (Author)

  4. Algorithm of search and track of static and moving large-scale objects

    Directory of Open Access Journals (Sweden)

    Kalyaev Anatoly

    2017-01-01

    Full Text Available We suggest an algorithm for processing of a sequence, which contains images of search and track of static and moving large-scale objects. The possible software implementation of the algorithm, based on multithread CUDA processing, is suggested. Experimental analysis of the suggested algorithm implementation is performed.

  5. Economic viability of large-scale fusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Helsley, Charles E., E-mail: cehelsley@fusionpowercorporation.com; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system

  6. Economic viability of large-scale fusion systems

    International Nuclear Information System (INIS)

    Helsley, Charles E.; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system economically

  7. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  8. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  9. Evaluation of factors controlling global secondary organic aerosol production from cloud processes

    Directory of Open Access Journals (Sweden)

    C. He

    2013-02-01

    Full Text Available Secondary organic aerosols (SOA exert a significant influence on ambient air quality and regional climate. Recent field, laboratorial and modeling studies have confirmed that in-cloud processes contribute to a large fraction of SOA production with large space-time heterogeneity. This study evaluates the key factors that govern the production of cloud-process SOA (SOAcld on a global scale based on the GFDL coupled chemistry-climate model AM3 in which full cloud chemistry is employed. The association between SOAcld production rate and six factors (i.e., liquid water content (LWC, total carbon chemical loss rate (TCloss, temperature, VOC/NOx, OH, and O3 is examined. We find that LWC alone determines the spatial pattern of SOAcld production, particularly over the tropical, subtropical and temperate forest regions, and is strongly correlated with SOAcld production. TCloss ranks the second and mainly represents the seasonal variability of vegetation growth. Other individual factors are essentially uncorrelated spatiotemporally to SOAcld production. We find that the rate of SOAcld production is simultaneously determined by both LWC and TCloss, but responds linearly to LWC and nonlinearly (or concavely to TCloss. A parameterization based on LWC and TCloss can capture well the spatial and temporal variability of the process-based SOAcld formation (R2 = 0.5 and can be easily applied to global three dimensional models to represent the SOA production from cloud processes.

  10. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination

    NARCIS (Netherlands)

    Wood, F.; Kowalczuk, J.; Elwyn, G.; Mitchell, C.; Gallacher, J.

    2011-01-01

    BACKGROUND: Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies

  11. The Very Large Array Data Processing Pipeline

    Science.gov (United States)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  12. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  13. Colour dynamics in large psub(T) hadron production on nuclei

    International Nuclear Information System (INIS)

    Kopeliovich, B.Z.; Niedermayer, F.

    1984-01-01

    The color dynamics of hadron production with large transverse momentum (psub(T)) on nuclei is investigated. Retardation by colour forces of colour objects propagating through nuclear matter leads to considerable shadowing of hard processes inside the nucleus. This explains the weak A dependence of the production cross section for large psub(T) meson pairs. The small absorption of compressed hadronic configurations inside the nucleus explains the linear A dependence of pp-pair production

  14. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe

    NARCIS (Netherlands)

    Blaas, H.; Kroeze, C.

    2014-01-01

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed.

  15. Algorithm 873: LSTRS: MATLAB Software for Large-Scale Trust-Region Subproblems and Regularization

    DEFF Research Database (Denmark)

    Rojas Larrazabal, Marielba de la Caridad; Santos, Sandra A.; Sorensen, Danny C.

    2008-01-01

    A MATLAB 6.0 implementation of the LSTRS method is resented. LSTRS was described in Rojas, M., Santos, S.A., and Sorensen, D.C., A new matrix-free method for the large-scale trust-region subproblem, SIAM J. Optim., 11(3):611-646, 2000. LSTRS is designed for large-scale quadratic problems with one...... at each step. LSTRS relies on matrix-vector products only and has low and fixed storage requirements, features that make it suitable for large-scale computations. In the MATLAB implementation, the Hessian matrix of the quadratic objective function can be specified either explicitly, or in the form...... of a matrix-vector multiplication routine. Therefore, the implementation preserves the matrix-free nature of the method. A description of the LSTRS method and of the MATLAB software, version 1.2, is presented. Comparisons with other techniques and applications of the method are also included. A guide...

  16. High-Temperature-Short-Time Annealing Process for High-Performance Large-Area Perovskite Solar Cells.

    Science.gov (United States)

    Kim, Minjin; Kim, Gi-Hwan; Oh, Kyoung Suk; Jo, Yimhyun; Yoon, Hyun; Kim, Ka-Hyun; Lee, Heon; Kim, Jin Young; Kim, Dong Suk

    2017-06-27

    Organic-inorganic hybrid metal halide perovskite solar cells (PSCs) are attracting tremendous research interest due to their high solar-to-electric power conversion efficiency with a high possibility of cost-effective fabrication and certified power conversion efficiency now exceeding 22%. Although many effective methods for their application have been developed over the past decade, their practical transition to large-size devices has been restricted by difficulties in achieving high performance. Here we report on the development of a simple and cost-effective production method with high-temperature and short-time annealing processing to obtain uniform, smooth, and large-size grain domains of perovskite films over large areas. With high-temperature short-time annealing at 400 °C for 4 s, the perovskite film with an average domain size of 1 μm was obtained, which resulted in fast solvent evaporation. Solar cells fabricated using this processing technique had a maximum power conversion efficiency exceeding 20% over a 0.1 cm 2 active area and 18% over a 1 cm 2 active area. We believe our approach will enable the realization of highly efficient large-area PCSs for practical development with a very simple and short-time procedure. This simple method should lead the field toward the fabrication of uniform large-scale perovskite films, which are necessary for the production of high-efficiency solar cells that may also be applicable to several other material systems for more widespread practical deployment.

  17. Process improvement of knives production in a small scale industry

    Science.gov (United States)

    Ananto, Gamawan; Muktasim, Irfan

    2017-06-01

    Small scale industry that produces several kinds of knive should increase its capacity due to the demand from the market. Qualitatively, this case study consisted of formulating the problems, collecting and analyzing the necessary data, and determining the possible recommendations for the improvement. While the current capacity is only 9 (nine), it is expected that 20 units of knife will produced per month. The processes sequence are: profiling (a), truing (b), beveling (c), heat treatment (d), polishing (e), assembly (f), sharpening (g) and finishing (h). The first process (a) is held by out-house vendor company while other steps from (b) to (g) are executed by in-house vendor. However, there is a high dependency upon the high skilled operator who executes the in -house processes that are mostly held manually with several unbalance successive tasks, where the processing time of one or two tasks require longer duration than others since the operation is merely relied on the operator's skill. The idea is the improvement or change of the profiling and beveling process. Due to the poor surface quality and suboptimal hardness resulted from the laser cut machine for profiling, it is considered to subst itute this kind of process with wire cut that is capable to obtain good surface quality with certain range levels of roughness. Through simple cutting experiments on the samples, it is expected that the generated surface quality is adequate to omit the truing process (b). In addition, the cutting experiments on one, two, and four test samples resulted the shortest time that was obtained through four pieces in one cut. The technical parameters were set according to the recommendation of machine standard as referred to samples condition such as thickness and path length that affect ed the rate of wear. Meanwhile, in order to guarantee the uniformity of knife angles that are formed through beveling process (c), a grinding fixture was created. This kind of tool diminishes the

  18. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  19. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  20. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    Science.gov (United States)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.