WorldWideScience

Sample records for large-scale production system

  1. Toyota production system beyond large-scale production

    CERN Document Server

    Ohno, Taiichi

    1998-01-01

    In this classic text, Taiichi Ohno--inventor of the Toyota Production System and Lean manufacturing--shares the genius that sets him apart as one of the most disciplined and creative thinkers of our time. Combining his candid insights with a rigorous analysis of Toyota's attempts at Lean production, Ohno's book explains how Lean principles can improve any production endeavor. A historical and philosophical description of just-in-time and Lean manufacturing, this work is a must read for all students of human progress. On a more practical level, it continues to provide inspiration and instruction for those seeking to improve efficiency through the elimination of waste.

  2. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  3. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  4. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  5. Large-scale production of Fischer-Tropsch diesel from biomass. Optimal gasification and gas cleaning systems

    International Nuclear Information System (INIS)

    Boerrigter, H.; Van der Drift, A.

    2004-12-01

    The paper is presented in the form of copies of overhead sheets. The contents concern definitions, an overview of Integrated biomass gasification and Fischer Tropsch (FT) systems (state-of-the-art, gas cleaning and biosyngas production, experimental demonstration and conclusions), some aspects of large-scale systems (motivation, biomass import) and an outlook

  6. New systems for the large-scale production of male tsetse flies (Diptera: Glossinidae)

    International Nuclear Information System (INIS)

    Opiyo, E.; Luger, D.; Robinson, A.S.

    2000-01-01

    morsitans morsitans Westwood produced a total of 500,000 sterile males. In Burkina Faso, between 1976 and 1984, a colony of 330,000 G. palpalis gambiensis Vanderplank and G. tachinoides Westwood provided 950,000 sterile males for release into an area of 3,000 km 2 (Clair et al. 1990) while during the Bicot project in Nigeria in an area of 1,500 km 2 , 1.5 million sterile male G. p. palpalis Robineau-Desvoidy were released (Olandunmade et al. 1990). Recently, 8.5 million sterile males were released on Unguja Island, Zanzibar, the United Republic of Tanzania in an area of 1,600 km 2 produced by a colony of about 600,000 G. austeni Newstead (Saleh et al. 1997, Kitwika et al. 1997). This led to the eradication of the tsetse population and a massive reduction in disease incidence in cattle (Saleh et al. 1997). Tsetse fly SIT has been applied on a limited scale because of the inability to provide large numbers of sterile males for release. The present rearing system is labour intensive and too many quality sensitive steps in the mass production system are not sufficiently standardised to transfer the system directly to large-scale production. Tsetse rearing evolved from feeding on live hosts to an in vitro rearing system where blood is fed to flies through a silicone membrane (Feldmann 1994a). At present, cages are small, hold a small number of flies and have to be manually transferred for feeding and then returned for pupal collection. This limits the number of flies that can be handled at any one time. In order to improve these processes, a Tsetse Production Unit (TPU) was developed and evaluated. During conventional tsetse rearing, flies need to be sexed with the correct number and sex of flies, whether for stocking production cages or for the release of males only. This has to be done by hand on an individual fly basis following the immobilisation of adults at C. A procedure is reported in this paper for the self-stocking of production cages (SSPC) which enables flies to

  7. A 3D Sphere Culture System Containing Functional Polymers for Large-Scale Human Pluripotent Stem Cell Production

    Directory of Open Access Journals (Sweden)

    Tomomi G. Otsuji

    2014-05-01

    Full Text Available Utilizing human pluripotent stem cells (hPSCs in cell-based therapy and drug discovery requires large-scale cell production. However, scaling up conventional adherent cultures presents challenges of maintaining a uniform high quality at low cost. In this regard, suspension cultures are a viable alternative, because they are scalable and do not require adhesion surfaces. 3D culture systems such as bioreactors can be exploited for large-scale production. However, the limitations of current suspension culture methods include spontaneous fusion between cell aggregates and suboptimal passaging methods by dissociation and reaggregation. 3D culture systems that dynamically stir carrier beads or cell aggregates should be refined to reduce shearing forces that damage hPSCs. Here, we report a simple 3D sphere culture system that incorporates mechanical passaging and functional polymers. This setup resolves major problems associated with suspension culture methods and dynamic stirring systems and may be optimal for applications involving large-scale hPSC production.

  8. A Recommender System for an IPTV Service Provider: a Real Large-Scale Production Environment

    Science.gov (United States)

    Bambini, Riccardo; Cremonesi, Paolo; Turrin, Roberto

    In this chapter we describe the integration of a recommender system into the production environment of Fastweb, one of the largest European IP Television (IPTV) providers. The recommender system implements both collaborative and content-based techniques, suitable tailored to the specific requirements of an IPTV architecture, such as the limited screen definition, the reduced navigation capabilities, and the strict time constraints. The algorithms are extensively analyzed by means of off-line and on-line tests, showing the effectiveness of the recommender systems: up to 30% of the recommendations are followed by a purchase, with an estimated lift factor (increase in sales) of 15%.

  9. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  10. Efficient large-scale protein production of larvae and pupae of silkworm by Bombyx mori nuclear polyhedrosis virus bacmid system

    International Nuclear Information System (INIS)

    Motohashi, Tomoko; Shimojima, Tsukasa; Fukagawa, Tatsuo; Maenaka, Katsumi; Park, Enoch Y.

    2005-01-01

    Silkworm is one of the most attractive hosts for large-scale production of eukaryotic proteins as well as recombinant baculoviruses for gene transfer to mammalian cells. The bacmid system of Autographa californica nuclear polyhedrosis virus (AcNPV) has already been established and widely used. However, the AcNPV does not have a potential to infect silkworm. We developed the first practical Bombyx mori nuclear polyhedrosis virus bacmid system directly applicable for the protein expression of silkworm. By using this system, the green fluorescence protein was successfully expressed in silkworm larvae and pupae not only by infection of its recombinant virus but also by direct injection of its bacmid DNA. This method provides the rapid protein production in silkworm as long as 10 days, is free from biohazard, thus will be a powerful tool for the future production factory of recombinant eukaryotic proteins and baculoviruses

  11. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  12. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  13. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  14. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  15. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  16. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  17. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  18. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  19. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  20. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  1. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  2. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  3. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  4. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  5. Nuclear criticality safety practices in digestion systems of the large scale production facility of the Department of Energy at Fernald

    International Nuclear Information System (INIS)

    Dolan, L.C.

    1982-01-01

    Nuclear criticality safety practices used at the Feed Materials Production Center at Fernald, Ohio in conjunction with its metal dissolving and nonmetal, e.g., ash and ore concentrates, digesting operations are reviewed. Operating procedures with several different types of dissolver or digestor systems, i.e., metal dissolver, continuous, drum and safe geometry, are discussed. Calculations performed to verify the criticality safety of the operations are described

  6. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  7. A novel plasmid addiction system for large-scale production of cyanophycin in Escherichia coli using mineral salts medium.

    Science.gov (United States)

    Kroll, Jens; Klinter, Stefan; Steinbüchel, Alexander

    2011-02-01

    Hitherto the production of the biopolymer cyanophycin (CGP) using recombinant Escherichia coli strains and cheap mineral salts medium yielded only trace amounts of CGP (dapE disrupts the native succinylase pathway in E. coli and (2) the complementation by the plasmid-encoded artificial aminotransferase pathway mediated by the dapL gene from Synechocystis sp. PCC 6308, which allows the synthesis of the essential lysine precursor L,L-2,6-diaminopimelate. In addition, this plasmid also harbors cphAC595S, an engineered cyanophycin synthetase gene responsible for CGP production. Cultivation experiments in Erlenmeyer flask and also in bioreactors in mineral salts medium without antibiotics revealed an at least 4.5-fold enhanced production of CGP in comparison to control cultivations without PAS. Fermentation experiments with culture volume of up to 400 l yielded a maximum of 18% CGP (w/w) and a final cell density of 15.2 g CDM/l. Lactose was used constantly as an effective inducer and carbon source. Thus, we present a convenient option to produce CGP with E. coli at a technical scale without the need to add antibiotics or amino acids using the mineral salts medium designed in this study.

  8. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  9. Sustainable production of toxin free marine microalgae biomass as fish feed in large scale open system in the Qatari desert.

    Science.gov (United States)

    Das, Probir; Thaher, Mahmoud Ibrahim; Hakim, Mohammed Abdul Quadir Mohd Abdul; Al-Jabri, Hareb Mohammed S J

    2015-09-01

    Mass cultivation of microalgae biomass for feed should be cost effective and toxin free. Evaporation loss in Qatar can be as high as 2 cm/d. Hence, production of marine microalgae biomass in Qatar would also require mitigating water loss as there was only very limited groundwater reserve. To address these issues, a combination of four growth conditions were applied to a 25,000 L raceway pond: locally isolated microalgae strain was selected which could grow in elevated salinity; strain that did not require silica and vitamins; volume of the culture would increase over time keeping denser inoculum in the beginning, and evaporation water loss would be balanced by adding seawater only. A local saline tolerant Nannochloropsis sp. was selected which did not require silica and vitamins. When the above conditions were combined in the pond, average areal biomass productivities reached 20.37 g/m(2)/d, and the culture was not contaminated by any toxic microalgae. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Performance regression manager for large scale systems

    Science.gov (United States)

    Faraj, Daniel A.

    2017-08-01

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.

  11. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  12. Large-scale modelling of neuronal systems

    International Nuclear Information System (INIS)

    Castellani, G.; Verondini, E.; Giampieri, E.; Bersani, F.; Remondini, D.; Milanesi, L.; Zironi, I.

    2009-01-01

    The brain is, without any doubt, the most, complex system of the human body. Its complexity is also due to the extremely high number of neurons, as well as the huge number of synapses connecting them. Each neuron is capable to perform complex tasks, like learning and memorizing a large class of patterns. The simulation of large neuronal systems is challenging for both technological and computational reasons, and can open new perspectives for the comprehension of brain functioning. A well-known and widely accepted model of bidirectional synaptic plasticity, the BCM model, is stated by a differential equation approach based on bistability and selectivity properties. We have modified the BCM model extending it from a single-neuron to a whole-network model. This new model is capable to generate interesting network topologies starting from a small number of local parameters, describing the interaction between incoming and outgoing links from each neuron. We have characterized this model in terms of complex network theory, showing how this, learning rule can be a support For network generation.

  13. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  14. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  15. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  16. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  17. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  18. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  19. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  20. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  1. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  2. A Classification Framework for Large-Scale Face Recognition Systems

    OpenAIRE

    Zhou, Ziheng; Deravi, Farzin

    2009-01-01

    This paper presents a generic classification framework for large-scale face recognition systems. Within the framework, a data sampling strategy is proposed to tackle the data imbalance when image pairs are sampled from thousands of face images for preparing a training dataset. A modified kernel Fisher discriminant classifier is proposed to make it computationally feasible to train the kernel-based classification method using tens of thousands of training samples. The framework is tested in an...

  3. LARGE SCALE METHOD FOR THE PRODUCTION AND PURIFICATION OF CURIUM

    Science.gov (United States)

    Higgins, G.H.; Crane, W.W.T.

    1959-05-19

    A large-scale process for production and purification of Cm/sup 242/ is described. Aluminum slugs containing Am are irradiated and declad in a NaOH-- NaHO/sub 3/ solution at 85 to 100 deg C. The resulting slurry filtered and washed with NaOH, NH/sub 4/OH, and H/sub 2/O. Recovery of Cm from filtrate and washings is effected by an Fe(OH)/sub 3/ precipitation. The precipitates are then combined and dissolved ln HCl and refractory oxides centrifuged out. These oxides are then fused with Na/sub 2/CO/sub 3/ and dissolved in HCl. The solution is evaporated and LiCl solution added. The Cm, rare earths, and anionic impurities are adsorbed on a strong-base anfon exchange resin. Impurities are eluted with LiCl--HCl solution, rare earths and Cm are eluted by HCl. Other ion exchange steps further purify the Cm. The Cm is then precipitated as fluoride and used in this form or further purified and processed. (T.R.H.)

  4. Design techniques for large scale linear measurement systems

    International Nuclear Information System (INIS)

    Candy, J.V.

    1979-03-01

    Techniques to design measurement schemes for systems modeled by large scale linear time invariant systems, i.e., physical systems modeled by a large number (> 5) of ordinary differential equations, are described. The techniques are based on transforming the physical system model to a coordinate system facilitating the design and then transforming back to the original coordinates. An example of a three-stage, four-species, extraction column used in the reprocessing of spent nuclear fuel elements is presented. The basic ideas are briefly discussed in the case of noisy measurements. An example using a plutonium nitrate storage vessel (reprocessing) with measurement uncertainty is also presented

  5. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  6. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  7. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  8. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  9. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  10. Glass badge dosimetry system for large scale personal monitoring

    International Nuclear Information System (INIS)

    Norimichi Juto

    2002-01-01

    Glass Badge using silver activated phosphate glass dosemeter was specially developed for large scale personal monitoring. And dosimetry systems such as an automatic leader and a dose equipment calculation algorithm were developed at once to achieve reasonable personal monitoring. In large scale personal monitoring, both of precision for dosimetry and confidence for lot of personal data handling become very important. The silver activated phosphate glass dosemeter has basically excellent characteristics for dosimetry such as homogeneous and stable sensitivity, negligible fading and so on. Glass Badge was designed to measure 10 keV - 10 MeV range of photon. 300 keV - 3 MeV range of beta, and 0.025 eV - 15 MeV range of neutron by included SSNTD. And developed Glass Badge dosimetry system has not only these basic characteristics but also lot of features to keep good precision for dosimetry and data handling. In this presentation, features of Glass Badge dosimetry systems and examples for practical personal monitoring systems will be presented. (Author)

  11. A convex optimization approach for solving large scale linear systems

    Directory of Open Access Journals (Sweden)

    Debora Cores

    2017-01-01

    Full Text Available The well-known Conjugate Gradient (CG method minimizes a strictly convex quadratic function for solving large-scale linear system of equations when the coefficient matrix is symmetric and positive definite. In this work we present and analyze a non-quadratic convex function for solving any large-scale linear system of equations regardless of the characteristics of the coefficient matrix. For finding the global minimizers, of this new convex function, any low-cost iterative optimization technique could be applied. In particular, we propose to use the low-cost globally convergent Spectral Projected Gradient (SPG method, which allow us to extend this optimization approach for solving consistent square and rectangular linear system, as well as linear feasibility problem, with and without convex constraints and with and without preconditioning strategies. Our numerical results indicate that the new scheme outperforms state-of-the-art iterative techniques for solving linear systems when the symmetric part of the coefficient matrix is indefinite, and also for solving linear feasibility problems.

  12. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  13. Engineering large-scale agent-based systems with consensus

    Science.gov (United States)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  14. Scenario analysis of large scale algae production in tubular photobioreactors

    NARCIS (Netherlands)

    Slegers, P.M.; Beveren, van P.J.M.; Wijffels, R.H.; Straten, van G.; Boxtel, van A.J.B.

    2013-01-01

    Microalgae productivity in tubular photobioreactors depends on algae species, location, tube diameter, biomass concentration, distance between tubes and for vertically stacked systems, the number of horizontal tubes per stack. A simulation model for horizontal and vertically stacked horizontal

  15. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  16. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  17. Large-Scale Traveling Weather Systems in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-10-01

    Between late fall and early spring, Mars’ middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  18. Large-Scale Traveling Weather Systems in Mars Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-01-01

    Between late fall and early spring, Mars' middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  19. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  20. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  1. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  2. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  3. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  4. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  5. Magnetic Properties of Large-Scale Nanostructured Graphene Systems

    DEFF Research Database (Denmark)

    Gregersen, Søren Schou

    The on-going progress in two-dimensional (2D) materials and nanostructure fabrication motivates the study of altered and combined materials. Graphene—the most studied material of the 2D family—displays unique electronic and spintronic properties. Exceptionally high electron mobilities, that surpass...... those in conventional materials such as silicon, make graphene a very interesting material for high-speed electronics. Simultaneously, long spin-diffusion lengths and spin-life times makes graphene an eligible spin-transport channel. In this thesis, we explore fundamental features of nanostructured...... graphene systems using large-scale modeling techniques. Graphene perforations, or antidots, have received substantial interest in the prospect of opening large band gaps in the otherwise gapless graphene. Motivated by recent improvements of fabrication processes, such as forming graphene antidots and layer...

  6. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  7. An integrated system for large scale scanning of nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Bozza, Cristiano, E-mail: kryss@sa.infn.it [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); D’Ambrosio, Nicola [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); De Lellis, Giovanni [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); De Serio, Marilisa [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Di Capua, Francesco [INFN Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Crescenzo, Antonia [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Ferdinando, Donato [INFN Bologna, viale B. Pichat 6/2, Bologna 40127 (Italy); Di Marco, Natalia [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); Esposito, Luigi Salvatore [Laboratori Nazionali del Gran Sasso, now at CERN, Geneva (Switzerland); Fini, Rosa Anna [INFN Bari, via E. Orabona 4, Bari 70125 (Italy); Giacomelli, Giorgio [University of Bologna and INFN, viale B. Pichat 6/2, Bologna 40127 (Italy); Grella, Giuseppe [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); Ieva, Michela [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Kose, Umut [INFN Padova, via Marzolo 8, Padova (PD) 35131 (Italy); Longhin, Andrea; Mauri, Nicoletta [INFN Laboratori Nazionali di Frascati, via E. Fermi 40, Frascati (RM) 00044 (Italy); Medinaceli, Eduardo [University of Padova and INFN, via Marzolo 8, Padova (PD) 35131 (Italy); Monacelli, Piero [University of L' Aquila and INFN, via Vetoio Loc. Coppito, L' Aquila (AQ) 67100 (Italy); Muciaccia, Maria Teresa; Pastore, Alessandra [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); and others

    2013-03-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m{sup 2} to tens of m{sup 2}, acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing.

  8. An integrated system for large scale scanning of nuclear emulsions

    International Nuclear Information System (INIS)

    Bozza, Cristiano; D’Ambrosio, Nicola; De Lellis, Giovanni; De Serio, Marilisa; Di Capua, Francesco; Di Crescenzo, Antonia; Di Ferdinando, Donato; Di Marco, Natalia; Esposito, Luigi Salvatore; Fini, Rosa Anna; Giacomelli, Giorgio; Grella, Giuseppe; Ieva, Michela; Kose, Umut; Longhin, Andrea; Mauri, Nicoletta; Medinaceli, Eduardo; Monacelli, Piero; Muciaccia, Maria Teresa; Pastore, Alessandra

    2013-01-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m 2 to tens of m 2 , acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing

  9. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  10. Logistics of large scale commercial IVF embryo production.

    Science.gov (United States)

    Blondin, P

    2016-01-01

    The use of IVF in agriculture is growing worldwide. This can be explained by the development of better IVF media and techniques, development of sexed semen and the recent introduction of bovine genomics on farms. Being able to perform IVF on a large scale, with multiple on-farm experts to perform ovum pick-up and IVF laboratories capable of handling large volumes in a consistent and sustainable way, remains a huge challenge. To be successful, there has to be a partnership between veterinarians on farms, embryologists in the laboratory and animal owners. Farmers must understand the limits of what IVF can or cannot do under different conditions; veterinarians must manage expectations of farmers once strategies have been developed regarding potential donors; and embryologists must maintain fluent communication with both groups to make sure that objectives are met within predetermined budgets. The logistics of such operations can be very overwhelming, but the return can be considerable if done right. The present mini review describes how such operations can become a reality, with an emphasis on the different aspects that must be considered by all parties.

  11. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  12. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  13. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  14. Financing a large-scale picture archival and communication system.

    Science.gov (United States)

    Goldszal, Alberto F; Bleshman, Michael H; Bryan, R Nick

    2004-01-01

    An attempt to finance a large-scale multi-hospital picture archival and communication system (PACS) solely based on cost savings from current film operations is reported. A modified Request for Proposal described the technical requirements, PACS architecture, and performance targets. The Request for Proposal was complemented by a set of desired financial goals-the main one being the ability to use film savings to pay for the implementation and operation of the PACS. Financing of the enterprise-wide PACS was completed through an operating lease agreement including all PACS equipment, implementation, service, and support for an 8-year term, much like a complete outsourcing. Equipment refreshes, both hardware and software, are included. Our agreement also linked the management of the digital imaging operation (PACS) and the traditional film printing, shifting the operational risks of continued printing and costs related to implementation delays to the PACS vendor. An additional optimization step provided the elimination of the negative film budget variances in the beginning of the project when PACS costs tend to be higher than film and film-related expenses. An enterprise-wide PACS has been adopted to achieve clinical workflow improvements and cost savings. PACS financing was solely based on film savings, which included the entire digital solution (PACS) and any residual film printing. These goals were achieved with simultaneous elimination of any over-budget scenarios providing a non-negative cash flow in each year of an 8-year term.

  15. Pool fires in a large scale ventilation system

    International Nuclear Information System (INIS)

    Smith, P.R.; Leslie, I.H.; Gregory, W.S.; White, B.

    1991-01-01

    A series of pool fire experiments was carried out in the Large Scale Flow Facility of the Mechanical Engineering Department at New Mexico State University. The various experiments burned alcohol, hydraulic cutting oil, kerosene, and a mixture of kerosene and tributylphosphate. Gas temperature and wall temperature measurements as a function of time were made throughout the 23.3m 3 burn compartment and the ducts of the ventilation system. The mass of the smoke particulate deposited upon the ventilation system 0.61m x 0.61m high efficiency particulate air filter for the hydraulic oil, kerosene, and kerosene-tributylphosphate mixture fires was measured using an in situ null balance. Significant increases in filter resistance were observed for all three fuels for burning time periods ranging from 10 to 30 minutes. This was found to be highly dependent upon initial ventilation system flow rate, fuel type, and flow configuration. The experimental results were compared to simulated results predicted by the Los Alamos National Laboratory FIRAC computer code. In general, the experimental and the computer results were in reasonable agreement, despite the fact that the fire compartment for the experiments was an insulated steel tank with 0.32 cm walls, while the compartment model FIRIN of FIRAC assumes 0.31 m thick concrete walls. This difference in configuration apparently caused FIRAC to consistently underpredict the measured temperatures in the fire compartment. The predicted deposition of soot proved to be insensitive to ventilation system flow rate, but the measured values showed flow rate dependence. However, predicted soot deposition was of the same order of magnitude as measured soot deposition

  16. Performance of mushroom fruiting for large scale commercial production

    International Nuclear Information System (INIS)

    Mat Rosol Awang; Rosnani Abdul Rashid; Hassan Hamdani Mutaat; Mohd Meswan Maskom

    2012-01-01

    The paper described the determination of mushroom fruiting yield, which is vital to economics of mushroom production. Consistency in mushroom yields enabling an estimation to be made for revenues and hence profitability could be predicted. It has been reported by many growers, there are a large variation in mushroom yields over different times of production. To assess such claims we have run four batches of mushroom fruiting and the performance fruiting body productions are presented. (author)

  17. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  18. Evaluation of enzymatic reactors for large-scale panose production.

    Science.gov (United States)

    Fernandes, Fabiano A N; Rodrigues, Sueli

    2007-07-01

    Panose is a trisaccharide constituted by a maltose molecule bonded to a glucose molecule by an alpha-1,6-glycosidic bond. This trisaccharide has potential to be used in the food industry as a noncariogenic sweetener, as the oral flora does not ferment it. Panose can also be considered prebiotic for stimulating the growth of benefic microorganisms, such as lactobacillus and bifidobacteria, and for inhibiting the growth of undesired microorganisms such as E. coli and Salmonella. In this paper, the production of panose by enzymatic synthesis in a batch and a fed-batch reactor was optimized using a mathematical model developed to simulate the process. Results show that optimum production is obtained in a fed-batch process with an optimum production of 11.23 g/l h of panose, which is 51.5% higher than production with batch reactor.

  19. Large-scale distribution of tritium in a commercial product

    International Nuclear Information System (INIS)

    Combs, F.; Doda, R.J.

    1979-01-01

    Tritium enters the environment from various sources including nuclear reactor operations, weapons testing, natural production, and from the manufacture, use and ultimate disposal of commercial products containing tritium. A recent commercial application of tritium in the United States of America involves the backlighting of liquid crystal displays (LCD) in digital electronic watches. These watches are distributed through normal commercial channels to the general public. One million curies (MCi) of tritium were distributed in 1977 in this product. This is a significant quantity of tritium compared with power reactor-produced tritium (3MCi yearly) or with naturally produced tritium (6MCi yearly). This is the single largest commercial application involving tritium to date. The final disposition of tritium from large quantities of this product, after its useful life, must be estimated by considering the means of disposal and the possibility of dispersal of tritium concurrent with disposal. The most likely method of final disposition of this product will be disposal in solid refuse; this includes burial in land fills and incineration. Burial in land fills will probably contain the tritium for its effective lifetime, whereas incineration will release all the tritium gas (as the oxide) to the atmosphere. The use and disposal of this product will be studied as part of an environmental study that is at present being prepared for the U.S. Nuclear Regulatory Commission. (author)

  20. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  1. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  2. Large scale production of antitumor cucurbitacins from Ecballium ...

    African Journals Online (AJOL)

    ajl6

    2012-08-16

    Aug 16, 2012 ... 1Department of Plant Biotechnology, National Research Center, Cairo, 12622 Egypt. ... Bioreactor plays a vital role in the commercial production of secondary metabolites .... comparing the peak area with that at the same retention time with ... air dried by rotatory evaporator and then extracted using ethanol:.

  3. Guide to Large Scale Production of Moringa oleifera (Lam.) for ...

    African Journals Online (AJOL)

    Thus, the use of environmentally safe natural pesticides as an alternative is being embraced in aquaculture because they have a short time of toxicity disappearance and biodegradable. This will enhance the principles of sustainable aquaculture production and its management in Nigeria. In Southwestern Nigeria, there is ...

  4. Quality Assurance in Large Scale Online Course Production

    Science.gov (United States)

    Holsombach-Ebner, Cinda

    2013-01-01

    The course design and development process (often referred to here as the "production process") at Embry-Riddle Aeronautical University (ERAU-Worldwide) aims to produce turnkey style courses to be taught by a highly-qualified pool of over 800 instructors. Given the high number of online courses and tremendous number of live sections…

  5. The potential for large scale uses for fission product xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-01-01

    Of all fission products in spent, low enrichment, uranium, power reactor fuels xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the U.S. radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state of the art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much more voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays and luminescence as well as for medicinal diagnostics and therapeutics fission product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly higher atomic weight, because of the much higher concentrations of the 134 X and 136 Xe isotopes. Therefore, fission product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  6. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  7. Large Scale Production of Stem Cells and Their Derivatives

    Science.gov (United States)

    Zweigerdt, Robert

    Stem cells have been envisioned to become an unlimited cell source for regenerative medicine. Notably, the interest in stem cells lies beyond direct therapeutic applications. They might also provide a previously unavailable source of valuable human cell types for screening platforms, which might facilitate the development of more efficient and safer drugs. The heterogeneity of stem cell types as well as the numerous areas of application suggests that differential processes are mandatory for their in vitro culture. Many of the envisioned applications would require the production of a high number of stem cells and their derivatives in scalable, well-defined and potentially clinical compliant manner under current good manufacturing practice (cGMP). In this review we provide an overview on recent strategies to develop bioprocesses for the expansion, differentiation and enrichment of stem cells and their progenies, presenting examples for adult and embryonic stem cells alike.

  8. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  9. Ethanol Production from Biomass: Large Scale Facility Design Project

    Energy Technology Data Exchange (ETDEWEB)

    Berson, R. Eric [Univ. of Louisville, KY (United States)

    2009-10-29

    High solids processing of biomass slurries provides the following benefits: maximized product concentration in the fermentable sugar stream, reduced water usage, and reduced reactor size. However, high solids processing poses mixing and heat transfer problems above about 15% for pretreated corn stover solids due to their high viscosities. Also, highly viscous slurries require high power consumption in conventional stirred tanks since they must be run at high rotational speeds to maintain proper mixing. An 8 liter scraped surface bio-reactor (SSBR) is employed here that is designed to efficiently handle high solids loadings for enzymatic saccharification of pretreated corn stover (PCS) while maintaining power requirements on the order of low viscous liquids in conventional stirred tanks. Saccharification of biomass exhibit slow reaction rates and incomplete conversion, which may be attributed to enzyme deactivation and loss of activity due to a variety of mechanisms. Enzyme deactivation is classified into two categories here: one, deactivation due to enzyme-substrate interactions and two, deactivation due to all other factors that are grouped together and termed “non-specific” deactivation. A study was conducted to investigate the relative extents of “non-specific” deactivation and deactivation due to “enzyme-substrate interactions” and a model was developed that describes the kinetics of cellulose hydrolysis by considering the observed deactivation effects. Enzyme substrate interactions had a much more significant effect on overall deactivation with a deactivation rate constant about 20X higher than the non-specific deactivation rate constant (0.35 h-1 vs 0.018 h-1). The model is well validated by the experimental data and predicts complete conversion of cellulose within 30 hours in the absence of enzyme substrate interactions.

  10. Large Scale System Safety Integration for Human Rated Space Vehicles

    Science.gov (United States)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  11. Buffer provisioning for large-scale data-acquisition systems

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Froening, Holger; Vandelli, Wainer

    2018-01-01

    The data acquisition system of the ATLAS experiment, a major experiment of the Large Hadron Collider (LHC) at CERN, will go through a major upgrade in the next decade. The upgrade is driven by experimental physics requirements, calling for increased data rates on the order of 6~TB/s. By contrast, the data rate of the existing system is 160~GB/s. Among the changes in the upgraded system will be a very large buffer with a projected size on the order of 70 PB. The buffer role will be decoupling of data production from on-line data processing, storing data for periods of up to 24~hours until it can be analyzed by the event processing system. The larger buffer will allow a new data recording strategy, providing additional margins to handle variable data rates. At the same time it will provide sensible trade-offs between buffering space and on-line processing capabilities. This compromise between two resources will be possible since the data production cycle includes time periods where the experiment will not produ...

  12. A large-scale cryoelectronic system for biological sample banking

    Science.gov (United States)

    Shirley, Stephen G.; Durst, Christopher H. P.; Fuchs, Christian C.; Zimmermann, Heiko; Ihmig, Frank R.

    2009-11-01

    We describe a polymorphic electronic infrastructure for managing biological samples stored over liquid nitrogen. As part of this system we have developed new cryocontainers and carrier plates attached to Flash memory chips to have a redundant and portable set of data at each sample. Our experimental investigations show that basic Flash operation and endurance is adequate for the application down to liquid nitrogen temperatures. This identification technology can provide the best sample identification, documentation and tracking that brings added value to each sample. The first application of the system is in a worldwide collaborative research towards the production of an AIDS vaccine. The functionality and versatility of the system can lead to an essential optimization of sample and data exchange for global clinical studies.

  13. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  14. Economic viability of large-scale fusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Helsley, Charles E., E-mail: cehelsley@fusionpowercorporation.com; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system

  15. Economic viability of large-scale fusion systems

    International Nuclear Information System (INIS)

    Helsley, Charles E.; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system economically

  16. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  17. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  18. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  19. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    Science.gov (United States)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  20. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  1. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  2. Large-scale enzymatic production of natural flavour esters in organic solvent with continuous water removal.

    Science.gov (United States)

    Gubicza, L; Kabiri-Badr, A; Keoves, E; Belafi-Bako, K

    2001-11-30

    A new, large-scale process was developed for the enzymatic production of low molecular weight flavour esters in organic solvent. Solutions for the elimination of substrate and product inhibitions are presented. The excess water produced during the process was continuously removed by hetero-azeotropic distillation and esters were produced at yields of over 90%.

  3. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  4. Very-large-scale production of antibodies in plants: The biologization of manufacturing.

    Science.gov (United States)

    Buyel, J F; Twyman, R M; Fischer, R

    2017-07-01

    Gene technology has facilitated the biologization of manufacturing, i.e. the use and production of complex biological molecules and systems at an industrial scale. Monoclonal antibodies (mAbs) are currently the major class of biopharmaceutical products, but they are typically used to treat specific diseases which individually have comparably low incidences. The therapeutic potential of mAbs could also be used for more prevalent diseases, but this would require a massive increase in production capacity that could not be met by traditional fermenter systems. Here we outline the potential of plants to be used for the very-large-scale (VLS) production of biopharmaceutical proteins such as mAbs. We discuss the potential market sizes and their corresponding production capacities. We then consider available process technologies and scale-down models and how these can be used to develop VLS processes. Finally, we discuss which adaptations will likely be required for VLS production, lessons learned from existing cell culture-based processes and the food industry, and practical requirements for the implementation of a VLS process. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Development of Anti-Insect Microencapsulated Polypropylene Films Using a Large Scale Film Coating System.

    Science.gov (United States)

    Song, Ah Young; Choi, Ha Young; Lee, Eun Song; Han, Jaejoon; Min, Sea C

    2018-04-01

    Films containing microencapsulated cinnamon oil (CO) were developed using a large-scale production system to protect against the Indian meal moth (Plodia interpunctella). CO at concentrations of 0%, 0.8%, or 1.7% (w/w ink mixture) was microencapsulated with polyvinyl alcohol. The microencapsulated CO emulsion was mixed with ink (47% or 59%, w/w) and thinner (20% or 25%, w/w) and coated on polypropylene (PP) films. The PP film was then laminated with a low-density polyethylene (LDPE) film on the coated side. The film with microencapsulated CO at 1.7% repelled P. interpunctella most effectively. Microencapsulation did not negatively affect insect repelling activity. The release rate of cinnamaldehyde, an active repellent, was lower when CO was microencapsulated than that in the absence of microencapsulation. Thermogravimetric analysis exhibited that microencapsulation prevented the volatilization of CO. The tensile strength, percentage elongation at break, elastic modulus, and water vapor permeability of the films indicated that microencapsulation did not affect the tensile and moisture barrier properties (P > 0.05). The results of this study suggest that effective films for the prevention of Indian meal moth invasion can be produced by the microencapsulation of CO using a large-scale film production system. Low-density polyethylene-laminated polypropylene films printed with ink incorporating microencapsulated cinnamon oil using a large-scale film production system effectively repelled Indian meal moth larvae. Without altering the tensile and moisture barrier properties of the film, microencapsulation resulted in the release of an active repellent for extended periods with a high thermal stability of cinnamon oil, enabling commercial film production at high temperatures. This anti-insect film system may have applications to other food-packaging films that use the same ink-printing platform. © 2018 Institute of Food Technologists®.

  6. Review of large-scale cryogenic systems for accelerators

    International Nuclear Information System (INIS)

    Horlitz, G.

    1992-01-01

    High energy accelerators close to the TeV region or beyond require in the case of protons high magnetic fields with inductions of several Tesla, and high electric field gradients in the case of electrons. Both types of fields can be realized only by superconductivity. Problems of production of refrigeration and its transportation over long distances are discussed. Different cooling possibilities are described. Solutions for existing machines as well as for future projects being in the stages of construction, planning or proposition are presented. (author) 8 figs.; 2 tabs

  7. Power System Operation with Large Scale Wind Power Integration

    DEFF Research Database (Denmark)

    Suwannarat, A.; Bak-Jensen, B.; Chen, Z.

    2007-01-01

    to the uncertain nature of wind power. In this paper, proposed models of generations and control system are presented which analyze the deviation of power exchange at the western Danish-German border, taking into account the fluctuating nature of wind power. The performance of the secondary control of the thermal......The Danish power system starts to face problems of integrating thousands megawatts of wind power, which produce in a stochastic behavior due to natural wind fluctuations. With wind power capacities increasing, the Danish Transmission System Operator (TSO) is faced with new challenges related...... power plants and the spinning reserves control from the Combined Heat and Power (CHP) units to achieve active power balance with the increased wind power penetration is presented....

  8. Testing of valves and associated systems in large scale experiments

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    The system examples dealt with are selected so that they cover a wide spectrum of technical tasks and limits. Therefore the flowing medium varies from pure steam flow via a mixed flow of steam and water to pure water flow. The valves concerned include those whose main function is opening, and also those whose main function is the secure closing. There is a certain limitation in that the examples are taken from Boiling Water Reactor technology. The main procedure in valve and system testing described is, of course, not limited to the selected examples, but applies generally in powerstation and process technology. (orig./HAG) [de

  9. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  10. Enabling parallel simulation of large-scale HPC network systems

    International Nuclear Information System (INIS)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; Carns, Philip

    2016-01-01

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks used in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations

  11. Energy Constraints for Building Large-Scale Systems

    Science.gov (United States)

    2016-03-17

    although most systems built to date do not consider these issues as primary constraints. Keywords: Neuromorphic Engineering; Cortical Operation...2Mbyte, 32bit input data, and 1Mbyte, 32bit output data, results in 3.1mW (Vdd = 2.5V) of power, even though one might find a DSP chip computing at...4MMAC(/s)/mW power efficiency [5], close to the power / energy efficiency wall [6]. A memory chip or data source further away requires even higher

  12. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  13. Program planning for large-scale control system upgrades

    International Nuclear Information System (INIS)

    Madani, M.; Giajnorio, J.; Richard, T.; Ho, D.; Volk, W.; Ertel, A.

    2011-01-01

    Bruce Power has been planning to replace the Bruce A Fuel Handling (FH) computer systems including the Controller and Protective computers for many years. This is a complex project, requiring an extended FH outage. To minimize operational disruption and fully identify associated project risks, Bruce Power is executing the project in phases starting with the Protective Computer replacement. GEH-C is collaborating with Bruce Power in a Preliminary Engineering (PE) phase to generate a project plan including specifications, budgetary cost, schedule, risks for the Protective computer replacement project. To assist Bruce Power in its evaluation, GEH-C's is using 6-Sigma methodologies to identify and rank Critical to Quality (CTQ) requirements in collaboration with Bruce Power Maintenance, Operations, Plant Design and FH Engineering teams. PE phase established the project scope, hardware and software specifications, material requirements and finally concluded with a recommended hardware platform and approved controls architecture.

  14. Exergy analysis of refrigerators for large scale cooling systems

    Energy Technology Data Exchange (ETDEWEB)

    Loehlein, K [Sulzer Cryogenics, Winterthur (Switzerland); Fukano, T [Nippon Sanso Corp., Kawasaki (Japan)

    1993-01-01

    Facilities with superconducting magnets require cooling capacity at different temperature levels and of different types (refrigeration or liquefaction). The bigger the demand for refrigeration, the more investment for improved efficiency of the refrigeration plant is justified and desired. Refrigeration cycles are built with discrete components like expansion turbines, cold compressors, etc. Therefore the exergetic efficiency for producing refrigeration on a distinct temperature level is significantly dependent on the 'thermodynamic arrangement' of these components. Among a variety of possibilities, limited by the range of applicability of the components, one has to choose the best design for higher efficiency on every level. Some influences are being quantified and aspects are given for a optimal integration of the refrigerator into the whole cooling system. (orig.).

  15. Operation Modeling of Power Systems Integrated with Large-Scale New Energy Power Sources

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-10-01

    Full Text Available In the most current methods of probabilistic power system production simulation, the output characteristics of new energy power generation (NEPG has not been comprehensively considered. In this paper, the power output characteristics of wind power generation and photovoltaic power generation are firstly analyzed based on statistical methods according to their historical operating data. Then the characteristic indexes and the filtering principle of the NEPG historical output scenarios are introduced with the confidence level, and the calculation model of NEPG’s credible capacity is proposed. Based on this, taking the minimum production costs or the best energy-saving and emission-reduction effect as the optimization objective, the power system operation model with large-scale integration of new energy power generation (NEPG is established considering the power balance, the electricity balance and the peak balance. Besides, the constraints of the operating characteristics of different power generation types, the maintenance schedule, the load reservation, the emergency reservation, the water abandonment and the transmitting capacity between different areas are also considered. With the proposed power system operation model, the operation simulations are carried out based on the actual Northwest power grid of China, which resolves the new energy power accommodations considering different system operating conditions. The simulation results well verify the validity of the proposed power system operation model in the accommodation analysis for the power system which is penetrated with large scale NEPG.

  16. Line Capacity Expansion and Transmission Switching in Power Systems With Large-Scale Wind Power

    DEFF Research Database (Denmark)

    Villumsen, Jonas Christoffer; Bronmo, Geir; Philpott, Andy B.

    2013-01-01

    In 2020 electricity production from wind power should constitute nearly 50% of electricity demand in Denmark. In this paper we look at optimal expansion of the transmission network in order to integrate 50% wind power in the system, while minimizing total fixed investment cost and expected cost...... of power generation. We allow for active switching of transmission elements to reduce congestion effects caused by Kirchhoff's voltage law. Results show that actively switching transmission lines may yield a better utilization of transmission networks with large-scale wind power and increase wind power...

  17. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  18. The use of soil moisture - remote sensing products for large-scale groundwater modeling and assessment

    NARCIS (Netherlands)

    Sutanudjaja, E.H.

    2012-01-01

    In this thesis, the possibilities of using spaceborne remote sensing for large-scale groundwater modeling are explored. We focus on a soil moisture product called European Remote Sensing Soil Water Index (ERS SWI, Wagner et al., 1999) - representing the upper profile soil moisture. As a test-bed, we

  19. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  20. Security and VO management capabilities in a large-scale Grid operating system

    OpenAIRE

    Aziz, Benjamin; Sporea, Ioana

    2014-01-01

    This paper presents a number of security and VO management capabilities in a large-scale distributed Grid operating system. The capabilities formed the basis of the design and implementation of a number of security and VO management services in the system. The main aim of the paper is to provide some idea of the various functionality cases that need to be considered when designing similar large-scale systems in the future.

  1. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  2. A compact to revitalise large-scale irrigation systems: A ‘theory of change’ approach

    Directory of Open Access Journals (Sweden)

    Bruce A. Lankford

    2016-02-01

    Full Text Available In countries with transitional economies such as those found in South Asia, large-scale irrigation systems (LSIS with a history of public ownership account for about 115 million ha (Mha or approximately 45% of their total area under irrigation. In terms of the global area of irrigation (320 Mha for all countries, LSIS are estimated at 130 Mha or 40% of irrigated land. These systems can potentially deliver significant local, regional and global benefits in terms of food, water and energy security, employment, economic growth and ecosystem services. For example, primary crop production is conservatively valued at about US$355 billion. However, efforts to enhance these benefits and reform the sector have been costly and outcomes have been underwhelming and short-lived. We propose the application of a 'theory of change' (ToC as a foundation for promoting transformational change in large-scale irrigation centred upon a 'global irrigation compact' that promotes new forms of leadership, partnership and ownership (LPO. The compact argues that LSIS can change by switching away from the current channelling of aid finances controlled by government irrigation agencies. Instead it is for irrigators, closely partnered by private, public and NGO advisory and regulatory services, to develop strong leadership models and to find new compensatory partnerships with cities and other river basin neighbours. The paper summarises key assumptions for change in the LSIS sector including the need to initially test this change via a handful of volunteer systems. Our other key purpose is to demonstrate a ToC template by which large-scale irrigation policy can be better elaborated and discussed.

  3. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  4. The linac control system for the large-scale synchrotron radiation facility (SPring-8)

    Energy Technology Data Exchange (ETDEWEB)

    Sakaki, Hironao; Yoshikawa, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Itoh, Yuichi [Atomic Energy General Services Corporation, Tokai, Ibaraki (Japan); Terashima, Yasushi [Information Technology System Co., Ltd. (ITECS), Tokyo (Japan)

    2000-09-01

    The linac for large-scale synchrotron radiation facilities has been operated since August of 1996. The linac deal with the user requests without any big troubles. In this report, the control system development policy, details, and the operation for the linac are presented. It is also described so that these experiences can be used for control system of a large scale proton accelerators which will be developed in the High Intensity Proton Accelerator Project. (author)

  5. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  6. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  7. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  8. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    OpenAIRE

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been...

  9. Development of large scale wind energy conservation system. Development of large scale wind energy conversion system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for development of large scale wind energy conversion system. The study on technological development of key components evaluates performance of, and confirms reliability and applicability of, hydraulic systems centered by those equipped with variable pitch mechanisms and electrohydraulic servo valves that control them. The study on blade conducts fatigue and crack-propagation tests, which show that the blades developed have high strength. The study on speed-increasing gear conducts load tests, confirming the effects of reducing vibration and noise by modification of the gear teeth. The study on NACELLE cover conducts vibration tests to confirm its vibration characteristics, and analyzes three-dimensional vibration by the finite element method. Some components for a 500kW commercial wind mill are fabricated, including rotor heads, variable pitch mechanisms, speed-increasing gears, YAW systems, and hydraulic control systems. The others fabricated include a remote supervisory control system for maintenance, system to integrate the wind mill into a power system, and electrical control devices in which site conditions, such as atmospheric temperature and lightening, are taken into consideration.

  10. Constructing Model of Relationship among Behaviors and Injuries to Products Based on Large Scale Text Data on Injuries

    Science.gov (United States)

    Nomori, Koji; Kitamura, Koji; Motomura, Yoichi; Nishida, Yoshifumi; Yamanaka, Tatsuhiro; Komatsubara, Akinori

    In Japan, childhood injury prevention is urgent issue. Safety measures through creating knowledge of injury data are essential for preventing childhood injuries. Especially the injury prevention approach by product modification is very important. The risk assessment is one of the most fundamental methods to design safety products. The conventional risk assessment has been carried out subjectively because product makers have poor data on injuries. This paper deals with evidence-based risk assessment, in which artificial intelligence technologies are strongly needed. This paper describes a new method of foreseeing usage of products, which is the first step of the evidence-based risk assessment, and presents a retrieval system of injury data. The system enables a product designer to foresee how children use a product and which types of injuries occur due to the product in daily environment. The developed system consists of large scale injury data, text mining technology and probabilistic modeling technology. Large scale text data on childhood injuries was collected from medical institutions by an injury surveillance system. Types of behaviors to a product were derived from the injury text data using text mining technology. The relationship among products, types of behaviors, types of injuries and characteristics of children was modeled by Bayesian Network. The fundamental functions of the developed system and examples of new findings obtained by the system are reported in this paper.

  11. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  12. Backup flexibility classes in emerging large-scale renewable electricity systems

    International Nuclear Information System (INIS)

    Schlachtberger, D.P.; Becker, S.; Schramm, S.; Greiner, M.

    2016-01-01

    Highlights: • Flexible backup demand in a European wind and solar based power system is modelled. • Three flexibility classes are defined based on production and consumption timescales. • Seasonal backup capacities are shown to be only used below 50% renewable penetration. • Large-scale transmission between countries can reduce fast flexible capacities. - Abstract: High shares of intermittent renewable power generation in a European electricity system will require flexible backup power generation on the dominant diurnal, synoptic, and seasonal weather timescales. The same three timescales are already covered by today’s dispatchable electricity generation facilities, which are able to follow the typical load variations on the intra-day, intra-week, and seasonal timescales. This work aims to quantify the changing demand for those three backup flexibility classes in emerging large-scale electricity systems, as they transform from low to high shares of variable renewable power generation. A weather-driven modelling is used, which aggregates eight years of wind and solar power generation data as well as load data over Germany and Europe, and splits the backup system required to cover the residual load into three flexibility classes distinguished by their respective maximum rates of change of power output. This modelling shows that the slowly flexible backup system is dominant at low renewable shares, but its optimized capacity decreases and drops close to zero once the average renewable power generation exceeds 50% of the mean load. The medium flexible backup capacities increase for modest renewable shares, peak at around a 40% renewable share, and then continuously decrease to almost zero once the average renewable power generation becomes larger than 100% of the mean load. The dispatch capacity of the highly flexible backup system becomes dominant for renewable shares beyond 50%, and reach their maximum around a 70% renewable share. For renewable shares

  13. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  14. Metoder for Modellering, Simulering og Regulering af Større Termiske Processer anvendt i Sukkerproduktion. Methods for Modelling, Simulation and Control of Large Scale Thermal Systems Applied in Sugar Production

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    The subject of this Ph.D. thesis is to investigate and develop methods for modelling, simulation and control applicable in large scale termal industrial plants. An ambition has been to evaluate the results in a physical process. Sugar production is well suited for the purpose. In collaboration...... simulator has been developed. The simulator handles the normal working conditions relevant to control engineers. A non-linear dynamic model based on mass and energy balances has been developed. The model parameters have been adjusted to data measured on a Danish sugar plant. The simulator consists...... of a computer, a data terminal and an electric interface corresponding to the interface at the sugar plant. The simulator is operating in realtime and thus a realistic test of controllers is possible. The idiomatic control methodology has been investigated developing a control concept for the evaporation...

  15. Distributed and hierarchical control techniques for large-scale power plant systems

    International Nuclear Information System (INIS)

    Raju, G.V.S.; Kisner, R.A.

    1985-08-01

    In large-scale systems, integrated and coordinated control functions are required to maximize plant availability, to allow maneuverability through various power levels, and to meet externally imposed regulatory limitations. Nuclear power plants are large-scale systems. Prime subsystems are those that contribute directly to the behavior of the plant's ultimate output. The prime subsystems in a nuclear power plant include reactor, primary and intermediate heat transport, steam generator, turbine generator, and feedwater system. This paper describes and discusses the continuous-variable control system developed to supervise prime plant subsystems for optimal control and coordination

  16. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  17. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  18. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  19. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  20. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  1. LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY

    International Nuclear Information System (INIS)

    SCHULTZ, K.R.; BROWN, L.C.; BESENBRUCH, G.E.; HAMILTON, C.J.

    2003-01-01

    OAK B202 LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY. The ''Hydrogen Economy'' will reduce petroleum imports and greenhouse gas emissions. However, current commercial hydrogen production processes use fossil fuels and releases carbon dioxide. Hydrogen produced from nuclear energy could avoid these concerns. The authors have recently completed a three-year project for the US Department of Energy whose objective was to ''define an economically feasible concept for production of hydrogen, by nuclear means, using an advanced high-temperature nuclear reactor as the energy source''. Thermochemical water-splitting, a chemical process that accomplishes the decomposition of water into hydrogen and oxygen, met this objective. The goal of the first phase of this study was to evaluate thermochemical processes which offer the potential for efficient, cost-effective, large-scale production of hydrogen and to select one for further detailed consideration. The authors selected the Sulfur-Iodine cycle, In the second phase, they reviewed all the basic reactor types for suitability to provide the high temperature heat needed by the selected thermochemical water splitting cycle and chose the helium gas-cooled reactor. In the third phase they designed the chemical flowsheet for the thermochemical process and estimated the efficiency and cost of the process and the projected cost of producing hydrogen. These results are summarized in this paper

  2. Dynamic Reactive Power Compensation of Large Scale Wind Integrated Power System

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    wind turbines especially wind farms with additional grid support functionalities like dynamic support (e,g dynamic reactive power support etc.) and ii) refurbishment of existing conventional central power plants to synchronous condensers could be one of the efficient, reliable and cost effective option......Due to progressive displacement of conventional power plants by wind turbines, dynamic security of large scale wind integrated power systems gets significantly compromised. In this paper we first highlight the importance of dynamic reactive power support/voltage security in large scale wind...... integrated power systems with least presence of conventional power plants. Then we propose a mixed integer dynamic optimization based method for optimal dynamic reactive power allocation in large scale wind integrated power systems. One of the important aspects of the proposed methodology is that unlike...

  3. How to correct long-term system externality of large scale wind power development by a capacity mechanism?

    International Nuclear Information System (INIS)

    Cepeda, Mauricio; Finon, Dominique

    2013-04-01

    This paper deals with the practical problems related to long-term security of supply in electricity markets in the presence of large-scale wind power development. The success of renewable promotion schemes adds a new dimension to ensuring long-term security of supply. It necessitates designing second-best policies to prevent large-scale wind power development from distorting long-run equilibrium prices and investments in conventional generation and in particular in peaking units. We rely upon a long-term simulation model which simulates electricity market players' investment decisions in a market regime and incorporates large-scale wind power development either in the presence of either subsidised wind production or in market-driven development. We test the use of capacity mechanisms to compensate for the long-term effects of large-scale wind power development on the system reliability. The first finding is that capacity mechanisms can help to reduce the social cost of large scale wind power development in terms of decrease of loss of load probability. The second finding is that, in a market-based wind power deployment without subsidy, wind generators are penalized for insufficient contribution to the long term system's reliability. (authors)

  4. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  5. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  6. Model of large scale man-machine systems with an application to vessel traffic control

    NARCIS (Netherlands)

    Wewerinke, P.H.; van der Ent, W.I.; ten Hove, D.

    1989-01-01

    Mathematical models are discussed to deal with complex large-scale man-machine systems such as vessel (air, road) traffic and process control systems. Only interrelationships between subsystems are assumed. Each subsystem is controlled by a corresponding human operator (HO). Because of the

  7. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  8. Compensating active power imbalances in power system with large-scale wind power penetration

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2016-01-01

    Large-scale wind power penetration can affectthe supply continuity in the power system. This is a matterof high priority to investigate, as more regulating reservesand specified control strategies for generation control arerequired in the future power system with even more highwind power penetrat...

  9. A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Lipeng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Feiyi [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Oral, H. Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cao, Qing [Univ. of Tennessee, Knoxville, TN (United States)

    2014-11-01

    High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storage systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results

  10. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  11. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  12. System Dynamics Simulation of Large-Scale Generation System for Designing Wind Power Policy in China

    Directory of Open Access Journals (Sweden)

    Linna Hou

    2015-01-01

    Full Text Available This paper focuses on the impacts of renewable energy policy on a large-scale power generation system, including thermal power, hydropower, and wind power generation. As one of the most important clean energy, wind energy has been rapidly developed in the world. But in recent years there is a serious waste of wind power equipment and investment in China leading to many problems in the industry from wind power planning to its integration. One way overcoming the difficulty is to analyze the influence of wind power policy on a generation system. This paper builds a system dynamics (SD model of energy generation to simulate the results of wind energy generation policies based on a complex system. And scenario analysis method is used to compare the effectiveness and efficiency of these policies. The case study shows that the combinations of lower portfolio goal and higher benchmark price and those of higher portfolio goal and lower benchmark price have large differences in both effectiveness and efficiency. On the other hand, the combinations of uniformly lower or higher portfolio goal and benchmark price have similar efficiency, but different effectiveness. Finally, an optimal policy combination can be chosen on the basis of policy analysis in the large-scale power system.

  13. Understanding water delivery performance in a large-scale irrigation system in Peru

    NARCIS (Netherlands)

    Vos, J.M.C.

    2005-01-01

    During a two-year field study the performance of the water delivery was evaluated in a large-scale irrigation system on the north coast of Peru. Flow measurements were carried out along the main canals, along two secondary canals, and in two tertiary blocks in the Chancay-Lambayeque irrigation

  14. Model Predictive Control for Flexible Power Consumption of Large-Scale Refrigeration Systems

    DEFF Research Database (Denmark)

    Shafiei, Seyed Ehsan; Stoustrup, Jakob; Rasmussen, Henrik

    2014-01-01

    A model predictive control (MPC) scheme is introduced to directly control the electrical power consumption of large-scale refrigeration systems. Deviation from the baseline of the consumption is corresponded to the storing and delivering of thermal energy. By virtue of such correspondence...

  15. A central solar-industrial waste heat heating system with large scale borehole thermal storage

    NARCIS (Netherlands)

    Guo, F.; Yang, X.; Xu, L.; Torrens, I.; Hensen, J.L.M.

    2017-01-01

    In this paper, a new research of seasonal thermal storage is introduced. This study aims to maximize the utilization of renewable energy source and industrial waste heat (IWH) for urban district heating systems in both heating and non-heating seasons through the use of large-scale seasonal thermal

  16. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    Science.gov (United States)

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Luminescence property and large-scale production of ZnO nanowires by current heating deposition

    International Nuclear Information System (INIS)

    Singjai, P.; Jintakosol, T.; Singkarat, S.; Choopun, S.

    2007-01-01

    Large-scale production for ZnO nanowires has been demonstrated by current heating deposition. Based on the use of a solid-vapor phase carbothermal sublimation technique, a ZnO-graphite mixed rod was placed between two copper bars and gradually heated by passing current through it under constant flowing of argon gas at atmospheric pressure. The product seen as white films deposited on the rod surface was separated for further characterizations. The results have shown mainly comb-like structures of ZnO nanowires in diameter ranging from 50 to 200 nm and length up to several tens micrometers. From optical testing, ionoluminescence spectra of as-grown and annealed samples have shown high green emission intensities centered at 510 nm. In contrast, the small UV peak centered at 390 nm was observed clearly in the as-grown sample which almost disappeared after the annealing treatment

  18. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  19. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  20. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  1. Thermal System Analysis and Optimization of Large-Scale Compressed Air Energy Storage (CAES

    Directory of Open Access Journals (Sweden)

    Zhongguang Fu

    2015-08-01

    Full Text Available As an important solution to issues regarding peak load and renewable energy resources on grids, large-scale compressed air energy storage (CAES power generation technology has recently become a popular research topic in the area of large-scale industrial energy storage. At present, the combination of high-expansion ratio turbines with advanced gas turbine technology is an important breakthrough in energy storage technology. In this study, a new gas turbine power generation system is coupled with current CAES technology. Moreover, a thermodynamic cycle system is optimized by calculating for the parameters of a thermodynamic system. Results show that the thermal efficiency of the new system increases by at least 5% over that of the existing system.

  2. Large-scale production and properties of human plasma-derived activated Factor VII concentrate.

    Science.gov (United States)

    Tomokiyo, K; Yano, H; Imamura, M; Nakano, Y; Nakagaki, T; Ogata, Y; Terano, T; Miyamoto, S; Funatsu, A

    2003-01-01

    An activated Factor VII (FVIIa) concentrate, prepared from human plasma on a large scale, has to date not been available for clinical use for haemophiliacs with antibodies against FVIII and FIX. In the present study, we attempted to establish a large-scale manufacturing process to obtain plasma-derived FVIIa concentrate with high recovery and safety, and to characterize its biochemical and biological properties. FVII was purified from human cryoprecipitate-poor plasma, by a combination of anion exchange and immunoaffinity chromatography, using Ca2+-dependent anti-FVII monoclonal antibody. To activate FVII, a FVII preparation that was nanofiltered using a Bemberg Microporous Membrane-15 nm was partially converted to FVIIa by autoactivation on an anion-exchange resin. The residual FVII in the FVII and FVIIa mixture was completely activated by further incubating the mixture in the presence of Ca2+ for 18 h at 10 degrees C, without any additional activators. For preparation of the FVIIa concentrate, after dialysis of FVIIa against 20 mm citrate, pH 6.9, containing 13 mm glycine and 240 mm NaCl, the FVIIa preparation was supplemented with 2.5% human albumin (which was first pasteurized at 60 degrees C for 10 h) and lyophilized in vials. To inactivate viruses contaminating the FVIIa concentrate, the lyophilized product was further heated at 65 degrees C for 96 h in a water bath. Total recovery of FVII from 15 000 l of plasma was approximately 40%, and the FVII preparation was fully converted to FVIIa with trace amounts of degraded products (FVIIabeta and FVIIagamma). The specific activity of the FVIIa was approximately 40 U/ micro g. Furthermore, virus-spiking tests demonstrated that immunoaffinity chromatography, nanofiltration and dry-heating effectively removed and inactivated the spiked viruses in the FVIIa. These results indicated that the FVIIa concentrate had both high specific activity and safety. We established a large-scale manufacturing process of human plasma

  3. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  4. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    security limits. Under such scenario, progressive displacement of conventional generation by wind generation is expected to eventually lead a complex power system with least presence of central power plants. Consequently the support from conventional power plants is expected to reach its all-time low...... system voltage control responsibility from conventional power plants to wind turbines. With increased wind penetration and displaced conventional central power plants, dynamic voltage security has been identified as one of the challenging issue for large scale wind integration. To address the dynamic...... security issue, a WAMS based systematic voltage control scheme for large scale wind integrated power system has been proposed. Along with the optimal reactive power compensation, the proposed scheme considers voltage support from wind farms (equipped with voltage support functionality) and refurbished...

  5. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  6. Large-Scale Selection and Breeding To Generate Industrial Yeasts with Superior Aroma Production

    Science.gov (United States)

    Steensels, Jan; Meersman, Esther; Snoek, Tim; Saels, Veerle

    2014-01-01

    The concentrations and relative ratios of various aroma compounds produced by fermenting yeast cells are essential for the sensory quality of many fermented foods, including beer, bread, wine, and sake. Since the production of these aroma-active compounds varies highly among different yeast strains, careful selection of variants with optimal aromatic profiles is of crucial importance for a high-quality end product. This study evaluates the production of different aroma-active compounds in 301 different Saccharomyces cerevisiae, Saccharomyces paradoxus, and Saccharomyces pastorianus yeast strains. Our results show that the production of key aroma compounds like isoamyl acetate and ethyl acetate varies by an order of magnitude between natural yeasts, with the concentrations of some compounds showing significant positive correlation, whereas others vary independently. Targeted hybridization of some of the best aroma-producing strains yielded 46 intraspecific hybrids, of which some show a distinct heterosis (hybrid vigor) effect and produce up to 45% more isoamyl acetate than the best parental strains while retaining their overall fermentation performance. Together, our results demonstrate the potential of large-scale outbreeding to obtain superior industrial yeasts that are directly applicable for commercial use. PMID:25192996

  7. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  8. Study of multi-functional precision optical measuring system for large scale equipment

    Science.gov (United States)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  9. Incipient multiple fault diagnosis in real time with applications to large-scale systems

    International Nuclear Information System (INIS)

    Chung, H.Y.; Bien, Z.; Park, J.H.; Seon, P.H.

    1994-01-01

    By using a modified signed directed graph (SDG) together with the distributed artificial neutral networks and a knowledge-based system, a method of incipient multi-fault diagnosis is presented for large-scale physical systems with complex pipes and instrumentations such as valves, actuators, sensors, and controllers. The proposed method is designed so as to (1) make a real-time incipient fault diagnosis possible for large-scale systems, (2) perform the fault diagnosis not only in the steady-state case but also in the transient case as well by using a concept of fault propagation time, which is newly adopted in the SDG model, (3) provide with highly reliable diagnosis results and explanation capability of faults diagnosed as in an expert system, and (4) diagnose the pipe damage such as leaking, break, or throttling. This method is applied for diagnosis of a pressurizer in the Kori Nuclear Power Plant (NPP) unit 2 in Korea under a transient condition, and its result is reported to show satisfactory performance of the method for the incipient multi-fault diagnosis of such a large-scale system in a real-time manner

  10. A Decentralized Multivariable Robust Adaptive Voltage and Speed Regulator for Large-Scale Power Systems

    Science.gov (United States)

    Okou, Francis A.; Akhrif, Ouassima; Dessaint, Louis A.; Bouchard, Derrick

    2013-05-01

    This papter introduces a decentralized multivariable robust adaptive voltage and frequency regulator to ensure the stability of large-scale interconnnected generators. Interconnection parameters (i.e. load, line and transormer parameters) are assumed to be unknown. The proposed design approach requires the reformulation of conventiaonal power system models into a multivariable model with generator terminal voltages as state variables, and excitation and turbine valve inputs as control signals. This model, while suitable for the application of modern control methods, introduces problems with regards to current design techniques for large-scale systems. Interconnection terms, which are treated as perturbations, do not meet the common matching condition assumption. A new adaptive method for a certain class of large-scale systems is therefore introduces that does not require the matching condition. The proposed controller consists of nonlinear inputs that cancel some nonlinearities of the model. Auxiliary controls with linear and nonlinear components are used to stabilize the system. They compensate unknown parametes of the model by updating both the nonlinear component gains and excitation parameters. The adaptation algorithms involve the sigma-modification approach for auxiliary control gains, and the projection approach for excitation parameters to prevent estimation drift. The computation of the matrix-gain of the controller linear component requires the resolution of an algebraic Riccati equation and helps to solve the perturbation-mismatching problem. A realistic power system is used to assess the proposed controller performance. The results show that both stability and transient performance are considerably improved following a severe contingency.

  11. Development of large scale and wind energy conservation system. Operational studies on a large-scale wind energy conservation system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no unten kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for operational studies on a large-scale wind energy conversion system. A total of 8 domestic and foreign cases are studied for wind energy conversion cost, to clarify the causes for higher cost of the Japanese system. The wind power systems studied include Japanese (5 units at Tappi Wind Park, the same type supplied by company M), US (California Wind Farm, 300 units) and UK (Wales Wind Farm, 103 units) systems. The investment costs are 639, 285 and 189 thousand yen/kW for the Japanese, US and UK systems, respectively. It is also revealed that the power plant itself and assembling costs account for a majority (70 to 88%) of the total investment cost. The higher cost of the Japanese system results from a smaller number of units installed, and the power plant cost can be drastically reduced by mass production. Increasing size also reduces cost greatly.

  12. A Dynamic Optimization Strategy for the Operation of Large Scale Seawater Reverses Osmosis System

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available In this work, an efficient strategy was proposed for efficient solution of the dynamic model of SWRO system. Since the dynamic model is formulated by a set of differential-algebraic equations, simultaneous strategies based on collocations on finite element were used to transform the DAOP into large scale nonlinear programming problem named Opt2. Then, simulation of RO process and storage tanks was carried element by element and step by step with fixed control variables. All the obtained values of these variables then were used as the initial value for the optimal solution of SWRO system. Finally, in order to accelerate the computing efficiency and at the same time to keep enough accuracy for the solution of Opt2, a simple but efficient finite element refinement rule was used to reduce the scale of Opt2. The proposed strategy was applied to a large scale SWRO system with 8 RO plants and 4 storage tanks as case study. Computing result shows that the proposed strategy is quite effective for optimal operation of the large scale SWRO system; the optimal problem can be successfully solved within decades of iterations and several minutes when load and other operating parameters fluctuate.

  13. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  14. Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity

    Science.gov (United States)

    Morris, C. K.; Knighton, J.

    2017-12-01

    Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.

  15. Polymerase-endonuclease amplification reaction (PEAR for large-scale enzymatic production of antisense oligonucleotides.

    Directory of Open Access Journals (Sweden)

    Xiaolong Wang

    Full Text Available Antisense oligonucleotides targeting microRNAs or their mRNA targets prove to be powerful tools for molecular biology research and may eventually emerge as new therapeutic agents. Synthetic oligonucleotides are often contaminated with highly homologous failure sequences. Synthesis of a certain oligonucleotide is difficult to scale up because it requires expensive equipment, hazardous chemicals and a tedious purification process. Here we report a novel thermocyclic reaction, polymerase-endonuclease amplification reaction (PEAR, for the amplification of oligonucleotides. A target oligonucleotide and a tandem repeated antisense probe are subjected to repeated cycles of denaturing, annealing, elongation and cleaving, in which thermostable DNA polymerase elongation and strand slipping generate duplex tandem repeats, and thermostable endonuclease (PspGI cleavage releases monomeric duplex oligonucleotides. Each round of PEAR achieves over 100-fold amplification. The product can be used in one more round of PEAR directly, and the process can be further repeated. In addition to avoiding dangerous materials and improved product purity, this reaction is easy to scale up and amenable to full automation. PEAR has the potential to be a useful tool for large-scale production of antisense oligonucleotide drugs.

  16. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    DEFF Research Database (Denmark)

    Jensen, Tue Vissing; Pinson, Pierre

    2017-01-01

    , we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven...... to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecastingof renewable power generation....

  17. TensorFlow: A system for large-scale machine learning

    OpenAIRE

    Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.

    2016-01-01

    TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. TensorFlow uses dataflow graphs to represent computation, shared state, and the operations that mutate that state. It maps the nodes of a dataflow graph across many machines in a cluster, and within a machine across multiple computational devices, including multicore CPUs, general-purpose GPUs, and custom designed ASICs known as Tensor Processing Units (TPUs). This architecture gives flexib...

  18. Evolutionary Hierarchical Multi-Criteria Metaheuristics for Scheduling in Large-Scale Grid Systems

    CERN Document Server

    Kołodziej, Joanna

    2012-01-01

    One of the most challenging issues in modelling today's large-scale computational systems is to effectively manage highly parametrised distributed environments such as computational grids, clouds, ad hoc networks and P2P networks. Next-generation computational grids must provide a wide range of services and high performance computing infrastructures. Various types of information and data processed in the large-scale dynamic grid environment may be incomplete, imprecise, and fragmented, which complicates the specification of proper evaluation criteria and which affects both the availability of resources and the final collective decisions of users. The complexity of grid architectures and grid management may also contribute towards higher energy consumption. All of these issues necessitate the development of intelligent resource management techniques, which are capable of capturing all of this complexity and optimising meaningful metrics for a wide range of grid applications.   This book covers hot topics in t...

  19. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  20. Modeling the impact of large-scale energy conversion systems on global climate

    International Nuclear Information System (INIS)

    Williams, J.

    There are three energy options which could satisfy a projected energy requirement of about 30 TW and these are the solar, nuclear and (to a lesser extent) coal options. Climate models can be used to assess the impact of large scale deployment of these options. The impact of waste heat has been assessed using energy balance models and general circulation models (GCMs). Results suggest that the impacts are significant when the heat imput is very high and studies of more realistic scenarios are required. Energy balance models, radiative-convective models and a GCM have been used to study the impact of doubling the atmospheric CO 2 concentration. State-of-the-art models estimate a surface temperature increase of 1.5-3.0 0 C with large amplification near the poles, but much uncertainty remains. Very few model studies have been made of the impact of particles on global climate, more information on the characteristics of particle input are required. The impact of large-scale deployment of solar energy conversion systems has received little attention but model studies suggest that large scale changes in surface characteristics associated with such systems (surface heat balance, roughness and hydrological characteristics and ocean surface temperature) could have significant global climatic effects. (Auth.)

  1. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  2. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks

    Directory of Open Access Journals (Sweden)

    Guangwen Fan

    2015-09-01

    Full Text Available Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications.

  3. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks.

    Science.gov (United States)

    Fan, Guangwen; Shen, Yu; Hao, Xiaowei; Yuan, Zongming; Zhou, Zhi

    2015-09-18

    Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG) storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications.

  4. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    Science.gov (United States)

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be ≈50% α-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices. PMID:19581598

  5. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  6. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  7. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems

    OpenAIRE

    Abadi, Martín; Agarwal, Ashish; Barham, Paul; Brevdo, Eugene; Chen, Zhifeng; Citro, Craig; Corrado, Greg S.; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Goodfellow, Ian; Harp, Andrew; Irving, Geoffrey; Isard, Michael

    2016-01-01

    TensorFlow is an interface for expressing machine learning algorithms, and an implementation for executing such algorithms. A computation expressed using TensorFlow can be executed with little or no change on a wide variety of heterogeneous systems, ranging from mobile devices such as phones and tablets up to large-scale distributed systems of hundreds of machines and thousands of computational devices such as GPU cards. The system is flexible and can be used to express a wide variety of algo...

  8. A fiber-optic ice detection system for large-scale wind turbine blades

    Science.gov (United States)

    Kim, Dae-gil; Sampath, Umesh; Kim, Hyunjin; Song, Minho

    2017-09-01

    Icing causes substantial problems in the integrity of large-scale wind turbines. In this work, a fiber-optic sensor system for detection of icing with an arrayed waveguide grating is presented. The sensor system detects Fresnel reflections from the ends of the fibers. The transition in Fresnel reflection due to icing gives peculiar intensity variations, which categorizes the ice, the water, and the air medium on the wind turbine blades. From the experimental results, with the proposed sensor system, the formation of icing conditions and thickness of ice were identified successfully in real time.

  9. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  10. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  11. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  12. Rain forest nutrient cycling and productivity in response to large-scale litter manipulation.

    Science.gov (United States)

    Wood, Tana E; Lawrence, Deborah; Clark, Deborah A; Chazdon, Robin L

    2009-01-01

    Litter-induced pulses of nutrient availability could play an important role in the productivity and nutrient cycling of forested ecosystems, especially tropical forests. Tropical forests experience such pulses as a result of wet-dry seasonality and during major climatic events, such as strong El Niños. We hypothesized that (1) an increase in the quantity and quality of litter inputs would stimulate leaf litter production, woody growth, and leaf litter nutrient cycling, and (2) the timing and magnitude of this response would be influenced by soil fertility and forest age. To test these hypotheses in a Costa Rican wet tropical forest, we established a large-scale litter manipulation experiment in two secondary forest sites and four old-growth forest sites of differing soil fertility. In replicated plots at each site, leaves and twigs (forest floor. We analyzed leaf litter mass, [N] and [P], and N and P inputs for addition, removal, and control plots over a two-year period. We also evaluated basal area increment of trees in removal and addition plots. There was no response of forest productivity or nutrient cycling to litter removal; however, litter addition significantly increased leaf litter production and N and P inputs 4-5 months following litter application. Litter production increased as much as 92%, and P and N inputs as much as 85% and 156%, respectively. In contrast, litter manipulation had no significant effect on woody growth. The increase in leaf litter production and N and P inputs were significantly positively related to the total P that was applied in litter form. Neither litter treatment nor forest type influenced the temporal pattern of any of the variables measured. Thus, environmental factors such as rainfall drive temporal variability in litter and nutrient inputs, while nutrient release from decomposing litter influences the magnitude. Seasonal or annual variation in leaf litter mass, such as occurs in strong El Niño events, could positively

  13. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  14. Large-scale gene function analysis with the PANTHER classification system.

    Science.gov (United States)

    Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D

    2013-08-01

    The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.

  15. Data management strategies for multinational large-scale systems biology projects.

    Science.gov (United States)

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  16. An Axiomatic Analysis Approach for Large-Scale Disaster-Tolerant Systems Modeling

    Directory of Open Access Journals (Sweden)

    Theodore W. Manikas

    2011-02-01

    Full Text Available Disaster tolerance in computing and communications systems refers to the ability to maintain a degree of functionality throughout the occurrence of a disaster. We accomplish the incorporation of disaster tolerance within a system by simulating various threats to the system operation and identifying areas for system redesign. Unfortunately, extremely large systems are not amenable to comprehensive simulation studies due to the large computational complexity requirements. To address this limitation, an axiomatic approach that decomposes a large-scale system into smaller subsystems is developed that allows the subsystems to be independently modeled. This approach is implemented using a data communications network system example. The results indicate that the decomposition approach produces simulation responses that are similar to the full system approach, but with greatly reduced simulation time.

  17. Scalable multi-objective control for large scale water resources systems under uncertainty

    Science.gov (United States)

    Giuliani, Matteo; Quinn, Julianne; Herman, Jonathan; Castelletti, Andrea; Reed, Patrick

    2016-04-01

    The use of mathematical models to support the optimal management of environmental systems is rapidly expanding over the last years due to advances in scientific knowledge of the natural processes, efficiency of the optimization techniques, and availability of computational resources. However, undergoing changes in climate and society introduce additional challenges for controlling these systems, ultimately motivating the emergence of complex models to explore key causal relationships and dependencies on uncontrolled sources of variability. In this work, we contribute a novel implementation of the evolutionary multi-objective direct policy search (EMODPS) method for controlling environmental systems under uncertainty. The proposed approach combines direct policy search (DPS) with hierarchical parallelization of multi-objective evolutionary algorithms (MOEAs) and offers a threefold advantage: the DPS simulation-based optimization can be combined with any simulation model and does not add any constraint on modeled information, allowing the use of exogenous information in conditioning the decisions. Moreover, the combination of DPS and MOEAs prompts the generation or Pareto approximate set of solutions for up to 10 objectives, thus overcoming the decision biases produced by cognitive myopia, where narrow or restrictive definitions of optimality strongly limit the discovery of decision relevant alternatives. Finally, the use of large-scale MOEAs parallelization improves the ability of the designed solutions in handling the uncertainty due to severe natural variability. The proposed approach is demonstrated on a challenging water resources management problem represented by the optimal control of a network of four multipurpose water reservoirs in the Red River basin (Vietnam). As part of the medium-long term energy and food security national strategy, four large reservoirs have been constructed on the Red River tributaries, which are mainly operated for hydropower

  18. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    ) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.

  19. A fast and optimized dynamic economic load dispatch for large scale power systems

    International Nuclear Information System (INIS)

    Musse Mohamud Ahmed; Mohd Ruddin Ab Ghani; Ismail Hassan

    2000-01-01

    This paper presents Lagrangian Multipliers (LM) and Linear Programming (LP) based dynamic economic load dispatch (DELD) solution for large-scale power system operations. It is to minimize the operation cost of power generation. units subject to the considered constraints. After individual generator units are economically loaded and periodically dispatched, fast and optimized DELD has been achieved. DELD with period intervals has been taken into consideration The results found from the algorithm based on LM and LP techniques appear to be modest in both optimizing the operation cost and achieving fast computation. (author)

  20. Research on unit commitment with large-scale wind power connected power system

    Science.gov (United States)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  1. Event-triggered decentralized robust model predictive control for constrained large-scale interconnected systems

    Directory of Open Access Journals (Sweden)

    Ling Lu

    2016-12-01

    Full Text Available This paper considers the problem of event-triggered decentralized model predictive control (MPC for constrained large-scale linear systems subject to additive bounded disturbances. The constraint tightening method is utilized to formulate the MPC optimization problem. The local predictive control law for each subsystem is determined aperiodically by relevant triggering rule which allows a considerable reduction of the computational load. And then, the robust feasibility and closed-loop stability are proved and it is shown that every subsystem state will be driven into a robust invariant set. Finally, the effectiveness of the proposed approach is illustrated via numerical simulations.

  2. Iterative learning-based decentralized adaptive tracker for large-scale systems: a digital redesign approach.

    Science.gov (United States)

    Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua

    2011-07-01

    In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  4. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  5. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    Science.gov (United States)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These

  6. RISK MANAGEMENT IN A LARGE-SCALE NEW RAILWAY TRANSPORT SYSTEM PROJECT

    Directory of Open Access Journals (Sweden)

    Sunduck D. SUH, Ph.D., P.E.

    2000-01-01

    Full Text Available Risk management experiences of the Korean Seoul-Pusan high-speed railway (KTX project since the planning stage are evaluated. One can clearly see the interplay of engineering and construction risks, financial risks and political risks in the development of the KTX project, which is the peculiarity of large-scale new railway system projects. A brief description on evaluation methodology and overview of the project is followed by detailed evaluations on key differences in risks between conventional railway system and high-speed railway system, social and political risks, engineering and construction risks, and financial risks. Risks involved in system procurement process, such as proposal solicitation, evaluation, selection, and scope of solicitation are separated out and evaluated in depth. Detailed events resulting from these issues are discussed along with their possible impact on system risk. Lessons learned and further possible refinements are also discussed.

  7. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    Science.gov (United States)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  8. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    Science.gov (United States)

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  9. Time-Efficient Cloning Attacks Identification in Large-Scale RFID Systems

    Directory of Open Access Journals (Sweden)

    Ju-min Zhao

    2017-01-01

    Full Text Available Radio Frequency Identification (RFID is an emerging technology for electronic labeling of objects for the purpose of automatically identifying, categorizing, locating, and tracking the objects. But in their current form RFID systems are susceptible to cloning attacks that seriously threaten RFID applications but are hard to prevent. Existing protocols aimed at detecting whether there are cloning attacks in single-reader RFID systems. In this paper, we investigate the cloning attacks identification in the multireader scenario and first propose a time-efficient protocol, called the time-efficient Cloning Attacks Identification Protocol (CAIP to identify all cloned tags in multireaders RFID systems. We evaluate the performance of CAIP through extensive simulations. The results show that CAIP can identify all the cloned tags in large-scale RFID systems fairly fast with required accuracy.

  10. Distributed weighted least-squares estimation with fast convergence for large-scale systems.

    Science.gov (United States)

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods.

  11. Dynamic state estimation techniques for large-scale electric power systems

    International Nuclear Information System (INIS)

    Rousseaux, P.; Pavella, M.

    1991-01-01

    This paper presents the use of dynamic type state estimators for energy management in electric power systems. Various dynamic type estimators have been developed, but have never been implemented. This is primarily because of dimensionality problems posed by the conjunction of an extended Kalman filter with a large scale power system. This paper precisely focuses on how to circumvent the high dimensionality, especially prohibitive in the filtering step, by using a decomposition-aggregation hierarchical scheme; to appropriately model the power system dynamics, the authors introduce new state variables in the prediction step and rely on a load forecasting method. The combination of these two techniques succeeds in solving the overall dynamic state estimation problem not only in a tractable and realistic way, but also in compliance with real-time computational requirements. Further improvements are also suggested, bound to the specifics of the high voltage electric transmission systems

  12. Tradeoffs between quality-of-control and quality-of-service in large-scale nonlinear networked control systems

    NARCIS (Netherlands)

    Borgers, D. P.; Geiselhart, R.; Heemels, W. P. M. H.

    2017-01-01

    In this paper we study input-to-state stability (ISS) of large-scale networked control systems (NCSs) in which sensors, controllers and actuators are connected via multiple (local) communication networks which operate asynchronously and independently of each other. We model the large-scale NCS as an

  13. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  14. Real-time graphic display system for ROSA-V Large Scale Test Facility

    International Nuclear Information System (INIS)

    Kondo, Masaya; Anoda, Yoshinari; Osaki, Hideki; Kukita, Yutaka; Takigawa, Yoshio.

    1993-11-01

    A real-time graphic display system was developed for the ROSA-V Large Scale Test Facility (LSTF) experiments simulating accident management measures for prevention of severe core damage in pressurized water reactors (PWRs). The system works on an IBM workstation (Power Station RS/6000 model 560) and accommodates 512 channels out of about 2500 total measurements in the LSTF. It has three major functions: (a) displaying the coolant inventory distribution in the facility primary and secondary systems; (b) displaying the measured quantities at desired locations in the facility; and (c) displaying the time histories of measured quantities. The coolant inventory distribution is derived from differential pressure measurements along vertical sections and gamma-ray densitometer measurements for horizontal legs. The color display indicates liquid subcooling calculated from pressure and temperature at individual locations. (author)

  15. Large-scale tests of aqueous scrubber systems for LMFBR vented containment

    International Nuclear Information System (INIS)

    McCormack, J.D.; Hilliard, R.K.; Postma, A.K.

    1980-01-01

    Six large-scale air cleaning tests performed in the Containment Systems Test Facility (CSTF) are described. The test conditions simulated those postulated for hypothetical accidents in an LMFBR involving containment venting to control hydrogen concentration and containment overpressure. Sodium aerosols were generated by continously spraying sodium into air and adding steam and/or carbon dioxide to create the desired Na 2 O 2 , Na 2 CO 3 or NaOH aerosol. Two air cleaning systems were tested: (a) spray quench chamber, educator venturi scrubber and high efficiency fibrous scrubber in series; and (b) the same except with the spray quench chamber eliminated. The gas flow rates ranged up to 0.8 m 3 /s (1700 acfm) at temperatures to 313 0 C (600 0 F). Quantities of aerosol removed from the gas stream ranged up to 700 kg per test. The systems performed very satisfactorily with overall aerosol mass removal efficiencies exceeding 99.9% in each test

  16. Impacts of large-scale offshore wind farm integration on power systems through VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Hongzhi; Chen, Zhe

    2013-01-01

    The potential of offshore wind energy has been commonly recognized and explored globally. Many countries have implemented and planned offshore wind farms to meet their increasing electricity demands and public environmental appeals, especially in Europe. With relatively less space limitation......, an offshore wind farm could have a capacity rating to hundreds of MWs or even GWs that is large enough to compete with conventional power plants. Thus the impacts of a large offshore wind farm on power system operation and security should be thoroughly studied and understood. This paper investigates...... the impacts of integrating a large-scale offshore wind farm into the transmission system of a power grid through VSC-HVDC connection. The concerns are focused on steady-state voltage stability, dynamic voltage stability and transient angle stability. Simulation results based on an exemplary power system...

  17. The analysis of MAI in large scale MIMO-CDMA system

    Science.gov (United States)

    Berceanu, Madalina-Georgiana; Voicu, Carmen; Halunga, Simona

    2016-12-01

    Recently, technological development imposed a rapid growth in the use of data carried by cellular services, which also implies the necessity of higher data rates and lower latency. To meet the users' demands, it was brought into discussion a series of new data processing techniques. In this paper, we approached the MIMO technology that uses multiple antennas at the receiver and transmitter ends. To study the performances obtained by this technology, we proposed a MIMO-CDMA system, where image transmission has been used instead of random data transmission to take benefit of a larger range of quality indicators. In the simulations we increased the number of antennas, we observed how the performances of the system are modified and, based on that, we were able to make a comparison between a conventional MIMO and a Large Scale MIMO system, in terms of BER and MSSIM index, which is a metric that compares the quality of the image before transmission with the received one.

  18. Rucio - The next generation large scale distributed system for ATLAS Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Lassnig, M; Barisits, M; Vigne, R; Serfon, C; Stewart, G A; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address the ATLAS experiment scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 150 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on new technologies to ensure system scalability, address new user requirements and employ a new automation framework to reduce operational overheads.

  19. Concept of large scale PV-WT-PSH energy sources coupled with the national power system

    Directory of Open Access Journals (Sweden)

    Jurasz Jakub

    2017-01-01

    Full Text Available Intermittent/non-dispatchable energy sources are characterized by a significant variation of their energy yield over time. In majority of cases their role in energy systems is marginalized. However, even in Poland which is strongly dedicated to its hard and brown coal fired power plants, the wind generation in terms of installed capacity starts to play a significant role. This paper briefly introduces a concept of wind (WT and solar (PV powered pumped storage hydroelectricity (PSH which seems to be a viable option for solving the problem of the variable nature of PV and WT generation. Additionally we summarize the results of our so far conducted research on the integration of variable renewable energy sources (VRES to the energy systems and present conclusions which strictly refer to the prospects of large scale PV-WT-PSH operating as a part of the polish energy system.

  20. SMES-UPS for large-scaled SC magnet system of LHD

    International Nuclear Information System (INIS)

    Yamada, Shuichi; Mito, T.; Chikaraishi, H.; Nishimura, A.; Kojima, H.; Nakanishi, Y.; Uede, T.; Satow, T.; Motojima, O.

    2003-01-01

    The LHD is an SC experimental fusion device of heliotron type. Eight sets of the helium compressors with total electric power of 3.5 MW are installed in the cryogenic system. The analytical studies of the SMES-UPS for the compressors under the deep voltage sag are reported in this paper. The amplitude and frequency of the voltage decrease gradually by the regenerating effect of the induction motors. The SMES-UPS system proposed in this report has the following functions; (1) variable frequency control, (2) regulations by ACR and AVR, and (3) rapid isolation and synchronous reconnection from the loads to grid line. We have demonstrated that SMES was useful for the large-scaled cryogenic system of the experimental fusion device

  1. Production of microbial biosurfactants: Status quo of rhamnolipid and surfactin towards large-scale production.

    Science.gov (United States)

    Henkel, Marius; Geissler, Mareen; Weggenmann, Fabiola; Hausmann, Rudolf

    2017-07-01

    Surfactants are an important class of industrial chemicals. Nowadays oleochemical surfactants such as alkyl polyglycosides (APGs) become increasingly important. This trend towards the utilization of renewable resources continues and consumers increasingly demand for environmentally friendly products. Consequently, research in microbial surfactants has drastically increased in the last years. While for mannosylerythritol lipids and sophorolipids established industrial processes exist, an implementation of other microbially derived surfactants has not yet been achieved. Amongst these biosurfactants, rhamnolipids synthesized by Pseudomonas aeruginosa and surfactin produced by Bacillus subtilis are so far the most analyzed biosurfactants due to their exceptional properties and the concomitant possible applications. In this review, a general overview is given regarding the current status of biosurfactants and benefits attributed to these molecules. Furthermore, the most recent research approaches for both rhamnolipids and surfactin are presented with respect to possible methods for industrial processes and the occurring drawbacks and limitations researchers have to address and overcome. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. LARGE SCALE DISTRIBUTED PARAMETER MODEL OF MAIN MAGNET SYSTEM AND FREQUENCY DECOMPOSITION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    ZHANG,W.; MARNERIS, I.; SANDBERG, J.

    2007-06-25

    Large accelerator main magnet system consists of hundreds, even thousands, of dipole magnets. They are linked together under selected configurations to provide highly uniform dipole fields when powered. Distributed capacitance, insulation resistance, coil resistance, magnet inductance, and coupling inductance of upper and lower pancakes make each magnet a complex network. When all dipole magnets are chained together in a circle, they become a coupled pair of very high order complex ladder networks. In this study, a network of more than thousand inductive, capacitive or resistive elements are used to model an actual system. The circuit is a large-scale network. Its equivalent polynomial form has several hundred degrees. Analysis of this high order circuit and simulation of the response of any or all components is often computationally infeasible. We present methods to use frequency decomposition approach to effectively simulate and analyze magnet configuration and power supply topologies.

  3. Assessing the value of storage services in large-scale multireservoir systems

    Science.gov (United States)

    Tilmant, A.; Arjoon, D.; Guilherme, G. F.

    2012-12-01

    both countries, the highly contrasted hydrologic regime of the Euphrates river could only be dealt with through storage. However, due to political tensions, those projects were carried out without much cooperation and coordination among riparian countries. The development started in the late 1960s with the construction of the head reservoir in Turkey (Keban dam) and the most downstream reservoir in Syria (Tabqa dam). Thirty years later, five other dams in both countries had been commissioned, changing the economy of this region through the export of hydroelectric power (7812 MW) and agricultural products (cotton and cereals). The operating policies and marginal water values of this multipurpose multiresevoir system are determined using Stochastic Dual Dynamic Programming, an optimization algorithm that can handle large-scale reservoir operation problems while keeping an individual representation of the hydraulic infrastructure and the demand sites. The analysis of the simulation results reveal that the average value of storage for the entire cascade of reservoirs is around 420 million US/a, which is 18% of the annual short-run benefits of the system (2.26 billion US/a).

  4. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    Science.gov (United States)

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  5. Application of the PSA method to decay heat removal systems in a large scale FBR design

    International Nuclear Information System (INIS)

    Kotake, S.; Satoh, K.; Matsumoto, H.; Sugawara, M.; Sakata, K.; Okabe, A.

    1993-01-01

    The Probabilistic Safety Assessment (PSA) method is applied to a large scale loop-type FBR in its conceptual design stage in order to establish a well-balanced safety. Both the reactor shut down and decay heat removal systems are designed to be highly reliable, e.g. 10 -7 /d. In this paper the results of several reliability analyses concerning the DHRS have been discussed, where the effects of the analytical assumptions, design options, accident managements on the reliability are examined. The reliability is evaluated small enough, since DRACSs consists of four independent loops with sufficient heat removal capacity and both forced and natural circulation capabilities are designed. It is found that the common mode failures for the active components in the DRACS dominate the reliability. The design diversity concerning these components can be effective for the improvements and the accident managements on BOP are also possible by making use of the long grace period in FBR. (author)

  6. Application of the PSA method to decay heat removal systems in a large scale FBR design

    Energy Technology Data Exchange (ETDEWEB)

    Kotake, S; Satoh, K [Japan Atomic Power Company, Otemachi, Chiyoda-ku, Tokyo (Japan); Matsumoto, H; Sugawara, M [Toshiba Corporation (Japan); Sakata, K [Mitsubishi Atomic Power Industries Inc. (Japan); Okabe, A [Hitachi Engineering Co., Ltd. (Japan)

    1993-02-01

    The Probabilistic Safety Assessment (PSA) method is applied to a large scale loop-type FBR in its conceptual design stage in order to establish a well-balanced safety. Both the reactor shut down and decay heat removal systems are designed to be highly reliable, e.g. 10{sup -7}/d. In this paper the results of several reliability analyses concerning the DHRS have been discussed, where the effects of the analytical assumptions, design options, accident managements on the reliability are examined. The reliability is evaluated small enough, since DRACSs consists of four independent loops with sufficient heat removal capacity and both forced and natural circulation capabilities are designed. It is found that the common mode failures for the active components in the DRACS dominate the reliability. The design diversity concerning these components can be effective for the improvements and the accident managements on BOP are also possible by making use of the long grace period in FBR. (author)

  7. Requirements and principles for the implementation and construction of large-scale geographic information systems

    Science.gov (United States)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  8. LongLine: Visual Analytics System for Large-scale Audit Logs

    Directory of Open Access Journals (Sweden)

    Seunghoon Yoo

    2018-03-01

    Full Text Available Audit logs are different from other software logs in that they record the most primitive events (i.e., system calls in modern operating systems. Audit logs contain a detailed trace of an operating system, and thus have received great attention from security experts and system administrators. However, the complexity and size of audit logs, which increase in real time, have hindered analysts from understanding and analyzing them. In this paper, we present a novel visual analytics system, LongLine, which enables interactive visual analyses of large-scale audit logs. LongLine lowers the interpretation barrier of audit logs by employing human-understandable representations (e.g., file paths and commands instead of abstract indicators of operating systems (e.g., file descriptors as well as revealing the temporal patterns of the logs in a multi-scale fashion with meaningful granularity of time in mind (e.g., hourly, daily, and weekly. LongLine also streamlines comparative analysis between interesting subsets of logs, which is essential in detecting anomalous behaviors of systems. In addition, LongLine allows analysts to monitor the system state in a streaming fashion, keeping the latency between log creation and visualization less than one minute. Finally, we evaluate our system through a case study and a scenario analysis with security experts.

  9. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  10. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    Science.gov (United States)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables

  11. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  12. A high-speed transmission method for large-scale marine seismic prospecting systems

    International Nuclear Information System (INIS)

    KeZhu, Song; Ping, Cao; JunFeng, Yang; FuMing, Ruan

    2012-01-01

    A marine seismic prospecting system is a kind of data acquisition and transmission system with large-scale coverage and synchronous multi-node acquisition. In this kind of system, data transmission is a fundamental and difficult technique. In this paper, a high-speed data-transmission method is proposed, its implications and limitations are discussed, and conclusions are drawn. The method we propose has obvious advantages over traditional techniques with respect to long-distance operation, high speed, and real-time transmission. A marine seismic system with four streamers, each 6000 m long and capable of supporting up to 1920 channels, was designed and built based on this method. The effective transmission baud rate of this system was found to reach up to 240 Mbps, while the minimum sampling interval time was as short as 0.25 ms. This system was found to achieve a good synchronization: 83 ns. Laboratory and in situ experiments showed that this marine-prospecting system could work correctly and robustly, which verifies the feasibility and validity of the method proposed in this paper. In addition to the marine seismic applications, this method can also be used in land seismic applications and certain other transmission applications such as environmental or engineering monitoring systems. (paper)

  13. A high-speed transmission method for large-scale marine seismic prospecting systems

    Science.gov (United States)

    KeZhu, Song; Ping, Cao; JunFeng, Yang; FuMing, Ruan

    2012-12-01

    A marine seismic prospecting system is a kind of data acquisition and transmission system with large-scale coverage and synchronous multi-node acquisition. In this kind of system, data transmission is a fundamental and difficult technique. In this paper, a high-speed data-transmission method is proposed, its implications and limitations are discussed, and conclusions are drawn. The method we propose has obvious advantages over traditional techniques with respect to long-distance operation, high speed, and real-time transmission. A marine seismic system with four streamers, each 6000 m long and capable of supporting up to 1920 channels, was designed and built based on this method. The effective transmission baud rate of this system was found to reach up to 240 Mbps, while the minimum sampling interval time was as short as 0.25 ms. This system was found to achieve a good synchronization: 83 ns. Laboratory and in situ experiments showed that this marine-prospecting system could work correctly and robustly, which verifies the feasibility and validity of the method proposed in this paper. In addition to the marine seismic applications, this method can also be used in land seismic applications and certain other transmission applications such as environmental or engineering monitoring systems.

  14. Static analysis of large-scale multibody system using joint coordinates and spatial algebra operator.

    Science.gov (United States)

    Omar, Mohamed A

    2014-01-01

    Initial transient oscillations inhibited in the dynamic simulations responses of multibody systems can lead to inaccurate results, unrealistic load prediction, or simulation failure. These transients could result from incompatible initial conditions, initial constraints violation, and inadequate kinematic assembly. Performing static equilibrium analysis before the dynamic simulation can eliminate these transients and lead to stable simulation. Most exiting multibody formulations determine the static equilibrium position by minimizing the system potential energy. This paper presents a new general purpose approach for solving the static equilibrium in large-scale articulated multibody. The proposed approach introduces an energy drainage mechanism based on Baumgarte constraint stabilization approach to determine the static equilibrium position. The spatial algebra operator is used to express the kinematic and dynamic equations of the closed-loop multibody system. The proposed multibody system formulation utilizes the joint coordinates and modal elastic coordinates as the system generalized coordinates. The recursive nonlinear equations of motion are formulated using the Cartesian coordinates and the joint coordinates to form an augmented set of differential algebraic equations. Then system connectivity matrix is derived from the system topological relations and used to project the Cartesian quantities into the joint subspace leading to minimum set of differential equations.

  15. Hierarchical fiber-optic-based sensing system: impact damage monitoring of large-scale CFRP structures

    International Nuclear Information System (INIS)

    Minakuchi, Shu; Banshoya, Hidehiko; Takeda, Nobuo; Tsukamoto, Haruka

    2011-01-01

    This study proposes a novel fiber-optic-based hierarchical sensing concept for monitoring randomly induced damage in large-scale composite structures. In a hierarchical system, several kinds of specialized devices are hierarchically combined to form a sensing network. Specifically, numerous three-dimensionally structured sensor devices are distributed throughout the whole structural area and connected with an optical fiber network through transducing mechanisms. The distributed devices detect damage, and the fiber-optic network gathers the damage signals and transmits the information to a measuring instrument. This study began by discussing the basic concept of a hierarchical sensing system through comparison with existing fiber-optic-based systems, and an impact damage detection system was then proposed to validate the new concept. The sensor devices were developed based on comparative vacuum monitoring (CVM), and Brillouin-based distributed strain measurement was utilized to identify damaged areas. Verification tests were conducted step-by-step, beginning with a basic test using a single sensor unit, and, finally, the proposed monitoring system was successfully verified using a carbon fiber reinforced plastic (CFRP) fuselage demonstrator. It was clearly confirmed that the hierarchical system has better repairability, higher robustness, and a wider monitorable area compared to existing systems

  16. Large-scale recumbent isoclinal folds in the footwall of the West Cycladic Detachment System (Greece)

    Science.gov (United States)

    Rice, A. Hugh N.; Grasemann, Bernhard

    2017-04-01

    The Pindos Zone in the Cyclades underwent Eocene high-pressure metamorphism and syn-orogenic exhumation, overprinted by Miocene low-angled extension. Although this represents a combination of likely high-strain-events, structural evidence of large-scale folding is rare. Here potential examples of such folding on Kea and Kythnos, in the Western Cyclades, are evaluated. These islands lie within the Cycladic Blueschist Nappe (lower nappe) of the Pindos Zone and in the footwall of the top-to-SSW West Cycladic Detachment System (WCDS). On Kea, no lithostratigraphy can be established in the 450 m thick greenschist facies mixed sedimentary-volcanoclastic-marble mylonite/phyllonite succession. On the east side of the island, lensoid marble layers frequently bifurcate, which might be reflecting early, sheared-out isoclinal folding, although no evidence of folded compositional layering has been found in potential fold-hinge zones and the bifurcation points are not arranged in a way suggestive of a fold axes parallel to the NNE-SSW oriented stretching lineation. However, at two localities, medium-scale recumbent isoclinal folding has been mapped, with NNE-SSW fold-axes exposed for up to 250 m and amplitudes of up to 170 m. On Kythnos, stretching lineations in greenschist facies rocks show a rotation from ENE-WSW in the north to NNE-SSW in the south, taken to represent a reorientation of the Eocene exhumation strain during block rotation coincident with top-to-SSW movement of the WCDS. The distribution of the three marble units that crop out in central/southern Kythnos suggest large-scale, likely isoclinal folding occurred. (1) Petroussa Lithodeme - a blue-grey calcite (BGC) marble with quartz-calcite-white-mica (QCWM) schists, forming a continuous outcrop around the island, thinning from >16m in the SE to <8m thick mylonites in the SW, overlain by grey sericite-albite-graphite-schists (Flabouria Lithodeme); (2) Rizou Lithodeme - massive grey to BGC marble, with abundant

  17. Towards a Database System for Large-scale Analytics on Strings

    KAUST Repository

    Sahli, Majed A.

    2015-07-23

    Recent technological advances are causing an explosion in the production of sequential data. Biological sequences, web logs and time series are represented as strings. Currently, strings are stored, managed and queried in an ad-hoc fashion because they lack a standardized data model and query language. String queries are computationally demanding, especially when strings are long and numerous. Existing approaches cannot handle the growing number of strings produced by environmental, healthcare, bioinformatic, and space applications. There is a trade- off between performing analytics efficiently and scaling to thousands of cores to finish in reasonable times. In this thesis, we introduce a data model that unifies the input and output representations of core string operations. We define a declarative query language for strings where operators can be pipelined to form complex queries. A rich set of core string operators is described to support string analytics. We then demonstrate a database system for string analytics based on our model and query language. In particular, we propose the use of a novel data structure augmented by efficient parallel computation to strike a balance between preprocessing overheads and query execution times. Next, we delve into repeated motifs extraction as a core string operation for large-scale string analytics. Motifs are frequent patterns used, for example, to identify biological functionality, periodic trends, or malicious activities. Statistical approaches are fast but inexact while combinatorial methods are sound but slow. We introduce ACME, a combinatorial repeated motifs extractor. We study the spatial and temporal locality of motif extraction and devise a cache-aware search space traversal technique. ACME is the only method that scales to gigabyte- long strings, handles large alphabets, and supports interesting motif types with minimal overhead. While ACME is cache-efficient, it is limited by being serial. We devise a lightweight

  18. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  19. Representative elements: A step to large-scale fracture system simulation

    International Nuclear Information System (INIS)

    Clemo, T.M.

    1987-01-01

    Large-scale simulation of flow and transport in fractured media requires the development of a technique to represent the effect of a large number of fractures. Representative elements are used as a tool to model a subset of a fracture system as a single distributed entity. Representative elements are part of a modeling concept called dual permeability. Dual permeability modeling combines discrete fracture simulation of the most important fractures with the distributed modeling of the less important fracture of a fracture system. This study investigates the use of stochastic analysis to determine properties of representative elements. Given an assumption of fully developed laminar flow, the net fracture conductivities and hence flow velocities can be determined from descriptive statistics of fracture spacing, orientation, aperture, and extent. The distribution of physical characteristics about their mean leads to a distribution of the associated conductivities. The variance of hydraulic conductivity induces dispersion into the transport process. Simple fracture systems are treated to demonstrate the usefulness of stochastic analysis. Explicit equations for conductivity of an element are developed and the dispersion characteristics are shown. Explicit formulation of the hydraulic conductivity and transport dispersion reveals the dependence of these important characteristics on the parameters used to describe the fracture system. Understanding these dependencies will help to focus efforts to identify the characteristics of fracture systems. Simulations of stochastically generated fracture sets do not provide this explicit functional dependence on the fracture system parameters. 12 refs., 6 figs

  20. Hierarchical system for autonomous sensing-healing of delamination in large-scale composite structures

    International Nuclear Information System (INIS)

    Minakuchi, Shu; Sun, Denghao; Takeda, Nobuo

    2014-01-01

    This study combines our hierarchical fiber-optic-based delamination detection system with a microvascular self-healing material to develop the first autonomous sensing-healing system applicable to large-scale composite structures. In this combined system, embedded vascular modules are connected through check valves to a surface-mounted supply tube of a pressurized healing agent while fiber-optic-based sensors monitor the internal pressure of these vascular modules. When delamination occurs, the healing agent flows into the vascular modules breached by the delamination and infiltrates the damage for healing. At the same time, the pressure sensors identify the damaged modules by detecting internal pressure changes. This paper begins by describing the basic concept of the combined system and by discussing the advantages that arise from its hierarchical nature. The feasibility of the system is then confirmed through delamination infiltration tests. Finally, the hierarchical system is validated in a plate specimen by focusing on the detection and infiltration of the damage. Its self-diagnostic function is also demonstrated. (paper)

  1. Large-scale module production for the CMS silicon strip tracker

    CERN Document Server

    Cattai, A

    2005-01-01

    The Silicon Strip Tracker (SST) for the CMS experiment at LHC consists of 210 m**2 of silicon strip detectors grouped into four distinct sub-systems. We present a brief description of the CMS Tracker, the industrialised detector module production methods and the current status of the SST with reference to some problems encountered at the factories and in the construction centres.

  2. Impact of Data Placement on Resilience in Large-Scale Object Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Carns, Philip; Harms, Kevin; Jenkins, John; Mubarak, Misbah; Ross, Robert; Carothers, Christopher

    2016-05-02

    Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model to investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.

  3. Development of large scale fusion plasma simulation and storage grid on JAERI Origin3800 system

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Wang, Xin

    2003-01-01

    Under the Numerical EXperiment of Tokamak (NEXT) research project, various fluid, particle, and hybrid codes have been developed. These codes require a computational environment which consists of high performance processors, high speed storage system, and high speed parallelized visualization system. In this paper, the performance of the JAERI Origin3800 system is examined from a point of view of these requests. In the performance tests, it is shown that the representative particle and fluid codes operate with 15 - 40% of processing efficiency up to 512 processors. A storage area network (SAN) provides high speed parallel data transfer. A parallel visualization system enables order to magnitude faster visualization of a large scale simulation data compared with the previous graphic workstations. Accordingly, an extremely advanced simulation environment is realized on the JAERI Origin3800 system. Recently, development of a storage grid is underway in order to improve a computational environment of remote users. The storage grid is constructed by a combination of SAN and a wavelength division multiplexer (WDM). The preliminary tests show that compared with the existing data transfer methods, it enables dramatically high speed data transfer ∼100 Gbps over a wide area network. (author)

  4. Algorithms for large scale singular value analysis of spatially variant tomography systems

    International Nuclear Information System (INIS)

    Cao-Huu, Tuan; Brownell, G.; Lachiver, G.

    1996-01-01

    The problem of determining the eigenvalues of large matrices occurs often in the design and analysis of modem tomography systems. As there is an interest in solving systems containing an ever-increasing number of variables, current research effort is being made to create more robust solvers which do not depend on some special feature of the matrix for convergence (e.g. block circulant), and to improve the speed of already known and understood solvers so that solving even larger systems in a reasonable time becomes viable. Our standard techniques for singular value analysis are based on sparse matrix factorization and are not applicable when the input matrices are large because the algorithms cause too much fill. Fill refers to the increase of non-zero elements in the LU decomposition of the original matrix A (the system matrix). So we have developed iterative solutions that are based on sparse direct methods. Data motion and preconditioning techniques are critical for performance. This conference paper describes our algorithmic approaches for large scale singular value analysis of spatially variant imaging systems, and in particular of PCR2, a cylindrical three-dimensional PET imager 2 built at the Massachusetts General Hospital (MGH) in Boston. We recommend the desirable features and challenges for the next generation of parallel machines for optimal performance of our solver

  5. Large scale production of densified hydrogen to the triple point and below

    Science.gov (United States)

    Swanger, A. M.; Notardonato, W. U.; E Fesmire, J.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage technology at NASA Kennedy Space Center led to the production of large quantities of densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. Overall densification performance of the system is explored, and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing, and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. The phenomenon, observed at two fill levels, is detailed herein. The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  6. Rucio – The next generation of large scale distributed system for ATLAS data management

    International Nuclear Information System (INIS)

    Garonne, V; Vigne, R; Stewart, G; Barisits, M; Eermann, T B; Lassnig, M; Serfon, C; Goossens, L; Nairz, A

    2014-01-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and 'Big Data' computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  7. Fluid-structure interaction in non-rigid pipeline systems - large scale validation experiments

    International Nuclear Information System (INIS)

    Heinsbroek, A.G.T.J.; Kruisbrink, A.C.H.

    1993-01-01

    The fluid-structure interaction computer code FLUSTRIN, developed by DELFT HYDRAULICS, enables the user to determine dynamic fluid pressures, structural stresses and displacements in a liquid-filled pipeline system under transient conditions. As such, the code is a useful tool to process and mechanical engineers in the safe design and operation of pipeline systems in nuclear power plants. To validate FLUSTRIN, experiments have been performed in a large scale 3D test facility. The test facility consists of a flexible pipeline system which is suspended by wires, bearings and anchors. Pressure surges, which excite the system, are generated by a fast acting shut-off valve. Dynamic pressures, structural displacements and strains (in total 70 signals) have been measured under well determined initial and boundary conditions. The experiments have been simulated with FLUSTRIN, which solves the acoustic equations using the method of characteristics (fluid) and the finite element method (structure). The agreement between experiments and simulations is shown to be good: frequencies, amplitudes and wave phenomena are well predicted by the numerical simulations. It is demonstrated that an uncoupled water hammer computation would render unreliable and useless results. (author)

  8. A Self-Organizing Spatial Clustering Approach to Support Large-Scale Network RTK Systems.

    Science.gov (United States)

    Shen, Lili; Guo, Jiming; Wang, Lei

    2018-06-06

    The network real-time kinematic (RTK) technique can provide centimeter-level real time positioning solutions and play a key role in geo-spatial infrastructure. With ever-increasing popularity, network RTK systems will face issues in the support of large numbers of concurrent users. In the past, high-precision positioning services were oriented towards professionals and only supported a few concurrent users. Currently, precise positioning provides a spatial foundation for artificial intelligence (AI), and countless smart devices (autonomous cars, unmanned aerial-vehicles (UAVs), robotic equipment, etc.) require precise positioning services. Therefore, the development of approaches to support large-scale network RTK systems is urgent. In this study, we proposed a self-organizing spatial clustering (SOSC) approach which automatically clusters online users to reduce the computational load on the network RTK system server side. The experimental results indicate that both the SOSC algorithm and the grid algorithm can reduce the computational load efficiently, while the SOSC algorithm gives a more elastic and adaptive clustering solution with different datasets. The SOSC algorithm determines the cluster number and the mean distance to cluster center (MDTCC) according to the data set, while the grid approaches are all predefined. The side-effects of clustering algorithms on the user side are analyzed with real global navigation satellite system (GNSS) data sets. The experimental results indicate that 10 km can be safely used as the cluster radius threshold for the SOSC algorithm without significantly reducing the positioning precision and reliability on the user side.

  9. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    Science.gov (United States)

    Garonne, V.; Vigne, R.; Stewart, G.; Barisits, M.; eermann, T. B.; Lassnig, M.; Serfon, C.; Goossens, L.; Nairz, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  10. Integrating large-scale functional genomics data to dissect metabolic networks for hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Harwood, Caroline S

    2012-12-17

    The goal of this project is to identify gene networks that are critical for efficient biohydrogen production by leveraging variation in gene content and gene expression in independently isolated Rhodopseudomonas palustris strains. Coexpression methods were applied to large data sets that we have collected to define probabilistic causal gene networks. To our knowledge this a first systems level approach that takes advantage of strain-to strain variability to computationally define networks critical for a particular bacterial phenotypic trait.

  11. A coordination model for ultra-large scale systems of systems

    Directory of Open Access Journals (Sweden)

    Manuela L. Bujorianu

    2013-11-01

    Full Text Available The ultra large multi-agent systems are becoming increasingly popular due to quick decay of the individual production costs and the potential of speeding up the solving of complex problems. Examples include nano-robots, or systems of nano-satellites for dangerous meteorite detection, or cultures of stem cells for organ regeneration or nerve repair. The topics associated with these systems are usually dealt within the theories of intelligent swarms or biologically inspired computation systems. Stochastic models play an important role and they are based on various formulations of the mechanical statistics. In these cases, the main assumption is that the swarm elements have a simple behaviour and that some average properties can be deduced for the entire swarm. In contrast, complex systems in areas like aeronautics are formed by elements with sophisticated behaviour, which are even autonomous. In situations like this, a new approach to swarm coordination is necessary. We present a stochastic model where the swarm elements are communicating autonomous systems, the coordination is separated from the component autonomous activity and the entire swarm can be abstracted away as a piecewise deterministic Markov process, which constitutes one of the most popular model in stochastic control. Keywords: ultra large multi-agent systems, system of systems, autonomous systems, stochastic hybrid systems.

  12. Derivation of Optimal Operating Rules for Large-scale Reservoir Systems Considering Multiple Trade-off

    Science.gov (United States)

    Zhang, J.; Lei, X.; Liu, P.; Wang, H.; Li, Z.

    2017-12-01

    Flood control operation of multi-reservoir systems such as parallel reservoirs and hybrid reservoirs often suffer from complex interactions and trade-off among tributaries and the mainstream. The optimization of such systems is computationally intensive due to nonlinear storage curves, numerous constraints and complex hydraulic connections. This paper aims to derive the optimal flood control operating rules based on the trade-off among tributaries and the mainstream using a new algorithm known as weighted non-dominated sorting genetic algorithm II (WNSGA II). WNSGA II could locate the Pareto frontier in non-dominated region efficiently due to the directed searching by weighted crowding distance, and the results are compared with those of conventional operating rules (COR) and single objective genetic algorithm (GA). Xijiang river basin in China is selected as a case study, with eight reservoirs and five flood control sections within four tributaries and the mainstream. Furthermore, the effects of inflow uncertainty have been assessed. Results indicate that: (1) WNSGA II could locate the non-dominated solutions faster and provide better Pareto frontier than the traditional non-dominated sorting genetic algorithm II (NSGA II) due to the weighted crowding distance; (2) WNSGA II outperforms COR and GA on flood control in the whole basin; (3) The multi-objective operating rules from WNSGA II deal with the inflow uncertainties better than COR. Therefore, the WNSGA II can be used to derive stable operating rules for large-scale reservoir systems effectively and efficiently.

  13. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Beermann, T; Goossens, L; Lassnig, M; Nairz, A; Stewart, GA; Vigne, V; Serfon, C

    2013-01-01

    Rucio is the next-generation Distributed Data Management(DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will address these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how ATLAS central group and user activities will be managed. The Rucio design, and the technology it employs, is described...

  14. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Beermann, T; Goossens, L; Lassnig, M; Nairz, A; Stewart, GA; Vigne, V; Serfon, C

    2014-01-01

    Rucio is the next-generation Distributed Data Management(DDM) system benefiting from recent advances in cloud and ”Big Data” computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will address these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how ATLAS central group and user activities will be managed. The Rucio design, and the technology it employs, is descr...

  15. A solution to the economic dispatch using EP based SA algorithm on large scale power system

    Energy Technology Data Exchange (ETDEWEB)

    Christober Asir Rajan, C. [Department of EEE, Pondicherry Engineering College, Pondicherry 605 014 (India)

    2010-07-15

    This paper develops a new approach for solving the Economic Load Dispatch (ELD) using an integrated algorithm based on Evolutionary Programming (EP) and Simulated Annealing (SA) on large scale power system. Classical methods employed for solving Economic Load Dispatch are calculus-based. For generator units having quadratic fuel cost functions, the classical techniques ignore or flatten out the portions of the incremental fuel cost curves and so may be have difficulties in the determination of the global optimum solution for non-differentiable fuel cost functions. To overcome these problems, the intelligent techniques, namely, Evolutionary Programming and Simulated Annealing are employed. The above said optimization techniques are capable of determining the global or near global optimum dispatch solutions. The validity and effectiveness of the proposed integrated algorithm has been tested with 66-bus Indian utility system, IEEE 5-bus, 30-bus, 118-bus system. And the test results are compared with the results obtained from other methods. Numerical results show that the proposed integrated algorithm can provide accurate solutions within reasonable time for any type of fuel cost functions. (author)

  16. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  17. Assessment of Future Whole-System Value of Large-Scale Pumped Storage Plants in Europe

    Directory of Open Access Journals (Sweden)

    Fei Teng

    2018-01-01

    Full Text Available This paper analyses the impacts and benefits of the pumped storage plant (PSP and its upgrade to variable speed on generation and transmission capacity requirements, capital costs, system operating costs and carbon emissions in the future European electricity system. The combination of a deterministic system planning tool, Whole-electricity System Investment Model (WeSIM, and a stochastic system operation optimisation tool, Advanced Stochastic Unit Commitment (ASUC, is used to analyse the whole-system value of PSP technology and to quantify the impact of European balancing market integration and other competing flexible technologies on the value of the PSP. Case studies on the Pan-European system demonstrate that PSPs can reduce the total system cost by up to €13 billion per annum by 2050 in a scenario with a high share of renewables. Upgrading the PSP to variable-speed drive enhances its long-term benefits by 10–20%. On the other hand, balancing market integration across Europe may potentially reduce the overall value of the variable-speed PSP, although the effect can vary across different European regions. The results also suggest that large-scale deployment of demand-side response (DSR leads to a significant reduction in the value of PSPs, while the value of PSPs increases by circa 18% when the total European interconnection capacity is halved. The benefit of PSPs in reducing emissions is relatively negligible by 2030 but constitutes around 6–10% of total annual carbon emissions from the European power sector by 2050.

  18. H1 Grid production tool for large scale Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lobodzinski, B; Wissing, Ch [DESY, Hamburg (Germany); Bystritskaya, E; Vorobiew, M [ITEP, Moscow (Russian Federation); Karbach, T M [University of Dortmund (Germany); Mitsyn, S [JINR, Moscow (Russian Federation); Mudrinic, M, E-mail: bogdan.lobodzinski@desy.d [VINS, Belgrad (Serbia)

    2010-04-01

    The H1 Collaboration at HERA has entered the period of high precision analyses based on the final data sample. These analyses require a massive production of simulated Monte Carlo (MC) events. The H1 MC framework (H1MC) is a software for mass MC production on the LCG Grid infrastructure and on a local batch system created by H1 Collaboration. The aim of the tool is a full automatisation of the MC production workflow including management of the MC jobs on the Grid down to copying of the resulting files from the Grid to the H1 mass storage tape device. The H1 MC framework has modular structure, delegating a specific task to each module, including task specific to the H1 experiment: Automatic building of steer and input files, simulation of the H1 detector, reconstruction of particle tracks and post processing calculation. Each module provides data or functionality needed by other modules via a local database. The Grid jobs created for detector simulation and reconstruction from generated MC input files are fully independent and fault-tolerant for 32 and 64-bit LCG Grid architecture and in Grid running state they can be continuously monitored using Relational Grid Monitoring Architecture (R-GMA) service. To monitor the full production chain and detect potential problems, regular checks of the job state are performed using the local database and the Service Availability Monitoring (SAM) framework. The improved stability of the system has resulted in a dramatic increase in the production rate, which exceeded two billion MC events in 2008.

  19. A modular approach to large-scale design optimization of aerospace systems

    Science.gov (United States)

    Hwang, John T.

    Gradient-based optimization and the adjoint method form a synergistic combination that enables the efficient solution of large-scale optimization problems. Though the gradient-based approach struggles with non-smooth or multi-modal problems, the capability to efficiently optimize up to tens of thousands of design variables provides a valuable design tool for exploring complex tradeoffs and finding unintuitive designs. However, the widespread adoption of gradient-based optimization is limited by the implementation challenges for computing derivatives efficiently and accurately, particularly in multidisciplinary and shape design problems. This thesis addresses these difficulties in two ways. First, to deal with the heterogeneity and integration challenges of multidisciplinary problems, this thesis presents a computational modeling framework that solves multidisciplinary systems and computes their derivatives in a semi-automated fashion. This framework is built upon a new mathematical formulation developed in this thesis that expresses any computational model as a system of algebraic equations and unifies all methods for computing derivatives using a single equation. The framework is applied to two engineering problems: the optimization of a nanosatellite with 7 disciplines and over 25,000 design variables; and simultaneous allocation and mission optimization for commercial aircraft involving 330 design variables, 12 of which are integer variables handled using the branch-and-bound method. In both cases, the framework makes large-scale optimization possible by reducing the implementation effort and code complexity. The second half of this thesis presents a differentiable parametrization of aircraft geometries and structures for high-fidelity shape optimization. Existing geometry parametrizations are not differentiable, or they are limited in the types of shape changes they allow. This is addressed by a novel parametrization that smoothly interpolates aircraft

  20. Solving large scale unit dilemma in electricity system by applying commutative law

    Science.gov (United States)

    Legino, Supriadi; Arianto, Rakhmat

    2018-03-01

    The conventional system, pooling resources with large centralized power plant interconnected as a network. provides a lot of advantages compare to the isolated one include optimizing efficiency and reliability. However, such a large plant need a huge capital. In addition, more problems emerged to hinder the construction of big power plant as well as its associated transmission lines. By applying commutative law of math, ab = ba, for all a,b €-R, the problem associated with conventional system as depicted above, can be reduced. The idea of having small unit but many power plants, namely “Listrik Kerakyatan,” abbreviated as LK provides both social and environmental benefit that could be capitalized by using proper assumption. This study compares the cost and benefit of LK to those of conventional system, using simulation method to prove that LK offers alternative solution to answer many problems associated with the large system. Commutative Law of Algebra can be used as a simple mathematical model to analyze whether the LK system as an eco-friendly distributed generation can be applied to solve various problems associated with a large scale conventional system. The result of simulation shows that LK provides more value if its plants operate in less than 11 hours as peaker power plant or load follower power plant to improve load curve balance of the power system. The result of simulation indicates that the investment cost of LK plant should be optimized in order to minimize the plant investment cost. This study indicates that the benefit of economies of scale principle does not always apply to every condition, particularly if the portion of intangible cost and benefit is relatively high.

  1. Large scale rooftop photovoltaics grid connected system at Charoenphol-Rama I green building

    Energy Technology Data Exchange (ETDEWEB)

    Ketjoy, N.; Rakwichian, W. [School of Renewable Energy Technology (SERT) (Thailand); Wongchupan, V. [Panya Consultants Co., Ltd (Thailand); Sankarat, T. [Tesco Lotus, Ek-Chai Distribution System Co., Ltd. (Thailand)

    2004-07-01

    This paper presents a technical feasibility study project for the large scale rooftop photovoltaics (PV) grid connected system at Charoenphol-Rama I green building super store of TESCO LOTUS (TL) in Thailand. The objective of this project is (i) to study the technical feasibility of installation 350 kWp PV systems on the top of the roof in this site (ii) and to determine the energy produce from this system. The technical factors are examined using a computerized PVS 2000 simulation and assessment tool. This super store building located in Bangkok, with latitude 14 N, longitude 100 E and the building direction is 16 from North direction. The building roof area is 14,000 m2; with 3 degree face East and 3 degree face West pitch. Average daily solar energy in this area is approximately 5.0 kWh. The study team for this project consists of educational institution as School of Renewable Energy Technology (SERT) and private institution as Panya Consultants (PC). TL is the project owner, PC is responsible for project management, and SERT is a third party and responsible for PV system study, conceptual design and all technical process. In this feasibility studies SERT will identify the most attractive scenarios of photovoltaic cell technology (mono, poly-crystalline or thin film amorphous), system design concepts for owners (TL) and determine possibility of the energy yield of the system from different module orientation and tilt angle. The result of this study is a guide to help TL to make decision to select proper rooftop PV system option for this store with proper technology view. The economic view will not be considered in this study. (orig.)

  2. An optimal beam alignment method for large-scale distributed space surveillance radar system

    Science.gov (United States)

    Huang, Jian; Wang, Dongya; Xia, Shuangzhi

    2018-06-01

    Large-scale distributed space surveillance radar is a very important ground-based equipment to maintain a complete catalogue for Low Earth Orbit (LEO) space debris. However, due to the thousands of kilometers distance between each sites of the distributed radar system, how to optimally implement the Transmitting/Receiving (T/R) beams alignment in a great space using the narrow beam, which proposed a special and considerable technical challenge in the space surveillance area. According to the common coordinate transformation model and the radar beam space model, we presented a two dimensional projection algorithm for T/R beam using the direction angles, which could visually describe and assess the beam alignment performance. Subsequently, the optimal mathematical models for the orientation angle of the antenna array, the site location and the T/R beam coverage are constructed, and also the beam alignment parameters are precisely solved. At last, we conducted the optimal beam alignment experiments base on the site parameters of Air Force Space Surveillance System (AFSSS). The simulation results demonstrate the correctness and effectiveness of our novel method, which can significantly stimulate the construction for the LEO space debris surveillance equipment.

  3. Feasibility Assessment of Using Power Plant Waste Heat in Large Scale Horticulture Facility Energy Supply Systems

    Directory of Open Access Journals (Sweden)

    Min Gyung Yu

    2016-02-01

    Full Text Available Recently, the Korean government has been carrying out projects to construct several large scale horticulture facilities. However, it is difficult for an energy supply to operate stably and economically with only a conventional fossil fuel boiler system. For this reason, several unused energy sources have become attractive and it was found that power plant waste heat has the greatest potential for application in this scenario. In this study, we performed a feasibility assessment of power plant waste heat as an energy source for horticulture facilities. As a result, it was confirmed that there was a sufficient amount of energy potential for the use of waste heat to supply energy to the assumed area. In Dangjin, an horticultural area of 500 ha could be constructed by utilizing 20% of the energy reserves. In Hadong, a horticulture facility can be set up to be 260 ha with 7.4% of the energy reserves. In Youngdong, an assumed area of 65 ha could be built utilizing about 19% of the energy reserves. Furthermore, the payback period was calculated in order to evaluate the economic feasibility compared with a conventional system. The initial investment costs can be recovered by the approximately 83% reduction in the annual operating costs.

  4. Large Scale Production of Densified Hydrogen Using Integrated Refrigeration and Storage

    Science.gov (United States)

    Notardonato, William U.; Swanger, Adam Michael; Jumper, Kevin M.; Fesmire, James E.; Tomsik, Thomas M.; Johnson, Wesley L.

    2017-01-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage (IRAS) technology at NASA Kennedy Space Center led to the production of large quantities of solid densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. System energy balances and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing (up to 1 K), and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. Twenty silicon diode temperature sensors were recorded over approximately one month for testing at two different fill levels (33 67). The phenomenon, observed at both two fill levels, is described and presented detailed and explained herein., and The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  5. Large Scale Integration of Renewable Power Sources into the Vietnamese Power System

    Science.gov (United States)

    Kies, Alexander; Schyska, Bruno; Thanh Viet, Dinh; von Bremen, Lueder; Heinemann, Detlev; Schramm, Stefan

    2017-04-01

    The Vietnamese Power system is expected to expand considerably in upcoming decades. Power capacities installed are projected to grow from 39 GW in 2015 to 129.5 GW by 2030. Installed wind power capacities are expected to grow to 6 GW (0.8 GW 2015) and solar power capacities to 12 GW (0.85 GW 2015). This goes hand in hand with an increase of the renewable penetration in the power mix from 1.3% from wind and photovoltaics (PV) in 2015 to 5.4% by 2030. The overall potential for wind power in Vietnam is estimated to be around 24 GW. Moreover, the up-scaling of renewable energy sources was formulated as one of the priorized targets of the Vietnamese government in the National Power Development Plan VII. In this work, we investigate the transition of the Vietnamese power system towards high shares of renewables. For this purpose, we jointly optimise the expansion of renewable generation facilities for wind and PV, and the transmission grid within renewable build-up pathways until 2030 and beyond. To simulate the Vietnamese power system and its generation from renewable sources, we use highly spatially and temporally resolved historical weather and load data and the open source modelling toolbox Python for Power System Analysis (PyPSA). We show that the highest potential of renewable generation for wind and PV is observed in southern Vietnam and discuss the resulting need for transmission grid extensions in dependency of the optimal pathway. Furthermore, we show that the smoothing effect of wind power has several considerable beneficial effects and that the Vietnamese hydro power potential can be efficiently used to provide balancing opportunities. This work is part of the R&D Project "Analysis of the Large Scale Integration of Renewable Power into the Future Vietnamese Power System" (GIZ, 2016-2018).

  6. Across Space and Time: Social Responses to Large-Scale Biophysical Systems

    Science.gov (United States)

    Macmynowski, Dena P.

    2007-06-01

    The conceptual rubric of ecosystem management has been widely discussed and deliberated in conservation biology, environmental policy, and land/resource management. In this paper, I argue that two critical aspects of the ecosystem management concept require greater attention in policy and practice. First, although emphasis has been placed on the “space” of systems, the “time”—or rates of change—associated with biophysical and social systems has received much less consideration. Second, discussions of ecosystem management have often neglected the temporal disconnects between changes in biophysical systems and the response of social systems to management issues and challenges. The empirical basis of these points is a case study of the “Crown of the Continent Ecosystem,” an international transboundary area of the Rocky Mountains that surrounds Glacier National Park (USA) and Waterton Lakes National Park (Canada). This project assessed the experiences and perspectives of 1) middle- and upper-level government managers responsible for interjurisdictional cooperation, and 2) environmental nongovernment organizations with an international focus. I identify and describe 10 key challenges to increasing the extent and intensity of transboundary cooperation in land/resource management policy and practice. These issues are discussed in terms of their political, institutional, cultural, information-based, and perceptual elements. Analytic techniques include a combination of environmental history, semistructured interviews with 48 actors, and text analysis in a systematic qualitative framework. The central conclusion of this work is that the rates of response of human social systems must be better integrated with the rates of ecological change. This challenge is equal to or greater than the well-recognized need to adapt the spatial scale of human institutions to large-scale ecosystem processes and transboundary wildlife.

  7. Multi-Agent System Supporting Automated Large-Scale Photometric Computations

    Directory of Open Access Journals (Sweden)

    Adam Sȩdziwy

    2016-02-01

    Full Text Available The technologies related to green energy, smart cities and similar areas being dynamically developed in recent years, face frequently problems of a computational nature rather than a technological one. The example is the ability of accurately predicting the weather conditions for PV farms or wind turbines. Another group of issues is related to the complexity of the computations required to obtain an optimal setup of a solution being designed. In this article, we present the case representing the latter group of problems, namely designing large-scale power-saving lighting installations. The term “large-scale” refers to an entire city area, containing tens of thousands of luminaires. Although a simple power reduction for a single street, giving limited savings, is relatively easy, it becomes infeasible for tasks covering thousands of luminaires described by precise coordinates (instead of simplified layouts. To overcome this critical issue, we propose introducing a formal representation of a computing problem and applying a multi-agent system to perform design-related computations in parallel. The important measure introduced in the article indicating optimization progress is entropy. It also allows for terminating optimization when the solution is satisfying. The article contains the results of real-life calculations being made with the help of the presented approach.

  8. Polynomial expansion of the precoder for power minimization in large-scale MIMO systems

    KAUST Repository

    Sifaou, Houssem

    2016-07-26

    This work focuses on the downlink of a single-cell large-scale MIMO system in which the base station equipped with M antennas serves K single-antenna users. In particular, we are interested in reducing the implementation complexity of the optimal linear precoder (OLP) that minimizes the total power consumption while ensuring target user rates. As most precoding schemes, a major difficulty towards the implementation of OLP is that it requires fast inversions of large matrices at every new channel realizations. To overcome this issue, we aim at designing a linear precoding scheme providing the same performance of OLP but with lower complexity. This is achieved by applying the truncated polynomial expansion (TPE) concept on a per-user basis. To get a further leap in complexity reduction and allow for closed-form expressions of the per-user weighting coefficients, we resort to the asymptotic regime in which M and K grow large with a bounded ratio. Numerical results are used to show that the proposed TPE precoding scheme achieves the same performance of OLP with a significantly lower implementation complexity. © 2016 IEEE.

  9. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  10. Linking genes to ecosystem trace gas fluxes in a large-scale model system

    Science.gov (United States)

    Meredith, L. K.; Cueva, A.; Volkmann, T. H. M.; Sengupta, A.; Troch, P. A.

    2017-12-01

    Soil microorganisms mediate biogeochemical cycles through biosphere-atmosphere gas exchange with significant impact on atmospheric trace gas composition. Improving process-based understanding of these microbial populations and linking their genomic potential to the ecosystem-scale is a challenge, particularly in soil systems, which are heterogeneous in biodiversity, chemistry, and structure. In oligotrophic systems, such as the Landscape Evolution Observatory (LEO) at Biosphere 2, atmospheric trace gas scavenging may supply critical metabolic needs to microbial communities, thereby promoting tight linkages between microbial genomics and trace gas utilization. This large-scale model system of three initially homogenous and highly instrumented hillslopes facilitates high temporal resolution characterization of subsurface trace gas fluxes at hundreds of sampling points, making LEO an ideal location to study microbe-mediated trace gas fluxes from the gene to ecosystem scales. Specifically, we focus on the metabolism of ubiquitous atmospheric reduced trace gases hydrogen (H2), carbon monoxide (CO), and methane (CH4), which may have wide-reaching impacts on microbial community establishment, survival, and function. Additionally, microbial activity on LEO may facilitate weathering of the basalt matrix, which can be studied with trace gas measurements of carbonyl sulfide (COS/OCS) and carbon dioxide (O-isotopes in CO2), and presents an additional opportunity for gene to ecosystem study. This work will present initial measurements of this suite of trace gases to characterize soil microbial metabolic activity, as well as links between spatial and temporal variability of microbe-mediated trace gas fluxes in LEO and their relation to genomic-based characterization of microbial community structure (phylogenetic amplicons) and genetic potential (metagenomics). Results from the LEO model system will help build understanding of the importance of atmospheric inputs to

  11. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    Science.gov (United States)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  12. A Self-Organizing Spatial Clustering Approach to Support Large-Scale Network RTK Systems

    Directory of Open Access Journals (Sweden)

    Lili Shen

    2018-06-01

    Full Text Available The network real-time kinematic (RTK technique can provide centimeter-level real time positioning solutions and play a key role in geo-spatial infrastructure. With ever-increasing popularity, network RTK systems will face issues in the support of large numbers of concurrent users. In the past, high-precision positioning services were oriented towards professionals and only supported a few concurrent users. Currently, precise positioning provides a spatial foundation for artificial intelligence (AI, and countless smart devices (autonomous cars, unmanned aerial-vehicles (UAVs, robotic equipment, etc. require precise positioning services. Therefore, the development of approaches to support large-scale network RTK systems is urgent. In this study, we proposed a self-organizing spatial clustering (SOSC approach which automatically clusters online users to reduce the computational load on the network RTK system server side. The experimental results indicate that both the SOSC algorithm and the grid algorithm can reduce the computational load efficiently, while the SOSC algorithm gives a more elastic and adaptive clustering solution with different datasets. The SOSC algorithm determines the cluster number and the mean distance to cluster center (MDTCC according to the data set, while the grid approaches are all predefined. The side-effects of clustering algorithms on the user side are analyzed with real global navigation satellite system (GNSS data sets. The experimental results indicate that 10 km can be safely used as the cluster radius threshold for the SOSC algorithm without significantly reducing the positioning precision and reliability on the user side.

  13. Combined Cycle Engine Large-Scale Inlet for Mode Transition Experiments: System Identification Rack Hardware Design

    Science.gov (United States)

    Thomas, Randy; Stueber, Thomas J.

    2013-01-01

    The System Identification (SysID) Rack is a real-time hardware-in-the-loop data acquisition (DAQ) and control instrument rack that was designed and built to support inlet testing in the NASA Glenn Research Center 10- by 10-Foot Supersonic Wind Tunnel. This instrument rack is used to support experiments on the Combined-Cycle Engine Large-Scale Inlet for Mode Transition Experiment (CCE? LIMX). The CCE?LIMX is a testbed for an integrated dual flow-path inlet configuration with the two flow paths in an over-and-under arrangement such that the high-speed flow path is located below the lowspeed flow path. The CCE?LIMX includes multiple actuators that are designed to redirect airflow from one flow path to the other; this action is referred to as "inlet mode transition." Multiple phases of experiments have been planned to support research that investigates inlet mode transition: inlet characterization (Phase-1) and system identification (Phase-2). The SysID Rack hardware design met the following requirements to support Phase-1 and Phase-2 experiments: safely and effectively move multiple actuators individually or synchronously; sample and save effector control and position sensor feedback signals; automate control of actuator positioning based on a mode transition schedule; sample and save pressure sensor signals; and perform DAQ and control processes operating at 2.5 KHz. This document describes the hardware components used to build the SysID Rack including their function, specifications, and system interface. Furthermore, provided in this document are a SysID Rack effectors signal list (signal flow); system identification experiment setup; illustrations indicating a typical SysID Rack experiment; and a SysID Rack performance overview for Phase-1 and Phase-2 experiments. The SysID Rack described in this document was a useful tool to meet the project objectives.

  14. Do large-scale hospital- and system-wide interventions improve patient outcomes: a systematic review.

    Science.gov (United States)

    Clay-Williams, Robyn; Nosrati, Hadis; Cunningham, Frances C; Hillman, Kenneth; Braithwaite, Jeffrey

    2014-09-03

    While health care services are beginning to implement system-wide patient safety interventions, evidence on the efficacy of these interventions is sparse. We know that uptake can be variable, but we do not know the factors that affect uptake or how the interventions establish change and, in particular, whether they influence patient outcomes. We conducted a systematic review to identify how organisational and cultural factors mediate or are mediated by hospital-wide interventions, and to assess the effects of those factors on patient outcomes. A systematic review was conducted and reported in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Database searches were conducted using MEDLINE from 1946, CINAHL from 1991, EMBASE from 1947, Web of Science from 1934, PsycINFO from 1967, and Global Health from 1910 to September 2012. The Lancet, JAMA, BMJ, BMJ Quality and Safety, The New England Journal of Medicine and Implementation Science were also hand searched for relevant studies published over the last 5 years. Eligible studies were required to focus on organisational determinants of hospital- and system-wide interventions, and to provide patient outcome data before and after implementation of the intervention. Empirical, peer-reviewed studies reporting randomised and non-randomised controlled trials, observational, and controlled before and after studies were included in the review. Six studies met the inclusion criteria. Improved outcomes were observed for studies where outcomes were measured at least two years after the intervention. Associations between organisational factors, intervention success and patient outcomes were undetermined: organisational culture and patient outcomes were rarely measured together, and measures for culture and outcome were not standardised. Common findings show the difficulty of introducing large-scale interventions, and that effective leadership and clinical champions, adequate

  15. Solution approach for a large scale personnel transport system for a large company in Latin America

    Energy Technology Data Exchange (ETDEWEB)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-07-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  16. Solution approach for a large scale personnel transport system for a large company in Latin America

    International Nuclear Information System (INIS)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-01-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  17. Solution approach for a large scale personnel transport system for a large company in Latin America

    Directory of Open Access Journals (Sweden)

    Eduardo-Arturo Garzón-Garnica

    2017-10-01

    Full Text Available Purpose: The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both.  When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  18. Comparing centralised and decentralised anaerobic digestion of stillage from a large-scale bioethanol plant to animal feed production.

    Science.gov (United States)

    Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R

    2008-01-01

    A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.

  19. Bioprocessing strategies for the large-scale production of human mesenchymal stem cells: a review.

    Science.gov (United States)

    Panchalingam, Krishna M; Jung, Sunghoon; Rosenberg, Lawrence; Behie, Leo A

    2015-11-23

    Human mesenchymal stem cells (hMSCs), also called mesenchymal stromal cells, have been of great interest in regenerative medicine applications because of not only their differentiation potential but also their ability to secrete bioactive factors that can modulate the immune system and promote tissue repair. This potential has initiated many early-phase clinical studies for the treatment of various diseases, disorders, and injuries by using either hMSCs themselves or their secreted products. Currently, hMSCs for clinical use are generated through conventional static adherent cultures in the presence of fetal bovine serum or human-sourced supplements. However, these methods suffer from variable culture conditions (i.e., ill-defined medium components and heterogeneous culture environment) and thus are not ideal procedures to meet the expected future demand of quality-assured hMSCs for human therapeutic use. Optimizing a bioprocess to generate hMSCs or their secreted products (or both) promises to improve the efficacy as well as safety of this stem cell therapy. In this review, current media and methods for hMSC culture are outlined and bioprocess development strategies discussed.

  20. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Optimizing in vitro large scale production of giant reed (Arundo donax L.) by liquid medium culture

    International Nuclear Information System (INIS)

    Cavallaro, Valeria; Patanè, Cristina; Cosentino, Salvatore L.; Di Silvestro, Isabella; Copani, Venera

    2014-01-01

    Tissue culture methods offer the potential for large-scale propagation of giant reed (Arundo donax L.), a promising crop for energy biomass. In previous trials, giant reed resulted particularly suitable to in vitro culture. In this paper, with the final goal of enhancing the efficiency of in vitro production process and reducing costs, the influence of four different culture media (agar or gellan-gum solidified medium, liquid medium into a temporary immersion system-RITA ® or in a stationary state) on in vitro shoot proliferation of giant reed was evaluated. Giant reed exhibited a particular sensitivity to gelling agents during the phase of secondary shoot formation. Gellan gum, as compared to agar, improved the efficiency of in vitro culture giving more shoots with higher mean fresh and dry weight. Moreover, the cultivation of this species into a liquid medium under temporary immersion conditions or in a stationary state, was comparatively as effective as and cheaper than that into a gellan gum medium. Increasing 6-benzylaminopurine (BA) up to 4 mg l −1 also resulted in a further enhancement of secondary shoot proliferation. The good adaptability of this species to liquid medium and the high multiplication rates observed indicate the possibility to obtain from a single node at least 1200 plantlets every six multiplication cycles (about 6 months), a number 100 fold higher than that obtained yearly per plant by the conventional methods of vegetative multiplication. In open field, micropropagated plantlets guaranteed a higher number of survived plants, secondary stems and above ground biomass as compared to rhizome ones. - Highlights: • In vitro propagation offers the potential for large-scale propagation of giant reed. • The success of an in vitro protocol depends on the rate and mode of shoot proliferation. • Substituting liquid media to solid ones may decrease propagation costs in Arundo donax. • Giant reed showed good proliferation rates in

  2. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    Science.gov (United States)

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  3. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  4. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  5. Computational Modelling of Large Scale Phage Production Using a Two-Stage Batch Process

    Directory of Open Access Journals (Sweden)

    Konrad Krysiak-Baltyn

    2018-04-01

    Full Text Available Cost effective and scalable methods for phage production are required to meet an increasing demand for phage, as an alternative to antibiotics. Computational models can assist the optimization of such production processes. A model is developed here that can simulate the dynamics of phage population growth and production in a two-stage, self-cycling process. The model incorporates variable infection parameters as a function of bacterial growth rate and employs ordinary differential equations, allowing application to a setup with multiple reactors. The model provides simple cost estimates as a function of key operational parameters including substrate concentration, feed volume and cycling times. For the phage and bacteria pairing examined, costs and productivity varied by three orders of magnitude, with the lowest cost found to be most sensitive to the influent substrate concentration and low level setting in the first vessel. An example case study of phage production is also presented, showing how parameter values affect the production costs and estimating production times. The approach presented is flexible and can be used to optimize phage production at laboratory or factory scale by minimizing costs or maximizing productivity.

  6. Large-Scale, Continuous-Flow Production of Stressed Biomass (Desulfovibrio vulgaris Hildenborough)

    Energy Technology Data Exchange (ETDEWEB)

    Geller, Jil T.; Borglin, Sharon E.; Fortney, Julian L.; Lam, Bonita R.; Hazen, Terry C.; Biggin, Mark D.

    2010-05-01

    The Protein Complex Analysis Project (PCAP, http://pcap.lbl.gov/), focuses on high-throughput analysis of microbial protein complexes in the anaerobic, sulfate-reducing organism, DesulfovibriovulgarisHildenborough(DvH).Interest in DvHas a model organism for bioremediation of contaminated groundwater sites arises from its ability to reduce heavy metals. D. vulgarishas been isolated from contaminated groundwater of sites in the DOE complex. To understand the effect of environmental changes on the organism, midlog-phase cultures are exposed to nitrate and salt stresses (at the minimum inhibitory concentration, which reduces growth rates by 50percent), and compared to controls of cultures at midlogand stationary phases. Large volumes of culture of consistent quality (up to 100 liters) are needed because of the relatively low cell density of DvHcultures (one order of magnitude lower than E. coli, for example) and PCAP's challenge to characterize low-abundance membrane proteins. Cultures are grown in continuous flow stirred tank reactors (CFSTRs) to produce consistent cell densities. Stressor is added to the outflow from the CFSTR, and the mixture is pumped through a plug flow reactor (PFR), to provide a stress exposure time of 2 hours. Effluent is chilled and held in large carboys until it is centrifuged. A variety of analyses -- including metabolites, total proteins, cell density and phospholipidfatty-acids -- track culture consistency within a production run, and differences due to stress exposure and growth phase for the different conditions used. With our system we are able to produce the requisite 100 L of culture for a given condition within a week.

  7. Finite-Time Stability of Large-Scale Systems with Interval Time-Varying Delay in Interconnection

    Directory of Open Access Journals (Sweden)

    T. La-inchua

    2017-01-01

    Full Text Available We investigate finite-time stability of a class of nonlinear large-scale systems with interval time-varying delays in interconnection. Time-delay functions are continuous but not necessarily differentiable. Based on Lyapunov stability theory and new integral bounding technique, finite-time stability of large-scale systems with interval time-varying delays in interconnection is derived. The finite-time stability criteria are delays-dependent and are given in terms of linear matrix inequalities which can be solved by various available algorithms. Numerical examples are given to illustrate effectiveness of the proposed method.

  8. Large-Scale Production of Fuel and Feed from Marine Microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Huntley, Mark [Cornell Univ., Ithaca, NY (United States)

    2015-09-30

    In summary, this Consortium has demonstrated a fully integrated process for the production of biofuels and high-value nutritional bioproducts at pre-commercial scale. We have achieved unprecedented yields of algal oil, and converted the oil to viable fuels. We have demonstrated the potential value of the residual product as a viable feed ingredient for many important animals in the global food supply.

  9. Ruling by canal: Governance and system-level design characteristics of large scale irrigation infrastructure in India and Uzbekistan

    NARCIS (Netherlands)

    Mollinga, P.; Veldwisch, G.J.A.

    2016-01-01

    This paper explores the relationship between governance regime and large-scale irrigation system design by investigating three cases: 1) protective irrigation design in post-independent South India; 2) canal irrigation system design in Khorezm Province, Uzbekistan, as implemented in the USSR period,

  10. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  11. Analysis of the electricity demand of Greece for optimal planning of a large-scale hybrid renewable energy system

    Science.gov (United States)

    Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos

    2015-04-01

    The Greek electricity system is examined for the period 2002-2014. The demand load data are analysed at various time scales (hourly, daily, seasonal and annual) and they are related to the mean daily temperature and the gross domestic product (GDP) of Greece for the same time period. The prediction of energy demand, a product of the Greek Independent Power Transmission Operator, is also compared with the demand load. Interesting results about the change of the electricity demand scheme after the year 2010 are derived. This change is related to the decrease of the GDP, during the period 2010-2014. The results of the analysis will be used in the development of an energy forecasting system which will be a part of a framework for optimal planning of a large-scale hybrid renewable energy system in which hydropower plays the dominant role. Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)

  12. Large-scale production of UO2 kernels by sol–gel process at INET

    International Nuclear Information System (INIS)

    Hao, Shaochang; Ma, Jingtao; Zhao, Xingyu; Wang, Yang; Zhou, Xiangwen; Deng, Changsheng

    2014-01-01

    In order to supply elements (300,000 elements per year) for the Chinese pebble bed modular high temperature gas cooled reactor (HTR-PM), it is necessary to scale up the production of UO 2 kernels to 3–6 kgU per batch. The sol–gel process for preparation of UO 2 kernels have been improved and optimized at Institute of Nuclear and New Energy Technology (INET), Tsinghua University, PR China, and a whole set of facility was designed and constructed based on the process. This report briefly describes the main steps of the process, the key equipment and the production capacities of every step. Six batches of kernels for scale-up verification and four batches of kernels for fuel elements for in-pile irradiation tests have been successfully produced, respectively. The quality of the produced kernels meets the design requirements. The production capacity of the process reaches 3–6 kgU per batch

  13. Grid matching of large-scale wind energy conversion systems, alone and in tandem with large-scale photovoltaic systems: An Israeli case study

    International Nuclear Information System (INIS)

    Solomon, A.A.; Faiman, D.; Meron, G.

    2010-01-01

    This paper presents a grid matching analysis of wind energy conversion systems (WECSs) and photovoltaic (PV)-WECS hybrid systems. The study was carried out using hourly load data of the Israel Electric Corporation (IEC) for the year 2006 and the corresponding simulated hourly performance of large PV and WECS plants in the Negev Desert. Our major objective was to compare the grid-matching capabilities of wind with those of our previously published PV results, and to assess the extent to which the combined employment of WECS and PV can improve the grid matching capability of either technology when used on its own. We find that, due to the differences in diurnal and seasonal output profiles of WECS and PV, their tandem employment significantly improves grid penetration compared to their use individually.

  14. A large scale software system for simulation and design optimization of mechanical systems

    Science.gov (United States)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  15. How Close We Are to Achieving Commercially Viable Large-Scale Photobiological Hydrogen Production by Cyanobacteria: A Review of the Biological Aspects

    Science.gov (United States)

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2015-01-01

    Photobiological production of H2 by cyanobacteria is considered to be an ideal source of renewable energy because the inputs, water and sunlight, are abundant. The products of photobiological systems are H2 and O2; the H2 can be used as the energy source of fuel cells, etc., which generate electricity at high efficiencies and minimal pollution, as the waste product is H2O. Overall, production of commercially viable algal fuels in any form, including biomass and biodiesel, is challenging, and the very few systems that are operational have yet to be evaluated. In this paper we will: briefly review some of the necessary conditions for economical production, summarize the reports of photobiological H2 production by cyanobacteria, present our schemes for future production, and discuss the necessity for further progress in the research needed to achieve commercially viable large-scale H2 production. PMID:25793279

  16. Authentication of Fish Products by Large-Scale Comparison of Tandem Mass Spectra

    DEFF Research Database (Denmark)

    Wulff, Tune; Nielsen, Michael Engelbrecht; Deelder, André M.

    2013-01-01

    Authentication of food is a major concern worldwide to ensure that food products are correctly labeled in terms of which animals are actually processed for consumption. Normally authentication is based on species recognition by comparison of selected sequences of DNA or protein. We here present...... a new robust, proteome-wide tandem mass spectrometry method for species recognition and food product authentication. The method does not use or require any genome sequences or selection of tandem mass spectra but uses all acquired data. The experimental steps were performed in a simple, standardized...

  17. Economics of intermittent renewable energy sources: four essays on large-scale integration into European power systems

    International Nuclear Information System (INIS)

    Henriot, Arthur

    2014-01-01

    This thesis centres on issues of economic efficiency originating from the large-scale development of intermittent renewable energy sources (RES) in Europe. The flexible resources that are necessary to cope with their specificities (variability, low-predictability, site specificity) are already known, but adequate signals are required to foster efficient operation and investment in these resources. A first question is to what extent intermittent RES can remain out of the market at times when they are the main driver of investment and operation in power systems. A second question is whether the current market design is adapted to their specificities. These two questions are tackled in four distinct contributions.The first chapter is a critical literature review. This analysis introduces and confronts two (often implicit) paradigms for RES integration. It then identifies and discusses a set of evolutions required to develop a market design adapted to the large-scale development of RES, such as new definitions of the products exchanged and reorganisation of the sequence of electricity markets.In the second chapter, an analytical model is used to assess the potential of intra-day markets as a flexibility provider to intermittent RES with low production predictability. This study highlights and demonstrates how the potential of intra-day markets is heavily dependent on the evolution of the forecast errors.The third chapter focuses on the benefits of curtailing the production by intermittent RES, as a tool to smooth out their variability and reduce overall generation costs. Another analytical model is employed to anatomise the relationship between these benefits and a set of pivotal parameters. Special attention is also paid to the allocation of these benefits between the different stakeholders.In the fourth chapter, a numerical simulation is used to evaluate the ability of the European transmission system operators to tackle the investment wave required in order to

  18. Third generation design solar cell module LSA task 5, large scale production

    Science.gov (United States)

    1980-01-01

    A total of twelve (12) preproduction modules were constructed, tested, and delivered. A concept to the frame assembly was designed and proven to be quite reliable. This frame design, as well as the rest of the assembly, was designed with future high volume production and the use of automated equipment in mind.

  19. Large-scale production of PWO scintillation elements for CMS ECAL

    International Nuclear Information System (INIS)

    Annenkov, A.; Auffray, E.; Drobychev, G.; Korzhik, M.; Kostylev, V.; Kovalev, O.; Lecoq, P.; Ligoun, V.; Missevitch, O.; Zouevski, R.

    2005-01-01

    JSC Bogoroditsk Technical Chemical Plant, BTCP, has produced up to date more than 20,000 lead tungstate scintillation elements for the electromagnetic calorimeter of CMS Collaboration. Here we report on the status of the crystal production and results of the quality insurance program, which is performed by the Collaboration in cooperation with BTCP to keep crystal properties within specifications

  20. A roadmap for natural product discovery based on large-scale genomics and metabolomics

    Science.gov (United States)

    Actinobacteria encode a wealth of natural product biosynthetic gene clusters, whose systematic study is complicated by numerous repetitive motifs. By combining several metrics we developed a method for global classification of these gene clusters into families (GCFs) and analyzed the biosynthetic ca...

  1. Potential for large-scale uses for fission-product Xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-03-01

    Of all fission products in spent, low-enrichment-uranium power-reactor fuels, xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the US, radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state-of-the-art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission-product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much-more-voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays, and luminescence - as well as for medicinal diagnostics and therapeutics - fission-product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly hgiher atomic weight, because of the much higher concentrations of the 134 Xe and 136 Xe isotopes. Therefore, fission-product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  2. Economical evaluation of large-scale photovoltaic systems using Universal Generating Function techniques

    DEFF Research Database (Denmark)

    Ding, Yi; Shen, Weixiang; Levitin, Gregory

    2013-01-01

    . The reliability models of solar panel arrays, PV inverters and energy production units (EPUs) are represented as the corresponding UGFs. The expected energy production models for different PV system configurations have also been developed. The expected unit cost of electricity has been calculated to provide......Solar energy plays an important role in the global energy framework for future. Comparing with conventional generation systems using fossil fuels, the cost structure of photovoltaic (PV) systems is different: the capital cost is higher while the operation cost is negligible. Reliabilities of the PV...

  3. Optimization of Large-Scale Culture Conditions for the Production of Cordycepin with Cordyceps militaris by Liquid Static Culture

    Directory of Open Access Journals (Sweden)

    Chao Kang

    2014-01-01

    Full Text Available Cordycepin is one of the most important bioactive compounds produced by species of Cordyceps sensu lato, but it is hard to produce large amounts of this substance in industrial production. In this work, single factor design, Plackett-Burman design, and central composite design were employed to establish the key factors and identify optimal culture conditions which improved cordycepin production. Using these culture conditions, a maximum production of cordycepin was 2008.48 mg/L for 700 mL working volume in the 1000 mL glass jars and total content of cordycepin reached 1405.94 mg/bottle. This method provides an effective way for increasing the cordycepin production at a large scale. The strategies used in this study could have a wide application in other fermentation processes.

  4. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  5. Titius--Bode law and the possibility of recent large-scale evolution in the solar system

    International Nuclear Information System (INIS)

    Neito, M.M.

    1974-01-01

    Although it is by no means clear that the Titius--Bode law of planetary distances is indeed a ''law'' (even though there are enticing indications), it is proposed that if one assumes that the law is a ''law'' and that the planets obey it, then this argues against recent large-scale evolution in the solar system. Put another way: one can believe in the Titius--Bode law or in recent large-scale evolution or in neither of them. But it appears difficult to believe in both of them

  6. Study of the environmental impacts of large scale bioethanol production in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1991-01-01

    The report provides an analysis of the energy balance, the carbon dioxide balance, and other environmental effects. Four crops which might be used as bioethanol feedstock were considered. These were: wheat, sugar beet, sweet sorghum and Jerusalem artichoke. Given the current agricultural capabilities in Europe, wheat and sugar beet could be cultivated immediately for bioethanol production whilst sweet sorghum and Jerusalem artichoke represent crops which are under investigation as potential bioethanol feedstock in the longer term. (author).

  7. Study of the environmental impacts of large scale bioethanol production in Europe

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    The report provides an analysis of the energy balance, the carbon dioxide balance, and other environmental effects. Four crops which might be used as bioethanol feedstock were considered. These were: wheat, sugar beet, sweet sorghum and Jerusalem artichoke. Given the current agricultural capabilities in Europe, wheat and sugar beet could be cultivated immediately for bioethanol production whilst sweet sorghum and Jerusalem artichoke represent crops which are under investigation as potential bioethanol feedstock in the longer term. (author)

  8. Improved decomposition–coordination and discrete differential dynamic programming for optimization of large-scale hydropower system

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Ouyang, Shuo; Ding, Xiaoling; Chen, Lu

    2014-01-01

    Highlights: • Optimization of large-scale hydropower system in the Yangtze River basin. • Improved decomposition–coordination and discrete differential dynamic programming. • Generating initial solution randomly to reduce generation time. • Proposing relative coefficient for more power generation. • Proposing adaptive bias corridor technology to enhance convergence speed. - Abstract: With the construction of major hydro plants, more and more large-scale hydropower systems are taking shape gradually, which brings up a challenge to optimize these systems. Optimization of large-scale hydropower system (OLHS), which is to determine water discharges or water levels of overall hydro plants for maximizing total power generation when subjecting to lots of constrains, is a high dimensional, nonlinear and coupling complex problem. In order to solve the OLHS problem effectively, an improved decomposition–coordination and discrete differential dynamic programming (IDC–DDDP) method is proposed in this paper. A strategy that initial solution is generated randomly is adopted to reduce generation time. Meanwhile, a relative coefficient based on maximum output capacity is proposed for more power generation. Moreover, an adaptive bias corridor technology is proposed to enhance convergence speed. The proposed method is applied to long-term optimal dispatches of large-scale hydropower system (LHS) in the Yangtze River basin. Compared to other methods, IDC–DDDP has competitive performances in not only total power generation but also convergence speed, which provides a new method to solve the OLHS problem

  9. Research on a Small Signal Stability Region Boundary Model of the Interconnected Power System with Large-Scale Wind Power

    Directory of Open Access Journals (Sweden)

    Wenying Liu

    2015-03-01

    Full Text Available For the interconnected power system with large-scale wind power, the problem of the small signal stability has become the bottleneck of restricting the sending-out of wind power as well as the security and stability of the whole power system. Around this issue, this paper establishes a small signal stability region boundary model of the interconnected power system with large-scale wind power based on catastrophe theory, providing a new method for analyzing the small signal stability. Firstly, we analyzed the typical characteristics and the mathematic model of the interconnected power system with wind power and pointed out that conventional methods can’t directly identify the topological properties of small signal stability region boundaries. For this problem, adopting catastrophe theory, we established a small signal stability region boundary model of the interconnected power system with large-scale wind power in two-dimensional power injection space and extended it to multiple dimensions to obtain the boundary model in multidimensional power injection space. Thirdly, we analyzed qualitatively the topological property’s changes of the small signal stability region boundary caused by large-scale wind power integration. Finally, we built simulation models by DIgSILENT/PowerFactory software and the final simulation results verified the correctness and effectiveness of the proposed model.

  10. New method for large scale production of medically applicable Actinium-225 and Radium-223

    International Nuclear Information System (INIS)

    Aliev, R.A.; Vasilyev, A.N.; Ostapenko, V.; Kalmykov, S.N.; Zhuikov, B.L.; Ermolaev, S.V.; Lapshina, E.V.

    2014-01-01

    Alpha-emitters ( 211 At, 212 Bi, 213 Bi, 223 Ra, 225 Ac) are promising for targeted radiotherapy of cancer. Only two alpha decays near a cell membrane result in 50% death of cancer cell and only a single decay inside the cell is required for this. 225 Ac may be used either directly or as a mother radionuclide in 213 Bi isotope generator. Production of 225 Ac is provided by three main suppliers - Institute for Transuranium Elements in Germany, Oak Ridge National Laboratory in USA and Institute of Physics and Power Engineering in Obninsk, Russia. The current worldwide production of 225 Ac is approximately 1.7 Ci per year that corresponds to only 100-200 patients that could be treated annually. The common approach for 225 Ac production is separation from mother 229 Th or irradiation of 226 Ra with protons in a cyclotron. Both the methods have some practical limitations to be applied routinely. 225 Ac can be also produced by irradiation of natural thorium with medium energy protons . Cumulative cross sections of 225 Ac, 227 Ac, 227 Th, 228 Th formations have been obtained recently. Thorium targets (1-9 g) were irradiated by 114-91 MeV proton beam (1-50 μA) at INR linear accelerator. After dissolution in 8 M HNO 3 + 0.004 M HF thorium was removed by double LLX by HDEHP in toluene (1:1). Ac and REE were pre-concentrated and separated from Ra and most fission products by DGA-Resin (Triskem). After washing out by 0.01 M HNO 3 Ac was separated from REE by TRU Resin (Triskem) in 3 M HNO 3 media. About 6 mCi 225 Ac were separated in hot cell with chemical yield 85%. The method may be upscaled for production of Ci amounts of the radionuclide. The main impurity is 227 Ac (0.1% at the EOB) but it does not hinder 225 Ac from being used for medical 225 Ac/ 213 Bi generators. (author)

  11. Technical data summary: Uranium(IV) production using a large scale electrochemical cell

    International Nuclear Information System (INIS)

    Hsu, T.C.

    1984-05-01

    This Technical Data Summary outlines an electrochemical process to produce U(IV), in the form of uranous nitrate, from U(VI), as uranyl nitrate. U(IV) with hydrazine could then be used as an alternative plutonium reductant to substantially reduce the waste volume from the Purex solvent extraction process. This TDS is divided into three parts. The first part (Chapters I to IV) generally describes the electrochemical production of U(IV). The second part (Chapters V to VII) describes a pilot scale U(IV) production facility that was constructed and operated at an engineering semiworks area of SRP, referred to as TNX. The lst part (Chapter VIII) describes a preliminary design for a full-scale facility that would meet the projected need for U(IV) as a reductant in SRP's separations processes. The preliminary design was described in a Basic Data Summary for the U(IV) production facility, and a Venture Guidance Appraisal (VGA) was prepared from the Basic Data Summary. The VGA for the U(IV) process showed that because of the large capital investment required, this approach to waste reduction was not economically competitive with another alternative that required only modifying the ongoing Purex process at no additional capital cost. However, implementing he U(IV) process as part of an overall canyon renovation, presently scheduled for the 1990's, may be economically attractive. The purpose of this TDS is therefore to bring together the information and experience obtained thus far in the U(IV) program so that a useful body of information will be available to support any future development of this process

  12. Breakthrough In Current In Plane Metrology For Monitoring Large Scale MRAM Production

    DEFF Research Database (Denmark)

    Cagliani, Alberto; Østerberg, Frederik Westergaard; Hansen, Ole

    2017-01-01

    The current-in-plane tunneling technique (CIPT) has been a crucial tool in the development of magnetic tunnel junction stacks suitable for Magnetic Random Access Memories (MRAM) for more than a decade. The MRAM development has now reached the maturity to make the transition from R&D to large...... of the Resistance Area product (RA) and the Tunnel Magnetoresistance (TMR) measurements, compared to state of the art CIPT metrology tools dedicated to R&D. On two test wafers, the repeatability of RA and MR was improved up to 350% and the measurement reproducibility up to 1700%. We believe that CIPT metrology now...

  13. Optimal Siting and Sizing of Energy Storage System for Power Systems with Large-scale Wind Power Integration

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Huang, Shaojun

    2015-01-01

    This paper proposes algorithms for optimal sitingand sizing of Energy Storage System (ESS) for the operationplanning of power systems with large scale wind power integration.The ESS in this study aims to mitigate the wind powerfluctuations during the interval between two rolling Economic......Dispatches (EDs) in order to maintain generation-load balance.The charging and discharging of ESS is optimized consideringoperation cost of conventional generators, capital cost of ESSand transmission losses. The statistics from simulated systemoperations are then coupled to the planning process to determinethe...

  14. Large-scale bioreactor production of the herbicide-degrading Aminobacter sp. strain MSH1

    DEFF Research Database (Denmark)

    Schultz-Jensen, Nadja; Knudsen, Berith Elkær; Frkova, Zuzana

    2014-01-01

    The Aminobacter sp. strain MSH1 has potential for pesticide bioremediation because it degrades the herbicide metabolite 2,6-dichlorobenzamide (BAM). Production of the BAM-degrading bacterium using aerobic bioreactor fermentation was investigated. A mineral salt medium limited for carbon and with ......The Aminobacter sp. strain MSH1 has potential for pesticide bioremediation because it degrades the herbicide metabolite 2,6-dichlorobenzamide (BAM). Production of the BAM-degrading bacterium using aerobic bioreactor fermentation was investigated. A mineral salt medium limited for carbon...... and with an element composition similar to the strain was generated. The optimal pH and temperature for strain growth were determined using shaker flasks and verified in bioreactors. Glucose, fructose, and glycerol were suitable carbon sources for MSH1 (μ =0.1 h−1); slower growth was observed on succinate and acetic...... acid (μ =0.01 h−1). Standard conditions for growth of theMSH1 strain were defined at pH 7 and 25 °C, with glucose as the carbon source. In bioreactors (1 and 5 L), the specific growth rate of MSH1 increased from μ =0.1 h−1 on traditional mineral salt medium to μ =0.18 h−1 on the optimized mineral salt...

  15. Large Scale Product Recommendation of Supermarket Ware Based on Customer Behaviour Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Kanavos

    2018-05-01

    Full Text Available In this manuscript, we present a prediction model based on the behaviour of each customer using data mining techniques. The proposed model utilizes a supermarket database and an additional database from Amazon, both containing information about customers’ purchases. Subsequently, our model analyzes these data in order to classify customers as well as products, being trained and validated with real data. This model is targeted towards classifying customers according to their consuming behaviour and consequently proposes new products more likely to be purchased by them. The corresponding prediction model is intended to be utilized as a tool for marketers so as to provide an analytically targeted and specified consumer behavior. Our algorithmic framework and the subsequent implementation employ the cloud infrastructure and use the MapReduce Programming Environment, a model for processing large data-sets in a parallel manner with a distributed algorithm on computer clusters, as well as Apache Spark, which is a newer framework built on the same principles as Hadoop. Through a MapReduce model application on each step of the proposed method, text processing speed and scalability are enhanced in reference to other traditional methods. Our results show that the proposed method predicts with high accuracy the purchases of a supermarket.

  16. Experience with LHC Magnets from Prototyping to Large Scale Industrial Production and Integration

    CERN Multimedia

    Rossi, L

    2004-01-01

    The construction of the LHC superconducting magnets is approaching its half way to completion. At the end of 2003, main dipoles cold masses for more than one octant were delivered; meanwhile the winding for the second octant was almost completed. The other large magnets, like the main quadrupoles and the insertion quadrupoles, have entered into series production as well. Providing more than 20 km of superconducting magnets, with the quality required for an accelerator like LHC, is an unprecedented challenge in term of complexity that has required many steps from the construction of 1 meterlong magnets in the laboratory to today’s production of more than one 15 meter-long magnet per day in Industry. The work and its organization is made even more complex by the fact that CERN supplies most of the critical components and part of the main tooling to the magnet manufacturers, both for cost reduction and for quality issues. In this paper the critical aspects of the construction will be reviewed and the actual ...

  17. Investigation of factors influencing biogas production in a large-scale thermophilic municipal biogas plant

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, Agnes; Jerome, Valerie; Freitag, Ruth [Bayreuth Univ. (Germany). Chair for Process Biotechnology; Burghardt, Diana; Likke, Likke; Peiffer, Stefan [Bayreuth Univ. (Germany). Dept. of Hydrology; Hofstetter, Eugen M. [RVT Process Equipment GmbH, Steinwiesen (Germany); Gabler, Ralf [BKW Biokraftwerke Fuerstenwalde GmbH, Fuerstenwalde (Germany)

    2009-10-15

    A continuously operated, thermophilic, municipal biogas plant was observed over 26 months (sampling twice per month) in regard to a number of physicochemical parameters and the biogas production. Biogas yields were put in correlation to parameters such as the volatile fatty acid concentration, the pH and the ammonium concentration. When the residing microbiota was classified via analysis of the 16S rRNA genes, most bacterial sequences matched with unidentified or uncultured bacteria from similar habitats. Of the archaeal sequences, 78.4% were identified as belonging to the genus Methanoculleus, which has not previously been reported for biogas plants, but is known to efficiently use H{sub 2} and CO{sub 2} produced by the degradation of fatty acids by syntrophic microorganisms. In order to further investigate the influence of varied amounts of ammonia (2-8 g/L) and volatile fatty acids on biogas production and composition (methane/CO{sub 2}), laboratory scale satellite experiments were performed in parallel to the technical plant. Finally, ammonia stripping of the process water of the technical plant was accomplished, a measure through which the ammonia entering the biogas reactor via the mash could be nearly halved, which increased the energy output of the biogas plant by almost 20%. (orig.)

  18. Large-scale production of tannase using the yeast Arxula adeninivorans.

    Science.gov (United States)

    Böer, Erik; Breuer, Friederike Sophie; Weniger, Michael; Denter, Sylvia; Piontek, Michael; Kunze, Gotthard

    2011-10-01

    Tannase (tannin acyl hydrolase, EC 3.1.1.20) hydrolyses the ester and depside bonds of gallotannins and gallic acid esters and is an important industrial enzyme. In the present study, transgenic Arxula adeninivorans strains were optimised for tannase production. Various plasmids carrying one or two expression modules for constitutive expression of tannase were constructed. Transformant strains that overexpress the ATAN1 gene from the strong A. adeninivorans TEF1 promoter produce levels of up to 1,642 U L(-1) when grown in glucose medium in shake flasks. The effect of fed-batch fermentation on tannase productivity was then investigated in detail. Under these conditions, a transgenic strain containing one ATAN1 expression module produced 51,900 U of tannase activity per litre after 142 h of fermentation at a dry cell weight of 162 g L(-1). The highest yield obtained from a transgenic strain with two ATAN1 expression modules was 31,300 U after 232 h at a dry cell weight of 104 g L(-1). Interestingly, the maximum achieved yield coefficients [Y(P/X)] for the two strains were essentially identical.

  19. An automatic device for the quality control of large-scale crystal's production

    CERN Document Server

    Baccaro, S; Castellani, M; Cecilia, A; Dafinei, I; Diemoz, M; Guerra, S; Longo, E; Montecchi, M; Organtini, G; Pellegrini, F

    2001-01-01

    In 1999, the construction of the electromagnetic calorimeter of the Compact Muon Solenoid (CMS) experiment started. Half of the barrel calorimeter made of 61200 lead tungstate (PWO) crystals will be assembled and tested in the Regional Centre of INFN-ENEA in Rome, Italy. Before assembling, all 30600 PWO crystals will be qualified for scintillation and radiation hardness characteristics by a specially built Automatic Crystal Control System. The measuring techniques for crystal qualification and performances of the automatic system will be discussed in this work. (11 refs).

  20. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    Science.gov (United States)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  1. Large-scale production of paper-based Li-ion cells

    CERN Document Server

    Zolin, Lorenzo

    2017-01-01

    This book describes in detail the use of natural cellulose fibers for the production of innovative, low-cost, and easily recyclable lithium-ion (Li-ion) cells by means of fast and reliable papermaking procedures that employ water as a solvent. In addition, it proposes specific methods to optimize the safety features of these paper-based cells and to improve the electronic conductivity of the electrodes by means of a carbonization process– an interesting novel technology that enables higher current rate capabilities to be achieved. The in-depth descriptions of materials, methods, and techniques are complemented by the inclusion of a general overview of electrochemical devices and, in particular, of different Li-ion battery configurations. Presenting the outcomes of this important research, the work is of wide interest to electrochemical engineers in both research institutions and industry.

  2. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Huiqiang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China); Shi, Zhenyu; Li, Shuang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); Liu, Na, E-mail: Naliujlu@163.com [College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China)

    2016-08-30

    Highlights: • Microwave method for synthesizing g-C{sub 3}N{sub 4} with N{sub 2} photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4}. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C{sub 3}N{sub 4}) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N{sub 2} adsorption, UV–vis spectroscopy, SEM, N{sub 2}-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C{sub 3}N{sub 4}, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C{sub 3}N{sub 4}. These nitrogen vacancies not only serve as active sites to adsorb and activate N{sub 2} molecules but also promote interfacial charge transfer from catalysts to N{sub 2} molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4} which is significantly important for the practical application.

  3. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    International Nuclear Information System (INIS)

    Ma, Huiqiang; Shi, Zhenyu; Li, Shuang; Liu, Na

    2016-01-01

    Highlights: • Microwave method for synthesizing g-C_3N_4 with N_2 photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C_3N_4. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C_3N_4) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N_2 adsorption, UV–vis spectroscopy, SEM, N_2-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C_3N_4, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C_3N_4. These nitrogen vacancies not only serve as active sites to adsorb and activate N_2 molecules but also promote interfacial charge transfer from catalysts to N_2 molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C_3N_4 which is significantly important for the practical application.

  4. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  5. Large-Scale Production of Monitored Drift Tube Chambers for the ATLAS Muon Spectrometer

    CERN Document Server

    Bauer, F.; Kortner, O; Kroha, H; Manz, A; Mohrdieck, S; Richter, R; Zhuravlov, V

    2016-01-01

    Precision drift tube chambers with a sense wire positioning accuracy of better than 20 microns are under construction for the ATLAS muon spectrometer. 70% of the 88 large chambers for the outermost layer of the central part of the spectrometer have been assembled. Measurements during chamber construction of the positions of the sense wires and of the sensors for the optical alignment monitoring system demonstrate that the requirements for the mechanical precision of the chambers are fulfilled.

  6. Developing Large-Scale Bayesian Networks by Composition: Fault Diagnosis of Electrical Power Systems in Aircraft and Spacecraft

    Science.gov (United States)

    Mengshoel, Ole Jakob; Poll, Scott; Kurtoglu, Tolga

    2009-01-01

    This CD contains files that support the talk (see CASI ID 20100021404). There are 24 models that relate to the ADAPT system and 1 Excel worksheet. In the paper an investigation into the use of Bayesian networks to construct large-scale diagnostic systems is described. The high-level specifications, Bayesian networks, clique trees, and arithmetic circuits representing 24 different electrical power systems are described in the talk. The data in the CD are the models of the 24 different power systems.

  7. Coherent Laser Radar Metrology System for Large Scale Optical Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — A new type of laser radar metrology inspection system is proposed that incorporates a novel, dual laser coherent detection scheme capable of eliminating both...

  8. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  9. Improving Large-scale Storage System Performance via Topology-aware and Balanced Data Placement

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feiyi [ORNL; Oral, H Sarp [ORNL; Vazhkudai, Sudharshan S [ORNL

    2014-01-01

    With the advent of big data, the I/O subsystems of large-scale compute clusters are becoming a center of focus, with more applications putting greater demands on end-to-end I/O performance. These subsystems are often complex in design. They comprise of multiple hardware and software layers to cope with the increasing capacity, capability and scalability requirements of data intensive applications. The sharing nature of storage resources and the intrinsic interactions across these layers make it to realize user-level, end-to-end performance gains a great challenge. We propose a topology-aware resource load balancing strategy to improve per-application I/O performance. We demonstrate the effectiveness of our algorithm on an extreme-scale compute cluster, Titan, at the Oak Ridge Leadership Computing Facility (OLCF). Our experiments with both synthetic benchmarks and a real-world application show that, even under congestion, our proposed algorithm can improve large-scale application I/O performance significantly, resulting in both the reduction of application run times and higher resolution simulation runs.

  10. Large-scale integration of wind power into different energy systems

    DEFF Research Database (Denmark)

    Lund, Henrik

    2005-01-01

    The paper presents the ability of different energy systems and regulation strategies to integrate wind power. The ability is expressed by the following three factors: the degree of electricity excess production caused by fluctuations in wind and Combined Heat and Power (CHP) heat demands......, the ability to utilise wind power to reduce CO2 emission in the system, and the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system...... and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and ancillary grid stability services and investments in electric heating, heat pumps and heat storage capacity. The results...

  11. Optimizing the design of large-scale ground-coupled heat pump systems using groundwater and heat transport modeling

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, H.; Itoi, R.; Fujii, J. [Kyushu University, Fukuoka (Japan). Faculty of Engineering, Department of Earth Resources Engineering; Uchida, Y. [Geological Survey of Japan, Tsukuba (Japan)

    2005-06-01

    In order to predict the long-term performance of large-scale ground-coupled heat pump (GCHP) systems, it is necessary to take into consideration well-to-well interference, especially in the presence of groundwater flow. A mass and heat transport model was developed to simulate the behavior of this type of system in the Akita Plain, northern Japan. The model was used to investigate different operational schemes and to maximize the heat extraction rate from the GCHP system. (author)

  12. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    Directory of Open Access Journals (Sweden)

    Anirban Nandi

    2014-01-01

    Full Text Available Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D. In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA. It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  13. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  14. Large-scale production of bioenergy by the side of fuel-peat; Bioenergian suurtuotanto polttoturpeen rinnalla

    Energy Technology Data Exchange (ETDEWEB)

    Heikkilae, K. [Vapo Oy, Jyvaeskylae (Finland)

    1996-12-31

    The objective of the project was to clarify the large-scale production possibilities and the construction of the costs for bioenergy, and to develop the operational manners so that smaller volumes of biomasses are integrated to prevailing peat production and delivered so that peat ensures the quality of the fuel supply, as well as the prices and the reliability of deliveries. Hence it is possible to utilize the same organisation, machinery and volumes. The operation will be designed to be all-year-round so that the profitability can be improved. Another aim is to get the non-utilizeable wood-wastes into use, which would serve also the silvicultural purposes. The utilizeable municipal and other wastes and sludges could be used within biomass, and to make, using proper mixing ratios, biofuels precisely suitable for the purposes of the customer. At the grain growing areas it is possible to utilize the straw and at the seaside the reed grass

  15. Reduced Order Modeling for Prediction and Control of Large-Scale Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Kalashnikova, Irina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Mathematics; Arunajatesan, Srinivasan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Aerosciences Dept.; Barone, Matthew Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Aerosciences Dept.; van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Uncertainty Quantification and Optimization Dept.; Fike, Jeffrey A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Component Science and Mechanics Dept.

    2014-05-01

    This report describes work performed from June 2012 through May 2014 as a part of a Sandia Early Career Laboratory Directed Research and Development (LDRD) project led by the first author. The objective of the project is to investigate methods for building stable and efficient proper orthogonal decomposition (POD)/Galerkin reduced order models (ROMs): models derived from a sequence of high-fidelity simulations but having a much lower computational cost. Since they are, by construction, small and fast, ROMs can enable real-time simulations of complex systems for onthe- spot analysis, control and decision-making in the presence of uncertainty. Of particular interest to Sandia is the use of ROMs for the quantification of the compressible captive-carry environment, simulated for the design and qualification of nuclear weapons systems. It is an unfortunate reality that many ROM techniques are computationally intractable or lack an a priori stability guarantee for compressible flows. For this reason, this LDRD project focuses on the development of techniques for building provably stable projection-based ROMs. Model reduction approaches based on continuous as well as discrete projection are considered. In the first part of this report, an approach for building energy-stable Galerkin ROMs for linear hyperbolic or incompletely parabolic systems of partial differential equations (PDEs) using continuous projection is developed. The key idea is to apply a transformation induced by the Lyapunov function for the system, and to build the ROM in the transformed variables. It is shown that, for many PDE systems including the linearized compressible Euler and linearized compressible Navier-Stokes equations, the desired transformation is induced by a special inner product, termed the “symmetry inner product”. Attention is then turned to nonlinear conservation laws. A new transformation and corresponding energy-based inner product for the full nonlinear compressible Navier

  16. Expanded Large-Scale Forcing Properties Derived from the Multiscale Data Assimilation System and Its Application to Single-Column Models

    Science.gov (United States)

    Feng, S.; Li, Z.; Liu, Y.; Lin, W.; Toto, T.; Vogelmann, A. M.; Fridlind, A. M.

    2013-12-01

    We present an approach to derive large-scale forcing that is used to drive single-column models (SCMs) and cloud resolving models (CRMs)/large eddy simulation (LES) for evaluating fast physics parameterizations in climate models. The forcing fields are derived by use of a newly developed multi-scale data assimilation (MS-DA) system. This DA system is developed on top of the NCEP Gridpoint Statistical Interpolation (GSI) System and is implemented in the Weather Research and Forecasting (WRF) model at a cloud resolving resolution of 2 km. This approach has been applied to the generation of large scale forcing for a set of Intensive Operation Periods (IOPs) over the Atmospheric Radiation Measurement (ARM) Climate Research Facility's Southern Great Plains (SGP) site. The dense ARM in-situ observations and high-resolution satellite data effectively constrain the WRF model. The evaluation shows that the derived forcing displays accuracies comparable to the existing continuous forcing product and, overall, a better dynamic consistency with observed cloud and precipitation. One important application of this approach is to derive large-scale hydrometeor forcing and multiscale forcing, which is not provided in the existing continuous forcing product. It is shown that the hydrometeor forcing poses an appreciable impact on cloud and precipitation fields in the single-column model simulations. The large-scale forcing exhibits a significant dependency on domain-size that represents SCM grid-sizes. Subgrid processes often contribute a significant component to the large-scale forcing, and this contribution is sensitive to the grid-size and cloud-regime.

  17. Impact of large scale wind power on the Nordic electricity system

    International Nuclear Information System (INIS)

    Holttinen, Hannele

    2006-01-01

    Integration costs of wind power depend on how much wind power and where, and the power system: load, generation flexibility, interconnections. When wind power is added to a large interconnected power system there is considerable smoothing effect for the production. Increase of reserve requirements will stay at a low level. 10 percent penetration of wind power is not a problem in Nordic countries, as long as wind power is built to all 4 countries. Increasing the share of wind power will increase the integration costs. 20 percent penetration would need more flexibility in the system. That will not happen in the near future for Nordel, and the power system will probably also contain more flexible elements at that stage, like producing fuel for vehicles (ml)

  18. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  19. ROSA-IV Large Scale Test Facility (LSTF) system description for second simulated fuel assembly

    International Nuclear Information System (INIS)

    1990-10-01

    The ROSA-IV Program's Large Scale Test Facility (LSTF) is a test facility for integral simulation of thermal-hydraulic response of a pressurized water reactor (PWR) during small break loss-of-coolant accidents (LOCAs) and transients. In this facility, the PWR core nuclear fuel rods are simulated using electric heater rods. The simulated fuel assembly which was installed during the facility construction was replaced with a new one in 1988. The first test with this second simulated fuel assembly was conducted in December 1988. This report describes the facility configuration and characteristics as of this date (December 1988) including the new simulated fuel assembly design and the facility changes which were made during the testing with the first assembly as well as during the renewal of the simulated fuel assembly. (author)

  20. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Science.gov (United States)

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  1. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Directory of Open Access Journals (Sweden)

    Xinhua He

    2014-01-01

    Full Text Available This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  2. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    Science.gov (United States)

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  3. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  4. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    Science.gov (United States)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  5. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  6. Large Scale Document Inversion using a Multi-threaded Computing System

    Science.gov (United States)

    Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won

    2018-01-01

    Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. CCS Concepts •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations.

  7. Large Scale Document Inversion using a Multi-threaded Computing System.

    Science.gov (United States)

    Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won

    2017-06-01

    Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations.

  8. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  9. Offshore Variability in Critical Weather Conditions in Large-Scale Wind Based Danish Power System

    DEFF Research Database (Denmark)

    Cutululis, Nicolaos Antonio; Litong-Palima, Marisciel; Sørensen, Poul Ejnar

    2013-01-01

    of the variability for the 2020 Danish power system, one can see that in the worst case, up to 1500 MW of power can be lost in 30 minutes. We present results showing how this issue is partially solved by the new High Wind Storm Controller presented by Siemens in the TWENTIES project.......Offshore wind power has a significant development potential, especially in North Europe. The geographical concentration of offshore wind power leads to increased variability and in the case of critical weather conditions it may lead to sudden and considerable loss of production. In this context......, the chances of losing several GW of wind power due to critical weather conditions in a very short time period could potentially jeopardize the whole system’s reliability and stability. Forecasting such events is not trivial and the results so far are not encouraging. When assessing the impact...

  10. Production of recombinant antigens and antibodies in Nicotiana benthamiana using 'magnifection' technology: GMP-compliant facilities for small- and large-scale manufacturing.

    Science.gov (United States)

    Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh

    2014-01-01

    This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.

  11. DMPy: a Python package for automated mathematical model construction of large-scale metabolic systems.

    Science.gov (United States)

    Smith, Robert W; van Rosmalen, Rik P; Martins Dos Santos, Vitor A P; Fleck, Christian

    2018-06-19

    Models of metabolism are often used in biotechnology and pharmaceutical research to identify drug targets or increase the direct production of valuable compounds. Due to the complexity of large metabolic systems, a number of conclusions have been drawn using mathematical methods with simplifying assumptions. For example, constraint-based models describe changes of internal concentrations that occur much quicker than alterations in cell physiology. Thus, metabolite concentrations and reaction fluxes are fixed to constant values. This greatly reduces the mathematical complexity, while providing a reasonably good description of the system in steady state. However, without a large number of constraints, many different flux sets can describe the optimal model and we obtain no information on how metabolite levels dynamically change. Thus, to accurately determine what is taking place within the cell, finer quality data and more detailed models need to be constructed. In this paper we present a computational framework, DMPy, that uses a network scheme as input to automatically search for kinetic rates and produce a mathematical model that describes temporal changes of metabolite fluxes. The parameter search utilises several online databases to find measured reaction parameters. From this, we take advantage of previous modelling efforts, such as Parameter Balancing, to produce an initial mathematical model of a metabolic pathway. We analyse the effect of parameter uncertainty on model dynamics and test how recent flux-based model reduction techniques alter system properties. To our knowledge this is the first time such analysis has been performed on large models of metabolism. Our results highlight that good estimates of at least 80% of the reaction rates are required to accurately model metabolic systems. Furthermore, reducing the size of the model by grouping reactions together based on fluxes alters the resulting system dynamics. The presented pipeline automates the

  12. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    International Nuclear Information System (INIS)

    Zhao Jing; Wu Yong; Zhu Neng

    2009-01-01

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end.

  13. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhao Jing [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China)], E-mail: zhaojing@tju.edu.cn; Wu Yong [Department of Science and Technology, Ministry of Housing and Urban-Rural Development of the People' s Republic of China, Beijing 100835 (China); Zhu Neng [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China)

    2009-06-15

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end.

  14. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Jing; Zhu, Neng [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China); Wu, Yong [Department of Science and Technology, Ministry of Housing and Urban-Rural Development of the People' s Republic of China, Beijing 100835 (China)

    2009-06-15

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end. (author)

  15. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    Science.gov (United States)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  16. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  17. Modeling and Coordinated Control Strategy of Large Scale Grid-Connected Wind/Photovoltaic/Energy Storage Hybrid Energy Conversion System

    Directory of Open Access Journals (Sweden)

    Lingguo Kong

    2015-01-01

    Full Text Available An AC-linked large scale wind/photovoltaic (PV/energy storage (ES hybrid energy conversion system for grid-connected application was proposed in this paper. Wind energy conversion system (WECS and PV generation system are the primary power sources of the hybrid system. The ES system, including battery and fuel cell (FC, is used as a backup and a power regulation unit to ensure continuous power supply and to take care of the intermittent nature of wind and photovoltaic resources. Static synchronous compensator (STATCOM is employed to support the AC-linked bus voltage and improve low voltage ride through (LVRT capability of the proposed system. An overall power coordinated control strategy is designed to manage real-power and reactive-power flows among the different energy sources, the storage unit, and the STATCOM system in the hybrid system. A simulation case study carried out on Western System Coordinating Council (WSCC 3-machine 9-bus test system for the large scale hybrid energy conversion system has been developed using the DIgSILENT/Power Factory software platform. The hybrid system performance under different scenarios has been verified by simulation studies using practical load demand profiles and real weather data.

  18. A large scale evacuation. Tasks of evacuator. Collapse of medical system at the time of areal indication for large scale indoor refuge and problems for restoration

    International Nuclear Information System (INIS)

    Oikawa, Tomoyoshi

    2012-01-01

    About the evacuation from disasters of quake/tsunami on Mar. 11, 2011, and Fukushima Nuclear Power Plant Accident (Mar. 12-15), described are its social background, influence on the local society, medicare and works of medical staff. The disaster immediately blocked means of communication and transportation in Minamisoma City, and the Japan government indicated the indoor refuge of residents in the zone at 3 km distance from the Plant, then at 20 km on 12th and 30 km on 15th. About 123 thousands residents nearby had to evacuate after all: the largest scale of evacuation by the nuclear accident in Japan. Author's Minamisoma Citizens' Hospital (MCH) was located at 23 km from the Plant. Residents could know about the government indications through various media before their official announcements, and many had begun to evacuate. MCH accepted >100 victims, and measured their contamination as well as the ambient dose, using GM counter from 12th. The highest dose was 16 mc-Sv/h on 20th. Following the hydrogen explosion of no.3 reactor on 14th, residents were bewildered by the indication of indoor refuge, which impacted social activities like stoppage of commerce and brought about residents' mental conflict and solitary. Movement of all 107 hospitalized patients to neighboring facilities in Niigata prefecture started on 18th and actually completed on 20th with help from Self Defense Force. Children, pregnant women and certain patients were prohibited to enter the newly defined emergent evacuation preparation (EEP) zone on Apr. 11, within 30 km afar from the Plant, which inhibited the areal restoration and medicare. At present that 1.5 years have passed since the disaster, the number of medical stuff is quite insufficient near the old EEP zone. There, now 2/3 of population before the disaster are beginning their life, and most are elderly, suggesting the necessity of rearrangement of medical systems which were seemingly once collapsed. (T.T.)

  19. Parameter estimation in large-scale systems biology models: a parallel and self-adaptive cooperative strategy.

    Science.gov (United States)

    Penas, David R; González, Patricia; Egea, Jose A; Doallo, Ramón; Banga, Julio R

    2017-01-21

    The development of large-scale kinetic models is one of the current key issues in computational systems biology and bioinformatics. Here we consider the problem of parameter estimation in nonlinear dynamic models. Global optimization methods can be used to solve this type of problems but the associated computational cost is very large. Moreover, many of these methods need the tuning of a number of adjustable search parameters, requiring a number of initial exploratory runs and therefore further increasing the computation times. Here we present a novel parallel method, self-adaptive cooperative enhanced scatter search (saCeSS), to accelerate the solution of this class of problems. The method is based on the scatter search optimization metaheuristic and incorporates several key new mechanisms: (i) asynchronous cooperation between parallel processes, (ii) coarse and fine-grained parallelism, and (iii) self-tuning strategies. The performance and robustness of saCeSS is illustrated by solving a set of challenging parameter estimation problems, including medium and large-scale kinetic models of the bacterium E. coli, bakerés yeast S. cerevisiae, the vinegar fly D. melanogaster, Chinese Hamster Ovary cells, and a generic signal transduction network. The results consistently show that saCeSS is a robust and efficient method, allowing very significant reduction of computation times with respect to several previous state of the art methods (from days to minutes, in several cases) even when only a small number of processors is used. The new parallel cooperative method presented here allows the solution of medium and large scale parameter estimation problems in reasonable computation times and with small hardware requirements. Further, the method includes self-tuning mechanisms which facilitate its use by non-experts. We believe that this new method can play a key role in the development of large-scale and even whole-cell dynamic models.

  20. On Modeling Large-Scale Multi-Agent Systems with Parallel, Sequential and Genuinely Asynchronous Cellular Automata

    International Nuclear Information System (INIS)

    Tosic, P.T.

    2011-01-01

    We study certain types of Cellular Automata (CA) viewed as an abstraction of large-scale Multi-Agent Systems (MAS). We argue that the classical CA model needs to be modified in several important respects, in order to become a relevant and sufficiently general model for the large-scale MAS, and so that thus generalized model can capture many important MAS properties at the level of agent ensembles and their long-term collective behavior patterns. We specifically focus on the issue of inter-agent communication in CA, and propose sequential cellular automata (SCA) as the first step, and genuinely Asynchronous Cellular Automata (ACA) as the ultimate deterministic CA-based abstract models for large-scale MAS made of simple reactive agents. We first formulate deterministic and nondeterministic versions of sequential CA, and then summarize some interesting configuration space properties (i.e., possible behaviors) of a restricted class of sequential CA. In particular, we compare and contrast those properties of sequential CA with the corresponding properties of the classical (that is, parallel and perfectly synchronous) CA with the same restricted class of update rules. We analytically demonstrate failure of the studied sequential CA models to simulate all possible behaviors of perfectly synchronous parallel CA, even for a very restricted class of non-linear totalistic node update rules. The lesson learned is that the interleaving semantics of concurrency, when applied to sequential CA, is not refined enough to adequately capture the perfect synchrony of parallel CA updates. Last but not least, we outline what would be an appropriate CA-like abstraction for large-scale distributed computing insofar as the inter-agent communication model is concerned, and in that context we propose genuinely asynchronous CA. (author)

  1. Co-gasification of municipal solid waste and material recovery in a large-scale gasification and melting system.

    Science.gov (United States)

    Tanigaki, Nobuhiro; Manako, Kazutaka; Osada, Morihiro

    2012-04-01

    This study evaluates the effects of co-gasification of municipal solid waste with and without the municipal solid waste bottom ash using two large-scale commercial operation plants. From the viewpoint of operation data, there is no significant difference between municipal solid waste treatment with and without the bottom ash. The carbon conversion ratios are as high as 91.7% and 95.3%, respectively and this leads to significantly low PCDD/DFs yields via complete syngas combustion. The gross power generation efficiencies are 18.9% with the bottom ash and 23.0% without municipal solid waste bottom ash, respectively. The effects of the equivalence ratio are also evaluated. With the equivalence ratio increasing, carbon monoxide concentration is decreased, and carbon dioxide and the syngas temperature (top gas temperature) are increased. The carbon conversion ratio is also increased. These tendencies are seen in both modes. Co-gasification using the gasification and melting system (Direct Melting System) has a possibility to recover materials effectively. More than 90% of chlorine is distributed in fly ash. Low-boiling-point heavy metals, such as lead and zinc, are distributed in fly ash at rates of 95.2% and 92.0%, respectively. Most of high-boiling-point heavy metals, such as iron and copper, are distributed in metal. It is also clarified that slag is stable and contains few harmful heavy metals such as lead. Compared with the conventional waste management framework, 85% of the final landfill amount reduction is achieved by co-gasification of municipal solid waste with bottom ash and incombustible residues. These results indicate that the combined production of slag with co-gasification of municipal solid waste with the bottom ash constitutes an ideal approach to environmental conservation and resource recycling. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Large scale hydrogen production from wind energy in the Magallanes area for consumption in the central zone of Chile

    International Nuclear Information System (INIS)

    Zolezzi, J.M.; Garay, A.; Reveco, M.

    2010-01-01

    The energy proposal of this research suggests the use of places with abundant wind resources for the production of H 2 on a large scale to be transported and used in the central zone of Chile with the purpose of diversifying the country's energy matrix in order to decrease its dependence on fossil fuels, increase its autonomy, and cover the future increases in energy demand. This research showed that the load factor of the proposed wind park reaches a value of 54.5%, putting in evidence the excellent wind conditions of the zone. This implies that the cost of the electricity produced by the wind park located in the Chilean Patagonia would have a cost of 0.0213 US$ kWh -1 in the year 2030. The low prices of the electricity obtained from the park, thanks to the economy of scale and the huge wind potential, represent a very attractive scenario for the production of H 2 in the future. The study concludes that by the year 2030 the cost of the H 2 generated in Magallanes and transported to the port of Quinteros would be 18.36 US$ MBTU -1 , while by that time the cost of oil would be about 17.241 US$ MBTU -1 , a situation that places H 2 in a very competitive position as a fuel. (author)

  3. Evaluation of hollow fiber culture for large-scale production of mouse embryonic stem cell-derived hematopoietic stem cells.

    Science.gov (United States)

    Nakano, Yu; Iwanaga, Shinya; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2018-03-03

    Hematopoietic stem cells (HSCs) have the ability to differentiate into all types of blood cells and can be transplanted to treat blood disorders. However, it is difficult to obtain HSCs in large quantities because of the shortage of donors. Recent efforts have focused on acquiring HSCs by differentiation of pluripotent stem cells. As a conventional differentiation method of pluripotent stem cells, the formation of embryoid bodies (EBs) is often employed. However, the size of EBs is limited by depletion of oxygen and nutrients, which prevents them from being efficient for the production of HSCs. In this study, we developed a large-scale hematopoietic differentiation approach for mouse embryonic stem (ES) cells by applying a hollow fiber (HF)/organoid culture method. Cylindrical organoids, which had the potential for further spontaneous differentiation, were established inside of hollow fibers. Using this method, we improved the proliferation rate of mouse ES cells to produce an increased HSC population and achieved around a 40-fold higher production volume of HSCs in HF culture than in conventional EB culture. Therefore, the HF/organoid culture method may be a new mass culture method to acquire pluripotent stem cell-derived HSCs.

  4. Developing Large-Scale Bayesian Networks by Composition: Fault Diagnosis of Electrical Power Systems in Aircraft and Spacecraft

    Science.gov (United States)

    Mengshoel, Ole Jakob; Poll, Scott; Kurtoglu, Tolga

    2009-01-01

    In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale Bayesian networks by composition. This compositional approach reflects how (often redundant) subsystems are architected to form systems such as electrical power systems. We develop high-level specifications, Bayesian networks, clique trees, and arithmetic circuits representing 24 different electrical power systems. The largest among these 24 Bayesian networks contains over 1,000 random variables. Another BN represents the real-world electrical power system ADAPT, which is representative of electrical power systems deployed in aerospace vehicles. In addition to demonstrating the scalability of the compositional approach, we briefly report on experimental results from the diagnostic competition DXC, where the ProADAPT team, using techniques discussed here, obtained the highest scores in both Tier 1 (among 9 international competitors) and Tier 2 (among 6 international competitors) of the industrial track. While we consider diagnosis of power systems specifically, we believe this work is relevant to other system health management problems, in particular in dependable systems such as aircraft and spacecraft. (See CASI ID 20100021910 for supplemental data disk.)

  5. Towards a Database System for Large-scale Analytics on Strings

    KAUST Repository

    Sahli, Majed

    2015-01-01

    Recent technological advances are causing an explosion in the production of sequential data. Biological sequences, web logs and time series are represented as strings. Currently, strings are stored, managed and queried in an ad-hoc fashion because

  6. An integrated assessment of a large-scale biodiesel production in Italy: Killing several birds with one stone?

    International Nuclear Information System (INIS)

    Russi, Daniela

    2008-01-01

    Biofuels are often presented as a contribution towards the solution of the problems related to our strong dependency on fossil fuels, i.e. greenhouse effect, energy dependency, urban pollution, besides being a way to support rural development. In this paper, an integrated assessment approach is employed to discuss the social desirability of a large-scale biodiesel production in Italy, taking into account social, environmental and economic factors. The conclusion is that the advantages in terms of reduction of greenhouse gas emissions, energy dependency and urban pollution would be very modest. The small benefits would not be enough to offset the huge costs in terms of land requirement: if the target of the European Directive 2003/30/EC were reached (5.75% of the energy used for transport by 2010) the equivalent of about one-third of the Italian agricultural land would be needed. The consequences would be a considerable increase in food imports and large environmental impacts in the agricultural phase. Also, since biodiesel must be de-taxed in order to make it competitive with oil-derived diesel, the Italian energy revenues would be reduced. In the end, rural development remains the only sound reason to promote biodiesel, but even for this objective other strategies look more advisable, like supporting organic agriculture. (author)

  7. Large-scale hydrological model river storage and discharge correction using a satellite altimetry-based discharge product

    Science.gov (United States)

    Emery, Charlotte Marie; Paris, Adrien; Biancamaria, Sylvain; Boone, Aaron; Calmant, Stéphane; Garambois, Pierre-André; Santos da Silva, Joecila

    2018-04-01

    Land surface models (LSMs) are widely used to study the continental part of the water cycle. However, even though their accuracy is increasing, inherent model uncertainties can not be avoided. In the meantime, remotely sensed observations of the continental water cycle variables such as soil moisture, lakes and river elevations are more frequent and accurate. Therefore, those two different types of information can be combined, using data assimilation techniques to reduce a model's uncertainties in its state variables or/and in its input parameters. The objective of this study is to present a data assimilation platform that assimilates into the large-scale ISBA-CTRIP LSM a punctual river discharge product, derived from ENVISAT nadir altimeter water elevation measurements and rating curves, over the whole Amazon basin. To deal with the scale difference between the model and the observation, the study also presents an initial development for a localization treatment that allows one to limit the impact of observations to areas close to the observation and in the same hydrological network. This assimilation platform is based on the ensemble Kalman filter and can correct either the CTRIP river water storage or the discharge. Root mean square error (RMSE) compared to gauge discharges is globally reduced until 21 % and at Óbidos, near the outlet, RMSE is reduced by up to 52 % compared to ENVISAT-based discharge. Finally, it is shown that localization improves results along the main tributaries.

  8. A large-scale RF-based Indoor Localization System Using Low-complexity Gaussian filter and improved Bayesian inference

    Directory of Open Access Journals (Sweden)

    L. Xiao

    2013-04-01

    Full Text Available The growing convergence among mobile computing device and smart sensors boosts the development of ubiquitous computing and smart spaces, where localization is an essential part to realize the big vision. The general localization methods based on GPS and cellular techniques are not suitable for tracking numerous small size and limited power objects in the indoor case. In this paper, we propose and demonstrate a new localization method, this method is an easy-setup and cost-effective indoor localization system based on off-the-shelf active RFID technology. Our system is not only compatible with the future smart spaces and ubiquitous computing systems, but also suitable for large-scale indoor localization. The use of low-complexity Gaussian Filter (GF, Wheel Graph Model (WGM and Probabilistic Localization Algorithm (PLA make the proposed algorithm robust and suitable for large-scale indoor positioning from uncertainty, self-adjective to varying indoor environment. Using MATLAB simulation, we study the system performances, especially the dependence on a number of system and environment parameters, and their statistical properties. The simulation results prove that our proposed system is an accurate and cost-effective candidate for indoor localization.

  9. Systems and methods for large-scale nanotemplate and nanowire fabrication

    KAUST Repository

    Vidal, Enrique Vilanova; Alfadhel, Ahmed; Ivanov, Iurii; Kosel, Jü rgen

    2016-01-01

    Systems and methods for largescale nanotemplate and nanowire fabrication are provided. The system can include a sample holder and one or more chemical containers fluidly connected to the sample holder. The sample holder can be configured to contain

  10. Model order reduction of large-scale dynamical systems with Jacobi-Davidson style eigensolvers

    NARCIS (Netherlands)

    Benner, P.; Hochstenbach, M.E.; Kürschner, P.

    2011-01-01

    Many applications concerning physical and technical processes employ dynamical systems for simulation purposes. The increasing demand for a more accurate and detailed description of realistic phenomena leads to high dimensional dynamical systems and hence, simulation often yields an increased

  11. Parallel real-time visualization system for large-scale simulation. Application to WSPEEDI

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Kitabata, Hideyuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    2000-01-01

    The real-time visualization system, PATRAS (PArallel TRAcking Steering system) has been developed on parallel computing servers. The system performs almost all of the visualization tasks on a parallel computing server, and uses image data compression technique for efficient communication between the server and the client terminal. Therefore, the system realizes high performance concurrent visualization in an internet computing environment. The experience in applying PATRAS to WSPEEDI (Worldwide version of System for Prediction Environmental Emergency Dose Information) is reported. The application of PATRAS to WSPEEDI enables users to understand behaviours of radioactive tracers from different release points easily and quickly. (author)

  12. Computer model for large-scale offshore wind-power systems

    Energy Technology Data Exchange (ETDEWEB)

    Dambolena, I G [Bucknell Univ., Lewisburg, PA; Rikkers, R F; Kaminsky, F C

    1977-01-01

    A computer-based planning model has been developed to evaluate the cost and simulate the performance of offshore wind-power systems. In these systems, the electricity produced by wind generators either satisfies directly demand or produces hydrogen by water electrolysis. The hydrogen is stored and later used to produce electricity in fuel cells. Using as inputs basic characteristics of the system and historical or computer-generated time series for wind speed and electricity demand, the model simulates system performance over time. A history of the energy produced and the discounted annual cost of the system are used to evaluate alternatives. The output also contains information which is useful in pointing towards more favorable design alternatives. Use of the model to analyze a specific wind-power system for New England indicates that electric energy could perhaps be generated at a competitive cost.

  13. Large-scale integration of wind power into the existing Chinese energy system

    DEFF Research Database (Denmark)

    Liu, Wen; Lund, Henrik; Mathiesen, Brian Vad

    2011-01-01

    stability, the maximum feasible wind power penetration in the existing Chinese energy system is approximately 26% from both technical and economic points of view. A fuel efficiency decrease occurred when increasing wind power penetration in the system, due to its rigid power supply structure and the task......This paper presents the ability of the existing Chinese energy system to integrate wind power and explores how the Chinese energy system needs to prepare itself in order to integrate more fluctuating renewable energy in the future. With this purpose in mind, a model of the Chinese energy system has...... been constructed by using EnergyPLAN based on the year 2007, which has then been used for investigating three issues. Firstly, the accuracy of the model itself has been examined and then the maximum feasible wind power penetration in the existing energy system has been identified. Finally, barriers...

  14. Prospects and strategy for large scale utility applications of photovoltaic power systems

    International Nuclear Information System (INIS)

    Vigotti, R.; Lysen, E.; Cole, A.

    1996-01-01

    The status and prospects of photovoltaic (PV) power systems are reviewed. The market diffusion strategy for the application of PV systems by utilities is described, and the mission, objectives and thoughts of the collaboration programme launched among 18 industrialized countries under the framework of the International Energy Agency are highly with particular reference to technology transfer to developing countries. Future sales of PV systems are expected to grow in the short and medium term mainly in the sector of isolated systems. (R.P.)

  15. The Impact of an Extensive Usage of Controlled Natural Ventilation in the Residential Sector on Large-Scale Energy Systems

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan

    The energy situation in the world is becoming alarming. The demand of electricity continues to grow whereas the means of production remain limited. In addition, the electricity generation in the world is mostly based on fossil fuels such as coal, oil and natural gas. Only a small share of the total...... to the atmosphere. On the other hand, the efficiency of the end-use energy consumption is also fundamental to decrease the electricity production thus to lower the emission of greenhouse gases. Thereby, the building sector is a very important target because it consumes approximately one quarter of the total annual...... be reflected in the reduction of the electricity production. The objective of the thesis is to show realistic benefits of utilizing natural ventilation at an extensive manner onto large-scale scenarios such as a national scenario by using a model of natural ventilation developed here. To do so, a building...

  16. Literature Review on Reasons and Countermeasures on Large-scale Off-grid of Wind Turbine Generator System

    Directory of Open Access Journals (Sweden)

    Zhu Jun

    2015-01-01

    Full Text Available This paper reviews the present situation of the application of wind turbines generator system(WTGS at home and abroad, describes the strategic significance and the value of sustainable development of the wind power in the country, illustrates the problems, a variety of reasons and responses on large-scale off-grid of WTGS, compares the advantages and disadvantages of various methods, gives full consideration to the actual demand for WTGS works and characteristics and points out the further research.

  17. Steady-state analysis of large scale systems : the successive lumping method

    NARCIS (Netherlands)

    Smit, L.C.

    2016-01-01

    The general area of research of this dissertation concerns large systems with random aspects to their behavior that can be modeled and studied in terms of the stationary distribution of Markov chains. As the state spaces of such systems become large, their behavior gets hard to analyze, either via

  18. Centralized configuration system for a large scale farm of network booted computers

    Science.gov (United States)

    Ballestrero, S.; Brasolin, F.; Dârlea, G.-L.; Dumitru, I.; Scannicchio, D. A.; Twomey, M. S.; Vâlsan, M. L.; Zaytsev, A.

    2012-12-01

    The ATLAS trigger and data acquisition online farm is composed of nearly 3,000 computing nodes, with various configurations, functions and requirements. Maintaining such a cluster is a big challenge from the computer administration point of view, thus various tools have been adopted by the System Administration team to help manage the farm efficiently. In particular, a custom central configuration system, ConfDBv2, was developed for the overall farm management. The majority of the systems are network booted, and are running an operating system image provided by a Local File Server (LFS) via the local area network (LAN). This method guarantees the uniformity of the system and allows, in case of issues, very fast recovery of the local disks which could be used as scratch area. It also provides greater flexibility as the nodes can be reconfigured and restarted with a different operating system in a very timely manner. A user-friendly web interface offers a quick overview of the current farm configuration and status, allowing changes to be applied on selected subsets or on the whole farm in an efficient and consistent manner. Also, various actions that would otherwise be time consuming and error prone can be quickly and safely executed. We describe the design, functionality and performance of this system and its web-based interface, including its integration with other CERN and ATLAS databases and with the monitoring infrastructure.

  19. On-line transient stability assessment of large-scale power systems by using ball vector machines

    International Nuclear Information System (INIS)

    Mohammadi, M.; Gharehpetian, G.B.

    2010-01-01

    In this paper ball vector machine (BVM) has been used for on-line transient stability assessment of large-scale power systems. To classify the system transient security status, a BVM has been trained for all contingencies. The proposed BVM based security assessment algorithm has very small training time and space in comparison with artificial neural networks (ANN), support vector machines (SVM) and other machine learning based algorithms. In addition, the proposed algorithm has less support vectors (SV) and therefore is faster than existing algorithms for on-line applications. One of the main points, to apply a machine learning method is feature selection. In this paper, a new Decision Tree (DT) based feature selection technique has been presented. The proposed BVM based algorithm has been applied to New England 39-bus power system. The simulation results show the effectiveness and the stability of the proposed method for on-line transient stability assessment procedure of large-scale power system. The proposed feature selection algorithm has been compared with different feature selection algorithms. The simulation results demonstrate the effectiveness of the proposed feature algorithm.

  20. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers....... The analysis was based on using the energy model Balmorel to determine the optimum dispatch of HPs in the system. The potential heat sources in Copenhagen for use in HPs were determined based on data related to temperatures, flows, and hydrography at different locations, while respecting technical constraints...

  1. Dynamic Arrest in Charged Colloidal Systems Exhibiting Large-Scale Structural Heterogeneities

    International Nuclear Information System (INIS)

    Haro-Perez, C.; Callejas-Fernandez, J.; Hidalgo-Alvarez, R.; Rojas-Ochoa, L. F.; Castaneda-Priego, R.; Quesada-Perez, M.; Trappe, V.

    2009-01-01

    Suspensions of charged liposomes are found to exhibit typical features of strongly repulsive fluid systems at short length scales, while exhibiting structural heterogeneities at larger length scales that are characteristic of attractive systems. We model the static structure factor of these systems using effective pair interaction potentials composed of a long-range attraction and a shorter range repulsion. Our modeling of the static structure yields conditions for dynamically arrested states at larger volume fractions, which we find to agree with the experimentally observed dynamics

  2. Prospects and strategy for large scale utility applications of photovoltaic power systems

    International Nuclear Information System (INIS)

    Cole, A.; Vigotti, R.; Lysen, E.

    1995-01-01

    The paper reviews the status and prospects of photovoltaic power systems and the R and D trends (silicon performances, thin films, balance of systems components), and describes the market diffusion strategy for the application of PV systems: at the short and medium term level, isolated systems for rural electricity supply in IEA member countries and decentralized energy supply (remote users and village power) in developing countries; at the medium and long term level, decentralized building integration in urban and rural areas, power stations for peak power and local grid support. The objectives of the IEA collaboration programme launched among 18 industrialized countries are summarized, with particular reference to technology transfer to developing countries. 4 figs

  3. Performance of large-scale scientific applications on the IBM ASCI Blue-Pacific system

    International Nuclear Information System (INIS)

    Mirin, A.

    1998-01-01

    The IBM ASCI Blue-Pacific System is a scalable, distributed/shared memory architecture designed to reach multi-teraflop performance. The IBM SP pieces together a large number of nodes, each having a modest number of processors. The system is designed to accommodate a mixed programming model as well as a pure message-passing paradigm. We examine a number of applications on this architecture and evaluate their performance and scalability

  4. Performance Analysis of an Updraft Tower System for Dry Cooling in Large-Scale Power Plants

    Directory of Open Access Journals (Sweden)

    Haotian Liu

    2017-11-01

    Full Text Available An updraft tower cooling system is assessed for elimination of water use associated with power plant heat rejection. Heat rejected from the power plant condenser is used to warm the air at the base of an updraft tower; buoyancy-driven air flows through a recuperative turbine inside the tower. The secondary loop, which couples the power plant condenser to a heat exchanger at the tower base, can be configured either as a constant-pressure pump cycle or a vapor compression cycle. The novel use of a compressor can elevate the air temperature in the tower base to increases the turbine power recovery and decrease the power plant condensing temperature. The system feasibility is evaluated by comparing the net power needed to operate the system versus alternative dry cooling schemes. A thermodynamic model coupling all system components is developed for parametric studies and system performance evaluation. The model predicts that constant-pressure pump cycle consumes less power than using a compressor; the extra compression power required for temperature lift is much larger than the gain in turbine power output. The updraft tower system with a pumped secondary loop can allow dry cooling with less power plant efficiency penalty compared to air-cooled condensers.

  5. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    Science.gov (United States)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  6. Visualization environment of the large-scale data of JAEA's supercomputer system

    Energy Technology Data Exchange (ETDEWEB)

    Sakamoto, Kensaku [Japan Atomic Energy Agency, Center for Computational Science and e-Systems, Tokai, Ibaraki (Japan); Hoshi, Yoshiyuki [Research Organization for Information Science and Technology (RIST), Tokai, Ibaraki (Japan)

    2013-11-15

    On research and development of various fields of nuclear energy, visualization of calculated data is especially useful to understand the result of simulation in an intuitive way. Many researchers who run simulations on the supercomputer in Japan Atomic Energy Agency (JAEA) are used to transfer calculated data files from the supercomputer to their local PCs for visualization. In recent years, as the size of calculated data has gotten larger with improvement of supercomputer performance, reduction of visualization processing time as well as efficient use of JAEA network is being required. As a solution, we introduced a remote visualization system which has abilities to utilize parallel processors on the supercomputer and to reduce the usage of network resources by transferring data of intermediate visualization process. This paper reports a study on the performance of image processing with the remote visualization system. The visualization processing time is measured and the influence of network speed is evaluated by varying the drawing mode, the size of visualization data and the number of processors. Based on this study, a guideline for using the remote visualization system is provided to show how the system can be used effectively. An upgrade policy of the next system is also shown. (author)

  7. Quadratic partial eigenvalue assignment in large-scale stochastic dynamic systems for resilient and economic design

    International Nuclear Information System (INIS)

    Das, Sonjoy; Goswami, Kundan; Datta, Biswa N.

    2014-01-01

    Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this work for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology

  8. Quadratic partial eigenvalue assignment in large-scale stochastic dynamic systems for resilient and economic design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sonjoy; Goswami, Kundan [University at Buffalo, NY (United States); Datta, Biswa N. [Northern Illinois University, IL (United States)

    2014-12-10

    Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this work for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology.

  9. Power System Operation with Large-Scale Wind Power in Liberalised Environments

    International Nuclear Information System (INIS)

    Ummels, B.C.

    2009-01-01

    The disadvantages of producing electricity from fossil fuels are that their supply is finite and unevenly distributed across the earth. Conventional power stations also emit greenhouse gases. Therefore, sustainable alternatives must be developed, such as wind power. The disadvantages of wind are that it may or may not blow and that it is unpredictable. Th generation of electricity must however always equal the consumption. This makes the integration of wind power in the electricity system more difficult. This thesis investigates the integration of wind power into the existing power system. Simulation models are developed and used to explore the operation of power systems with a lot of wind power. The simulations provide a picture of the reliability, cost and emission of CO2 of the generation of electricity, with and without wind power. The research also takes into account electricity exchange on international markets. Possible solutions for integrating wind power, such as flexible power plants and energy storage, are investigated as well

  10. Solving large-scale sparse eigenvalue problems and linear systems of equations for accelerator modeling

    International Nuclear Information System (INIS)

    Gene Golub; Kwok Ko

    2009-01-01

    The solutions of sparse eigenvalue problems and linear systems constitute one of the key computational kernels in the discretization of partial differential equations for the modeling of linear accelerators. The computational challenges faced by existing techniques for solving those sparse eigenvalue problems and linear systems call for continuing research to improve on the algorithms so that ever increasing problem size as required by the physics application can be tackled. Under the support of this award, the filter algorithm for solving large sparse eigenvalue problems was developed at Stanford to address the computational difficulties in the previous methods with the goal to enable accelerator simulations on then the world largest unclassified supercomputer at NERSC for this class of problems. Specifically, a new method, the Hemitian skew-Hemitian splitting method, was proposed and researched as an improved method for solving linear systems with non-Hermitian positive definite and semidefinite matrices.

  11. Systems and methods for large-scale nanotemplate and nanowire fabrication

    KAUST Repository

    Vidal, Enrique Vilanova

    2016-03-31

    Systems and methods for largescale nanotemplate and nanowire fabrication are provided. The system can include a sample holder and one or more chemical containers fluidly connected to the sample holder. The sample holder can be configured to contain a solution and to releasably hold a substrate material within the solution. In other aspects, the system can include a robotic arm including a head configured to releasably hold a substrate material. The methods can include initiating a treatment step by moving a chemical solution from a chemical container to the sample holder to submerge the substrate material for a period of time. The methods can include moving the robotic arm to position the substrate in a chemical container. The treatment steps can be stopped by removing the chemical solution from the sample holder or by moving the robotic arm to remove the substrate from the chemical container. The treatment steps can include degreasing, polishing, rinsing, anodization, and deposition.

  12. SELECTIVE MODAL ANALYSIS OF POWER FLOW OSCILLATION IN LARGE SCALE LONGITUDINAL POWER SYSTEMS

    Directory of Open Access Journals (Sweden)

    Wirindi -

    2009-06-01

    Full Text Available Novel selective modal analysis for the determination of low frequency power flow oscillation behaviour based on eigenvalues with corresponding damping ratio, cumulative damping index, and participation factors is proposed. The power system being investigated consists of three large longitudinally interconnected areas with some weak tie lines. Different modes, such as exciter modes, inter area modes, and local modes of the dominant poles are fully studied to find out the significant level of system damping and other factors producing power flow instability. The nature of the energy exchange between area is determined and strategic power flow stability improvement is developed and tested.

  13. Data management and analysis systems for large-scale hydrogeochemical reconnaissance

    International Nuclear Information System (INIS)

    Ferguson, R.B.; Maddox, J.H.; Wren, H.F.

    1976-01-01

    The Savannah River Laboratory has developed a versatile, computerized data recording, processing, updating, and retrieval system for handling an expected 150 million bytes of hydrogeochemical data from 150,000 to 200,000 sample sites over the next four years. A sub-system accounts for the movements of samples from initial receipt through final storage. Approximately 6 million sample movements are expected. Two- and three-dimensional plots of sampled geographic areas showing concentrations and locations of individual chemical elements are displayed and reproduced photographically. Pattern recognition techniques enable multivariate data to be categorized into ''clusters'' which indicate sites favorable for uranium exploration

  14. Considerations regarding system engineering in large scale projects with heterogeneous contexts

    Science.gov (United States)

    Cremonini, A.; Caiazzo, M.; Hayden, D.; Labate, M. G.; Oulgin, R.; Santander-Vela, J.

    2016-08-01

    In this paper we would like to share some considerations and lessons learned based on our direct experience as system engineer at the SKA project, with emphasis in the personal experiences of the first author. This is a very wide and ambitious program, which involves several stakeholders with a level of heterogeneity in cultural backgrounds, technological heritages, multidisciplinary interplays, motivations and competences without precedents. The role of the leading author is to amalgamate efforts in order to deliver the "MID telescope" and in that role, he has often discovered that, Systems Engineering means far more than purely a disciplined sets of processes.

  15. Ocean Acidification Experiments in Large-Scale Mesocosms Reveal Similar Dynamics of Dissolved Organic Matter Production and Biotransformation

    Directory of Open Access Journals (Sweden)

    Maren Zark

    2017-09-01

    Full Text Available Dissolved organic matter (DOM represents a major reservoir of carbon in the oceans. Environmental stressors such as ocean acidification (OA potentially affect DOM production and degradation processes, e.g., phytoplankton exudation or microbial uptake and biotransformation of molecules. Resulting changes in carbon storage capacity of the ocean, thus, may cause feedbacks on the global carbon cycle. Previous experiments studying OA effects on the DOM pool under natural conditions, however, were mostly conducted in temperate and coastal eutrophic areas. Here, we report on OA effects on the existing and newly produced DOM pool during an experiment in the subtropical North Atlantic Ocean at the Canary Islands during an (1 oligotrophic phase and (2 after simulated deep water upwelling. The last is a frequently occurring event in this region controlling nutrient and phytoplankton dynamics. We manipulated nine large-scale mesocosms with a gradient of pCO2 ranging from ~350 up to ~1,030 μatm and monitored the DOM molecular composition using ultrahigh-resolution mass spectrometry via Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS. An increase of 37 μmol L−1 DOC was observed in all mesocosms during a phytoplankton bloom induced by simulated upwelling. Indications for enhanced DOC accumulation under elevated CO2 became apparent during a phase of nutrient recycling toward the end of the experiment. The production of DOM was reflected in changes of the molecular DOM composition. Out of the 7,212 molecular formulae, which were detected throughout the experiment, ~50% correlated significantly in mass spectrometric signal intensity with cumulative bacterial protein production (BPP and are likely a product of microbial transformation. However, no differences in the produced compounds were found with respect to CO2 levels. Comparing the results of this experiment with a comparable OA experiment in the Swedish Gullmar Fjord, reveals

  16. Theory and algorithms for solving large-scale numerical problems. Application to the management of electricity production

    International Nuclear Information System (INIS)

    Chiche, A.

    2012-01-01

    This manuscript deals with large-scale optimization problems, and more specifically with solving the electricity unit commitment problem arising at EDF. First, we focused on the augmented Lagrangian algorithm. The behavior of that algorithm on an infeasible convex quadratic optimization problem is analyzed. It is shown that the algorithm finds a point that satisfies the shifted constraints with the smallest possible shift in the sense of the Euclidean norm and that it minimizes the objective on the corresponding shifted constrained set. The convergence to such a point is realized at a global linear rate, which depends explicitly on the augmentation parameter. This suggests us a rule for determining the augmentation parameter to control the speed of convergence of the shifted constraint norm to zero. This rule has the advantage of generating bounded augmentation parameters even when the problem is infeasible. As a by-product, the algorithm computes the smallest translation in the Euclidean norm that makes the constraints feasible. Furthermore, this work provides solution methods for stochastic optimization industrial problems decomposed on a scenario tree, based on the progressive hedging algorithm introduced by [Rockafellar et Wets, 1991]. We also focus on the convergence of that algorithm. On the one hand, we offer a counter-example showing that the algorithm could diverge if its augmentation parameter is iteratively updated. On the other hand, we show how to recover the multipliers associated with the non-dualized constraints defined on the scenario tree from those associated with the corresponding constraints of the scenario subproblems. Their convergence is also analyzed for convex problems. The practical interest of theses solutions techniques is corroborated by numerical experiments performed on the electric production management problem. We apply the progressive hedging algorithm to a realistic industrial problem. More precisely, we solve the French medium

  17. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  18. Large-scale molecular dynamics simulations of self-assembling systems.

    Science.gov (United States)

    Klein, Michael L; Shinoda, Wataru

    2008-08-08

    Relentless increases in the size and performance of multiprocessor computers, coupled with new algorithms and methods, have led to novel applications of simulations across chemistry. This Perspective focuses on the use of classical molecular dynamics and so-called coarse-grain models to explore phenomena involving self-assembly in complex fluids and biological systems.

  19. Mizan: A system for dynamic load balancing in large-scale graph processing

    KAUST Repository

    Khayyat, Zuhair

    2013-01-01

    Pregel [23] was recently introduced as a scalable graph mining system that can provide significant performance improvements over traditional MapReduce implementations. Existing implementations focus primarily on graph partitioning as a preprocessing step to balance computation across compute nodes. In this paper, we examine the runtime characteristics of a Pregel system. We show that graph partitioning alone is insufficient for minimizing end-to-end computation. Especially where data is very large or the runtime behavior of the algorithm is unknown, an adaptive approach is needed. To this end, we introduce Mizan, a Pregel system that achieves efficient load balancing to better adapt to changes in computing needs. Unlike known implementations of Pregel, Mizan does not assume any a priori knowledge of the structure of the graph or behavior of the algorithm. Instead, it monitors the runtime characteristics of the system. Mizan then performs efficient fine-grained vertex migration to balance computation and communication. We have fully implemented Mizan; using extensive evaluation we show that - especially for highly-dynamic workloads - Mizan provides up to 84% improvement over techniques leveraging static graph pre-partitioning. © 2013 ACM.

  20. Dynamic classification system in large-scale supervision of energy efficiency in buildings

    International Nuclear Information System (INIS)

    Kiluk, S.

    2014-01-01

    Highlights: • Rough set approximation of classification improves energy efficiency prediction. • Dynamic features of diagnostic classification allow for its precise prediction. • Indiscernibility in large population enhances identification of process features. • Diagnostic information can be refined by dynamic references to local neighbourhood. • We introduce data exploration validation based on system dynamics and uncertainty. - Abstract: Data mining and knowledge discovery applied to the billing data provide the diagnostic instruments for the evaluation of energy use in buildings connected to a district heating network. To ensure the validity of an algorithm-based classification system, the dynamic properties of a sequence of partitions for consecutive detected events were investigated. The information regarding the dynamic properties of the classification system refers to the similarities between the supervised objects and migrations that originate from the changes in the building energy use and loss similarity to their neighbourhood and thus represents the refinement of knowledge. In this study, we demonstrate that algorithm-based diagnostic knowledge has dynamic properties that can be exploited with a rough set predictor to evaluate whether the implementation of classification for supervision of energy use aligns with the dynamics of changes of district heating-supplied building properties. Moreover, we demonstrate the refinement of the current knowledge with the previous findings and we present the creation of predictive diagnostic systems based on knowledge dynamics with a satisfactory level of classification errors, even for non-stationary data

  1. LMI-Based Design of a Control of a Large-Scale System

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2008-01-01

    Roč. 2008, č. 3 (2008), s. 1-3 ISSN 1827-6660 R&D Projects: GA ČR GP102/07/P413 Institutional research plan: CEZ:AV0Z10750506 Keywords : Decentralized control * decomposition * linear matrix inequalities Subject RIV: BC - Control Systems Theory

  2. Season: Shelving Interference and Joint Identification in Large-scale RFID Systems

    DEFF Research Database (Denmark)

    Yang, Lei; Han, Jinsong; Qi, Yong

    2011-01-01

    Prior work on anti-collision for Radio Frequency IDentification (RFID) systems usually schedule adjacent readers to exclusively interrogate tags for avoiding reader collisions. Although such a pattern can effectively deal with collisions, the lack of readers’ collaboration wastes numerous time...

  3. Large-scale clinical comparison of the lysis-centrifugation and radiometric systems for blood culture

    International Nuclear Information System (INIS)

    Brannon, P.; Kiehn, T.E.

    1985-01-01

    The Isolator 10 lysis-centrifugation blood culture system (E. I. du Pont de Nemours and Co., Inc., Wilmington, Del.) was compared with the BACTEC radiometric method (Johnston Laboratories, Inc., Towson, Md.) with 6B and 7D broth media for the recovery of bacteria and yeasts. From 11,000 blood cultures, 1,174 clinically significant organisms were isolated. The Isolator system recovered significantly more total organisms, members of the family Enterobacteriaceae, Staphylococcus spp., and yeasts. The BACTEC system recovered significantly more Pseudomonas spp., Streptococcus spp., and anaerobes. Of the Isolator colony counts, 87% measured less than 11 CFU/ml of blood. Organisms, on an average, were detected the same day from each of the two culture systems. Only 13 of the 975 BACTEC isolates (0.01%) were recovered by subculture of growth-index-negative bottles, and 12 of the 13 were detected in another broth blood culture taken within 24 h. Contaminants were recovered from 4.8% of the Isolator 10 and 2.3% of the BACTEC cultures

  4. Large-scale demonstration test plan for digface data acquisition system

    International Nuclear Information System (INIS)

    Roybal, L.G.; Svoboda, J.M.

    1994-11-01

    Digface characterization promotes the use of online site characterization and monitoring during waste retrieval efforts, a need that arises from safety and efficiency considerations during the cleanup of a complex waste site. Information concerning conditions at the active digface can be used by operators as a basis for adjusting retrieval activities to reduce safety risks and to promote an efficient transition between retrieval and downstream operations. Most importantly, workers are given advance warning of upcoming dangerous conditions. In addition, detailed knowledge of digface conditions provides a basis for selecting tools and methods that avoid contamination spread and work stoppages. In FY-94, work began in support of a largescale demonstration coordinating the various facets of a prototype digface remediation operation including characterization, contaminant suppression, and cold waste retrieval. This test plan describes the activities that will be performed during the winter of FY-95 that are necessary to assess the performance of the data acquisition and display system in its initial integration with hardware developed in the Cooperative Telerobotic Retrieval (CTR) program. The six specific objectives of the test are determining system electrical noise, establishing a dynamic background signature of the gantry crane and associated equipment, determining the resolution of the overall system by scanning over known objects, reporting the general functionality of the overall data acquisition system, evaluating the laser topographic functionality, and monitoring the temperature control features of the electronic package

  5. Evaluating real-time Java for mission-critical large-scale embedded systems

    Science.gov (United States)

    Sharp, D. C.; Pla, E.; Luecke, K. R.; Hassan, R. J.

    2003-01-01

    This paper describes benchmarking results on an RT JVM. This paper extends previously published results by including additional tests, by being run on a recently available pre-release version of the first commercially supported RTSJ implementation, and by assessing results based on our experience with avionics systems in other languages.

  6. Efficient Key Management System for Large-scale Smart RFID Applications

    Directory of Open Access Journals (Sweden)

    Mohammad Fal Sadikin

    2015-08-01

    Full Text Available Due to low-cost and its practical solution, the integration of RFID tag to the sensor node called smart RFID has become prominent solution in various fields including industrial applications. Nevertheless, the constrained nature of smart RFID system introduces tremendous security and privacy problem. One of them is the problem in key management system. Indeed, it is not feasible to recall all RFID tags in order to update their security properties (e.g. update their private keys. On the other hand, using common key management solution like standard TLS/SSL is too heavy-weight that can drain and overload the limited resources. Furthermore, most of existing solutions are highly susceptible to various threats reaching from privacy threats, physical attacks to various technics of Man-in-the-Middle attacks. This paper introduces novel key management system, tailored to the limited resources of smart RFID system. It proposes light-weight mutual authentication and identity protection to mitigate the existing threats.

  7. Proportional and Integral Thermal Control System for Large Scale Heating Tests

    Science.gov (United States)

    Fleischer, Van Tran

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) Flight Loads Laboratory is a unique national laboratory that supports thermal, mechanical, thermal/mechanical, and structural dynamics research and testing. A Proportional Integral thermal control system was designed and implemented to support thermal tests. A thermal control algorithm supporting a quartz lamp heater was developed based on the Proportional Integral control concept and a linearized heating process. The thermal control equations were derived and expressed in terms of power levels, integral gain, proportional gain, and differences between thermal setpoints and skin temperatures. Besides the derived equations, user's predefined thermal test information generated in the form of thermal maps was used to implement the thermal control system capabilities. Graphite heater closed-loop thermal control and graphite heater open-loop power level were added later to fulfill the demand for higher temperature tests. Verification and validation tests were performed to ensure that the thermal control system requirements were achieved. This thermal control system has successfully supported many milestone thermal and thermal/mechanical tests for almost a decade with temperatures ranging from 50 F to 3000 F and temperature rise rates from -10 F/s to 70 F/s for a variety of test articles having unique thermal profiles and test setups.

  8. Hierarchical, decentralized control system for large-scale smart-structures

    International Nuclear Information System (INIS)

    Algermissen, Stephan; Fröhlich, Tim; Monner, Hans Peter

    2014-01-01

    Active control of sound and vibration has gained much attention in all kinds of industries in the past decade. Future prospects for maximizing airline passenger comfort are especially promising. The objectives of recent research projects in this area are the reduction of noise transmission through thin walled structures such as fuselages, linings or interior elements. Besides different external noise sources, such as the turbulent boundary layer, rotor or jet noise, the actuator and sensor placement as well as different control concepts are addressed. Mostly, the work is focused on a single panel or section of the fuselage, neglecting the fact that for effective noise reduction the entire fuselage has to be taken into account. Nevertheless, extending the scope of an active system from a single panel to the entire fuselage increases the effort for control hardware dramatically. This paper presents a control concept for large structures using distributed control nodes. Each node has the capability to execute a vibration or noise controller for a specific part or section of the fuselage. For maintenance, controller tuning or performance measurement, all nodes are connected to a host computer via Universal Serial Bus (USB). This topology allows a partitioning and distributing of tasks. The nodes execute the low-level control functions. High-level tasks like maintenance, system identification and control synthesis are operated by the host using streamed data from the nodes. By choosing low-price nodes, a very cost effective way of implementing an active system for large structures is realized. Besides the system identification and controller synthesis on the host computer, a detailed view on the hardware and software concept for the nodes is given. Finally, the results of an experimental test of a system running a robust vibration controller at an active panel demonstrator are shown. (paper)

  9. Potential for large-scale solar collector system to offset carbon-based heating in the Ontario greenhouse sector

    Science.gov (United States)

    Semple, Lucas M.; Carriveau, Rupp; Ting, David S.-K.

    2018-04-01

    In the Ontario greenhouse sector the misalignment of available solar radiation during the summer months and large heating demand during the winter months makes solar thermal collector systems an unviable option without some form of seasonal energy storage. Information obtained from Ontario greenhouse operators has shown that over 20% of annual natural gas usage occurs during the summer months for greenhouse pre-heating prior to sunrise. A transient model of the greenhouse microclimate and indoor conditioning systems is carried out using TRNSYS software and validated with actual natural gas usage data. A large-scale solar thermal collector system is then incorporated and found to reduce the annual heating energy demand by approximately 35%. The inclusion of the collector system correlates to a reduction of about 120 tonnes of CO2 equivalent emissions per acre of greenhouse per year. System payback period is discussed considering the benefits of a future Ontario carbon tax.

  10. Adaptive Fuzzy Output-Constrained Fault-Tolerant Control of Nonlinear Stochastic Large-Scale Systems With Actuator Faults.

    Science.gov (United States)

    Li, Yongming; Ma, Zhiyao; Tong, Shaocheng

    2017-09-01

    The problem of adaptive fuzzy output-constrained tracking fault-tolerant control (FTC) is investigated for the large-scale stochastic nonlinear systems of pure-feedback form. The nonlinear systems considered in this paper possess the unstructured uncertainties, unknown interconnected terms and unknown nonaffine nonlinear faults. The fuzzy logic systems are employed to identify the unknown lumped nonlinear functions so that the problems of structured uncertainties can be solved. An adaptive fuzzy state observer is designed to solve the nonmeasurable state problem. By combining the barrier Lyapunov function theory, adaptive decentralized and stochastic control principles, a novel fuzzy adaptive output-constrained FTC approach is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.

  11. Large-scale infection of the ascidian Ciona intestinalis by the gregarine Lankesteria ascidiae in an inland culture system.

    Science.gov (United States)

    Mita, Kaoru; Kawai, Narudo; Rueckert, Sonja; Sasakura, Yasunori

    2012-11-19

    An important way to keep transgenic and mutant lines of the ascidian Ciona intestinalis, a model system for e.g. genetic functions, in laboratories is via culturing systems. Here we report a disease of C. intestinalis observed in an inland culturing system. The disease, called 'long feces syndrome,' is expressed in affected animals by the following characteristic symptoms of the digestive system: (1) excretion of long and thin feces, (2) pale color of the stomach, and (3) congestion of the digestive tube by digested material. Severely diseased animals usually die within a week after the first symptoms occur, implying a high risk of this disease for ascidian culturing systems. The digestive tubes of the diseased animals are occupied by the gregarine apicomplexan parasite Lankesteria ascidiae, suggesting that large-scale infection by this parasite is the cause of long feces syndrome.

  12. EXPERIMENTAL BUBBLE FORMATION IN A LARGE SCALE SYSTEM FOR NEWTONIAN AND NONNEWTONIAN FLUIDS

    Energy Technology Data Exchange (ETDEWEB)

    Leishear, R; Michael Restivo, M

    2008-06-26

    The complexities of bubble formation in liquids increase as the system size increases, and a photographic study is presented here to provide some insight into the dynamics of bubble formation for large systems. Air was injected at the bottom of a 28 feet tall by 30 inch diameter column. Different fluids were subjected to different air flow rates at different fluid depths. The fluids were water and non-Newtonian, Bingham plastic fluids, which have yield stresses requiring an applied force to initiate movement, or shearing, of the fluid. Tests showed that bubble formation was significantly different in the two types of fluids. In water, a field of bubbles was formed, which consisted of numerous, distributed, 1/4 to 3/8 inch diameter bubbles. In the Bingham fluid, large bubbles of 6 to 12 inches in diameter were formed, which depended on the air flow rate. This paper provides comprehensive photographic results related to bubble formation in these fluids.

  13. Challenges in Gaining Large Scale Carbon Reductions through Wireless Home Automation Systems

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Rovsing, Poul Ejnar; Toftegaard, Thomas Skjødeberg

    2010-01-01

    Buildings account for more than a 35 % of the energy consumption in Europe. Therefore a step towards more sustainable lifestile is to use home automation to optimize the energy consumption “automatically”. This paper reports about the usage and some of the remaining challenges of especially...... wireless but also powerline communication in a home automation setting. For many years, home automation has been visible to many, but accessible to only a few, because of inadequate integration of systems. A vast number of both standard and proprietary communication protocols are used, and systems...... are often difficult to install and configure so professional assistance is needed. In this paper we report about our experience in constructing an open universal home automation framework enabling interoperability of multiple communication protocols. The framework can easily be expanded in order to support...

  14. Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle

    Science.gov (United States)

    Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat

    1993-01-01

    The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.

  15. The application of sensitivity analysis to models of large scale physiological systems

    Science.gov (United States)

    Leonard, J. I.

    1974-01-01

    A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.

  16. Investigation of transducers for large-scale cryogenic systems in Italy

    International Nuclear Information System (INIS)

    Pavese, F.

    1984-01-01

    This chapter investigates temperature, pressure (static, absolute), strain and flowrate transducers. A modular cryostat system, which includes a superconducting solenoid, is used for measurements. The module for pressure transducers allows them to be measured one at a time. Adiabatic conditions for the functional part of the module for strain-gages are ensured by sliding thermal anchors. The equipment is driven by three computer-based systems which act separately. Magnetoresistance has been measured up to 6 T. Only foil-type strain gages were investigated. It is determined that apparent strain has a peculiar trend at liquid helium temperatures. Four types of transducers, specifically designed for low-temperature measurement of static, absolute pressure, and uncalibrated, were tested

  17. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    Science.gov (United States)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calclating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of the system are also discussed.

  18. Structures and Techniques For Implementing and Packaging Complex, Large Scale Microelectromechanical Systems Using Foundry Fabrication Processes.

    Science.gov (United States)

    1996-06-01

    switches 5-43 Figure 5-27. Mechanical interference between ’Pull Spring’ devices 5-45 Figure 5-28. Array of LIGA mechanical relay switches 5-49...like coating DM Direct metal interconnect technique DMD ™ Digital Micromirror Device EDP Ethylene, diamine, pyrocatechol and water; silicon anisotropic...mechanical systems MOSIS MOS Implementation Service PGA Pin grid array, an electronic die package PZT Lead-zirconate-titanate LIGA Lithographie

  19. Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.

    Directory of Open Access Journals (Sweden)

    Sagar Indurkhya

    Full Text Available ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1 a small number of reactions tend to occur a disproportionately large percentage of the time, and (2 a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.

  20. Reaction Factoring and Bipartite Update Graphs Accelerate the Gillespie Algorithm for Large-Scale Biochemical Systems

    Science.gov (United States)

    Indurkhya, Sagar; Beal, Jacob

    2010-01-01

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048

  1. Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.

    Science.gov (United States)

    Indurkhya, Sagar; Beal, Jacob

    2010-01-06

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.

  2. Georeferenced and secure mobile health system for large scale data collection in primary care.

    Science.gov (United States)

    Sa, Joao H G; Rebelo, Marina S; Brentani, Alexandra; Grisi, Sandra J F E; Iwaya, Leonardo H; Simplicio, Marcos A; Carvalho, Tereza C M B; Gutierrez, Marco A

    2016-10-01

    Mobile health consists in applying mobile devices and communication capabilities for expanding the coverage and improving the effectiveness of health care programs. The technology is particularly promising for developing countries, in which health authorities can take advantage of the flourishing mobile market to provide adequate health care to underprivileged communities, especially primary care. In Brazil, the Primary Care Information System (SIAB) receives primary health care data from all regions of the country, creating a rich database for health-related action planning. Family Health Teams (FHTs) collect this data in periodic visits to families enrolled in governmental programs, following an acquisition procedure that involves filling in paper forms. This procedure compromises the quality of the data provided to health care authorities and slows down the decision-making process. To develop a mobile system (GeoHealth) that should address and overcome the aforementioned problems and deploy the proposed solution in a wide underprivileged metropolitan area of a major city in Brazil. The proposed solution comprises three main components: (a) an Application Server, with a database containing family health conditions; and two clients, (b) a Web Browser running visualization tools for management tasks, and (c) a data-gathering device (smartphone) to register and to georeference the family health data. A data security framework was designed to ensure the security of data, which was stored locally and transmitted over public networks. The system was successfully deployed at six primary care units in the city of Sao Paulo, where a total of 28,324 families/96,061 inhabitants are regularly followed up by government health policies. The health conditions observed from the population covered were: diabetes in 3.40%, hypertension (age >40) in 23.87% and tuberculosis in 0.06%. This estimated prevalence has enabled FHTs to set clinical appointments proactively, with the aim of

  3. Multi-Sensing system for outdoor thermal monitoring: Application to large scale civil engineering components

    Science.gov (United States)

    Crinière, Antoine; Dumoulin, Jean; Manceau, Jean-Luc; Perez, Laetitia; Bourquin, Frederic

    2014-05-01

    Aging of transport infrastructures combined with traffic and climatic solicitations contribute to the reduction of their performances. To address and quantify the resilience of civil engineering structure, investigations on robust, fast and efficient methods are required. Among research works carried out at IFSTTAR, methods for long term monitoring face an increasing demand. Such works take benefits of this last decade technological progresses in ICT domain. The present study follows the ISTIMES European project [1], which aimed at demonstrate the ability of different electromagnetic sensing techniques, processing methods and ICT architecture, to be used for long term monitoring of critical transport infrastructures. Thanks to this project a multi-sensing techniques system, able to date and synchronize measurements carried out by infrared thermography coupled with various measurements data (i.e. weather parameters), have been designed, developed and implemented on real site [2]. Among experiments carried out on real transport infrastructure, it has been shown, for the "Musmesci" bridge deck (Italy), that by using infrared thermal image sequence with weather measurements during sevral days it was possible to develop analysis methods able to produce qualitative and quantitative data [3]. In the present study, added functionalities were designed and added to the "IrLAW" system in order to reach full autonomy in term of power supply, very long term measurement capability (at least 1 year) and automated data base feeding. The surveyed civil engineering structures consist in two concrete beams of 16 m long and 21 T weight each. One of the two beams was damage by high energy mechanical impact at the IFSTTAR falling rocks test station facilities located in the French Alpes [4]. The system is composed of one IR uncooled microbolometric camera (FLIR SC325) with a 320X240 Focal Plane Array detector in band III, a weather station VAISALA WXT520, a GPS, a failover power supply

  4. Simulation of large-scale soil water systems using groundwater data and satellite based soil moisture

    Science.gov (United States)

    Kreye, Phillip; Meon, Günter

    2016-04-01

    Complex concepts for the physically correct depiction of dominant processes in the hydrosphere are increasingly at the forefront of hydrological modelling. Many scientific issues in hydrological modelling demand for additional system variables besides a simulation of runoff only, such as groundwater recharge or soil moisture conditions. Models that include soil water simulations are either very simplified or require a high number of parameters. Against this backdrop there is a heightened demand of observations to be used to calibrate the model. A reasonable integration of groundwater data or remote sensing data in calibration procedures as well as the identifiability of physically plausible sets of parameters is subject to research in the field of hydrology. Since this data is often combined with conceptual models, the given interfaces are not suitable for such demands. Furthermore, the application of automated optimisation procedures is generally associated with conceptual models, whose (fast) computing times allow many iterations of the optimisation in an acceptable time frame. One of the main aims of this study is to reduce the discrepancy between scientific and practical applications in the field of hydrological modelling. Therefore, the soil model DYVESOM (DYnamic VEgetation SOil Model) was developed as one of the primary components of the hydrological modelling system PANTA RHEI. DYVESOMs structure provides the required interfaces for the calibrations made at runoff, satellite based soil moisture and groundwater level. The model considers spatial and temporal differentiated feedback of the development of the vegetation on the soil system. In addition, small scale heterogeneities of soil properties (subgrid-variability) are parameterized by variation of van Genuchten parameters depending on distribution functions. Different sets of parameters are operated simultaneously while interacting with each other. The developed soil model is innovative regarding concept

  5. Development and application of a large scale river system model for National Water Accounting in Australia

    Science.gov (United States)

    Dutta, Dushmanta; Vaze, Jai; Kim, Shaun; Hughes, Justin; Yang, Ang; Teng, Jin; Lerat, Julien

    2017-04-01

    Existing global and continental scale river models, mainly designed for integrating with global climate models, are of very coarse spatial resolutions and lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing water accounts, which have become increasingly important for water resources planning and management at regional and national scales. A continental scale river system model called Australian Water Resource Assessment River System model (AWRA-R) has been developed and implemented for national water accounting in Australia using a node-link architecture. The model includes major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. Two key components of the model are an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. The results in the Murray-Darling Basin shows highly satisfactory performance of the model with median daily Nash-Sutcliffe Efficiency (NSE) of 0.64 and median annual bias of less than 1% for the period of calibration (1970-1991) and median daily NSE of 0.69 and median annual bias of 12% for validation period (1992-2014). The results have demonstrated that the performance of the model is less satisfactory when the key processes such as overbank flow, groundwater seepage and irrigation diversion are switched off. The AWRA-R model, which has been operationalised by the Australian Bureau of Meteorology for continental scale water accounting, has contributed to improvements in the national water account by substantially reducing accounted different volume (gain/loss).

  6. An assessment of future computer system needs for large-scale computation

    Science.gov (United States)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  7. Demonstration of Regenerable, Large-Scale Ion Exchange System Using WBA Resin in Rialto, CA

    Science.gov (United States)

    2012-12-01

    Milliequivalents NDMA – N-nitrosodimethylamine NPDES – National Pollutant Discharge Elimination System O&M – Operation and Maintenance ESTCP Project No. ER...0.10 ppb) using IC/MS/MS. Nitrosamines were analyzed using EPA Method 521. NDMA was 2.6 ppt with a detection limit of 2 ppt. All other...No. 001060 49 NDEA, NDMA , NDBA, NDPA, NMEA, NMOR, NPIP, and NPYR. The reportable limit for each of these analytes is 2 ng/L. NDEA and NPIP were

  8. Environmental aspects of large-scale wind-power systems in the UK

    Science.gov (United States)

    Robson, A.

    1984-11-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the UK are discussed. Noise, television interference, hazards to bird life, and visual effects are considered. Areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first UK machines to be introduced in a safe and environementally acceptable manner. Research to establish siting criteria more clearly, and significantly increase the potential wind-energy resource is mentioned. Studies of the comparative risk of energy systems are shown to be overpessimistic for UK wind turbines.

  9. Evaluating the Reliability of Emergency Response Systems for Large-Scale Incident Operations

    Science.gov (United States)

    2010-01-01

    get them to an environment where they a Sim p lifi ed M o d el o f an Em erg en cy R esp o n se to a C h lo rin e R elease 71 Figure 4.4...of the complexity, picturing the interconnections in this network mapping makes it easier to go beyond what was included in Table 5.2 and, starting...commercial media to the use of spe- cialized alert systems (e.g., an email push alert network ). As with previous steps, the time it takes for a message to be

  10. Revisiting the Euganean Geothermal System (NE Italy) - insights from large scale hydrothermal modelling

    Science.gov (United States)

    Pola, Marco; Cacace, Mauro; Fabbri, Paolo; Piccinini, Leonardo; Zampieri, Dario; Dalla Libera, Nico

    2017-04-01

    As one of the largest and most extensive utilized geothermal system in northern Italy, the Euganean Geothermal System (EGS, Veneto region, NE Italy) has long been the subject of still ongoing studies. Hydrothermal waters feeding the system are of meteoric origin and infiltrate in the Veneto Prealps, to the north of the main geothermal area. The waters circulate for approximately 100 km in the subsurface of the central Veneto, outflowing with temperatures from 65°C to 86°C to the southwest near the cities of Abano Terme and Montegrotto Terme. The naturally emerging waters are mainly used for balneotherapeutic purposes, forming the famous Euganean spa district. This preferential outflow is thought to have a relevant structural component producing a high secondary permeability localized within an area of limited extent (approx. 25 km2). This peculiar structure is associated with a local network of fractures resulting from transtentional tectonics of the regional Schio-Vicenza fault system (SVFS) bounding the Euganean Geothermal Field (EGF). In the present study, a revised conceptual hydrothermal model for the EGS based on the regional hydrogeology and structural geology is proposed. Particularly, this work aims to quantify: (1) the role of the regional SVFS, and (2) the impact of the high density local fractures mesh beneath the EGF on the regional-to-local groundwater flow circulation at depths and its thermal configuration. 3D coupled flow and heat transport numerical simulations inspired by the newly developed conceptual model are carried out to properly quantify the results from these interactions. Consistently with the observations, the obtained results provide indication for temperatures in the EGF reservoir being higher than in the surrounding areas, despite a uniform basal regional crustal heat inflow. In addition, they point to the presence of a structural causative process for the localized outflow, in which deep-seated groundwater is preferentially

  11. Shift of large-scale atmospheric systems over Europe during late MIS 3 and implications for Modern Human dispersal.

    Science.gov (United States)

    Obreht, Igor; Hambach, Ulrich; Veres, Daniel; Zeeden, Christian; Bösken, Janina; Stevens, Thomas; Marković, Slobodan B; Klasen, Nicole; Brill, Dominik; Burow, Christoph; Lehmkuhl, Frank

    2017-07-19

    Understanding the past dynamics of large-scale atmospheric systems is crucial for our knowledge of the palaeoclimate conditions in Europe. Southeastern Europe currently lies at the border between Atlantic, Mediterranean, and continental climate zones. Past changes in the relative influence of associated atmospheric systems must have been recorded in the region's palaeoarchives. By comparing high-resolution grain-size, environmental magnetic and geochemical data from two loess-palaeosol sequences in the Lower Danube Basin with other Eurasian palaeorecords, we reconstructed past climatic patterns over Southeastern Europe and the related interaction of the prevailing large-scale circulation modes over Europe, especially during late Marine Isotope Stage 3 (40,000-27,000 years ago). We demonstrate that during this time interval, the intensification of the Siberian High had a crucial influence on European climate causing the more continental conditions over major parts of Europe, and a southwards shift of the Westerlies. Such a climatic and environmental change, combined with the Campanian Ignimbrite/Y-5 volcanic eruption, may have driven the Anatomically Modern Human dispersal towards Central and Western Europe, pointing to a corridor over the Eastern European Plain as an important pathway in their dispersal.

  12. Congestion management in power systems. Long-term modeling framework and large-scale application

    Energy Technology Data Exchange (ETDEWEB)

    Bertsch, Joachim; Hagspiel, Simeon; Just, Lisa

    2015-06-15

    In liberalized power systems, generation and transmission services are unbundled, but remain tightly interlinked. Congestion management in the transmission network is of crucial importance for the efficiency of these inter-linkages. Different regulatory designs have been suggested, analyzed and followed, such as uniform zonal pricing with redispatch or nodal pricing. However, the literature has either focused on the short-term efficiency of congestion management or specific issues of timing investments. In contrast, this paper presents a generalized and flexible economic modeling framework based on a decomposed inter-temporal equilibrium model including generation, transmission, as well as their inter-linkages. Short and long-term effects of different congestion management designs can hence be analyzed. Specifically, we are able to identify and isolate implicit frictions and sources of inefficiencies in the different regulatory designs, and to provide a comparative analysis including a benchmark against a first-best welfare-optimal result. To demonstrate the applicability of our framework, we calibrate and numerically solve our model for a detailed representation of the Central Western European (CWE) region, consisting of 70 nodes and 174 power lines. Analyzing six different congestion management designs until 2030, we show that compared to the first-best benchmark, i.e., nodal pricing, inefficiencies of up to 4.6% arise. Inefficiencies are mainly driven by the approach of determining cross-border capacities as well as the coordination of transmission system operators' activities.

  13. Cattle mammary bioreactor generated by a novel procedure of transgenic cloning for large-scale production of functional human lactoferrin.

    Directory of Open Access Journals (Sweden)

    Penghua Yang

    Full Text Available Large-scale production of biopharmaceuticals by current bioreactor techniques is limited by low transgenic efficiency and low expression of foreign proteins. In general, a bacterial artificial chromosome (BAC harboring most regulatory elements is capable of overcoming the limitations, but transferring BAC into donor cells is difficult. We describe here the use of cattle mammary bioreactor to produce functional recombinant human lactoferrin (rhLF by a novel procedure of transgenic cloning, which employs microinjection to generate transgenic somatic cells as donor cells. Bovine fibroblast cells were co-microinjected for the first time with a 150-kb BAC carrying the human lactoferrin gene and a marker gene. The resulting transfection efficiency of up to 15.79 x 10(-2 percent was notably higher than that of electroporation and lipofection. Following somatic cell nuclear transfer, we obtained two transgenic cows that secreted rhLF at high levels, 2.5 g/l and 3.4 g/l, respectively. The rhLF had a similar pattern of glycosylation and proteolytic susceptibility as the natural human counterpart. Biochemical analysis revealed that the iron-binding and releasing properties of rhLF were identical to that of native hLF. Importantly, an antibacterial experiment further demonstrated that rhLF was functional. Our results indicate that co-microinjection with a BAC and a marker gene into donor cells for somatic cell cloning indeed improves transgenic efficiency. Moreover, the cattle mammary bioreactors generated with this novel procedure produce functional rhLF on an industrial scale.

  14. Large-Scale Urban Projects, Production of Space and Neo-liberal Hegemony: A Comparative Study of Izmir

    Directory of Open Access Journals (Sweden)

    Mehmet PENPECİOĞLU

    2013-04-01

    Full Text Available With the rise of neo-liberalism, large-scale urban projects (LDPs have become a powerful mechanism of urban policy. Creating spaces of neo-liberal urbanization such as central business districts, tourism centers, gated residences and shopping malls, LDPs play a role not only in the reproduction of capital accumulation relations but also in the shift of urban political priorities towards the construction of neo-liberal hegemony. The construction of neo-liberal hegemony and the role played by LDPs in this process could not only be investigated by the analysis of capital accumulation. For such an investigation; the role of state and civil society actors in LDPs, their collaborative and conflictual relationships should be researched and their functions in hegemony should be revealed. In the case of Izmir’s two LDPs, namely the New City Center (NCC and Inciraltı Tourism Center (ITC projects, this study analyzes the relationship between the production of space and neo-liberal hegemony. In the NCC project, local governments, investors, local capital organizations and professional chambers collaborated and disseminated hegemonic discourse, which provided social support for the project. Through these relationships and discourses, the NCC project has become a hegemonic project for producing space and constructed neo-liberal hegemony over urban political priorities. In contrast to the NCC project, the ITC project saw no collaboration between state and organized civil society actors. The social opposition against the ITC project, initiated by professional chambers, has brought legal action against the ITC development plans in order to prevent their implementation. As a result, the ITC project did not acquire the consent of organized social groups and failed to become a hegemonic project for producing space.

  15. Impact of Wind Shear and Tower Shadow Effects on Power System with Large Scale Wind Power Penetration

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Chen, Zhe

    2011-01-01

    presents a simulation model of a variable speed wind farm with permanent magnet synchronous generators (PMSGs) and fullscale back-to-back converters in the simulation tool of DIgSILENT/PowerFactory. In this paper, the impacts of wind shear and tower shadow effects on the small signal stability of power......Grid connected wind turbines are fluctuating power sources due to wind speed variations, the wind shear and the tower shadow effects. The fluctuating power may be able to excite the power system oscillation at a frequency close to the natural oscillation frequency of a power system. This paper...... systems with large scale wind power penetrations are investigated during continuous operation based on the wind turbine model and the power system model....

  16. Methods of Model Reduction for Large-Scale Biological Systems: A Survey of Current Methods and Trends.

    Science.gov (United States)

    Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J

    2017-07-01

    Complex models of biochemical reaction systems have become increasingly common in the systems biology literature. The complexity of such models can present a number of obstacles for their practical use, often making problems difficult to intuit or computationally intractable. Methods of model reduction can be employed to alleviate the issue of complexity by seeking to eliminate those portions of a reaction network that have little or no effect upon the outcomes of interest, hence yielding simplified systems that retain an accurate predictive capacity. This review paper seeks to provide a brief overview of a range of such methods and their application in the context of biochemical reaction network models. To achieve this, we provide a brief mathematical account of the main methods including timescale exploitation approaches, reduction via sensitivity analysis, optimisation methods, lumping, and singular value decomposition-based approaches. Methods are reviewed in the context of large-scale systems biology type models, and future areas of research are briefly discussed.

  17. Planning alternative organizational frameworks for a large scale educational telecommunications system served by fixed/broadcast satellites

    Science.gov (United States)

    Walkmeyer, J.

    1973-01-01

    This memorandum explores a host of considerations meriting attention from those who are concerned with designing organizational structures for development and control of a large scale educational telecommunications system using satellites. Part of a broader investigation at Washington University into the potential uses of fixed/broadcast satellites in U.S. education, this study lays ground work for a later effort to spell out a small number of hypothetical organizational blueprints for such a system and for assessment of potential short and long term impacts. The memorandum consists of two main parts. Part A deals with subjects of system-wide concern, while Part B deals with matters related to specific system components.

  18. Third generation participatory design in health informatics--making user participation applicable to large-scale information system projects.

    Science.gov (United States)

    Pilemalm, Sofie; Timpka, Toomas

    2008-04-01

    Participatory Design (PD) methods in the field of health informatics have mainly been applied to the development of small-scale systems with homogeneous user groups in local settings. Meanwhile, health service organizations are becoming increasingly large and complex in character, making it necessary to extend the scope of the systems that are used for managing data, information and knowledge. This study reports participatory action research on the development of a PD framework for large-scale system design. The research was conducted in a public health informatics project aimed at developing a system for 175,000 users. A renewed PD framework was developed in response to six major limitations experienced to be associated with the existing methods. The resulting framework preserves the theoretical grounding, but extends the toolbox to suit applications in networked health service organizations. Future research should involve evaluations of the framework in other health service settings where comprehensive HISs are developed.

  19. A concurrent visualization system for large-scale unsteady simulations. Parallel vector performance on an NEC SX-4

    International Nuclear Information System (INIS)

    Takei, Toshifumi; Doi, Shun; Matsumoto, Hideki; Muramatsu, Kazuhiro

    2000-01-01

    We have developed a concurrent visualization system RVSLIB (Real-time Visual Simulation Library). This paper shows the effectiveness of the system when it is applied to large-scale unsteady simulations, for which the conventional post-processing approach may no longer work, on high-performance parallel vector supercomputers. The system performs almost all of the visualization tasks on a computation server and uses compressed visualized image data for efficient communication between the server and the user terminal. We have introduced several techniques, including vectorization and parallelization, into the system to minimize the computational costs of the visualization tools. The performance of RVSLIB was evaluated by using an actual CFD code on an NEC SX-4. The computational time increase due to the concurrent visualization was at most 3% for a smaller (1.6 million) grid and less than 1% for a larger (6.2 million) one. (author)

  20. A cellphone based system for large-scale monitoring of black carbon

    Science.gov (United States)

    Ramanathan, N.; Lukac, M.; Ahmed, T.; Kar, A.; Praveen, P. S.; Honles, T.; Leong, I.; Rehman, I. H.; Schauer, J. J.; Ramanathan, V.

    2011-08-01

    Black carbon aerosols are a major component of soot and are also a major contributor to global and regional climate change. Reliable and cost-effective systems to measure near-surface black carbon (BC) mass concentrations (hereafter denoted as [BC]) globally are necessary to validate air pollution and climate models and to evaluate the effectiveness of BC mitigation actions. Toward this goal we describe a new wireless, low-cost, ultra low-power, BC cellphone based monitoring system (BC_CBM). BC_CBM integrates a Miniaturized Aerosol filter Sampler (MAS) with a cellphone for filter image collection, transmission and image analysis for determining [BC] in real time. The BC aerosols in the air accumulate on the MAS quartz filter, resulting in a coloration of the filter. A photograph of the filter is captured by the cellphone camera and transmitted by the cellphone to the analytics component of BC_CBM. The analytics component compares the image with a calibrated reference scale (also included in the photograph) to estimate [BC]. We demonstrate with field data collected from vastly differing environments, ranging from southern California to rural regions in the Indo-Gangetic plains of Northern India, that the total BC deposited on the filter is directly and uniquely related to the reflectance of the filter in the red wavelength, irrespective of its source or how the particles were deposited. [BC] varied from 0.1 to 1 μg m -3 in Southern California and from 10 to 200 μg m -3 in rural India in our field studies. In spite of the 3 orders of magnitude variation in [BC], the BC_CBM system was able to determine the [BC] well within the experimental error of two independent reference instruments for both indoor air and outdoor ambient air. Accurate, global-scale measurements of [BC] in urban and remote rural locations, enabled by the wireless, low-cost, ultra low-power operation of BC_CBM, will make it possible to better capture the large spatial and temporal variations in

  1. Liquid Methane Testing With a Large-Scale Spray Bar Thermodynamic Vent System

    Science.gov (United States)

    Hastings, L. J.; Bolshinskiy, L. G.; Hedayat, A.; Flachbart, R. H.; Sisco, J. D.; Schnell. A. R.

    2014-01-01

    NASA's Marshall Space Flight Center conducted liquid methane testing in November 2006 using the multipurpose hydrogen test bed outfitted with a spray bar thermodynamic vent system (TVS). The basic objective was to identify any unusual or unique thermodynamic characteristics associated with densified methane that should be considered in the design of space-based TVSs. Thirteen days of testing were performed with total tank heat loads ranging from 720 to 420 W at a fill level of approximately 90%. It was noted that as the fluid passed through the Joule-Thompson expansion, thermodynamic conditions consistent with the pervasive presence of metastability were indicated. This Technical Publication describes conditions that correspond with metastability and its detrimental effects on TVS performance. The observed conditions were primarily functions of methane densification and helium pressurization; therefore, assurance must be provided that metastable conditions have been circumvented in future applications of thermodynamic venting to in-space methane storage.

  2. Environmental aspects of large-scale wind-power systems in the UK

    Energy Technology Data Exchange (ETDEWEB)

    Robson, A

    1983-12-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the U.K. are discussed. Areas of interest include noise, television interference, hazards to bird life and visual effects. A number of areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first U.K. machines to be introduced in a safe and environmentally acceptable manner. Research currently under way will serve to establish siting criteria more clearly, and could significantly increase the potential wind-energy resource. Certain studies of the comparative risk of energy systems are shown to be overpessimistic for U.K. wind turbines.

  3. A literature review for large-scale health information system project planning, implementation and evaluation.

    Science.gov (United States)

    Sligo, Judith; Gauld, Robin; Roberts, Vaughan; Villa, Luis

    2017-01-01

    Information technology is perceived as a potential panacea for healthcare organisations to manage pressure to improve services in the face of increased demand. However, the implementation and evaluation of health information systems (HIS) is plagued with problems and implementation shortcomings and failures are rife. HIS implementation is complex and relies on organisational, structural, technological, and human factors to be successful. It also requires reflective, nuanced, multidimensional evaluation to provide ongoing feedback to ensure success. This article provides a comprehensive review of the literature about evaluating and implementing HIS, detailing the challenges and recommendations for both evaluators and healthcare organisations. The factors that inhibit or promote successful HIS implementation are identified and effective evaluation strategies are described with the goal of informing teams evaluating complex HIS. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. On the network protocol performance evaluation for large scale communication system of nuclear plant

    International Nuclear Information System (INIS)

    Song, K. S.; Lee, T. H.; Kim, H. R.; Kim, D. H.; Ku, I. S.

    1998-01-01

    Computer technology has been dramatically advanced and it is now natural to apply digital network technology into nuclear plants. Communication architecture for nuclear plant defines the coordination of safety reactor control, balance of plant, subsystem utilities, and plant monitoring functions, and how they are connected and their user interface to guarantee plant performance and guarantee safety requirements. Therefore, to implement a digital network for control and monitoring systems of advanced nuclear plant needs systematic design and evaluation procedures because of responsive and hard real-time process characteristics of nuclear plant. In this paper, we evaluate several digital network protocols in terms of network delay, link failure effects to hard real-time requirements with full scale traffic

  5. A microfluidic platform for generating large-scale nearly identical human microphysiological system arrays

    Science.gov (United States)

    Hsu, Yu-Hsiang; Moya, Monica L.; Hughes, Christopher C.W.; Georgea, Steven C.; Lee, Abraham P.

    2013-01-01

    This paper reports a polydimethylsiloxane microfluidic model system that can develop an array of nearly identical human microtissues with interconnected vascular networks. The microfluidic system design is based on an analogy with an electric circuit, applying resistive circuit concepts to design pressure dividers in serially-connected microtissue chambers. A long microchannel (550, 620 and 775 mm) creates a resistive circuit with a large hydraulic resistance. Two media reservoirs with a large cross-sectional area and of different heights are connected to the entrance and exit of the long microchannel to serve as a pressure source, and create a near constant pressure drop along the long microchannel. Microtissue chambers (0.12 μl) serve as a two-terminal resistive component with an input impedance > 50-fold larger than the long microchannel. Connecting each microtissue chamber to two different positions along the long microchannel creates a series of pressure dividers. Each microtissue chamber enables a controlled pressure drop of a segment of the microchannel without altering the hydrodynamic behaviour of the microchannel. The result is a controlled and predictable microphysiological environment within the microchamber. Interstitial flow, a mechanical cue for stimulating vasculogenesis, was verified by finite element simulation and experiments. The simplicity of this design enabled the development of multiple microtissue arrays (5, 12, and 30 microtissues) by co-culturing endothelial cells, stromal cells, and fibrin within the microchambers over two and three week periods. This methodology enables the culturing of a large array of microtissues with interconnected vascular networks for biological studies and applications such as drug development. PMID:23723013

  6. Co-gasification of municipal solid waste and material recovery in a large-scale gasification and melting system

    International Nuclear Information System (INIS)

    Tanigaki, Nobuhiro; Manako, Kazutaka; Osada, Morihiro

    2012-01-01

    Highlights: ► This study evaluates the effects of co-gasification of MSW with MSW bottom ash. ► No significant difference between MSW treatment with and without MSW bottom ash. ► PCDD/DFs yields are significantly low because of the high carbon conversion ratio. ► Slag quality is significantly stable and slag contains few hazardous heavy metals. ► The final landfill amount is reduced and materials are recovered by DMS process. - Abstract: This study evaluates the effects of co-gasification of municipal solid waste with and without the municipal solid waste bottom ash using two large-scale commercial operation plants. From the viewpoint of operation data, there is no significant difference between municipal solid waste treatment with and without the bottom ash. The carbon conversion ratios are as high as 91.7% and 95.3%, respectively and this leads to significantly low PCDD/DFs yields via complete syngas combustion. The gross power generation efficiencies are 18.9% with the bottom ash and 23.0% without municipal solid waste bottom ash, respectively. The effects of the equivalence ratio are also evaluated. With the equivalence ratio increasing, carbon monoxide concentration is decreased, and carbon dioxide and the syngas temperature (top gas temperature) are increased. The carbon conversion ratio is also increased. These tendencies are seen in both modes. Co-gasification using the gasification and melting system (Direct Melting System) has a possibility to recover materials effectively. More than 90% of chlorine is distributed in fly ash. Low-boiling-point heavy metals, such as lead and zinc, are distributed in fly ash at rates of 95.2% and 92.0%, respectively. Most of high-boiling-point heavy metals, such as iron and copper, are distributed in metal. It is also clarified that slag is stable and contains few harmful heavy metals such as lead. Compared with the conventional waste management framework, 85% of the final landfill amount reduction is achieved by

  7. EFFECTS OF OXYGEN AND AIR MIXING ON VOID FRACTIONS IN A LARGE SCALE SYSTEM

    International Nuclear Information System (INIS)

    Leishear, R; Hector Guerrero, H; Michael Restivo, M

    2008-01-01

    Oxygen and air mixing with spargers was performed in a 30 foot tall by 30 inch diameter column, to investigate mass transfer as air sparged up through the column and removed saturated oxygen from solution. The mixing techniques required to support this research are the focus of this paper. The fluids tested included water, water with an antifoam agent (AFA), and a high, solids content, Bingham plastic, nuclear waste simulant with AFA, referred to as AZ01 simulant, which is non-radioactive. Mixing of fluids in the column was performed using a recirculation system and an air sparger. The re-circulation system consisted of the column, a re-circulating pump, and associated piping. The air sparger was fabricated from a two inch diameter pipe concentrically installed in the column and open near the bottom of the column. The column contents were slowly re-circulated while fluids were mixed with the air sparger. Samples were rheologically tested to ensure effective mixing, as required. Once the fluids were adequately mixed, oxygen was homogeneously added through the re-circulation loop using a sintered metal oxygen sparger followed by a static mixer. Then the air sparger was re-actuated to remove oxygen from solution as air bubbled up through solution. To monitor mixing effectiveness several variables were monitored, which included flow rates, oxygen concentration, differential pressures along the column height, fluid levels, and void fractions, which are defined as the percent of dissolved gas divided by the total volume of gas and liquid. Research showed that mixing was uniform for water and water with AFA, but mixing for the AZ101 fluid was far more complex. Although mixing of AZ101 was uniform throughout most of the column, gas entrapment and settling of solids significantly affected test results. The detailed test results presented here provide some insight into the complexities of mixing and void fractions for different fluids and how the mixing process itself

  8. EFFECTS OF OXYGEN AND AIR MIXING ON VOID FRACTIONS IN A LARGE SCALE SYSTEM

    Energy Technology Data Exchange (ETDEWEB)

    Leishear, R; Hector Guerrero, H; Michael Restivo, M

    2008-09-11

    Oxygen and air mixing with spargers was performed in a 30 foot tall by 30 inch diameter column, to investigate mass transfer as air sparged up through the column and removed saturated oxygen from solution. The mixing techniques required to support this research are the focus of this paper. The fluids tested included water, water with an antifoam agent (AFA), and a high, solids content, Bingham plastic, nuclear waste simulant with AFA, referred to as AZ01 simulant, which is non-radioactive. Mixing of fluids in the column was performed using a recirculation system and an air sparger. The re-circulation system consisted of the column, a re-circulating pump, and associated piping. The air sparger was fabricated from a two inch diameter pipe concentrically installed in the column and open near the bottom of the column. The column contents were slowly re-circulated while fluids were mixed with the air sparger. Samples were rheologically tested to ensure effective mixing, as required. Once the fluids were adequately mixed, oxygen was homogeneously added through the re-circulation loop using a sintered metal oxygen sparger followed by a static mixer. Then the air sparger was re-actuated to remove oxygen from solution as air bubbled up through solution. To monitor mixing effectiveness several variables were monitored, which included flow rates, oxygen concentration, differential pressures along the column height, fluid levels, and void fractions, which are defined as the percent of dissolved gas divided by the total volume of gas and liquid. Research showed that mixing was uniform for water and water with AFA, but mixing for the AZ101 fluid was far more complex. Although mixing of AZ101 was uniform throughout most of the column, gas entrapment and settling of solids significantly affected test results. The detailed test results presented here provide some insight into the complexities of mixing and void fractions for different fluids and how the mixing process itself

  9. Lightning Surge Analysis on a Large Scale Grid-Connected Solar Photovoltaic System

    Directory of Open Access Journals (Sweden)

    Nur Hazirah Zaini

    2017-12-01

    Full Text Available Solar photovoltaic (PV farms currently play a vital role in the generation of electrical power in different countries, such as Malaysia, which is moving toward the use of renewable energy. Malaysia is one of the countries with abundant sunlight and thus can use solar PV farms as alternative sources for electricity generation. However, lightning strikes frequently occur in the country. Being installed in open and flat areas, solar PV farms, especially their electronic components, are at great risk of damage caused by lightning. In this paper, the effects of lightning currents with different peak currents and waveshapes on grid-connected solar PV farms were determined to approximate the level of transient effect that can damage solar PV modules, inverters and transformers. Depending on the location of the solar PV farm, engineer can obtain information on the peak current and median current of the site from the lightning location system (LLS and utilise the results obtained in this study to appropriately assign an SPD to protect the solar panel, inverter and the main panel that connected to the grid. Therefore, the simulation results serve as the basis for controlling the effects of lightning strikes on electrical equipment and power grids where it provides proper justification on the ‘where to be installed’ and ‘what is the rating’ of the SPD. This judgment and decision will surely reduce the expensive cost of repair and replacement of electrical equipment damages due to the lightning.

  10. Large Scale Proteomic Data and Network-Based Systems Biology Approaches to Explore the Plant World.

    Science.gov (United States)

    Di Silvestre, Dario; Bergamaschi, Andrea; Bellini, Edoardo; Mauri, PierLuigi

    2018-06-03

    The investigation of plant organisms by means of data-derived systems biology approaches based on network modeling is mainly characterized by genomic data, while the potential of proteomics is largely unexplored. This delay is mainly caused by the paucity of plant genomic/proteomic sequences and annotations which are fundamental to perform mass-spectrometry (MS) data interpretation. However, Next Generation Sequencing (NGS) techniques are contributing to filling this gap and an increasing number of studies are focusing on plant proteome profiling and protein-protein interactions (PPIs) identification. Interesting results were obtained by evaluating the topology of PPI networks in the context of organ-associated biological processes as well as plant-pathogen relationships. These examples foreshadow well the benefits that these approaches may provide to plant research. Thus, in addition to providing an overview of the main-omic technologies recently used on plant organisms, we will focus on studies that rely on concepts of module, hub and shortest path, and how they can contribute to the plant discovery processes. In this scenario, we will also consider gene co-expression networks, and some examples of integration with metabolomic data and genome-wide association studies (GWAS) to select candidate genes will be mentioned.

  11. The UP modelling system for large scale hydrology: simulation of the Arkansas-Red River basin

    Directory of Open Access Journals (Sweden)

    C. G. Kilsby

    1999-01-01

    Full Text Available The UP (Upscaled Physically-based hydrological modelling system to the Arkansas-Red River basin (USA is designed for macro-scale simulations of land surface processes, and aims for a physical basis and, avoids the use of discharge records in the direct calibration of parameters. This is achieved in a two stage process: in the first stage parametrizations are derived from detailed modelling of selected representative small and then used in a second stage in which a simple distributed model is used to simulate the dynamic behaviour of the whole basin. The first stage of the process is described in a companion paper (Ewen et al., this issue, and the second stage of this process is described here. The model operated at an hourly time-step on 17-km grid squares for a two year simulation period, and represents all the important hydrological processes including regional aquifer recharge, groundwater discharge, infiltration- and saturation-excess runoff, evapotranspiration, snowmelt, overland and channel flow. Outputs from the model are discussed, and include river discharge at gauging stations and space-time fields of evaporation and soil moisture. Whilst the model efficiency assessed by comparison of simulated and observed discharge records is not as good as could be achieved with a model calibrated against discharge, there are considerable advantages in retaining a physical basis in applications to ungauged river basins and assessments of impacts of land use or climate change.

  12. Interacting large-scale magnetic fields and ionized gas in the W50/SS433 system

    Science.gov (United States)

    Farnes, J. S.; Gaensler, B. M.; Purcell, C.; Sun, X. H.; Haverkorn, M.; Lenc, E.; O'Sullivan, S. P.; Akahori, T.

    2017-06-01

    The W50/SS433 system is an unusual Galactic outflow-driven object of debatable origin. We have used the Australia Telescope Compact Array to observe a new 198 pointing mosaic, covering 3° × 2°, and present the highest-sensitivity full-Stokes data of W50 to date using wide-field, wide-band imaging over a 2 GHz bandwidth centred at 2.1 GHz. We also present a complementary Hα mosaic created using the Isaac Newton Telescope Photometric Hα Survey of the Northern Galactic Plane. The magnetic structure of W50 is consistent with the prevailing hypothesis that the nebula is a reanimated shell-like supernova remnant (SNR), which has been re-energized by the jets from SS433. We observe strong depolarization effects that correlate with diffuse Hα emission, likely due to spatially varying Faraday rotation measure (RM) fluctuations of ≥48-61 rad m-2 on scales ≤4.5-6 pc. We also report the discovery of numerous, faint, Hα filaments that are unambiguously associated with the central region of W50. These thin filaments are suggestive of an SNR's shock emission, and almost all have a radio counterpart. Furthermore, an RM-gradient is detected across the central region of W50, which we interpret as a loop magnetic field with a symmetry axis offset by ≈90° to the east-west jet-alignment axis, and implying that the evolutionary processes of both the jets and the SNR must be coupled. A separate RM-gradient is associated with the termination shock in the eastern ear, which we interpret as a ring-like field located where the shock of the jet interacts with the circumstellar medium. Future optical observations will be able to use the new Hα filaments to probe the kinematics of the shell of W50, potentially allowing for a definitive experiment on W50's formation history.

  13. Analysis of large scale tests for AP-600 passive containment cooling system

    International Nuclear Information System (INIS)

    Sha, W.T.; Chien, T.H.; Sun, J.G.; Chao, B.T.

    1997-01-01

    One unique feature of the AP-600 is its passive containment cooling system (PCCS), which is designed to maintain containment pressure below the design limit for 72 hours without action by the reactor operator. During a design-basis accident, i.e., either a loss-of-coolant or a main steam-line break accident, steam escapes and comes in contact with the much cooler containment vessel wall. Heat is transferred to the inside surface of the steel containment wall by convection and condensation of steam and through the containment steel wall by conduction. Heat is then transferred from the outside of the containment surface by heating and evaporation of a thin liquid film that is formed by applying water at the top of the containment vessel dome. Air in the annual space is heated by both convection and injection of steam from the evaporating liquid film. The heated air and vapor rise as a result of natural circulation and exit the shield building through the outlets above the containment shell. All of the analytical models that are developed for and used in the COMMIX-ID code for predicting performance of the PCCS will be described. These models cover governing conservation equations for multicomponents single phase flow, transport equations for the κ-ε two-equation turbulence model, auxiliary equations, liquid-film tracking model for both inside (condensate) and outside (evaporating liquid film) surfaces of the containment vessel wall, thermal coupling between flow domains inside and outside the containment vessel, and heat and mass transfer models. Various key parameters of the COMMIX-ID results and corresponding AP-600 PCCS experimental data are compared and the agreement is good. Significant findings from this study are summarized

  14. Valuing physically and financially-induced flexibility in large-scale water resources systems

    Science.gov (United States)

    Tilmant, Amaury; Pina, Jasson; Côté, Pascal

    2017-04-01

    In a world characterized by rapid changes in terms of water demands and supplies, there is a growing and persistent need for institutional reforms that promote cross-sectoral, adaptive management processes and policies. Yet, in many regions throughout the world, the continued expansion of supply-side infrastructure is still perceived as the way to go despite the rising financial, social and environmental costs. This trend is further compounded by the risks posed by climate change; reservoir storage, for example, is still perceived as a key element of climate change adaptation strategies in many countries. There is a growing concern that such strategies may result in a rigidity trap whereby the physical and institutional infrastructure become inflexible and unable to adapt to changes because they are mutually reinforcing each other. However, several authors have recently advocated for adaptive, flexible, management techniques involving a more diversified portfolio of measures whose management is regularly updated as new information about supplies and demands becomes available. Despite being conceptually attractive, such a management approach presents several challenges to policy makers. One of them is the sheer amount of information that must be processed each time a management decision must be taken. To address this issue, we propose an optimization framework that can be used to determine the optimal management of a large portfolio of physical and financial assets using various hydro-climatic information. This optimization framework is illustrated with the management of a power system in Quebec involving various power stations, reservoirs, power and energy contracts as well as hydrologic and climatic data. The results can be used to assess the economic value of the flexibility induced by either the physical assets (power stations and reservoirs) or by the financial ones (contracts), an information we believe is important to highlight the benefits of adaptive

  15. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  16. Examining Agencies' Satisfaction with Electronic Record Management Systems in e-Government: A Large-Scale Survey Study

    Science.gov (United States)

    Hsu, Fang-Ming; Hu, Paul Jen-Hwa; Chen, Hsinchun; Hu, Han-Fen

    While e-government is propelling and maturing steadily, advanced technological capabilities alone cannot guarantee agencies’ realizing the full benefits of the enabling computer-based systems. This study analyzes information systems in e-government settings by examining agencies’ satisfaction with an electronic record management system (ERMS). Specifically, we investigate key satisfaction determinants that include regulatory compliance, job relevance, and satisfaction with support services for using the ERMS. We test our model and the hypotheses in it, using a large-scale survey that involves a total of 1,652 government agencies in Taiwan. Our results show significant effects of regulatory compliance on job relevance and satisfaction with support services, which in turn determine government agencies’ satisfaction with an ERMS. Our data exhibit a reasonably good fit to our model, which can explain a significant portion of the variance in agencies’ satisfaction with an ERMS. Our findings have several important implications to research and practice, which are also discussed.

  17. Large-scale digitizer system (LSD) for charge and time digitization in high-energy physics experiments

    International Nuclear Information System (INIS)

    Althaus, R.F.; Kirsten, F.A.; Lee, K.L.; Olson, S.R.; Wagner, L.J.; Wolverton, J.M.

    1976-10-01

    A large-scale digitizer (LSD) system for acquiring charge and time-of-arrival particle data from high-energy-physics experiments has been developed at the Lawrence Berkeley Laboratory. The objective in this development was to significantly reduce the cost of instrumenting large-detector arrays which, for the 4π-geometry of colliding-beam experiments, are proposed with an order of magnitude increase in channel count over previous detectors. In order to achieve the desired economy (approximately $65 per channel), a system was designed in which a number of control signals for conversion, for digitization, and for readout are shared in common by all the channels in each 128-channel bin. The overall-system concept and the distribution of control signals that are critical to the 10-bit charge resolution and to the 12-bit time resolution are described. Also described is the bit-serial transfer scheme, chosen for its low component and cabling costs

  18. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu

    2011-08-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  19. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  20. Two-dimensional simulation of the gravitational system dynamics and formation of the large-scale structure of the universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.; Kotok, E.V.; Novikov, I.D.; Polyudov, A.N.; Shandarin, S.F.; Sigov, Y.S.

    1980-01-01

    The results of a numerical experiment are given that describe the non-linear stages of the development of perturbations in gravitating matter density in the expanding Universe. This process simulates the formation of the large-scale structure of the Universe from an initially almost homogeneous medium. In the one- and two-dimensional cases of this numerical experiment the evolution of the system from 4096 point masses that interact gravitationally only was studied with periodic boundary conditions (simulation of the infinite space). The initial conditions were chosen that resulted from the theory of the evolution of small perturbations in the expanding Universe. The results of numerical experiments are systematically compared with the approximate analytic theory. The results of the calculations show that in the case of collisionless particles, as well as in the gas-dynamic case, the cellular structure appeared at the non-linear stage in the case of the adiabatic perturbations. The greater part of the matter is in thin layers that separate vast regions of low density. In a Robertson-Walker universe the cellular structure exists for a finite time and then fragments into a few compact objects. In the open Universe the cellular structure also exists if the amplitude of initial perturbations is large enough. But the following disruption of the cellular structure is more difficult because of too rapid an expansion of the Universe. The large-scale structure is frozen. (author)

  1. Large Scale System Defense

    Science.gov (United States)

    2008-10-01

    NUMBER 00 5f. WORK UNIT NUMBER 01 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Columbia University 1700 Broadway New York NY 10019-5905 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AFRL/RIGA 525 Brooks Rd. Rome NY 13441-4505...pealing because of the need to modify source code. Since source-level annotations serve as a vestigial policy, we articulated a way to augment self

  2. Computational Techniques for Model Predictive Control of Large-Scale Systems with Continuous-Valued and Discrete-Valued Inputs

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available We propose computational techniques for model predictive control of large-scale systems with both continuous-valued control inputs and discrete-valued control inputs, which are a class of hybrid systems. In the proposed method, we introduce the notion of virtual control inputs, which are obtained by relaxing discrete-valued control inputs to continuous variables. In online computation, first, we find continuous-valued control inputs and virtual control inputs minimizing a cost function. Next, using the obtained virtual control inputs, only discrete-valued control inputs at the current time are computed in each subsystem. In addition, we also discuss the effect of quantization errors. Finally, the effectiveness of the proposed method is shown by a numerical example. The proposed method enables us to reduce and decentralize the computation load.

  3. Energy from the desert. Very large scale photovoltaic systems: socio-economic, financial, technical and environmental aspects. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Kurokawa, K.; Ito, M.; Komoto, K.; Vleuten, P. van der; Faiman, D. (eds.)

    2009-05-15

    This executive summary report for the International Energy Agency (IEA) summarises the objectives and concepts of very large scale photovoltaic power generation (VLS-PV) systems and takes a look at the socio-economic, financial and technical aspects involved as well as the environmental impact of such systems. Potential benefits for desert communities, agricultural development and desalination of water are topics that are looked at. The potential of VLS-PV, its energy payback time and CO{sub 2} emission rates are discussed. Case studies for the Sahara and the Gobi Dessert areas are discussed. A VLS-PV roadmap is proposed and scenarios are discussed. Finally, conclusions are drawn and recommendations are made.

  4. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    Science.gov (United States)

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  5. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    Directory of Open Access Journals (Sweden)

    Ali Albattat

    2016-08-01

    Full Text Available The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems. These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  6. Adaptive Neural Networks Decentralized FTC Design for Nonstrict-Feedback Nonlinear Interconnected Large-Scale Systems Against Actuator Faults.

    Science.gov (United States)

    Li, Yongming; Tong, Shaocheng

    The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small neighborhood of zero. Finally, numerical results of practical examples are presented to further demonstrate the effectiveness of the proposed control strategy.The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small

  7. Large-scale production of poly(3-hydroxyoctanoic acid) by Pseudomonas putida GPo1 and a simplified downstream process.

    Science.gov (United States)

    Elbahloul, Yasser; Steinbüchel, Alexander

    2009-02-01

    The suitability of Pseudomonas putida GPo1 for large-scale cultivation and production of poly(3-hydroxyoctanoate) (PHO) was investigated in this study. Three fed-batch cultivations of P. putida GPo1 at the 350- or 400-liter scale in a bioreactor with a capacity of 650 liters were done in mineral salts medium containing initially 20 mM sodium octanoate as the carbon source. The feeding solution included ammonium octanoate, which was fed at a relatively low concentration to promote PHO accumulation under nitrogen-limited conditions. During cultivation, the pH was regulated by addition of NaOH, NH(4)OH, or octanoic acid, which was used as an additional carbon source. Partial O(2) pressure (pO(2)) was adjusted to 20 to 40% by controlling the airflow and stirrer speed. Under the optimized conditions, P. putida GPo1 was able to grow to cell densities as high as 18, 37, and 53 g cells (dry mass) (CDM) per liter containing 49, 55, and 60% (wt/wt) of PHO, respectively. The resulting 40 kg CDM from these three cultivations was used directly for extraction of PHO. Three different methods of extraction of PHO were applied. From these, only acetone extraction showed better performance and resulted in 94% recovery of the PHO contents of cells. A novel mixture of precipitation solvents composed of 70% (vol/vol) methanol and 70% (vol/vol) ethanol was identified in this study. The ratio of PHO concentrate to the mixture was 0.2:1 (vol/vol) and allowed complete precipitation of PHO as white flakes. However, at a ratio of 1:1 (vol/vol) of the solvent mixture to PHO concentrate, a highly purified PHO was obtained. Precipitation yielded a dough-like polymeric material which was cast into thin layers and then shredded into small strips to allow evaporation of the remaining solvents. Gas chromatographic analysis revealed a purity of about 99% +/- 0.2% (wt/wt) of the polymer, which consisted mainly of 3-hydroxyoctanoic acid (96 mol%).

  8. Adoption of innovative energy systems in social housing: Lessons from eight large-scale renovation projects in The Netherlands

    International Nuclear Information System (INIS)

    Hoppe, Thomas

    2012-01-01

    Thanks to new insights on the impacts that dwellings have throughout their life cycles, there has been increased attention to retrofitting innovative energy systems (IES) in existing housing. This paper uses an explorative case study design to gain more knowledge about the governance aspects of this under-researched topic. The central research question is: Which factors influence the adoption of innovative energy systems in social housing sites during renovation projects? To answer this question, eight large-scale renovation projects in The Netherlands were investigated. These case studies allowed the identification of barriers, enabling factors and perspectives from three main actors—housing associations, tenants and local authorities. It turns out that adopting IES encounters many barriers: lack of trust between project partners, delay in project progress, financial feasibility considerations, lack of support from tenants, lengthy legal permit procedures, over-ambitious project goals, poor experiences in previous projects, and IES ambitions that are not taken serious by key decision-makers. Furthermore, IES were only successfully fitted in three of the eight projects. Moreover, ambitions were lowered as the projects progressed in all the cases investigated. The study calls for further systematic, in-depth comparison of fitting IES in large-scale renovation projects in social housing. - Highlights: ► Attention to adoption of innovative energy systems in social housing. ► Several non-technical factors influence adoption. ► In-depth analysis of eight local-level renovation projects. ► Ambitions are lowered as projects progress. ► Barriers: financial feasibility, over-ambitious goals, delay, lack of trust.

  9. The TRIDEC System-of-Systems; Choreography of large-scale concurrent tasks in Natural Crisis Management

    Science.gov (United States)

    Häner, R.; Wächter, J.

    2012-04-01

    The project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme aims at establishing a network of dedicated, autonomous legacy systems for large-scale concurrent management of natural crises utilising heterogeneous information resources. TRIDEC's architecture reflects the System-of- Systems (SoS) approach which is based on task-oriented systems, cooperatively interacting as a collective in a common environment. The design of the TRIDEC-SoS follows the principles of service-oriented and event-driven architectures (SOA & EDA) exceedingly focusing on a loose coupling of the systems. The SoS approach in combination with SOA and EDA has the distinction of being able to provide novel and coherent behaviours and features resulting from a process of dynamic self-organisation. Self-organisation is a process without the need for a central or external coordinator controlling it through orchestration. It is the result of enacted concurrent tasks in a collaborative environment of geographically distributed systems. Although the individual systems act completely autonomously, their interactions expose emergent structures of evolving nature. Particularly, the fact is important that SoS are inherently able to evolve on all facets of intelligent information management. This includes adaptive properties, e.g. seamless integration of new resource types or the adoption of new fields in natural crisis management. In the case of TRIDEC with various heterogeneous participants involved, concurrent information processing is of fundamental importance because of the achievable improvements regarding cooperative decision making. Collaboration within TRIDEC will be implemented with choreographies and conversations. Choreographies specify the expected behaviour between two or more participants; conversations describe the message exchange between all participants emphasising their logical

  10. Co-evolution of intelligent socio-technical systems modelling and applications in large scale emergency and transport domains

    CERN Document Server

    2013-01-01

    As the interconnectivity between humans through technical devices is becoming ubiquitous, the next step is already in the making: ambient intelligence, i.e. smart (technical) environments, which will eventually play the same active role in communication as the human players, leading to a co-evolution in all domains where real-time communication is essential. This topical volume, based on the findings of the Socionical European research project, gives equal attention to two highly relevant domains of applications: transport, specifically traffic, dynamics from the viewpoint of a socio-technical interaction and evacuation scenarios for large-scale emergency situations. Care was taken to investigate as much as possible the limits of scalability and to combine the modeling using complex systems science approaches with relevant data analysis.

  11. Large-scale bioenergy production from soybeans and switchgrass in Argentina: Part A: Potential and economic feasibility for national and international markets

    NARCIS (Netherlands)

    van Dam, J.; Faaij, A.P.C.; Hilbert, J.; Petruzzi, H.; Turkenburg, W.C.

    2009-01-01

    This study focuses on the economic feasibility for large-scale biomass production from soybeans or switchgrass from a region in Argentina. This is determined, firstly, by estimating whether the potential supply of biomass, when food and feed demand are met, is sufficient under different scenarios to

  12. Understory fern community structure, growth and spore production responses to a large-scale hurricane experiment in a Puerto Rico rainforest

    Science.gov (United States)

    Joanne M. Sharpe; Aaron B. Shiels

    2014-01-01

    Ferns are abundant in most rainforest understories yet their responses to hurricanes have not been well studied. Fern community structure, growth and spore production were monitored for two years before and five years after a large-scale experiment that simulated two key components of severe hurricane disturbance: canopy openness and debris deposition. The canopy was...

  13. Fault Ride Through Capability Enhancement of a Large-Scale PMSG Wind System with Bridge Type Fault Current Limiters

    Directory of Open Access Journals (Sweden)

    ALAM, M. S.

    2018-02-01

    Full Text Available In this paper, bridge type fault current limiter (BFCL is proposed as a potential solution to the fault problems of permanent magnet synchronous generator (PMSG based large-scale wind energy system. As PMSG wind system is more vulnerable to disturbances, it is essential to guarantee the stability during severe disturbances by enhancing the fault ride through capability. BFCL controller has been designed to insert resistance and inductance during the inception of system disturbances in order to limit fault current. Constant capacitor voltage has been maintained by the grid voltage source converter (GVSC controller while current extraction or injection has been achieved by machine VSC (MVSC controller. Symmetrical and unsymmetrical faults have been applied in the system to show the effectiveness of the proposed BFCL solution. PMSG wind system, BFCL and their controllers have been implemented by real time hardware in loop (RTHIL setup with real time digital simulator (RTDS and dSPACE. Another significant feature of this work is that the performance of the proposed BFCL is compared with that of series dynamic braking resistor (SDBR. Comparative RTHIL implementation results show that the proposed BFCL is very efficient in improving system fault ride through capability by limiting the fault current and outperforms SDBR.

  14. Ruling by canal: Governance and system-level design characteristics of large scale irrigation infrastructure in India and Uzbekistan

    Directory of Open Access Journals (Sweden)

    Peter Mollinga

    2016-06-01

    Full Text Available This paper explores the relationship between governance regime and large-scale irrigation system design by investigating three cases: 1 protective irrigation design in post-independent South India; 2 canal irrigation system design in Khorezm Province, Uzbekistan, as implemented in the USSR period, and 3 canal design by the Madras Irrigation and Canal Company, as part of an experiment to do canal irrigation development in colonial India on commercial terms in the 1850s-1860s. The mutual shaping of irrigation infrastructure design characteristics on the one hand and management requirements and conditions on the other has been documented primarily at lower, within-system levels of the irrigation systems, notably at the level of division structures. Taking a 'social construction of technology' perspective, the paper analyses the relationship between technological structures and management and governance arrangements at irrigation system level. The paper finds qualitative differences in the infrastructural configuration of the three irrigation systems expressing and facilitating particular forms of governance and rule, differences that matter for management and use, and their effects and impacts.

  15. Engineering youth service system infrastructure: Hawaii's continued efforts at large-scale implementation through knowledge management strategies.

    Science.gov (United States)

    Nakamura, Brad J; Mueller, Charles W; Higa-McMillan, Charmaine; Okamura, Kelsie H; Chang, Jaime P; Slavin, Lesley; Shimabukuro, Scott

    2014-01-01

    Hawaii's Child and Adolescent Mental Health Division provides a unique illustration of a youth public mental health system with a long and successful history of large-scale quality improvement initiatives. Many advances are linked to flexibly organizing and applying knowledge gained from the scientific literature and move beyond installing a limited number of brand-named treatment approaches that might be directly relevant only to a small handful of system youth. This article takes a knowledge-to-action perspective and outlines five knowledge management strategies currently under way in Hawaii. Each strategy represents one component of a larger coordinated effort at engineering a service system focused on delivering both brand-named treatment approaches and complimentary strategies informed by the evidence base. The five knowledge management examples are (a) a set of modular-based professional training activities for currently practicing therapists, (b) an outreach initiative for supporting youth evidence-based practices training at Hawaii's mental health-related professional programs, (c) an effort to increase consumer knowledge of and demand for youth evidence-based practices, (d) a practice and progress agency performance feedback system, and (e) a sampling of system-level research studies focused on understanding treatment as usual. We end by outlining a small set of lessons learned and a longer term vision for embedding these efforts into the system's infrastructure.

  16. Government management and implementation of national real-time energy monitoring system for China large-scale public building

    International Nuclear Information System (INIS)

    Na Wei; Wu Yong; Song Yan; Dong Zhongcheng

    2009-01-01

    The supervision of energy efficiency in government office buildings and large-scale public buildings (GOBLPB) is the main embodiment for government implementation of Public Administration in the fields of resource saving and environmental protection. It is significant for China government to achieve the target: reducing building energy consumption by 11 million ton standard coal before 2010. In the framework of a national demonstration project concerning the energy management system, Shenzhen Municipality has been selected for the implementation of the system. A data acquisition system and a methodology concerning the energy consumption of the GOBLPB have been developed. This paper summarizes the various features of the system incorporated into identifying the building consumes and energy saving potential. This paper also defines the methods to achieve the real-time monitoring and diagnosis: the meters installed at each building, the data transmitted through internet to a center server, the analysis and unification at the center server and the publication through web. Furthermore, this paper introduces the plans to implement the system and to extend countrywide. Finally, this paper presents some measurements to achieve a common benefit community in implementation of building energy efficiency supervisory system on GOBLPB in its construction, reconstruction or operation stages.

  17. Solar powered oxygen systems in remote health centers in Papua New Guinea: a large scale implementation effectiveness trial.

    Science.gov (United States)

    Duke, Trevor; Hwaihwanje, Ilomo; Kaupa, Magdalynn; Karubi, Jonah; Panauwe, Doreen; Sa'avu, Martin; Pulsan, Francis; Prasad, Peter; Maru, Freddy; Tenambo, Henry; Kwaramb, Ambrose; Neal, Eleanor; Graham, Hamish; Izadnegahdar, Rasa

    2017-06-01

    Pneumonia is the largest cause of child deaths in Papua New Guinea (PNG), and hypoxaemia is the major complication causing death in childhood pneumonia, and hypoxaemia is a major factor in deaths from many other common conditions, including bronchiolitis, asthma, sepsis, malaria, trauma, perinatal problems, and obstetric emergencies. A reliable source of oxygen therapy can reduce mortality from pneumonia by up to 35%. However, in low and middle income countries throughout the world, improved oxygen systems have not been implemented at large scale in remote, difficult to access health care settings, and oxygen is often unavailable at smaller rural hospitals or district health centers which serve as the first point of referral for childhood illnesses. These hospitals are hampered by lack of reliable power, staff training and other basic services. We report the methodology of a large implementation effectiveness trial involving sustainable and renewable oxygen and power systems in 36 health facilities in remote rural areas of PNG. The methodology is a before-and after evaluation involving continuous quality improvement, and a health systems approach. We describe this model of implementation as the considerations and steps involved have wider implications in health systems in other countries. The implementation steps include: defining the criteria for where such an intervention is appropriate, assessment of power supplies and power requirements, the optimal design of a solar power system, specifications for oxygen concentrators and other oxygen equipment that will function in remote environments, installation logistics in remote settings, the role of oxygen analyzers in monitoring oxygen concentrator performance, the engineering capacity required to sustain a program at scale, clinical guidelines and training on oxygen equipment and the treatment of children with severe respiratory infection and other critical illnesses, program costs, and measurement of processes and

  18. Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    Data acquisition systems for large-scale high-energy physics experiments have to handle hundreds of gigabytes per second of data, and are typically realized as specialized data centers that connect a very large number of front-end electronics devices to an event detection and storage system. The design of such systems is often based on many assumptions, small-scale experiments and a substantial amount of over-provisioning. In this work, we introduce a discrete event-based simulation tool that models the data flow of the current ATLAS data acquisition system, with the main goal to be accurate with regard to the main operational characteristics. We measure buffer occupancy counting the number of elements in buffers, resource utilization measuring output bandwidth and counting the number of active processing units, and their time evolution by comparing data over many consecutive and small periods of time. We perform studies on the error of simulation when comparing the results to a large amount of real-world ope...

  19. Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    Data acquisition systems for large-scale high-energy physics experiments have to handle hundreds of gigabytes per second of data, and are typically implemented as specialized data centers that connect a very large number of front-end electronics devices to an event detection and storage system. The design of such systems is often based on many assumptions, small-scale experiments and a substantial amount of over-provisioning. In this paper, we introduce a discrete event-based simulation tool that models the dataflow of the current ATLAS data acquisition system, with the main goal to be accurate with regard to the main operational characteristics. We measure buffer occupancy counting the number of elements in buffers; resource utilization measuring output bandwidth and counting the number of active processing units, and their time evolution by comparing data over many consecutive and small periods of time. We perform studies on the error in simulation when comparing the results to a large amount of real-world ...

  20. ROSA-V large scale test facility (LSTF) system description for the third and fourth simulated fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Mitsuhiro; Nakamura, Hideo; Ohtsu, Iwao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others

    2003-03-01

    The Large Scale Test Facility (LSTF) is a full-height and 1/48 volumetrically scaled test facility of the Japan Atomic Energy Research Institute (JAERI) for system integral experiments simulating the thermal-hydraulic responses at full-pressure conditions of a 1100 MWe-class pressurized water reactor (PWR) during small break loss-of-coolant accidents (SBLOCAs) and other transients. The LSTF can also simulate well a next-generation type PWR such as the AP600 reactor. In the fifth phase of the Rig-of-Safety Assessment (ROSA-V) Program, eighty nine experiments have been conducted at the LSTF with the third simulated fuel assembly until June 2001, and five experiments have been conducted with the newly-installed fourth simulated fuel assembly until December 2002. In the ROSA-V program, various system integral experiments have been conducted to certify effectiveness of both accident management (AM) measures in beyond design basis accidents (BDBAs) and improved safety systems in the next-generation reactors. In addition, various separate-effect tests have been conducted to verify and develop computer codes and analytical models to predict non-homogeneous and multi-dimensional phenomena such as heat transfer across the steam generator U-tubes under the presence of non-condensable gases in both current and next-generation reactors. This report presents detailed information of the LSTF system with the third and fourth simulated fuel assemblies for the aid of experiment planning and analyses of experiment results. (author)

  1. Systems Perturbation Analysis of a Large-Scale Signal Transduction Model Reveals Potentially Influential Candidates for Cancer Therapeutics

    Science.gov (United States)

    Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš

    2016-01-01

    Dysregulation in signal transduction pathways can lead to a variety of complex disorders, including cancer. Computational approaches such as network analysis are important tools to understand system dynamics as well as to identify critical components that could be further explored as therapeutic targets. Here, we performed perturbation analysis of a large-scale signal transduction model in extracellular environments that stimulate cell death, growth, motility, and quiescence. Each of the model’s components was perturbed under both loss-of-function and gain-of-function mutations. Using 1,300 simulations under both types of perturbations across various extracellular conditions, we identified the most and least influential components based on the magnitude of their influence on the rest of the system. Based on the premise that the most influential components might serve as better drug targets, we characterized them for biological functions, housekeeping genes, essential genes, and druggable proteins. The most influential components under all environmental conditions were enriched with several biological processes. The inositol pathway was found as most influential under inactivating perturbations, whereas the kinase and small lung cancer pathways were identified as the most influential under activating perturbations. The most influential components were enriched with essential genes and druggable proteins. Moreover, known cancer drug targets were also classified in influential components based on the affected components in the network. Additionally, the systemic perturbation analysis of the model revealed a network motif of most influential components which affect each other. Furthermore, our analysis predicted novel combinations of cancer drug targets with various effects on other most influential components. We found that the combinatorial perturbation consisting of PI3K inactivation and overactivation of IP3R1 can lead to increased activity levels of apoptosis

  2. Comparison of fracture toughness values from large-scale pipe system tests and C(T) specimens

    International Nuclear Information System (INIS)

    Olson, R.; Scott, P.; Marschall, C.; Wilkowski, G.

    1993-01-01

    Within the International Piping Integrity Research Group (IPIRG) program, pipe system experiments involving dynamic loading with intentionally circumferentially cracked pipe were conducted. The pipe system was fabricated from 406-mm (16-inch) diameter Schedule 100 pipe and the experiments were conducted at 15.5 MPa (2,250 psi) and 288 C (550 F). The loads consisted of pressure, dead-weight, thermal expansion, inertia, and dynamic anchor motion. Significant instrumentation was used to allow the material fracture resistance to be calculated from these large-scale experiments. A comparison of the toughness values from the stainless steel base metal pipe experiment of standard quasi-static and dynamic C(T) specimen tests showed the pipe toughness value was significantly lower than that obtained from C(T) specimens. It is hypothesized that the cyclic loading from inertial stresses in this pipe system experiment caused local degradation of the material toughness. Such effects are not considered in current LBB or pipe flaw evaluation criteria. 4 refs., 14 figs., 1 tab

  3. Engineered catalytic biofilms for continuous large scale production of n-octanol and (S)-styrene oxide.

    Science.gov (United States)

    Gross, Rainer; Buehler, Katja; Schmid, Andreas

    2013-02-01

    This study evaluates the technical feasibility of biofilm-based biotransformations at an industrial scale by theoretically designing a process employing membrane fiber modules as being used in the chemical industry and compares the respective process parameters to classical stirred-tank studies. To our knowledge, catalytic biofilm processes for fine chemicals production have so far not been reported on a technical scale. As model reactions, we applied the previously studied asymmetric styrene epoxidation employing Pseudomonas sp. strain VLB120ΔC biofilms and the here-described selective alkane hydroxylation. Using the non-heme iron containing alkane hydroxylase system (AlkBGT) from P. putida Gpo1 in the recombinant P. putida PpS81 pBT10 biofilm, we were able to continuously produce 1-octanol from octane with a maximal productivity of 1.3 g L ⁻¹(aq) day⁻¹ in a single tube micro reactor. For a possible industrial application, a cylindrical membrane fiber module packed with 84,000 polypropylene fibers is proposed. Based on the here presented calculations, 59 membrane fiber modules (of 0.9 m diameter and 2 m length) would be feasible to realize a production process of 1,000 tons/year for styrene oxide. Moreover, the product yield on carbon can at least be doubled and over 400-fold less biomass waste would be generated compared to classical stirred-tank reactor processes. For the octanol process, instead, further intensification in biological activity and/or surface membrane enlargement is required to reach production scale. By taking into consideration challenges such as biomass growth control and maintaining a constant biological activity, this study shows that a biofilm process at an industrial scale for the production of fine chemicals is a sustainable alternative in terms of product yield and biomass waste production. Copyright © 2012 Wiley Periodicals, Inc.

  4. History matching of large scale fractures to production data; Calage de la geometrie des reseaux de fractures aux donnees hydrodynamiques de production d'un champ petrolier

    Energy Technology Data Exchange (ETDEWEB)

    Jenni, S.

    2005-01-01

    Object based models are very helpful to represent complex geological media such as fractured reservoirs. For building realistic fracture networks, these models have to be constrained to both static (seismic, geomechanics, geology) and dynamic data (well tests and production history). In this report we present a procedure for the calibration of large-scale fracture networks to production history. The history matching procedure includes a realistic geological modeling, a parameterization method coherent with the geological model and allowing an efficient optimization. Fluid flow modeling is based on a double medium approach. The calibration procedure was applied to a semi-synthetic case based on a real fractured reservoir. The calibration to water-cut data was performed. (author)

  5. A Novel Method for Proximity Detection of Moving Targets Using a Large-Scale Planar Capacitive Sensor System

    Directory of Open Access Journals (Sweden)

    Yong Ye

    2016-05-01

    Full Text Available A novel method for proximity detection of moving targets (with high dielectric constants using a large-scale (the size of each sensor is 31 cm × 19 cm planar capacitive sensor system (PCSS is proposed. The capacitive variation with distance is derived, and a pair of electrodes in a planar capacitive sensor unit (PCSU with a spiral shape is found to have better performance on sensitivity distribution homogeneity and dynamic range than three other shapes (comb shape, rectangular shape, and circular shape. A driving excitation circuit with a Clapp oscillator is proposed, and a capacitance measuring circuit with sensitivity of 0.21 V p − p / pF is designed. The results of static experiments and dynamic experiments demonstrate that the voltage curves of static experiments are similar to those of dynamic experiments; therefore, the static data can be used to simulate the dynamic curves. The dynamic range of proximity detection for three projectiles is up to 60 cm, and the results of the following static experiments show that the PCSU with four neighboring units has the highest sensitivity (the sensitivities of other units are at least 4% lower; when the attack angle decreases, the intensity of sensor signal increases. This proposed method leads to the design of a feasible moving target detector with simple structure and low cost, which can be applied in the interception system.

  6. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  7. Development of inspection data collection and evaluation system for large scale MOX fuel fabrication plant safeguards (3)

    International Nuclear Information System (INIS)

    Kumakura, Shinichi; Masuda, Shoichiro; Iso, Shoko; Hisamatsu, Yoshinori; Kurobe, Hiroko; Nakajima, Shinji

    2015-01-01

    Inspection Data Collection and Evaluation System is the system to store inspection data and operator declaration data collected from various measurement equipment, which is installed in fuel fabrication processes of the large-scale MOX fuel fabrication plant, and to make safeguards evaluation based on Near Real Time Accountancy (NRTA) using these data. Nuclear Material Control Center developed the simulator to simulate fuel fabrication process, in-process material inventory/flow data and the measurement data and the adequacy/impact to the uncertainty of the material balance using the simulation results, such as the facility operation and the operational status, has been reviewed. Following the 34th INMM Japan chapter presentation, the model similar to the real nuclear material accountancy during the fuel fabrication process was simulated and the nuclear material accountancy and its uncertainty (Sigma MUF) have been reviewed. Some findings have been obtained, such as regarding evaluation related indicators for verification under a more realistic accountancy which could be applied by operator. (author)

  8. Scale-up and large-scale production of Tetraselmis sp. CTP4 (Chlorophyta) for CO2 mitigation: from an agar plate to 100-m3 industrial photobioreactors.

    Science.gov (United States)

    Pereira, Hugo; Páramo, Jaime; Silva, Joana; Marques, Ana; Barros, Ana; Maurício, Dinis; Santos, Tamára; Schulze, Peter; Barros, Raúl; Gouveia, Luísa; Barreira, Luísa; Varela, João

    2018-03-23

    Industrial production of novel microalgal isolates is key to improving the current portfolio of available strains that are able to grow in large-scale production systems for different biotechnological applications, including carbon mitigation. In this context, Tetraselmis sp. CTP4 was successfully scaled up from an agar plate to 35- and 100-m 3 industrial scale tubular photobioreactors (PBR). Growth was performed semi-continuously for 60 days in the autumn-winter season (17 th October - 14 th December). Optimisation of tubular PBR operations showed that improved productivities were obtained at a culture velocity of 0.65-1.35 m s -1 and a pH set-point for CO 2 injection of 8.0. Highest volumetric (0.08 ± 0.01 g L -1 d -1 ) and areal (20.3 ± 3.2 g m -2 d -1 ) biomass productivities were attained in the 100-m 3 PBR compared to those of the 35-m 3 PBR (0.05 ± 0.02 g L -1 d -1 and 13.5 ± 4.3 g m -2 d -1 , respectively). Lipid contents were similar in both PBRs (9-10% of ash free dry weight). CO 2 sequestration was followed in the 100-m 3 PBR, revealing a mean CO 2 mitigation efficiency of 65% and a biomass to carbon ratio of 1.80. Tetraselmis sp. CTP4 is thus a robust candidate for industrial-scale production with promising biomass productivities and photosynthetic efficiencies up to 3.5% of total solar irradiance.

  9. Efficient Large-Scale 2D Culture System for Human Induced Pluripotent Stem Cells and Differentiated Cardiomyocytes

    Directory of Open Access Journals (Sweden)

    Shugo Tohyama

    2017-11-01

    Full Text Available Cardiac regenerative therapies utilizing human induced pluripotent stem cells (hiPSCs are hampered by ineffective large-scale culture. hiPSCs were cultured in multilayer culture plates (CPs with active gas ventilation (AGV, resulting in stable proliferation and pluripotency. Seeding of 1 × 106 hiPSCs per layer yielded 7.2 × 108 hiPSCs in 4-layer CPs and 1.7 × 109 hiPSCs in 10-layer CPs with pluripotency. hiPSCs were sequentially differentiated into cardiomyocytes (CMs in a two-dimensional (2D differentiation protocol. The efficiency of cardiac differentiation using 10-layer CPs with AGV was 66%–87%. Approximately 6.2–7.0 × 108 cells (4-layer and 1.5–2.8 × 109 cells (10-layer were obtained with AGV. After metabolic purification with glucose- and glutamine-depleted and lactate-supplemented media, a massive amount of purified CMs was prepared. Here, we present a scalable 2D culture system using multilayer CPs with AGV for hiPSC-derived CMs, which will facilitate clinical applications for severe heart failure in the near future.

  10. A Stream Tilling Approach to Surface Area Estimation for Large Scale Spatial Data in a Shared Memory System

    Directory of Open Access Journals (Sweden)

    Liu Jiping

    2017-12-01

    Full Text Available Surface area estimation is a widely used tool for resource evaluation in the physical world. When processing large scale spatial data, the input/output (I/O can easily become the bottleneck in parallelizing the algorithm due to the limited physical memory resources and the very slow disk transfer rate. In this paper, we proposed a stream tilling approach to surface area estimation that first decomposed a spatial data set into tiles with topological expansions. With these tiles, the one-to-one mapping relationship between the input and the computing process was broken. Then, we realized a streaming framework towards the scheduling of the I/O processes and computing units. Herein, each computing unit encapsulated a same copy of the estimation algorithm, and multiple asynchronous computing units could work individually in parallel. Finally, the performed experiment demonstrated that our stream tilling estimation can efficiently alleviate the heavy pressures from the I/O-bound work, and the measured speedup after being optimized have greatly outperformed the directly parallel versions in shared memory systems with multi-core processors.

  11. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim

    Directory of Open Access Journals (Sweden)

    José Medina Pestana

    Full Text Available Summary The kidney transplant program at Hospital do Rim (hrim is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym, has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001 and patient survival (90.5 vs. 95.1%, p=0.001. The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

  12. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim.

    Science.gov (United States)

    Pestana, José Medina

    2016-10-01

    The kidney transplant program at Hospital do Rim (hrim) is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym), has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001) and patient survival (90.5 vs. 95.1%, p=0.001). The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

  13. Implications of environmental regulation and coal plant retirements in systems with large scale penetration of wind power

    International Nuclear Information System (INIS)

    Rahmani, Mohsen; Jaramillo, Paulina; Hug, Gabriela

    2016-01-01

    Over the last decade there have been a growing number of federal and state regulations aimed at controlling air emissions at power plants and/or increasing the penetration of renewable resources in the grid. Environmental Protection Agency regulations will likely lead to the retrofit, retirement, or replacement of coal-fired power plants while the state Renewable Portfolio Standards will continue to drive large-scale deployment of renewable energy sources, primarily wind. Combined, these changes in the generation fleet could have profound implications for the operations of the power system. In this paper, we aim to better understand the interaction between coal plant retirements and increased levels of wind power. We extensively analyze the operations of the PJM electricity system under a broad set of scenarios that include varying levels of wind penetration and coal plant retirements. Not surprisingly, we find that without transmission upgrades, retirement of coal-fired power plants will likely result in considerable transmission congestion and higher energy prices. Increased wind penetration, with high geographic diversity, could mitigate some of the negative effects of coal plant retirement and lead to a significant reduction in air emissions, but wind forecast error might impose operational constraints on the system at times of peak load. - Highlights: •Retirement of coal plants may increase transmission congestion and LMP prices. •EPA rules might lead to significant reductions in emission of air pollutants. •Wind geographical diversity may reduce transmission constraints and air emissions. •At times of high peak load, wind may not reduce system stress caused by retirement. •RPS policies can support and mitigate negative impacts of EPA regulations.

  14. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  15. Parametric Evaluation of Large-Scale High-Temperature Electrolysis Hydrogen Production Using Different Advanced Nuclear Reactor Heat Sources

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; McKellar, Michael G.; O'Brien, James E.; Herring, J. Stephen

    2009-01-01

    High Temperature Electrolysis (HTE), when coupled to an advanced nuclear reactor capable of operating at reactor outlet temperatures of 800 C to 950 C, has the potential to efficiently produce the large quantities of hydrogen needed to meet future energy and transportation needs. To evaluate the potential benefits of nuclear-driven hydrogen production, the UniSim process analysis software was used to evaluate different reactor concepts coupled to a reference HTE process design concept. The reference HTE concept included an Intermediate Heat Exchanger and intermediate helium loop to separate the reactor primary system from the HTE process loops and additional heat exchangers to transfer reactor heat from the intermediate loop to the HTE process loops. The two process loops consisted of the water/steam loop feeding the cathode side of a HTE electrolysis stack, and the sweep gas loop used to remove oxygen from the anode side. The UniSim model of the process loops included pumps to circulate the working fluids and heat exchangers to recover heat from the oxygen and hydrogen product streams to improve the overall hydrogen production efficiencies. The reference HTE process loop model was coupled to separate UniSim models developed for three different advanced reactor concepts (a high-temperature helium cooled reactor concept and two different supercritical CO2 reactor concepts). Sensitivity studies were then performed to evaluate the affect of reactor outlet temperature on the power cycle efficiency and overall hydrogen production efficiency for each of the reactor power cycles. The results of these sensitivity studies showed that overall power cycle and hydrogen production efficiencies increased with reactor outlet temperature, but the power cycles producing the highest efficiencies varied depending on the temperature range considered

  16. DEVELOPMENT AND ADAPTATION OF VORTEX REALIZABLE MEASUREMENT SYSTEM FOR BENCHMARK TEST WITH LARGE SCALE MODEL OF NUCLEAR REACTOR

    Directory of Open Access Journals (Sweden)

    S. M. Dmitriev

    2017-01-01

    Full Text Available The last decades development of applied calculation methods of nuclear reactor thermal and hydraulic processes are marked by the rapid growth of the High Performance Computing (HPC, which contribute to the active introduction of Computational Fluid Dynamics (CFD. The use of such programs to justify technical and economic parameters and especially the safety of nuclear reactors requires comprehensive verification of mathematical models and CFD programs. The aim of the work was the development and adaptation of a measuring system having the characteristics necessary for its application in the verification test (experimental facility. It’s main objective is to study the processes of coolant flow mixing with different physical properties (for example, the concentration of dissolved impurities inside a large-scale reactor model. The basic method used for registration of the spatial concentration field in the mixing area is the method of spatial conductometry. In the course of the work, a measurement complex, including spatial conductometric sensors, a system of secondary converters and software, was created. Methods of calibration and normalization of measurement results are developed. Averaged concentration fields, nonstationary realizations of the measured local conductivity were obtained during the first experimental series, spectral and statistical analysis of the realizations were carried out.The acquired data are compared with pretest CFD-calculations performed in the ANSYS CFX program. A joint analysis of the obtained results made it possible to identify the main regularities of the process under study, and to demonstrate the capabilities of the designed measuring system to receive the experimental data of the «CFD-quality» required for verification.The carried out adaptation of spatial sensors allows to conduct a more extensive program of experimental tests, on the basis of which a databank and necessary generalizations will be created

  17. A Mobile System for Measuring Water Surface Velocities Using Unmanned Aerial Vehicle and Large-Scale Particle Image Velocimetry

    Science.gov (United States)

    Chen, Y. L.

    2015-12-01

    Measurement technologies for velocity of river flow are divided into intrusive and nonintrusive methods. Intrusive method requires infield operations. The measuring process of intrusive methods are time consuming, and likely to cause damages of operator and instrument. Nonintrusive methods require fewer operators and can reduce instrument damages from directly attaching to the flow. Nonintrusive measurements may use radar or image velocimetry to measure the velocities at the surface of water flow. The image velocimetry, such as large scale particle image velocimetry (LSPIV) accesses not only the point velocity but the flow velocities in an area simultaneously. Flow properties of an area hold the promise of providing spatially information of flow fields. This study attempts to construct a mobile system UAV-LSPIV by using an unmanned aerial vehicle (UAV) with LSPIV to measure flows in fields. The mobile system consists of a six-rotor UAV helicopter, a Sony nex5T camera, a gimbal, an image transfer device, a ground station and a remote control device. The activate gimbal helps maintain the camera lens orthogonal to the water surface and reduce the extent of images being distorted. The image transfer device can monitor the captured image instantly. The operator controls the UAV by remote control device through ground station and can achieve the flying data such as flying height and GPS coordinate of UAV. The mobile system was then applied to field experiments. The deviation of velocities measured by UAV-LSPIV of field experiments and handhold Acoustic Doppler Velocimeter (ADV) is under 8%. The results of the field experiments suggests that the application of UAV-LSPIV can be effectively applied to surface flow studies.

  18. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  19. Evaluation of the Potential Environmental Impacts from Large-Scale Use and Production of Hydrogen in Energy and Transportation Applications

    Energy Technology Data Exchange (ETDEWEB)

    Wuebbles, D.J.; Dubey, M.K., Edmonds, J.; Layzell, D.; Olsen, S.; Rahn, T.; Rocket, A.; Wang, D.; Jia, W.

    2010-06-01

    The purpose of this project is to systematically identify and examine possible near and long-term ecological and environmental effects from the production of hydrogen from various energy sources based on the DOE hydrogen production strategy and the use of that hydrogen in transportation applications. This project uses state-of-the-art numerical modeling tools of the environment and energy system emissions in combination with relevant new and prior measurements and other analyses to assess the understanding of the potential ecological and environmental impacts from hydrogen market penetration. H2 technology options and market penetration scenarios will be evaluated using energy-technology-economics models as well as atmospheric trace gas projections based on the IPCC SRES scenarios including the decline in halocarbons due to the Montreal Protocol. Specifically we investigate the impact of hydrogen releases on the oxidative capacity of the atmosphere, the long-term stability of the ozone layer due to changes in hydrogen emissions, the impact of hydrogen emissions and resulting concentrations on climate, the impact on microbial ecosystems involved in hydrogen uptake, and criteria pollutants emitted from distributed and centralized hydrogen production pathways and their impacts on human health, air quality, ecosystems, and structures under different penetration scenarios

  20. Application of Large-Scale, Multi-Resolution Watershed Modeling Framework Using the Hydrologic and Water Quality System (HAWQS

    Directory of Open Access Journals (Sweden)

    Haw Yen

    2016-04-01

    Full Text Available In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources allocation, sediment transport, and pollution control. Among commonly adopted models, the Soil and Water Assessment Tool (SWAT has been demonstrated to provide superior performance with a large amount of referencing databases. However, it is cumbersome to perform tedious initialization steps such as preparing inputs and developing a model with each changing targeted study area. In this study, the Hydrologic and Water Quality System (HAWQS is introduced to serve as a national-scale Decision Support System (DSS to conduct challenging watershed modeling tasks. HAWQS is a web-based DSS developed and maintained by Texas A & M University, and supported by the U.S. Environmental Protection Agency. Three different spatial resolutions of Hydrologic Unit Code (HUC8, HUC10, and HUC12 and three temporal scales (time steps in daily/monthly/annual are available as alternatives for general users. In addition, users can specify preferred values of model parameters instead of using the pre-defined sets. With the aid of HAWQS, users can generate a preliminarily calibrated SWAT project within a few minutes by only providing the ending HUC number of the targeted watershed and the simulation period. In the case study, HAWQS was implemented on the Illinois River Basin, USA, with graphical demonstrations and associated analytical results. Scientists and/or decision-makers can take advantage of the HAWQS framework while conducting relevant topics or policies in the future.