WorldWideScience

Sample records for integrating large complex

  1. Large erupted complex odontoma

    Directory of Open Access Journals (Sweden)

    Vijeev Vasudevan

    2009-01-01

    Full Text Available Odontomas are a heterogeneous group of jaw bone lesions, classified as odontogenic tumors which usually include well-diversified dental tissues. Odontoma is a term introduced to the literature by Broca in 1867. Trauma, infection and hereditary factors are the possible causes of forming this kind of lesions. Among odontogenic tumors, they constitute about 2/3 of cases. These lesions usually develop slowly and asymptomatically, and in most cases they do not cross the bone borders. Two types of odontoma are recognized: compound and complex. Complex odontomas are less common than the compound variety in the ratio 1:2.3. Eruption of an odontoma in the oral cavity is rare. We present a case of complex odontoma, in which apparent eruption has occurred in the area of the right maxillary second molar region.

  2. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  3. An integration strategy for large enterprises

    Directory of Open Access Journals (Sweden)

    Risimić Dejan

    2007-01-01

    Full Text Available Integration is the process of enabling a communication between disparate software components. Integration has been the burning issue for large enterprises in the last twenty years, due to the fact that 70% of the development and deployment budget is spent on integrating complex and heterogeneous back-end and front-end IT systems. The need to integrate existing applications is to support newer, faster, more accurate business processes and to provide meaningful, consistent management information. Historically, integration started with the introduction of point-to-point approaches evolving into simpler hub-and spoke topologies. These topologies were combined with custom remote procedure calls, distributed object technologies and message-oriented middleware (MOM, continued with enterprise application integration (EAI and used an application server as a primary vehicle for integration. The current phase of the evolution is service-oriented architecture (SOA combined with an enterprise service bus (ESB. Technical aspects of the comparison between the aforementioned technologies are analyzed and presented. The result of the study is the recommended integration strategy for large enterprises.

  4. Development of a descriptive model of an integrated information system to support complex, dynamic, distributed decision making for emergency management in large organisations

    International Nuclear Information System (INIS)

    Andersen, V.; Andersen, H.B.; Axel, E.; Petersen, T.

    1990-01-01

    A short introduction will be given to the European (ESPRIT II) project, ''IT Support for Emergency Management - ISEM''. The project is aimed at the development of an integrated information system capable of supporting the complex, dynamic, distributed decision making in the management of emergencies. The basic models developed to describe and construct emergency management organisations and their preparedness have been illustrated, and it has been stated that similarities may be found even in emergency situations that originally are of quite different nature. (author)

  5. Reliability of large and complex systems

    CERN Document Server

    Kolowrocki, Krzysztof

    2014-01-01

    Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt

  6. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  7. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  8. Large branched self-assembled DNA complexes

    International Nuclear Information System (INIS)

    Tosch, Paul; Waelti, Christoph; Middelberg, Anton P J; Davies, A Giles

    2007-01-01

    Many biological molecules have been demonstrated to self-assemble into complex structures and networks by using their very efficient and selective molecular recognition processes. The use of biological molecules as scaffolds for the construction of functional devices by self-assembling nanoscale complexes onto the scaffolds has recently attracted significant attention and many different applications in this field have emerged. In particular DNA, owing to its inherent sophisticated self-organization and molecular recognition properties, has served widely as a scaffold for various nanotechnological self-assembly applications, with metallic and semiconducting nanoparticles, proteins, macromolecular complexes, inter alia, being assembled onto designed DNA scaffolds. Such scaffolds may typically contain multiple branch-points and comprise a number of DNA molecules selfassembled into the desired configuration. Previously, several studies have used synthetic methods to produce the constituent DNA of the scaffolds, but this typically constrains the size of the complexes. For applications that require larger self-assembling DNA complexes, several tens of nanometers or more, other techniques need to be employed. In this article, we discuss a generic technique to generate large branched DNA macromolecular complexes

  9. Symplectic integration for complex wigglers

    International Nuclear Information System (INIS)

    Forest, E.; Ohmi, K.

    1992-01-01

    Using the example of the helical wiggler proposed for the KEK photon factory, we show how to integrate the equation of motion through the wiggler. The integration is performed in cartesian coordinates. For the usual expanded Hamiltonian (without square root), we derive a first order symplectic integrator for the purpose of tracking through a wiggler in a ring. We also show how to include classical radiation for the computation of the damping decrement

  10. Large complex ovarian cyst managed by laparoscopy

    OpenAIRE

    Dipak J. Limbachiya; Ankit Chaudhari; Grishma P. Agrawal

    2017-01-01

    Complex ovarian cyst with secondary infection is a rare disease that hardly responds to the usual antibiotic treatment. Most of the times, it hampers day to day activities of women. It is commonly known to cause pain and fever. To our surprise, in our case the cyst was large enough to compress the ureter and it was adherent to the surrounding structures. Laparoscopic removal of the cyst was done and specimen was sent for histopathological examination.

  11. Vertical integration from the large Hilbert space

    Science.gov (United States)

    Erler, Theodore; Konopka, Sebastian

    2017-12-01

    We develop an alternative description of the procedure of vertical integration based on the observation that amplitudes can be written in BRST exact form in the large Hilbert space. We relate this approach to the description of vertical integration given by Sen and Witten.

  12. Assembly and control of large microtubule complexes

    Science.gov (United States)

    Korolev, Kirill; Ishihara, Keisuke; Mitchison, Timothy

    Motility, division, and other cellular processes require rapid assembly and disassembly of microtubule structures. We report a new mechanism for the formation of asters, radial microtubule complexes found in very large cells. The standard model of aster growth assumes elongation of a fixed number of microtubules originating from the centrosomes. However, aster morphology in this model does not scale with cell size, and we found evidence for microtubule nucleation away from centrosomes. By combining polymerization dynamics and auto-catalytic nucleation of microtubules, we developed a new biophysical model of aster growth. The model predicts an explosive transition from an aster with a steady-state radius to one that expands as a travelling wave. At the transition, microtubule density increases continuously, but aster growth rate discontinuously jumps to a nonzero value. We tested our model with biochemical perturbations in egg extract and confirmed main theoretical predictions including the jump in the growth rate. Our results show that asters can grow even though individual microtubules are short and unstable. The dynamic balance between microtubule collapse and nucleation could be a general framework for the assembly and control of large microtubule complexes. NIH GM39565; Simons Foundation 409704; Honjo International 486 Scholarship Foundation.

  13. Complex integration and Cauchy's theorem

    CERN Document Server

    Watson, GN

    2012-01-01

    This brief monograph by one of the great mathematicians of the early twentieth century offers a single-volume compilation of propositions employed in proofs of Cauchy's theorem. Developing an arithmetical basis that avoids geometrical intuitions, Watson also provides a brief account of the various applications of the theorem to the evaluation of definite integrals.Author G. N. Watson begins by reviewing various propositions of Poincaré's Analysis Situs, upon which proof of the theorem's most general form depends. Subsequent chapters examine the calculus of residues, calculus optimization, the

  14. Guidelines for integrated risk assessment and management in large industrial areas. Inter-Agency programme on the assessment and management of health and environmental risks from energy and other complex industrial systems

    International Nuclear Information System (INIS)

    1998-01-01

    The IAEA, the United Nations Environment Programme (UNEP) within the framework of the Awareness and Preparedness for Emergencies at Local Level (APELL), the United Nations Industrial Development Organization (UNIDO) and the World Health Organization (WHO) decided in 1986 to join forces in order to promote the use of integrated area wide approaches to risk management. An Inter-Agency Programme, which brings together expertise in health the environment, industry and energy, all vital for effective risk management, was established. The Inter-Agency Programme on the assessment and Management of Health and Environmental Risks from Energy and Other Complex Industrial Systems aims at promoting and facilitating the implementation of integrated risk assessment and management for large industrial areas. This initiative includes the compilation of procedures and methods for environmental and public health risk assessment, the transfer of knowledge and experience amongst countries in the application of these procedures and the implementation of an integrated approach to risk management. The purpose of the Inter-Agency Programme is to develop a broad approach to the identification, prioritization and minimization of industrial hazards in a given geographical area. The UN organizations sponsoring this programme have been involved for several years in activities aimed at assessment and management of environmental and health risks, prevention of major accidents and emergency preparedness. These Guidelines have been developed on the basis of experience from these activities to assist in the planning and conduct of regional risk management projects. They provide a reference framework for the undertaking of integrated health and environmental risk assessment for large industrial areas and for the formulation of appropriate risk management strategies

  15. Guidelines for integrated risk assessment and management in large industrial areas. Inter-Agency programme on the assessment and management of health and environmental risks from energy and other complex industrial systems

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    The IAEA, the United Nations Environment Programme (UNEP) within the framework of the Awareness and Preparedness for Emergencies at Local Level (APELL), the United Nations Industrial Development Organization (UNIDO) and the World Health Organization (WHO) decided in 1986 to join forces in order to promote the use of integrated area wide approaches to risk management. An Inter-Agency Programme, which brings together expertise in health the environment, industry and energy, all vital for effective risk management, was established. The Inter-Agency Programme on the assessment and Management of Health and Environmental Risks from Energy and Other Complex Industrial Systems aims at promoting and facilitating the implementation of integrated risk assessment and management for large industrial areas. This initiative includes the compilation of procedures and methods for environmental and public health risk assessment, the transfer of knowledge and experience amongst countries in the application of these procedures and the implementation of an integrated approach to risk management. The purpose of the Inter-Agency Programme is to develop a broad approach to the identification, prioritization and minimization of industrial hazards in a given geographical area. The UN organizations sponsoring this programme have been involved for several years in activities aimed at assessment and management of environmental and health risks, prevention of major accidents and emergency preparedness. These Guidelines have been developed on the basis of experience from these activities to assist in the planning and conduct of regional risk management projects. They provide a reference framework for the undertaking of integrated health and environmental risk assessment for large industrial areas and for the formulation of appropriate risk management strategies. Refs, figs, tabs.

  16. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  17. Surgical Treatment of a Large Complex Odontoma

    Directory of Open Access Journals (Sweden)

    Burak Cezairli

    2017-08-01

    Full Text Available The treatment modalities for odontomas are generally depend on the tumors size. Small and medium lesions can usually be removed easily allowing preservation of surrounding anatomical structures. In our study, we reported a conservative surgical treatment of a large complex odontoma. A 19-year-old woman was referred to our clinic after an incidentally observed lesion on her right mandibular angle. The patient was symptom-free at the time of visit. Computed tomography (CT images showed a mass with a size of 3.5 cm x 3 cm x 2 cm. CT sections and tridimensional images showed partially eroded buccal and lingual cortex. Surgical treatment was indicated with an initial diagnosis of compound odontoma. The lesion removed after sectioning with bur and maxillo-mandibular fixation (MMF were not thought to be necessary while the buccal and lingual cortexes were mostly reliable for preventing a fracture. In our case, the size of the odontoma was suitable for a conservative treatment method and with this modality we managed to prevent a possible fracture and eliminate the disadvantages of MMF.

  18. Path integral representations on the complex sphere

    Energy Technology Data Exchange (ETDEWEB)

    Grosche, C. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik

    2007-08-15

    In this paper we discuss the path integral representations for the coordinate systems on the complex sphere S{sub 3C}. The Schroedinger equation, respectively the path integral, separates in exactly 21 orthogonal coordinate systems. We enumerate these coordinate systems and we are able to present the path integral representations explicitly in the majority of the cases. In each solution the expansion into the wave-functions is stated. Also, the kernel and the corresponding Green function can be stated in closed form in terms of the invariant distance on the sphere, respectively on the hyperboloid. (orig.)

  19. Path integral representations on the complex sphere

    International Nuclear Information System (INIS)

    Grosche, C.

    2007-08-01

    In this paper we discuss the path integral representations for the coordinate systems on the complex sphere S 3C . The Schroedinger equation, respectively the path integral, separates in exactly 21 orthogonal coordinate systems. We enumerate these coordinate systems and we are able to present the path integral representations explicitly in the majority of the cases. In each solution the expansion into the wave-functions is stated. Also, the kernel and the corresponding Green function can be stated in closed form in terms of the invariant distance on the sphere, respectively on the hyperboloid. (orig.)

  20. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  1. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  2. Integrative structure and functional anatomy of a nuclear pore complex

    Science.gov (United States)

    Kim, Seung Joong; Fernandez-Martinez, Javier; Nudelman, Ilona; Shi, Yi; Zhang, Wenzhu; Raveh, Barak; Herricks, Thurston; Slaughter, Brian D.; Hogan, Joanna A.; Upla, Paula; Chemmama, Ilan E.; Pellarin, Riccardo; Echeverria, Ignacia; Shivaraju, Manjunatha; Chaudhury, Azraa S.; Wang, Junjie; Williams, Rosemary; Unruh, Jay R.; Greenberg, Charles H.; Jacobs, Erica Y.; Yu, Zhiheng; de La Cruz, M. Jason; Mironska, Roxana; Stokes, David L.; Aitchison, John D.; Jarrold, Martin F.; Gerton, Jennifer L.; Ludtke, Steven J.; Akey, Christopher W.; Chait, Brian T.; Sali, Andrej; Rout, Michael P.

    2018-03-01

    Nuclear pore complexes play central roles as gatekeepers of RNA and protein transport between the cytoplasm and nucleoplasm. However, their large size and dynamic nature have impeded a full structural and functional elucidation. Here we determined the structure of the entire 552-protein nuclear pore complex of the yeast Saccharomyces cerevisiae at sub-nanometre precision by satisfying a wide range of data relating to the molecular arrangement of its constituents. The nuclear pore complex incorporates sturdy diagonal columns and connector cables attached to these columns, imbuing the structure with strength and flexibility. These cables also tie together all other elements of the nuclear pore complex, including membrane-interacting regions, outer rings and RNA-processing platforms. Inwardly directed anchors create a high density of transport factor-docking Phe-Gly repeats in the central channel, organized into distinct functional units. This integrative structure enables us to rationalize the architecture, transport mechanism and evolutionary origins of the nuclear pore complex.

  3. Integrative structure and functional anatomy of a nuclear pore complex.

    Science.gov (United States)

    Kim, Seung Joong; Fernandez-Martinez, Javier; Nudelman, Ilona; Shi, Yi; Zhang, Wenzhu; Raveh, Barak; Herricks, Thurston; Slaughter, Brian D; Hogan, Joanna A; Upla, Paula; Chemmama, Ilan E; Pellarin, Riccardo; Echeverria, Ignacia; Shivaraju, Manjunatha; Chaudhury, Azraa S; Wang, Junjie; Williams, Rosemary; Unruh, Jay R; Greenberg, Charles H; Jacobs, Erica Y; Yu, Zhiheng; de la Cruz, M Jason; Mironska, Roxana; Stokes, David L; Aitchison, John D; Jarrold, Martin F; Gerton, Jennifer L; Ludtke, Steven J; Akey, Christopher W; Chait, Brian T; Sali, Andrej; Rout, Michael P

    2018-03-22

    Nuclear pore complexes play central roles as gatekeepers of RNA and protein transport between the cytoplasm and nucleoplasm. However, their large size and dynamic nature have impeded a full structural and functional elucidation. Here we determined the structure of the entire 552-protein nuclear pore complex of the yeast Saccharomyces cerevisiae at sub-nanometre precision by satisfying a wide range of data relating to the molecular arrangement of its constituents. The nuclear pore complex incorporates sturdy diagonal columns and connector cables attached to these columns, imbuing the structure with strength and flexibility. These cables also tie together all other elements of the nuclear pore complex, including membrane-interacting regions, outer rings and RNA-processing platforms. Inwardly directed anchors create a high density of transport factor-docking Phe-Gly repeats in the central channel, organized into distinct functional units. This integrative structure enables us to rationalize the architecture, transport mechanism and evolutionary origins of the nuclear pore complex.

  4. Integration of functional complex oxide nanomaterials on silicon

    Directory of Open Access Journals (Sweden)

    Jose Manuel eVila-Fungueiriño

    2015-06-01

    Full Text Available The combination of standard wafer-scale semiconductor processing with the properties of functional oxides opens up to innovative and more efficient devices with high value applications that can be produced at large scale. This review uncovers the main strategies that are successfully used to monolithically integrate functional complex oxide thin films and nanostructures on silicon: the chemical solution deposition approach (CSD and the advanced physical vapor deposition techniques such as oxide molecular beam epitaxy (MBE. Special emphasis will be placed on complex oxide nanostructures epitaxially grown on silicon using the combination of CSD and MBE. Several examples will be exposed, with a particular stress on the control of interfaces and crystallization mechanisms on epitaxial perovskite oxide thin films, nanostructured quartz thin films, and octahedral molecular sieve nanowires. This review enlightens on the potential of complex oxide nanostructures and the combination of both chemical and physical elaboration techniques for novel oxide-based integrated devices.

  5. Reorganizing Complex Network to Improve Large-Scale Multiagent Teamwork

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2014-01-01

    Full Text Available Large-scale multiagent teamwork has been popular in various domains. Similar to human society infrastructure, agents only coordinate with some of the others, with a peer-to-peer complex network structure. Their organization has been proven as a key factor to influence their performance. To expedite team performance, we have analyzed that there are three key factors. First, complex network effects may be able to promote team performance. Second, coordination interactions coming from their sources are always trying to be routed to capable agents. Although they could be transferred across the network via different paths, their sources and sinks depend on the intrinsic nature of the team which is irrelevant to the network connections. In addition, the agents involved in the same plan often form a subteam and communicate with each other more frequently. Therefore, if the interactions between agents can be statistically recorded, we are able to set up an integrated network adjustment algorithm by combining the three key factors. Based on our abstracted teamwork simulations and the coordination statistics, we implemented the adaptive reorganization algorithm. The experimental results briefly support our design that the reorganized network is more capable of coordinating heterogeneous agents.

  6. Unusually large erupted complex odontoma: A rare case report

    Energy Technology Data Exchange (ETDEWEB)

    Bagewadi, Shivanand B.; Kukreja, Rahul; Suma, Gundareddy N.; Yadav, Bhawn; Sharma, Havi [Dept. of Oral Medicine and Radiology, ITS Centre for Dental Studies and Research, Murad Nagar (India)

    2015-03-15

    Odontomas are nonaggressive, hamartomatous developmental malformations composed of mature tooth substances and may be compound or complex depending on the extent of morphodifferentiation or on their resemblance to normal teeth. Among them, complex odontomas are relatively rare tumors. They are usually asymptomatic in nature. Occasionally, these tumors become large, causing bone expansion followed by facial asymmetry. Odontoma eruptions are uncommon, and thus far, very few cases of erupted complex odontomas have been reported in the literature. Here, we report the case of an unusually large, painless, complex odontoma located in the right posterior mandible.

  7. Architectures of adaptive integration in large collaborative projects

    Directory of Open Access Journals (Sweden)

    Lois Wright Morton

    2015-12-01

    Full Text Available Collaborations to address complex societal problems associated with managing human-natural systems often require large teams comprised of scientists from multiple disciplines. For many such problems, large-scale, transdisciplinary projects whose members include scientists, stakeholders, and other professionals are necessary. The success of very large, transdisciplinary projects can be facilitated by attending to the diversity of types of collaboration that inevitably occur within them. As projects progress and evolve, the resulting dynamic collaborative heterogeneity within them constitutes architectures of adaptive integration (AAI. Management that acknowledges this dynamic and fosters and promotes awareness of it within a project can better facilitate the creativity and innovation required to address problems from a systems perspective. In successful large projects, AAI (1 functionally meets objectives and goals, (2 uses disciplinary expertise and concurrently bridges many disciplines, (3 has mechanisms to enable connection, (4 delineates boundaries to keep focus but retain flexibility, (5 continuously monitors and adapts, and (6 encourages project-wide awareness. These principles are illustrated using as case studies three large climate change and agriculture projects funded by the U.S. Department of Agriculture-National Institute of Food and Agriculture.

  8. Path integral in area tensor Regge calculus and complex connections

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2006-01-01

    Euclidean quantum measure in Regge calculus with independent area tensors is considered using example of the Regge manifold of a simple structure. We go over to integrations along certain contours in the hyperplane of complex connection variables. Discrete connection and curvature on classical solutions of the equations of motion are not, strictly speaking, genuine connection and curvature, but more general quantities and, therefore, these do not appear as arguments of a function to be averaged, but are the integration (dummy) variables. We argue that upon integrating out the latter the resulting measure can be well-defined on physical hypersurface (for the area tensors corresponding to certain edge vectors, i.e. to certain metric) as positive and having exponential cutoff at large areas on condition that we confine ourselves to configurations which do not pass through degenerate metrics

  9. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  10. Using lanthanoid complexes to phase large macromolecular assemblies

    International Nuclear Information System (INIS)

    Talon, Romain; Kahn, Richard; Durá, M. Asunción; Maury, Olivier; Vellieux, Frédéric M. D.; Franzetti, Bruno; Girard, Eric

    2011-01-01

    A lanthanoid complex, [Eu(DPA) 3 ] 3− , was used to obtain experimental phases at 4.0 Å resolution of PhTET1-12s, a large self-compartmentalized homo-dodecameric protease complex of 444 kDa. Lanthanoid ions exhibit extremely large anomalous X-ray scattering at their L III absorption edge. They are thus well suited for anomalous diffraction experiments. A novel class of lanthanoid complexes has been developed that combines the physical properties of lanthanoid atoms with functional chemical groups that allow non-covalent binding to proteins. Two structures of large multimeric proteins have already been determined by using such complexes. Here the use of the luminescent europium tris-dipicolinate complex [Eu(DPA) 3 ] 3− to solve the low-resolution structure of a 444 kDa homododecameric aminopeptidase, called PhTET1-12s from the archaea Pyrococcus horikoshii, is reported. Surprisingly, considering the low resolution of the data, the experimental electron density map is very well defined. Experimental phases obtained by using the lanthanoid complex lead to maps displaying particular structural features usually observed in higher-resolution maps. Such complexes open a new way for solving the structure of large molecular assemblies, even with low-resolution data

  11. Fuel pin integrity assessment under large scale transients

    International Nuclear Information System (INIS)

    Dutta, B.K.

    2006-01-01

    The integrity of fuel rods under normal, abnormal and accident conditions is an important consideration during fuel design of advanced nuclear reactors. The fuel matrix and the sheath form the first barrier to prevent the release of radioactive materials into the primary coolant. An understanding of the fuel and clad behaviour under different reactor conditions, particularly under the beyond-design-basis accident scenario leading to large scale transients, is always desirable to assess the inherent safety margins in fuel pin design and to plan for the mitigation the consequences of accidents, if any. The severe accident conditions are typically characterized by the energy deposition rates far exceeding the heat removal capability of the reactor coolant system. This may lead to the clad failure due to fission gas pressure at high temperature, large- scale pellet-clad interaction and clad melting. The fuel rod performance is affected by many interdependent complex phenomena involving extremely complex material behaviour. The versatile experimental database available in this area has led to the development of powerful analytical tools to characterize fuel under extreme scenarios

  12. Research and assessment of competitiveness of large engineering complexes

    Directory of Open Access Journals (Sweden)

    Krivorotov V.V.

    2017-01-01

    Full Text Available The urgency of the problem of ensuring the competitiveness of manufacturing and high-tech sectors is shown. Substantiated the decisive role of the large industrial complexes in the formation of the results of the national economy; the author’s interpretation of the concept of “industrial complex” with regard to current economic systems. Current approaches to assessing the competitiveness of enterprises and industrial complexes are analyzed; showing their main advantages and disadvantages. Provides scientific-methodological approach to the study and management of competitiveness of a large industrial complex; the description of its main units is provided. As a Central element of the scientific methodology approach proposed the methodology for assessing the competitiveness of a large industrial complex based on the Pattern-method; a modular system of indicators of competitiveness is developed and its adaptation to a large engineering complexes is made. Using the developed methodology the competitiveness of one of the largest engineering complexes of the group of companies Uralelectrotyazhmash, which is the leading enterprises in electrotechnical industry of Russia is assessed. The evaluation identified the main problems and bottlenecks in the development of these enterprises, and their comparison with leading competitors is provided. According to the results of the study the main conclusions and recommendations are formed.

  13. Integrating the Differentiated: A Review of the Personal Construct Approach to Cognitive Complexity

    OpenAIRE

    Kovářová, M. (Marie); Filip, M. (Miroslav)

    2015-01-01

    This article reviews personal construct psychology (PCP) research on cognitive complexity. It examines conceptual foundations, measures of cognitive complexity, and a large body of empirical findings. It identifies several ambiguities in the conceptualization of the two components of cognitive complexity: differentiation and integration. These ambiguities lead to inconsistent interpretations of indexes proposed for their measurement and consequently to an inconsistent interpretation of em...

  14. Integrated Modeling of Complex Optomechanical Systems

    Science.gov (United States)

    Andersen, Torben; Enmark, Anita

    2011-09-01

    Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

  15. Integrated Visualisation and Description of Complex Systems

    National Research Council Canada - National Science Library

    Goodburn, D

    1999-01-01

    ... on system topographies and feature overlays. System information from the domain's information space is filtered and integrated into a Composite Systems Model that provides a basis for consistency and integration between all system views...

  16. How complex can integrated optical circuits become?

    NARCIS (Netherlands)

    Smit, M.K.; Hill, M.T.; Baets, R.G.F.; Bente, E.A.J.M.; Dorren, H.J.S.; Karouta, F.; Koenraad, P.M.; Koonen, A.M.J.; Leijtens, X.J.M.; Nötzel, R.; Oei, Y.S.; Waardt, de H.; Tol, van der J.J.G.M.; Khoe, G.D.

    2007-01-01

    The integration scale in Photonic Integrated Circuits will be pushed to VLSI-level in the coming decade. This will bring major changes in both application and manufacturing. In this paper developments in Photonic Integration are reviewed and the limits for reduction of device demensions are

  17. RESEARCH ON COMPLEX, LARGE INDUSTRIAL PROJECTS IN TRANSNATIONAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Florin POPESCU

    2016-12-01

    Full Text Available More and more projects from different industrial sectors developed in transnational environment are becoming more characterized as "complex". In recent years, there has been much discussion and controversy about the complexity of the projects, and, despite what has been written and said in various papers, journals and professional conferences, more confusion than clarification was created, complexity of projects being interpreted differently from one author to another. Most of the literature studied is based on linear, analytical and rational approach, focusing on the size of project management planning and control and actually less on projects that are characterized as taking place and grow into a dynamic socio-human environment in a continuous change. This study represents a critical review of existing theoretical models found in literature, highlighting their limitations. The output of this literature study represents an integration of different approaches concerning complexity under one umbrella to provide a common understanding of the evolution of this concept.

  18. Management of Large Erupting Complex Odontoma in Maxilla

    Directory of Open Access Journals (Sweden)

    Colm Murphy

    2014-01-01

    Full Text Available We present the unusual case of a large complex odontoma erupting in the maxilla. Odontomas are benign developmental tumours of odontogenic origin. They are characterized by slow growth and nonaggressive behaviour. Complex odontomas, which erupt, are rare. They are usually asymptomatic and are identified on routine radiograph but may present with erosion into the oral cavity with subsequent cellulitis and facial asymmetry. This present paper describes the presentation and management of an erupting complex odontoma, occupying the maxillary sinus with extension to the infraorbital rim. We also discuss various surgical approaches used to access this anatomic area.

  19. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  20. Structuring and assessing large and complex decision problems using MCDA

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    This paper presents an approach for the structuring and assessing of large and complex decision problems using multi-criteria decision analysis (MCDA). The MCDA problem is structured in a decision tree and assessed using the REMBRANDT technique featuring a procedure for limiting the number of pair...

  1. Accurate Complex Systems Design: Integrating Serious Games with Petri Nets

    Directory of Open Access Journals (Sweden)

    Kirsten Sinclair

    2016-03-01

    Full Text Available Difficulty understanding the large number of interactions involved in complex systems makes their successful engineering a problem. Petri Nets are one graphical modelling technique used to describe and check proposed designs of complex systems thoroughly. While automatic analysis capabilities of Petri Nets are useful, their visual form is less so, particularly for communicating the design they represent. In engineering projects, this can lead to a gap in communications between people with different areas of expertise, negatively impacting achieving accurate designs.In contrast, although capable of representing a variety of real and imaginary objects effectively, behaviour of serious games can only be analysed manually through interactive simulation. This paper examines combining the complementary strengths of Petri Nets and serious games. The novel contribution of this work is a serious game prototype of a complex system design that has been checked thoroughly. Underpinned by Petri Net analysis, the serious game can be used as a high-level interface to communicate and refine the design.Improvement of a complex system design is demonstrated by applying the integration to a proof-of-concept case study.   

  2. Risk management integration into complex project organizations

    Science.gov (United States)

    Fisher, K.; Greanias, G.; Rose, J.; Dumas, R.

    2002-01-01

    This paper describes the approach used in designing and adapting the SIRTF prototype, discusses some of the lessons learned in developing the SIRTF prototype, and explains the adaptability of the risk management database to varying levels project complexity.

  3. Harnessing Product Complexity: An Integrative Approach

    OpenAIRE

    Orfi, Nihal Mohamed Sherif

    2011-01-01

    In today's market, companies are faced with pressure to increase variety in product offerings. While increasing variety can help increase market share and sales growth, the costs of doing so can be significant. Ultimately, variety causes complexity in products and processes to soar, which negatively impacts product development, quality, production scheduling, efficiency and more. Product variety is just one common cause of product complexity, a topic that several researchers have tackled with...

  4. Risk Management and Uncertainty in Large Complex Public Projects

    DEFF Research Database (Denmark)

    Neerup Themsen, Tim; Harty, Chris; Tryggestad, Kjell

    Governmental actors worldwide are promoting risk management as a rational approach to man-age uncertainty and improve the abilities to deliver large complex projects according to budget, time plans, and pre-set project specifications: But what do we know about the effects of risk management...... on the abilities to meet such objectives? Using Callon’s (1998) twin notions of framing and overflowing we examine the implementation of risk management within the Dan-ish public sector and the effects this generated for the management of two large complex pro-jects. We show how the rational framing of risk...... management have generated unexpected costly outcomes such as: the undermining of the longer-term value and societal relevance of the built asset, the negligence of the wider range of uncertainties emerging during project processes, and constraining forms of knowledge. We also show how expert accountants play...

  5. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  6. Integrated pollution control for oil refinery complexes

    Energy Technology Data Exchange (ETDEWEB)

    Kiperstok, A. [Bahia Univ., Salvador, BA (Brazil); Sharratt, P.N. [Manchester Univ. (United Kingdom). Inst. of Science and Technology

    1993-12-31

    Improving environmental performance of oil refineries is a complex task. Emission limits, operating constraints, available technologies, operating techniques, and local environment sensitivity must all be considered. This work describes efforts to build an interactive software to deal with this problem. 8 refs., 5 figs.

  7. Integrated pollution control for oil refinery complexes

    Energy Technology Data Exchange (ETDEWEB)

    Kiperstok, A [Bahia Univ., Salvador, BA (Brazil); Sharratt, P N [Manchester Univ. (United Kingdom). Inst. of Science and Technology

    1994-12-31

    Improving environmental performance of oil refineries is a complex task. Emission limits, operating constraints, available technologies, operating techniques, and local environment sensitivity must all be considered. This work describes efforts to build an interactive software to deal with this problem. 8 refs., 5 figs.

  8. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  9. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  10. Contribution of Large Region Joint Associations to Complex Traits Genetics

    Science.gov (United States)

    Paré, Guillaume; Asma, Senay; Deng, Wei Q.

    2015-01-01

    A polygenic model of inheritance, whereby hundreds or thousands of weakly associated variants contribute to a trait’s heritability, has been proposed to underlie the genetic architecture of complex traits. However, relatively few genetic variants have been positively identified so far and they collectively explain only a small fraction of the predicted heritability. We hypothesized that joint association of multiple weakly associated variants over large chromosomal regions contributes to complex traits variance. Confirmation of such regional associations can help identify new loci and lead to a better understanding of known ones. To test this hypothesis, we first characterized the ability of commonly used genetic association models to identify large region joint associations. Through theoretical derivation and simulation, we showed that multivariate linear models where multiple SNPs are included as independent predictors have the most favorable association profile. Based on these results, we tested for large region association with height in 3,740 European participants from the Health and Retirement Study (HRS) study. Adjusting for SNPs with known association with height, we demonstrated clustering of weak associations (p = 2x10-4) in regions extending up to 433.0 Kb from known height loci. The contribution of regional associations to phenotypic variance was estimated at 0.172 (95% CI 0.063-0.279; p < 0.001), which compared favorably to 0.129 explained by known height variants. Conversely, we showed that suggestively associated regions are enriched for known height loci. To extend our findings to other traits, we also tested BMI, HDLc and CRP for large region associations, with consistent results for CRP. Our results demonstrate the presence of large region joint associations and suggest these can be used to pinpoint weakly associated SNPs. PMID:25856144

  11. Privatization of Land Plot Under Integral Real Estate Complex

    Directory of Open Access Journals (Sweden)

    Maruchek A. A.

    2014-10-01

    Full Text Available The article deals with the questions concerning the privatization of a land plot under integral real estate complex. The authors come to conclusion that a number of legislation norms relating to privatization of a land plot do not take into account the construction of an integral real estate complex that could cause some problems in the realization of the right to privatization of the land plot

  12. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  13. Everyday value conflicts and integrative complexity of thought.

    Science.gov (United States)

    Myyry, Liisa

    2002-12-01

    This study examined the value pluralism model in everyday value conflicts, and the effect of issue context on complexity of thought. According to the cognitive manager model we hypothesized that respondents would obtain a higher level of integrative complexity on personal issues that on professional and general issues. We also explored the relations of integrative complexity to value priorities, measured by the Schwartz Value Survey, and to emotional empathy. The value pluralism model was not supported by the data collected from 126 university students from social science, business and technology. The cognitive manager model was partially confirmed by data from females but not from males. Concerning value priorities, more complex respondents had higher regard for self-transcendence values, and less complex respondents for self-enhancement values Emotional empathy was also significantly related to complexity score.

  14. Optimizing liquid effluent monitoring at a large nuclear complex.

    Science.gov (United States)

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  15. Large-Eddy Simulations of Flows in Complex Terrain

    Science.gov (United States)

    Kosovic, B.; Lundquist, K. A.

    2011-12-01

    Large-eddy simulation as a methodology for numerical simulation of turbulent flows was first developed to study turbulent flows in atmospheric by Lilly (1967). The first LES were carried by Deardorff (1970) who used these simulations to study atmospheric boundary layers. Ever since, LES has been extensively used to study canonical atmospheric boundary layers, in most cases flat plate boundary layers under the assumption of horizontal homogeneity. Carefully designed LES of canonical convective and neutrally stratified and more recently stably stratified atmospheric boundary layers have contributed significantly to development of better understanding of these flows and their parameterizations in large scale models. These simulations were often carried out using codes specifically designed and developed for large-eddy simulations of horizontally homogeneous flows with periodic lateral boundary conditions. Recent developments in multi-scale numerical simulations of atmospheric flows enable numerical weather prediction (NWP) codes such as ARPS (Chow and Street, 2009), COAMPS (Golaz et al., 2009) and Weather Research and Forecasting model, to be used nearly seamlessly across a wide range of atmospheric scales from synoptic down to turbulent scales in atmospheric boundary layers. Before we can with confidence carry out multi-scale simulations of atmospheric flows, NWP codes must be validated for accurate performance in simulating flows over complex or inhomogeneous terrain. We therefore carry out validation of WRF-LES for simulations of flows over complex terrain using data from Askervein Hill (Taylor and Teunissen, 1985, 1987) and METCRAX (Whiteman et al., 2008) field experiments. WRF's nesting capability is employed with a one-way nested inner domain that includes complex terrain representation while the coarser outer nest is used to spin up fully developed atmospheric boundary layer turbulence and thus represent accurately inflow to the inner domain. LES of a

  16. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  17. Geophysical mapping of complex glaciogenic large-scale structures

    DEFF Research Database (Denmark)

    Høyer, Anne-Sophie

    2013-01-01

    This thesis presents the main results of a four year PhD study concerning the use of geophysical data in geological mapping. The study is related to the Geocenter project, “KOMPLEKS”, which focuses on the mapping of complex, large-scale geological structures. The study area is approximately 100 km2...... data types and co-interpret them in order to improve our geological understanding. However, in order to perform this successfully, methodological considerations are necessary. For instance, a structure indicated by a reflection in the seismic data is not always apparent in the resistivity data...... information) can be collected. The geophysical data are used together with geological analyses from boreholes and pits to interpret the geological history of the hill-island. The geophysical data reveal that the glaciotectonic structures truncate at the surface. The directions of the structures were mapped...

  18. Complex Formation Control of Large-Scale Intelligent Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Lei

    2012-01-01

    Full Text Available A new formation framework of large-scale intelligent autonomous vehicles is developed, which can realize complex formations while reducing data exchange. Using the proposed hierarchy formation method and the automatic dividing algorithm, vehicles are automatically divided into leaders and followers by exchanging information via wireless network at initial time. Then, leaders form formation geometric shape by global formation information and followers track their own virtual leaders to form line formation by local information. The formation control laws of leaders and followers are designed based on consensus algorithms. Moreover, collision-avoiding problems are considered and solved using artificial potential functions. Finally, a simulation example that consists of 25 vehicles shows the effectiveness of theory.

  19. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  20. Complexity of Configurators Relative to Integrations and Field of Application

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Battistello, Loris

    . Moreover, configurators are commonly integrated to various IT systems within companies. The complexity of configurators is an important factor when it comes to performance, development and maintenance of the systems. A direct comparison of the complexity based on the different application...... integrations to other IT systems. The research method adopted in the paper is based on a survey followed with interviews where the unit of analysis is based on operating configurators within a company.......Configurators are applied widely to automate the specification processes at companies. The literature describes the industrial application of configurators supporting both sales and engineering processes, where configurators supporting the engineering processes are described more challenging...

  1. Knowledge Sharing Strategies for Large Complex Building Projects.

    Directory of Open Access Journals (Sweden)

    Esra Bektas

    2013-06-01

    Full Text Available The construction industry is a project-based sector with a myriad of actors such as architects, construction companies, consultants, producers of building materials (Anumba et al., 2005. The interaction between the project partners is often quite limited, which leads to insufficient knowledge sharing during the project and knowledge being unavailable for reuse (Fruchter et al. 2002. The result can be a considerable amount of extra work, delays and cost overruns. Design outcomes that are supposed to function as boundary objects across different disciplines can lead to misinterpretation of requirements, project content and objectives. In this research, knowledge is seen as resulting from social interactions; knowledge resides in communities and it is generated through social relationships (Wenger 1998, Olsson et al. 2008. Knowledge is often tacit, intangible and context-dependent and it is articulated in the changing responsibilities, roles, attitudes and values that are present in the work environment (Bresnen et al., 2003. In a project environment, knowledge enables individuals to solve problems, take decisions, and apply these decisions to actions. In order to achieve a shared understanding and minimize the misunderstanding and misinterpretations among project actors, it is necessary to share knowledge (Fong 2003. Sharing knowledge is particularly crucial in large complex building projects (LCBPs in order to accelerate the building process, improve architectural quality and prevent mistakes or undesirable results. However, knowledge sharing is often hampered through professional or organizational boundaries or contractual concerns. When knowledge is seen as an organizational asset, there is little willingness among project organizations to share their knowledge. Individual people may recognize the need to promote knowledge sharing throughout the project, but typically there is no deliberate strategy agreed by all project partners to address

  2. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  3. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  4. Data integration, systems approach and multilevel description of complex biosystems

    International Nuclear Information System (INIS)

    Hernández-Lemus, Enrique

    2013-01-01

    Recent years have witnessed the development of new quantitative approaches and theoretical tenets in the biological sciences. The advent of high throughput experiments in genomics, proteomics and electrophysiology (to cite just a few examples) have provided the researchers with unprecedented amounts of data to be analyzed. Large datasets, however can not provide the means to achieve a complete understanding of the underlying biological phenomena, unless they are supplied with a solid theoretical framework and with proper analytical tools. It is now widely accepted that by using and extending some of the paradigmatic principles of what has been called complex systems theory, some degree of advance in this direction can be attained. We will be presenting ways in which by using data integration techniques (linear, non-linear, combinatorial, graphical), multidimensional-multilevel descriptions (multifractal modeling, dimensionality reduction, computational learning), as well as an approach based in systems theory (interaction maps, probabilistic graphical models, non-equilibrium physics) have allowed us to better understand some problems in the interface of Statistical Physics and Computational Biology

  5. Packaging Concerns and Techniques for Large Devices: Challenges for Complex Electronics

    Science.gov (United States)

    LaBel, Kenneth A.; Sampson, Michael J.

    2010-01-01

    NASA is going to have to accept the use of non-hermetic packages for complex devices. There are a large number of packaging options available. Space application subjects the packages to stresses that they were probably not designed for (vacuum for instance). NASA has to find a way of having assurance in the integrity of the packages. There are manufacturers interested in qualifying non-hermetic packages to MIL-PRF-38535 Class V. Government space users are agreed that Class V should be for hermetic packages only. NASA is working on a new Class for non-hermetic packages for M38535 Appendix B, "Class Y". Testing for package integrity will be required but can be package specific as described by a Package Integrity Test Plan. The plan is developed by the manufacturer and approved by DSCC and government space.

  6. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  7. On the benefits of an integrated nuclear complex for Nevada

    International Nuclear Information System (INIS)

    Blink, J.A.; Halsey, W.G.

    1994-01-01

    An integrated nuclear complex is proposed for location at the Nevada Test Site. In addition to solving the nuclear waste disposal problem, this complex would tremendously enhance the southern Nevada economy, and it would provide low cost electricity to each resident and business in the affected counties. Nuclear industry and the national economy would benefit because the complex would demonstrate the new generation of safer nuclear power plants and revitalize the industry. Many spin-offs of the complex would be possible, including research into nuclear fusion and a world class medical facility for southern Nevada. For such a complex to become a reality, the cycle of distrust between the federal government and the State of Nevada must be broken. The paper concludes with a discussion of implementation through a public process led by state officials and culminating in a voter referendum

  8. Navigating Complexities: An Integrative Approach to English Language Teacher Education

    Science.gov (United States)

    Ryan, Phillip; Glodjo, Tyler; Hobbs, Bethany; Stargel, Victoria; Williams, Thad

    2015-01-01

    This article is an analysis of one undergraduate English language teacher education program's integrative theoretical framework that is structured around three pillars: interdisciplinarity, critical pedagogy, and teacher exploration. First, the authors survey the unique complexities of language teaching and learning. Then, they introduce this…

  9. Integration and test plans for complex manufacturing systems

    NARCIS (Netherlands)

    Boumen, R.

    2007-01-01

    The integration and test phases that are part of the development and manufacturing of complex manufacturing systems are costly and time consuming. As time-to-market is becoming increasingly important, it is crucial to keep these phases as short as possible, whilemaintaining system quality. This is

  10. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    Science.gov (United States)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  11. Estimating large complex projects Estimando proyectos grandes y complejos

    Directory of Open Access Journals (Sweden)

    Cliff Schexnayder

    2007-08-01

    Full Text Available Managing large capital construction projects requires the coordination of a multitude of human, organizational, technical, and natural resources. Quite often, the engineering and construction complexities of such projects are overshadowed by economic, societal, and political challenges. The ramifications and effects, which result from differences between early project cost estimates and the bid price or the final project cost, are significant. Over the time span between the initiation of a project and the completion of construction many factors influence a project's final costs. This time span is normally several years in duration but for highly complex and technologically challenging projects, project duration can easily exceed a decade. Over that period, changes to the project scope often occur. The subject here is a presentation of strategies that support realistic cost estimating. Through literature review and interviews with transportation agencies in the U.S. and internationally the authors developed a database of the factors that are the root causes of cost estimation problemsGestionar proyectos de construcción de grandes capitales requiere de la coordinación de una multitud de recursos humanos, organizacionales, técnicos y naturales. Frecuentemente, las complejidades del diseño y construcción de esos grandes proyectos son tapadas por sus desafíos económicos, políticos y sociales. Las ramificaciones y efectos que resultan de las diferencias entre la estimación de costo inicial, el costo de la propuesta adjudicada y el costo final del proyecto son significativas. Hay numerosos factores que inciden en el costo final del proyecto entre su inicio y finalización. La duración es generalmente de varios años y puede incluso superar la década para aquellos especialmente complejos y desafiantes. En ese período de tiempo, cambios en los alcances del proyecto cambian frecuentemente. El tópico del presente artículo es mostrar

  12. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  13. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  14. Integral criteria for large-scale multiple fingerprint solutions

    Science.gov (United States)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  15. New integrable structures in large-N QCD

    International Nuclear Information System (INIS)

    Ferretti, Gabriele; Heise, Rainer; Zarembo, Konstantin

    2004-01-01

    We study the anomalous dimensions of single trace operators composed of field strengths F μν in large-N QCD. The matrix of anomalous dimensions is the Hamiltonian of a compact spin chain with two spin one representations at each vertex corresponding to the self-dual and anti-self-dual components of F μν . Because of the special form of the interaction it is possible to study separately renormalization of purely self-dual components. In this sector the Hamiltonian is integrable and can be exactly solved by Bethe ansatz. Its continuum limit is described by the level two SU(2) Wess-Zumino-Witten model

  16. Large-area smart glass and integrated photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Lampert, C.M. [Star Science, 8730 Water Road, Cotati, CA 94931-4252 (United States)

    2003-04-01

    Several companies throughout the world are developing dynamic glazing and large-area flat panel displays. University and National Laboratory groups are researching new materials and processes to improve these products. The concept of a switchable glazing for building and vehicle application is very attractive. Conventional glazing only offers fixed transmittance and control of energy passing through it. Given the wide range of illumination conditions and glare, a dynamic glazing with adjustable transmittance offers the best solution. Photovoltaics can be integrated as power sources for smart windows. In this way a switchable window could be a completely stand alone smart system. A new range of large-area flat panel display including light-weight and flexible displays are being developed. These displays can be used for banner advertising, dynamic pricing in stores, electronic paper, and electronic books, to name only a few applications. This study covers selected switching technologies including electrochromism, suspended particles, and encapsulated liquid crystals.

  17. Integration of radiation and physical safety in large radiator facilities

    International Nuclear Information System (INIS)

    Lima, P.P.M.; Benedito, A.M.; Lima, C.M.A.; Silva, F.C.A. da

    2017-01-01

    Growing international concern about radioactive sources after the Sept. 11, 2001 event has led to a strengthening of physical safety. There is evidence that the illicit use of radioactive sources is a real possibility and may result in harmful radiological consequences for the population and the environment. In Brazil there are about 2000 medical, industrial and research facilities with radioactive sources, of which 400 are Category 1 and 2 classified by the - International Atomic Energy Agency - AIEA, where large irradiators occupy a prominent position due to the very high cobalt-60 activities. The radiological safety is well established in these facilities, due to the intense work of the authorities in the Country. In the paper the main aspects on radiological and physical safety applied in the large radiators are presented, in order to integrate both concepts for the benefit of the safety as a whole. The research showed that the items related to radiation safety are well defined, for example, the tests on the access control devices to the irradiation room. On the other hand, items related to physical security, such as effective control of access to the company, use of safety cameras throughout the company, are not yet fully incorporated. Integration of radiation and physical safety is fundamental for total safety. The elaboration of a Brazilian regulation on the subject is of extreme importance

  18. Iterative methods for the solution of very large complex symmetric linear systems of equations in electrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Clemens, M.; Weiland, T. [Technische Hochschule Darmstadt (Germany)

    1996-12-31

    In the field of computational electrodynamics the discretization of Maxwell`s equations using the Finite Integration Theory (FIT) yields very large, sparse, complex symmetric linear systems of equations. For this class of complex non-Hermitian systems a number of conjugate gradient-type algorithms is considered. The complex version of the biconjugate gradient (BiCG) method by Jacobs can be extended to a whole class of methods for complex-symmetric algorithms SCBiCG(T, n), which only require one matrix vector multiplication per iteration step. In this class the well-known conjugate orthogonal conjugate gradient (COCG) method for complex-symmetric systems corresponds to the case n = 0. The case n = 1 yields the BiCGCR method which corresponds to the conjugate residual algorithm for the real-valued case. These methods in combination with a minimal residual smoothing process are applied separately to practical 3D electro-quasistatical and eddy-current problems in electrodynamics. The practical performance of the SCBiCG methods is compared with other methods such as QMR and TFQMR.

  19. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  20. Some thoughts on the management of large, complex international space ventures

    Science.gov (United States)

    Lee, T. J.; Kutzer, Ants; Schneider, W. C.

    1992-01-01

    Management issues relevant to the development and deployment of large international space ventures are discussed with particular attention given to previous experience. Management approaches utilized in the past are labeled as either simple or complex, and signs of efficient management are examined. Simple approaches include those in which experiments and subsystems are developed for integration into spacecraft, and the Apollo-Soyuz Test Project is given as an example of a simple multinational approach. Complex approaches include those for ESA's Spacelab Project and the Space Station Freedom in which functional interfaces cross agency and political boundaries. It is concluded that individual elements of space programs should be managed by individual participating agencies, and overall configuration control is coordinated by level with a program director acting to manage overall objectives and project interfaces.

  1. Integrated circuit devices in control systems of coal mining complexes

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    Systems of automatic monitoring and control of coal mining complexes developed in the 1960's used electromagnetic relays, thyristors, and flip-flops on transistors of varying conductivity. The circuits' designers, devoted much attention to ensuring spark safety, lowering power consumption, and raising noise immunity and repairability of functional devices. The fast development of integrated circuitry led to the use of microelectronic components in most devices of mine automation. An analysis of specifications and experimental research into integrated circuits (IMS) shows that the series K 176 IMS components made by CMOS technology best meet mine conditions of operation. The use of IMS devices under mine conditions has demonstrated their high reliability. Further development of integrated circuitry involve using microprocessors and microcomputers. (SC)

  2. Complexity and network dynamics in physiological adaptation: an integrated view.

    Science.gov (United States)

    Baffy, György; Loscalzo, Joseph

    2014-05-28

    Living organisms constantly interact with their surroundings and sustain internal stability against perturbations. This dynamic process follows three fundamental strategies (restore, explore, and abandon) articulated in historical concepts of physiological adaptation such as homeostasis, allostasis, and the general adaptation syndrome. These strategies correspond to elementary forms of behavior (ordered, chaotic, and static) in complex adaptive systems and invite a network-based analysis of the operational characteristics, allowing us to propose an integrated framework of physiological adaptation from a complex network perspective. Applicability of this concept is illustrated by analyzing molecular and cellular mechanisms of adaptation in response to the pervasive challenge of obesity, a chronic condition resulting from sustained nutrient excess that prompts chaotic exploration for system stability associated with tradeoffs and a risk of adverse outcomes such as diabetes, cardiovascular disease, and cancer. Deconstruction of this complexity holds the promise of gaining novel insights into physiological adaptation in health and disease. Published by Elsevier Inc.

  3. Data mining in large sets of complex data

    CERN Document Server

    Cordeiro, Robson L F; Júnior, Caetano Traina

    2013-01-01

    The amount and the complexity of the data gathered by current enterprises are increasing at an exponential rate. Consequently, the analysis of Big Data is nowadays a central challenge in Computer Science, especially for complex data. For example, given a satellite image database containing tens of Terabytes, how can we find regions aiming at identifying native rainforests, deforestation or reforestation? Can it be made automatically? Based on the work discussed in this book, the answers to both questions are a sound "yes", and the results can be obtained in just minutes. In fact, results that

  4. International Requirements for Large Integration of Renewable Energy Sources

    DEFF Research Database (Denmark)

    Molina-Garcia, Angel; Hansen, Anca Daniela; Muljadi, Ed

    2017-01-01

    Most European countries have concerns about the integration of large amounts of renewable energy sources (RES) into electric power systems, and this is currently a topic of growing interest. In January 2008, the European Commission published the 2020 package, which proposes committing the European...... Union to a 20% reduction in greenhouse gas emissions, to achieve a target of deriving 20% of the European Union's final energy consumption from renewable sources, and to achieve 20% improvement in energy efficiency both by the year 2020 [1]. Member states have different individual goals to meet...... these overall objectives, and they each need to provide a detailed roadmap describing how they will meet these legally binding targets [2]. At this time, RES are an indispensable part of the global energy mix, which has been partially motivated by the continuous increases in hydropower as well as the rapid...

  5. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong

    2013-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and tip speed ratio, the optimal airfoils are designed based on the local speed ratios. To achieve high power performance at low cost, the airfoils are designed...... with an objective of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on the previous in-house airfoil family which were optimized at a Reynolds number of 3...... million. A novel shape perturbation function is introduced to optimize the geometry on the existing airfoils and thus simplify the design procedure. The viscos/inviscid code Xfoil is used as the aerodynamic tool for airfoil optimization where the Reynolds number is set at 16 million with a free...

  6. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2014-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and a tip speed ratio, optimal airfoils are designed based on the local speed ratios. To achieve a high power performance at low cost, the airfoils are designed...... with the objectives of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on a previous in-house designed airfoil family which was optimized at a Reynolds number...... of 3 million. A novel shape perturbation function is introduced to optimize the geometry based on the existing airfoils which simplifies the design procedure. The viscous/inviscid interactive code XFOIL is used as the aerodynamic tool for airfoil optimization at a Reynolds number of 16 million...

  7. High-level waste program integration within the DOE complex

    International Nuclear Information System (INIS)

    Valentine, J.H.; Malone, K.; Schaus, P.S.

    1998-03-01

    Eleven major Department of Energy (DOE) site contractors were chartered by the Assistant Secretary to use a systems engineering approach to develop and evaluate technically defensible cost savings opportunities across the complex. Known as the complex-wide Environmental Management Integration (EMI), this process evaluated all the major DOE waste streams including high level waste (HLW). Across the DOE complex, this waste stream has the highest life cycle cost and is scheduled to take until at least 2035 before all HLW is processed for disposal. Technical contract experts from the four DOE sites that manage high level waste participated in the integration analysis: Hanford, Savannah River Site (SRS), Idaho National Engineering and Environmental Laboratory (INEEL), and West Valley Demonstration Project (WVDP). In addition, subject matter experts from the Yucca Mountain Project and the Tanks Focus Area participated in the analysis. Also, departmental representatives from the US Department of Energy Headquarters (DOE-HQ) monitored the analysis and results. Workouts were held throughout the year to develop recommendations to achieve a complex-wide integrated program. From this effort, the HLW Environmental Management (EM) Team identified a set of programmatic and technical opportunities that could result in potential cost savings and avoidance in excess of $18 billion and an accelerated completion of the HLW mission by seven years. The cost savings, schedule improvements, and volume reduction are attributed to a multifaceted HLW treatment disposal strategy which involves waste pretreatment, standardized waste matrices, risk-based retrieval, early development and deployment of a shipping system for glass canisters, and reasonable, low cost tank closure

  8. ABOUT MODELING COMPLEX ASSEMBLIES IN SOLIDWORKS – LARGE AXIAL BEARING

    Directory of Open Access Journals (Sweden)

    Cătălin IANCU

    2017-12-01

    Full Text Available In this paperwork is presented the modeling strategy used in SOLIDWORKS for modeling special items as large axial bearing and the steps to be taken in order to obtain a better design. In the paper are presented the features that are used for modeling parts, and then the steps that must be taken in order to obtain the 3D model of a large axial bearing used for bucket-wheel equipment for charcoal moving.

  9. The Integrated Complex of Marketing of Higher Education Services

    Directory of Open Access Journals (Sweden)

    Zhehus Olena V.

    2017-10-01

    Full Text Available The article, on the basis of generalization of scientific views of foreign and domestic scientists, substantiates the integrated model of marketing of higher education products and services with consideration of their specificities. The obtained result is the «5Р + S» model, which includes the newly introduced poli-element «proposition», combining the interrelated and indivisible elements of «product», «people» and «process», as well as the traditional elements of the service marketing complex: «price», «place», «promotion», «physical evidence». The «social-marketing» element has been added to the integrated model on the basis of the high societal importance of educational services. Altogether, the proposed integrated model of the complex of marketing of higher education products and services is a symbiosis of commercial and non-commercial marketing, which will enhance social and economic efficiency of functioning of higher educational institution.

  10. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  11. Quantifying complexity in translational research: an integrated approach.

    Science.gov (United States)

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  12. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  13. Integrator complex plays an essential role in adipose differentiation

    International Nuclear Information System (INIS)

    Otani, Yuichiro; Nakatsu, Yusuke; Sakoda, Hideyuki; Fukushima, Toshiaki; Fujishiro, Midori; Kushiyama, Akifumi; Okubo, Hirofumi; Tsuchiya, Yoshihiro; Ohno, Haruya; Takahashi, Shin-Ichiro; Nishimura, Fusanori; Kamata, Hideaki; Katagiri, Hideki; Asano, Tomoichiro

    2013-01-01

    Highlights: •IntS6 and IntS11 are subunits of the Integrator complex. •Expression levels of IntS6 and IntS11 were very low in 3T3-L1 fibroblast. •IntS6 and IntS11 were upregulated during adipose differentiation. •Suppression of IntS6 or IntS11 expression inhibited adipose differentiation. -- Abstract: The dynamic process of adipose differentiation involves stepwise expressions of transcription factors and proteins specific to the mature fat cell phenotype. In this study, it was revealed that expression levels of IntS6 and IntS11, subunits of the Integrator complex, were increased in 3T3-L1 cells in the period when the cells reached confluence and differentiated into adipocytes, while being reduced to basal levels after the completion of differentiation. Suppression of IntS6 or IntS11 expression using siRNAs in 3T3-L1 preadipocytes markedly inhibited differentiation into mature adipocytes, based on morphological findings as well as mRNA analysis of adipocyte-specific genes such as Glut4, perilipin and Fabp4. Although Pparγ2 protein expression was suppressed in IntS6 or IntS11-siRNA treated cells, adenoviral forced expression of Pparγ2 failed to restore the capacity for differentiation into mature adipocytes. Taken together, these findings demonstrate that increased expression of Integrator complex subunits is an indispensable event in adipose differentiation. Although further study is necessary to elucidate the underlying mechanism, the processing of U1, U2 small nuclear RNAs may be involved in cell differentiation steps

  14. Computer tomography of large dust clouds in complex plasmas

    International Nuclear Information System (INIS)

    Killer, Carsten; Himpel, Michael; Melzer, André

    2014-01-01

    The dust density is a central parameter of a dusty plasma. Here, a tomography setup for the determination of the three-dimensionally resolved density distribution of spatially extended dust clouds is presented. The dust clouds consist of micron-sized particles confined in a radio frequency argon plasma, where they fill almost the entire discharge volume. First, a line-of-sight integrated dust density is obtained from extinction measurements, where the incident light from an LED panel is scattered and absorbed by the dust. Performing these extinction measurements from many different angles allows the reconstruction of the 3D dust density distribution, analogous to a computer tomography in medical applications

  15. Protein complex detection in PPI networks based on data integration and supervised learning method.

    Science.gov (United States)

    Yu, Feng; Yang, Zhi; Hu, Xiao; Sun, Yuan; Lin, Hong; Wang, Jian

    2015-01-01

    Revealing protein complexes are important for understanding principles of cellular organization and function. High-throughput experimental techniques have produced a large amount of protein interactions, which makes it possible to predict protein complexes from protein-protein interaction (PPI) networks. However, the small amount of known physical interactions may limit protein complex detection. The new PPI networks are constructed by integrating PPI datasets with the large and readily available PPI data from biomedical literature, and then the less reliable PPI between two proteins are filtered out based on semantic similarity and topological similarity of the two proteins. Finally, the supervised learning protein complex detection (SLPC), which can make full use of the information of available known complexes, is applied to detect protein complex on the new PPI networks. The experimental results of SLPC on two different categories yeast PPI networks demonstrate effectiveness of the approach: compared with the original PPI networks, the best average improvements of 4.76, 6.81 and 15.75 percentage units in the F-score, accuracy and maximum matching ratio (MMR) are achieved respectively; compared with the denoising PPI networks, the best average improvements of 3.91, 4.61 and 12.10 percentage units in the F-score, accuracy and MMR are achieved respectively; compared with ClusterONE, the start-of the-art complex detection method, on the denoising extended PPI networks, the average improvements of 26.02 and 22.40 percentage units in the F-score and MMR are achieved respectively. The experimental results show that the performances of SLPC have a large improvement through integration of new receivable PPI data from biomedical literature into original PPI networks and denoising PPI networks. In addition, our protein complexes detection method can achieve better performance than ClusterONE.

  16. Complex Nonlinearity Chaos, Phase Transitions, Topology Change and Path Integrals

    CERN Document Server

    Ivancevic, Vladimir G

    2008-01-01

    Complex Nonlinearity: Chaos, Phase Transitions, Topology Change and Path Integrals is a book about prediction & control of general nonlinear and chaotic dynamics of high-dimensional complex systems of various physical and non-physical nature and their underpinning geometro-topological change. The book starts with a textbook-like expose on nonlinear dynamics, attractors and chaos, both temporal and spatio-temporal, including modern techniques of chaos–control. Chapter 2 turns to the edge of chaos, in the form of phase transitions (equilibrium and non-equilibrium, oscillatory, fractal and noise-induced), as well as the related field of synergetics. While the natural stage for linear dynamics comprises of flat, Euclidean geometry (with the corresponding calculation tools from linear algebra and analysis), the natural stage for nonlinear dynamics is curved, Riemannian geometry (with the corresponding tools from nonlinear, tensor algebra and analysis). The extreme nonlinearity – chaos – corresponds to th...

  17. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  18. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  19. Large Scale System Safety Integration for Human Rated Space Vehicles

    Science.gov (United States)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  20. Large-eddy simulation of atmospheric flow over complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Bechmann, A.

    2006-11-15

    The present report describes the development and validation of a turbulence model designed for atmospheric flows based on the concept of Large-Eddy Simulation (LES). The background for the work is the high Reynolds number k - epsilon model, which has been implemented on a finite-volume code of the incompressible Reynolds-averaged Navier-Stokes equations (RANS). The k - epsilon model is traditionally used for RANS computations, but is here developed to also enable LES. LES is able to provide detailed descriptions of a wide range of engineering flows at low Reynolds numbers. For atmospheric flows, however, the high Reynolds numbers and the rough surface of the earth provide difficulties normally not compatible with LES. Since these issues are most severe near the surface they are addressed by handling the near surface region with RANS and only use LES above this region. Using this method, the developed turbulence model is able to handle both engineering and atmospheric flows and can be run in both RANS or LES mode. For LES simulations a time-dependent wind field that accurately represents the turbulent structures of a wind environment must be prescribed at the computational inlet. A method is implemented where the turbulent wind field from a separate LES simulation can be used as inflow. To avoid numerical dissipation of turbulence special care is paid to the numerical method, e.g. the turbulence model is calibrated with the specific numerical scheme used. This is done by simulating decaying isotropic and homogeneous turbulence. Three atmospheric test cases are investigated in order to validate the behavior of the presented turbulence model. Simulation of the neutral atmospheric boundary layer, illustrates the turbulence model ability to generate and maintain the turbulent structures responsible for boundary layer transport processes. Velocity and turbulence profiles are in good agreement with measurements. Simulation of the flow over the Askervein hill is also

  1. Large-eddy simulation of atmospheric flow over complex terrain

    DEFF Research Database (Denmark)

    Bechmann, Andreas

    2007-01-01

    The present report describes the development and validation of a turbulence model designed for atmospheric flows based on the concept of Large-Eddy Simulation (LES). The background for the work is the high Reynolds number k - #epsilon# model, which has been implemented on a finite-volume code...... turbulence model is able to handle both engineering and atmospheric flows and can be run in both RANS or LES mode. For LES simulations a time-dependent wind field that accurately represents the turbulent structures of a wind environment must be prescribed at the computational inlet. A method is implemented...... where the turbulent wind field from a separate LES simulation can be used as inflow. To avoid numerical dissipation of turbulence special care is paid to the numerical method, e.g. the turbulence model is calibrated with the specific numerical scheme used. This is done by simulating decaying isotropic...

  2. Optimizing Liquid Effluent Monitoring at a Large Nuclear Complex

    International Nuclear Information System (INIS)

    Chou, Charissa J.; Johnson, V.G.; Barnett, Brent B.; Olson, Phillip M.

    2003-01-01

    Monitoring data for a centralized effluent treatment and disposal facility at the Hanford Site, a defense nuclear complex undergoing cleanup and decommissioning in southeast Washington State, was evaluated to optimize liquid effluent monitoring efficiency. Wastewater from several facilities is collected and discharged to the ground at a common disposal site. The discharged water infiltrates through 60 m of soil column to the groundwater, which eventually flows into the Columbia River, the second largest river in the contiguous United States. Protection of this important natural resource is the major objective of both cleanup and groundwater and effluent monitoring activities at the Hanford Site. Four years of effluent data were evaluated for this study. More frequent sampling was conducted during the first year of operation to assess temporal variability in analyte concentrations, to determine operational factors contributing to waste stream variability and to assess the probability of exceeding permit limits. Subsequently, the study was updated which included evaluation of the sampling and analysis regime. It was concluded that the probability of exceeding permit limits was one in a million under normal operating conditions, sampling frequency could be reduced, and several analytes could be eliminated, while indicators could be substituted for more expensive analyses. Findings were used by the state regulatory agency to modify monitoring requirements for a new discharge permit. The primary focus of this paper is on the statistical approaches and rationale that led to the successful permit modification and to a more cost-effective effluent monitoring program

  3. The Mediator complex: a central integrator of transcription

    Science.gov (United States)

    Allen, Benjamin L.; Taatjes, Dylan J.

    2016-01-01

    The RNA polymerase II (pol II) enzyme transcribes all protein-coding and most non-coding RNA genes and is globally regulated by Mediator, a large, conformationally flexible protein complex with variable subunit composition (for example, a four-subunit CDK8 module can reversibly associate). These biochemical characteristics are fundamentally important for Mediator's ability to control various processes important for transcription, including organization of chromatin architecture and regulation of pol II pre-initiation, initiation, re-initiation, pausing, and elongation. Although Mediator exists in all eukaryotes, a variety of Mediator functions appear to be specific to metazoans, indicative of more diverse regulatory requirements. PMID:25693131

  4. An integrated micromechanical large particle in flow sorter (MILPIS)

    Science.gov (United States)

    Fuad, Nurul M.; Skommer, Joanna; Friedrich, Timo; Kaslin, Jan; Wlodkowic, Donald

    2015-06-01

    At present, the major hurdle to widespread deployment of zebrafish embryo and larvae in large-scale drug development projects is lack of enabling high-throughput analytical platforms. In order to spearhead drug discovery with the use of zebrafish as a model, platforms need to integrate automated pre-test sorting of organisms (to ensure quality control and standardization) and their in-test positioning (suitable for high-content imaging) with modules for flexible drug delivery. The major obstacle hampering sorting of millimetre sized particles such as zebrafish embryos on chip-based devices is their substantial diameter (above one millimetre), mass (above one milligram), which both lead to rapid gravitational-induced sedimentation and high inertial forces. Manual procedures associated with sorting hundreds of embryos are very monotonous and as such prone to significant analytical errors due to operator's fatigue. In this work, we present an innovative design of a micromechanical large particle in-flow sorter (MILPIS) capable of analysing, sorting and dispensing living zebrafish embryos for drug discovery applications. The system consisted of a microfluidic network, revolving micromechanical receptacle actuated by robotic servomotor and opto-electronic sensing module. The prototypes were fabricated in poly(methyl methacrylate) (PMMA) transparent thermoplastic using infrared laser micromachining. Elements of MILPIS were also fabricated in an optically transparent VisiJet resin using 3D stereolithography (SLA) processes (ProJet 7000HD, 3D Systems). The device operation was based on a rapidly revolving miniaturized mechanical receptacle. The latter function was to hold and position individual fish embryos for (i) interrogation, (ii) sorting decision-making and (iii) physical sorting..The system was designed to separate between fertilized (LIVE) and non-fertilized (DEAD) eggs, based on optical transparency using infrared (IR) emitters and receivers embedded in the system

  5. Integrating economic parameters into genetic selection for Large White pigs.

    Science.gov (United States)

    Dube, Bekezela; Mulugeta, Sendros D; Dzama, Kennedy

    2013-08-01

    The objective of the study was to integrate economic parameters into genetic selection for sow productivity, growth performance and carcass characteristics in South African Large White pigs. Simulation models for sow productivity and terminal production systems were performed based on a hypothetical 100-sow herd, to derive economic values for the economically relevant traits. The traits included in the study were number born alive (NBA), 21-day litter size (D21LS), 21-day litter weight (D21LWT), average daily gain (ADG), feed conversion ratio (FCR), age at slaughter (AGES), dressing percentage (DRESS), lean content (LEAN) and backfat thickness (BFAT). Growth of a pig was described by the Gompertz growth function, while feed intake was derived from the nutrient requirements of pigs at the respective ages. Partial budgeting and partial differentiation of the profit function were used to derive economic values, which were defined as the change in profit per unit genetic change in a given trait. The respective economic values (ZAR) were: 61.26, 38.02, 210.15, 33.34, -21.81, -68.18, 5.78, 4.69 and -1.48. These economic values indicated the direction and emphases of selection, and were sensitive to changes in feed prices and marketing prices for carcasses and maiden gilts. Economic values for NBA, D21LS, DRESS and LEAN decreased with increasing feed prices, suggesting a point where genetic improvement would be a loss, if feed prices continued to increase. The economic values for DRESS and LEAN increased as the marketing prices for carcasses increased, while the economic value for BFAT was not sensitive to changes in all prices. Reductions in economic values can be counterbalanced by simultaneous increases in marketing prices of carcasses and maiden gilts. Economic values facilitate genetic improvement by translating it to proportionate profitability. Breeders should, however, continually recalculate economic values to place the most appropriate emphases on the respective

  6. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  7. Properties Important To Mixing For WTP Large Scale Integrated Testing

    International Nuclear Information System (INIS)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-01-01

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  8. PROPERTIES IMPORTANT TO MIXING FOR WTP LARGE SCALE INTEGRATED TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-04-26

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  9. Narrative persuasion, causality, complex integration, and support for obesity policy.

    Science.gov (United States)

    Niederdeppe, Jeff; Shapiro, Michael A; Kim, Hye Kyung; Bartolo, Danielle; Porticella, Norman

    2014-01-01

    Narrative messages have the potential to convey causal attribution information about complex social issues. This study examined attributions about obesity, an issue characterized by interrelated biological, behavioral, and environmental causes. Participants were randomly assigned to read one of three narratives emphasizing societal causes and solutions for obesity or an unrelated story that served as the control condition. The three narratives varied in the extent to which the character in the story acknowledged personal responsibility (high, moderate, and none) for controlling her weight. Stories that featured no acknowledgment and moderate acknowledgment of personal responsibility, while emphasizing environmental causes and solutions, were successful at increasing societal cause attributions about obesity and, among conservatives, increasing support for obesity-related policies relative to the control group. The extent to which respondents were able to make connections between individual and environmental causes of obesity (complex integration) mediated the relationship between the moderate acknowledgment condition and societal cause attributions. We conclude with a discussion of the implications of this work for narrative persuasion theory and health communication campaigns.

  10. The challenge of integrating large scale wind power

    Energy Technology Data Exchange (ETDEWEB)

    Kryszak, B.

    2007-07-01

    The support of renewable energy sources is one of the key issues in current energy policies. The paper presents aspects of the integration of wind power in the electric power system from the perspective of a Transmission System Operator (TSO). Technical, operational and market aspects related to the integration of more than 8000 MW of installed wind power into the Transmission Network of Vattenfall Europe Transmission are discussed, and experiences with the transmission of wind power, wind power prediction, balancing of wind power, power production behaviour and fluctuations are reported. Moreover, issues for wind power integration on a European level will be discussed with the background of a wind power study. (auth)

  11. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  12. Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities

    Science.gov (United States)

    Terblanche, Deon; Jalkanen, Liisa

    2013-04-01

    Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities Deon E. Terblanche and Liisa Jalkanen dterblanche@wmo.int ljalkanen@wmo.int World Meteorological Organization, Geneva, Switzerland The 21st Century could amongst others, become known as the century in which our species has evolved from Homo sapiens to Homo urbanus. By now the urban population has surpassed the rural population and the rate of urbanization will continue at such a pace that by 2050 urban dwellers could outnumber their rural counterpart by more than two to one. Most of this growth in urban population will occur in developing countries and along coastal areas. Urbanization is to a large extent the outcome of humans seeking a better life through improved opportunities presented by high-density communities. Megacities and large urban complexes provide more job opportunities and social structures, better transport and communication links and a relative abundance of physical goods and services when compared to most rural areas. Unfortunately these urban complexes also present numerous social and environmental challenges. Urban areas differ from their surroundings by morphology, population density, and with high concentration of industrial activities, energy consumption and transport. They also pose unique challenges to atmospheric modelling and monitoring and create a multi-disciplinary spectrum of potential threats, including air pollution, which need to be addressed in an integrated way. These areas are also vulnerable to the changing climate and its implications to sea-level and extreme events, air quality and related health impacts. Many urban activities are significantly impacted by weather events that would not be considered to be of high impact in less densely populated areas. For instance, moderate precipitation events can cause flooding and landslides as modified urban catchments generally have higher run-off to rainfall ratios than their more pristine rural

  13. Integrating complexity into data-driven multi-hazard supply chain network strategies

    Science.gov (United States)

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  14. Integration of large chemical kinetic mechanisms via exponential methods with Krylov approximations to Jacobian matrix functions

    KAUST Repository

    Bisetti, Fabrizio

    2012-01-01

    with the computational cost associated with the time integration of stiff, large chemical systems, a novel approach is proposed. The approach combines an exponential integrator and Krylov subspace approximations to the exponential function of the Jacobian matrix

  15. Evaluation of integration methods for hybrid simulation of complex structural systems through collapse

    Science.gov (United States)

    Del Carpio R., Maikol; Hashemi, M. Javad; Mosqueda, Gilberto

    2017-10-01

    This study examines the performance of integration methods for hybrid simulation of large and complex structural systems in the context of structural collapse due to seismic excitations. The target application is not necessarily for real-time testing, but rather for models that involve large-scale physical sub-structures and highly nonlinear numerical models. Four case studies are presented and discussed. In the first case study, the accuracy of integration schemes including two widely used methods, namely, modified version of the implicit Newmark with fixed-number of iteration (iterative) and the operator-splitting (non-iterative) is examined through pure numerical simulations. The second case study presents the results of 10 hybrid simulations repeated with the two aforementioned integration methods considering various time steps and fixed-number of iterations for the iterative integration method. The physical sub-structure in these tests consists of a single-degree-of-freedom (SDOF) cantilever column with replaceable steel coupons that provides repeatable highlynonlinear behavior including fracture-type strength and stiffness degradations. In case study three, the implicit Newmark with fixed-number of iterations is applied for hybrid simulations of a 1:2 scale steel moment frame that includes a relatively complex nonlinear numerical substructure. Lastly, a more complex numerical substructure is considered by constructing a nonlinear computational model of a moment frame coupled to a hybrid model of a 1:2 scale steel gravity frame. The last two case studies are conducted on the same porotype structure and the selection of time steps and fixed number of iterations are closely examined in pre-test simulations. The generated unbalance forces is used as an index to track the equilibrium error and predict the accuracy and stability of the simulations.

  16. Large scale grid integration of renewable energy sources

    CERN Document Server

    Moreno-Munoz, Antonio

    2017-01-01

    This book presents comprehensive coverage of the means to integrate renewable power, namely wind and solar power. It looks at new approaches to meet the challenges, such as increasing interconnection capacity among geographical areas, hybridisation of different distributed energy resources and building up demand response capabilities.

  17. Large Scale Integration of Carbon Nanotubes in Microsystems

    DEFF Research Database (Denmark)

    Gjerde, Kjetil

    2007-01-01

    Kulstof nanorør har mange egenskaber der kunne anvendes i kombination med traditionelle mikrosystemer, her især overlegne mekaniske og elektriske egenskaber. I dette arbejde bliver metoder til stor-skala integration av kulstof nanorør i mikrosystemer undersøgt, med henblik på anvendelse som mekan...

  18. Integrating social, economic, and ecological values across large landscapes

    Science.gov (United States)

    Jessica E. Halofsky; Megan K. Creutzburg; Miles A. Hemstrom

    2014-01-01

    The Integrated Landscape Assessment Project (ILAP) was a multiyear effort to produce information, maps, and models to help land managers, policymakers, and others conduct mid- to broad-scale (e.g., watersheds to states and larger areas) prioritization of land management actions, perform landscape assessments, and estimate cumulative effects of management actions for...

  19. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  20. Understanding large multiprotein complexes: applying a multiple allosteric networks model to explain the function of the Mediator transcription complex.

    Science.gov (United States)

    Lewis, Brian A

    2010-01-15

    The regulation of transcription and of many other cellular processes involves large multi-subunit protein complexes. In the context of transcription, it is known that these complexes serve as regulatory platforms that connect activator DNA-binding proteins to a target promoter. However, there is still a lack of understanding regarding the function of these complexes. Why do multi-subunit complexes exist? What is the molecular basis of the function of their constituent subunits, and how are these subunits organized within a complex? What is the reason for physical connections between certain subunits and not others? In this article, I address these issues through a model of network allostery and its application to the eukaryotic RNA polymerase II Mediator transcription complex. The multiple allosteric networks model (MANM) suggests that protein complexes such as Mediator exist not only as physical but also as functional networks of interconnected proteins through which information is transferred from subunit to subunit by the propagation of an allosteric state known as conformational spread. Additionally, there are multiple distinct sub-networks within the Mediator complex that can be defined by their connections to different subunits; these sub-networks have discrete functions that are activated when specific subunits interact with other activator proteins.

  1. Large Variability in the Diversity of Physiologically Complex Surgical Procedures Exists Nationwide Among All Hospitals Including Among Large Teaching Hospitals.

    Science.gov (United States)

    Dexter, Franklin; Epstein, Richard H; Thenuwara, Kokila; Lubarsky, David A

    2017-11-22

    Multiple previous studies have shown that having a large diversity of procedures has a substantial impact on quality management of hospital surgical suites. At hospitals with substantial diversity, unless sophisticated statistical methods suitable for rare events are used, anesthesiologists working in surgical suites will have inaccurate predictions of surgical blood usage, case durations, cost accounting and price transparency, times remaining in late running cases, and use of intraoperative equipment. What is unknown is whether large diversity is a feature of only a few very unique set of hospitals nationwide (eg, the largest hospitals in each state or province). The 2013 United States Nationwide Readmissions Database was used to study heterogeneity among 1981 hospitals in their diversities of physiologically complex surgical procedures (ie, the procedure codes). The diversity of surgical procedures performed at each hospital was quantified using a summary measure, the number of different physiologically complex surgical procedures commonly performed at the hospital (ie, 1/Herfindahl). A total of 53.9% of all hospitals commonly performed 3-fold larger diversity (ie, >30 commonly performed physiologically complex procedures). Larger hospitals had greater diversity than the small- and medium-sized hospitals (P 30 procedures (lower 99% CL, 71.9% of hospitals). However, there was considerable variability among the large teaching hospitals in their diversity (interquartile range of the numbers of commonly performed physiologically complex procedures = 19.3; lower 99% CL, 12.8 procedures). The diversity of procedures represents a substantive differentiator among hospitals. Thus, the usefulness of statistical methods for operating room management should be expected to be heterogeneous among hospitals. Our results also show that "large teaching hospital" alone is an insufficient description for accurate prediction of the extent to which a hospital sustains the

  2. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  3. Electricity Prices, Large-Scale Renewable Integration, and Policy Implications

    OpenAIRE

    Kyritsis, Evangelos; Andersson, Jonas; Serletis, Apostolos

    2016-01-01

    This paper investigates the effects of intermittent solar and wind power generation on electricity price formation in Germany. We use daily data from 2010 to 2015, a period with profound modifications in the German electricity market, the most notable being the rapid integration of photovoltaic and wind power sources, as well as the phasing out of nuclear energy. In the context of a GARCH-in-Mean model, we show that both solar and wind power Granger cause electricity prices, that solar power ...

  4. Integral and measure from rather simple to rather complex

    CERN Document Server

    Mackevicius, Vigirdas

    2014-01-01

    This book is devoted to integration, one of the two main operations in calculus. In Part 1, the definition of the integral of a one-variable function is different (not essentially, but rather methodically) from traditional definitions of Riemann or Lebesgue integrals. Such an approach allows us, on the one hand, to quickly develop the practical skills of integration as well as, on the other hand, in Part 2, to pass naturally to the more general Lebesgue integral. Based on the latter, in Part 2, the author develops a theory of integration for functions of several variables. In Part 3, within

  5. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    OpenAIRE

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an existing large and complex software-intensive system. This definition approach enables the customization and extension of a set of predefined viewpoints to address the requirements of a specific developme...

  6. Automated X-ray television complex for testing large dynamic objects

    International Nuclear Information System (INIS)

    Gusev, E.A.; Luk'yanenko, Eh.A.; Chelnokov, V.B.; Kuleshov, V.K.; Alkhimov, Yu.V.

    1992-01-01

    An automated X-ray television complex on the base of matrix gas-dischage large-area (2.1x1.0 m) converter for testing large cargoes and containers, as well as for inductrial article diagnostics is described. The complex pulsed operation with the 512 kbytes television digital memory unit provides for testing dynamic objects under minimal doses (20-100 μR)

  7. Large Complex Odontoma of Mandible in a Young Boy: A Rare and Unusual Case Report

    Directory of Open Access Journals (Sweden)

    G. Siva Prasad Reddy

    2014-01-01

    Full Text Available Odontomas are the most common odontogenic tumors. They are broadly classified in to Compound Odontoma and Complex Odontoma. Among them complex odontoma is a rare tumor. Occasionally this tumor becomes large, causing expansion of bone followed by facial asymmetry. Otherwise these tumors are asymptomatic and are generally diagnosed on radiographic examination. We report a rare case of complex odontoma of mandible in a young boy. The tumor was treated by surgical excision under general anesthesia.

  8. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  9. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  10. Intensity of anxiety is modified via complex integrative stress circuitries.

    Science.gov (United States)

    Smith, Justin P; Prince, Melissa A; Achua, Justin K; Robertson, James M; Anderson, Raymond T; Ronan, Patrick J; Summers, Cliff H

    2016-01-01

    Escalation of anxious behavior while environmentally and socially relevant contextual events amplify the intensity of emotional response produces a testable gradient of anxiety shaped by integrative circuitries. Apprehension of the Stress-Alternatives Model apparatus (SAM) oval open field (OF) is measured by the active latency to escape, and is delayed by unfamiliarity with the passageway. Familiar OF escape is the least anxious behavior along the continuum, which can be reduced by anxiolytics such as icv neuropeptide S (NPS). Social aggression increases anxiousness in the SAM, reducing the number of mice willing to escape by 50%. The apprehension accompanying escape during social aggression is diminished by anxiolytics, such as exercise and corticotropin releasing-factor receptor 1 (CRF1) antagonism, but exacerbated by anxiogenic treatment, like antagonism of α2-adrenoreceptors. What is more, the anxiolytic CRF1 and anxiogenic α2-adrenoreceptor antagonists also modify behavioral phenotypes, with CRF1 antagonism allowing escape by previously submissive animals, and α2-adrenoreceptor antagonism hindering escape in mice that previously engaged in it. Gene expression of NPS and brain-derived neurotrophic factor (BDNF) in the central amygdala (CeA), as well as corticosterone secretion, increased concomitantly with the escalating anxious content of the mouse-specific anxiety continuum. The general trend of CeA NPS and BDNF expression suggested that NPS production was promoted by increasing anxiousness, and that BDNF synthesis was associated with learning about ever-more anxious conditions. The intensity gradient for anxious behavior resulting from varying contextual conditions may yield an improved conceptualization of the complexity of mechanisms producing the natural continuum of human anxious conditions, and potential therapies that arise therefrom. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  12. Vision for single flux quantum very large scale integrated technology

    International Nuclear Information System (INIS)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-01-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm -2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm -2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip

  13. Vision for single flux quantum very large scale integrated technology

    Energy Technology Data Exchange (ETDEWEB)

    Silver, Arnold [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Bunyk, Paul [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Kleinsasser, Alan [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Spargo, John [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2006-05-15

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm{sup -2} to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm{sup -2} into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip.

  14. Control Synthesis for the Flow-Based Microfluidic Large-Scale Integration Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2013-01-01

    In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units, such asmi......In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units...

  15. Explicit integration with GPU acceleration for large kinetic networks

    International Nuclear Information System (INIS)

    Brock, Benjamin; Belt, Andrew; Billings, Jay Jay; Guidry, Mike

    2015-01-01

    We demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. This orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies that important coupled, multiphysics problems in various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.

  16. Power System Operation with Large Scale Wind Power Integration

    DEFF Research Database (Denmark)

    Suwannarat, A.; Bak-Jensen, B.; Chen, Z.

    2007-01-01

    to the uncertain nature of wind power. In this paper, proposed models of generations and control system are presented which analyze the deviation of power exchange at the western Danish-German border, taking into account the fluctuating nature of wind power. The performance of the secondary control of the thermal......The Danish power system starts to face problems of integrating thousands megawatts of wind power, which produce in a stochastic behavior due to natural wind fluctuations. With wind power capacities increasing, the Danish Transmission System Operator (TSO) is faced with new challenges related...... power plants and the spinning reserves control from the Combined Heat and Power (CHP) units to achieve active power balance with the increased wind power penetration is presented....

  17. Evaluation of complex integrated care programmes: the approach in North West London

    Science.gov (United States)

    Greaves, Felix; Pappas, Yannis; Bardsley, Martin; Harris, Matthew; Curry, Natasha; Holder, Holly; Blunt, Ian; Soljak, Michael; Gunn, Laura; Majeed, Azeem; Car, Josip

    2013-01-01

    Background Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect. Study design We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context. Theory and discussion This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them. Conclusions We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important. PMID:23687478

  18. Evaluation of complex integrated care programmes: the approach in North West London

    Directory of Open Access Journals (Sweden)

    Felix Greaves

    2013-03-01

    Full Text Available Background: Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect. Study design: We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context. Theory and discussion: This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them. Conclusions: We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important

  19. Evaluation of complex integrated care programmes: the approach in North West London

    Directory of Open Access Journals (Sweden)

    Felix Greaves

    2013-03-01

    Full Text Available Background: Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect.Study design: We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context.Theory and discussion: This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them.Conclusions: We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important

  20. Integration as the basis of stable and dynamic development of enterprises in agroindustrial complex

    Directory of Open Access Journals (Sweden)

    Petr Ivanovich Ogorodnikov

    2011-12-01

    Full Text Available Formation of market relations in Russian economy generates an objective need to address a number of problems in the relationship between agroundustrial complex organizations in connection with privatization, liberalization of prices and imbalances in the existing inter-industry production and economic relations that negatively affect the results of their economic activities. Because of the flagrant violations of the replenishment process, a diverse range of connections and relationships between producers and processors was broken. The major direction of lifting agricultural economy in this situation is the development of cooperatives and agroindustrial integration. In addition, the formation of large integrated complexes demonstrates high efficiency and rapid development, which is the basis of agroindustrial sector in many developed countries. The increase of competition forces business entities to combine capabilities and mutually beneficial cooperation in the struggle for the strengthening of market positions. Thus, increasing the degree of integration in the agricultural sector helps to get out of the protracted crisis and move more quickly to the innovations.

  1. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  2. From Synergy to Complexity: The Trend Toward Integrated Value Chain and Landscape Governance.

    Science.gov (United States)

    Ros-Tonen, Mirjam A F; Reed, James; Sunderland, Terry

    2018-05-30

    This Editorial introduces a special issue that illustrates a trend toward integrated landscape approaches. Whereas two papers echo older "win-win" strategies based on the trade of non-timber forest products, ten papers reflect a shift from a product to landscape perspective. However, they differ from integrated landscape approaches in that they emanate from sectorial approaches driven primarily by aims such as forest restoration, sustainable commodity sourcing, natural resource management, or carbon emission reduction. The potential of such initiatives for integrated landscape governance and achieving landscape-level outcomes has hitherto been largely unaddressed in the literature on integrated landscape approaches. This special issue addresses this gap, with a focus on actor constellations and institutional arrangements emerging in the transition from sectorial to integrated approaches. This editorial discusses the trends arising from the papers, including the need for a commonly shared concern and sense of urgency; inclusive stakeholder engagement; accommodating and coordinating polycentric governance in landscapes beset with institutional fragmentation and jurisdictional mismatches; alignment with locally embedded initiatives and governance structures; and a framework to assess and monitor the performance of integrated multi-stakeholder approaches. We conclude that, despite a growing tendency toward integrated approaches at the landscape level, inherent landscape complexity renders persistent and significant challenges such as balancing multiple objectives, equitable inclusion of all relevant stakeholders, dealing with power and gender asymmetries, adaptive management based on participatory outcome monitoring, and moving beyond existing administrative, jurisdictional, and sectorial silos. Multi-stakeholder platforms and bridging organizations and individuals are seen as key in overcoming such challenges.

  3. Electricity prices, large-scale renewable integration, and policy implications

    International Nuclear Information System (INIS)

    Kyritsis, Evangelos; Andersson, Jonas; Serletis, Apostolos

    2017-01-01

    This paper investigates the effects of intermittent solar and wind power generation on electricity price formation in Germany. We use daily data from 2010 to 2015, a period with profound modifications in the German electricity market, the most notable being the rapid integration of photovoltaic and wind power sources, as well as the phasing out of nuclear energy. In the context of a GARCH-in-Mean model, we show that both solar and wind power Granger cause electricity prices, that solar power generation reduces the volatility of electricity prices by scaling down the use of peak-load power plants, and that wind power generation increases the volatility of electricity prices by challenging electricity market flexibility. - Highlights: • We model the impact of solar and wind power generation on day-ahead electricity prices. • We discuss the different nature of renewables in relation to market design. • We explore the impact of renewables on the distributional properties of electricity prices. • Solar and wind reduce electricity prices but affect price volatility in the opposite way. • Solar decreases the probability of electricity price spikes, while wind increases it.

  4. Large capacity storage of integrated objects before change blindness.

    Science.gov (United States)

    Landman, Rogier; Spekreijse, Henk; Lamme, Victor A F

    2003-01-01

    Normal people have a strikingly low ability to detect changes in a visual scene. This has been taken as evidence that the brain represents only a few objects at a time, namely those currently in the focus of attention. In the present study, subjects were asked to detect changes in the orientation of rectangular figures in a textured display across a 1600 ms gray interval. In the first experiment, change detection improved when the location of a possible change was cued during the interval. The cue remained effective during the entire interval, but after the interval, it was ineffective, suggesting that an initially large representation was overwritten by the post-change display. To control for an effect of light intensity during the interval on the decay of the representation, we compared performance with a gray or a white interval screen in a second experiment. We found no difference between these conditions. In the third experiment, attention was occasionally misdirected during the interval by first cueing the wrong figure, before cueing the correct figure. This did not compromise performance compared to a single cue, indicating that when an item is attentionally selected, the representation of yet unchosen items remains available. In the fourth experiment, the cue was shown to be effective when changes in figure size and orientation were randomly mixed. At the time the cue appeared, subjects could not know whether size or orientation would change, therefore these results suggest that the representation contains features in their 'bound' state. Together, these findings indicate that change blindness involves overwriting of a large capacity representation by the post-change display.

  5. Symplectic integrators for large scale molecular dynamics simulations: A comparison of several explicit methods

    International Nuclear Information System (INIS)

    Gray, S.K.; Noid, D.W.; Sumpter, B.G.

    1994-01-01

    We test the suitability of a variety of explicit symplectic integrators for molecular dynamics calculations on Hamiltonian systems. These integrators are extremely simple algorithms with low memory requirements, and appear to be well suited for large scale simulations. We first apply all the methods to a simple test case using the ideas of Berendsen and van Gunsteren. We then use the integrators to generate long time trajectories of a 1000 unit polyethylene chain. Calculations are also performed with two popular but nonsymplectic integrators. The most efficient integrators of the set investigated are deduced. We also discuss certain variations on the basic symplectic integration technique

  6. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  7. Integrating Collaborative Learning Groups in the Large Enrollment Lecture

    Science.gov (United States)

    Adams, J. P.; Brissenden, G.; Lindell Adrian, R.; Slater, T. F.

    1998-12-01

    Recent reforms for undergraduate education propose that students should work in teams to solve problems that simulate problems that research scientists address. In the context of an innovative large-enrollment course at Montana State University, faculty have developed a series of 15 in-class, collaborative learning group activities that provide students with realistic scenarios to investigate. Focusing on a team approach, the four principle types of activities employed are historical, conceptual, process, and open-ended activities. Examples of these activities include classifying stellar spectra, characterizing galaxies, parallax measurements, estimating stellar radii, and correlating star colors with absolute magnitudes. Summative evaluation results from a combination of attitude surveys, astronomy concept examinations, and focus group interviews strongly suggest that, overall, students are learning more astronomy, believe that the group activities are valuable, enjoy the less-lecture course format, and have significantly higher attendance rates. In addition, class observations of 48 self-formed, collaborative learning groups reveal that female students are more engaged in single-gender learning groups than in mixed gender groups.

  8. An integrated system for large scale scanning of nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Bozza, Cristiano, E-mail: kryss@sa.infn.it [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); D’Ambrosio, Nicola [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); De Lellis, Giovanni [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); De Serio, Marilisa [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Di Capua, Francesco [INFN Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Crescenzo, Antonia [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Ferdinando, Donato [INFN Bologna, viale B. Pichat 6/2, Bologna 40127 (Italy); Di Marco, Natalia [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); Esposito, Luigi Salvatore [Laboratori Nazionali del Gran Sasso, now at CERN, Geneva (Switzerland); Fini, Rosa Anna [INFN Bari, via E. Orabona 4, Bari 70125 (Italy); Giacomelli, Giorgio [University of Bologna and INFN, viale B. Pichat 6/2, Bologna 40127 (Italy); Grella, Giuseppe [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); Ieva, Michela [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Kose, Umut [INFN Padova, via Marzolo 8, Padova (PD) 35131 (Italy); Longhin, Andrea; Mauri, Nicoletta [INFN Laboratori Nazionali di Frascati, via E. Fermi 40, Frascati (RM) 00044 (Italy); Medinaceli, Eduardo [University of Padova and INFN, via Marzolo 8, Padova (PD) 35131 (Italy); Monacelli, Piero [University of L' Aquila and INFN, via Vetoio Loc. Coppito, L' Aquila (AQ) 67100 (Italy); Muciaccia, Maria Teresa; Pastore, Alessandra [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); and others

    2013-03-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m{sup 2} to tens of m{sup 2}, acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing.

  9. An integrated system for large scale scanning of nuclear emulsions

    International Nuclear Information System (INIS)

    Bozza, Cristiano; D’Ambrosio, Nicola; De Lellis, Giovanni; De Serio, Marilisa; Di Capua, Francesco; Di Crescenzo, Antonia; Di Ferdinando, Donato; Di Marco, Natalia; Esposito, Luigi Salvatore; Fini, Rosa Anna; Giacomelli, Giorgio; Grella, Giuseppe; Ieva, Michela; Kose, Umut; Longhin, Andrea; Mauri, Nicoletta; Medinaceli, Eduardo; Monacelli, Piero; Muciaccia, Maria Teresa; Pastore, Alessandra

    2013-01-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m 2 to tens of m 2 , acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing

  10. Integration of the immune system: a complex adaptive supersystem

    Science.gov (United States)

    Crisman, Mark V.

    2001-10-01

    Immunity to pathogenic organisms is a complex process involving interacting factors within the immune system including circulating cells, tissues and soluble chemical mediators. Both the efficiency and adaptive responses of the immune system in a dynamic, often hostile, environment are essential for maintaining our health and homeostasis. This paper will present a brief review of one of nature's most elegant, complex adaptive systems.

  11. Exact complex integrals in two dimensions for shifted harmonic ...

    Indian Academy of Sciences (India)

    We use rationalization method to study two-dimensional complex dynamical systems (shifted harmonic oscillator in complex plane) on the extended comples phase space (ECPS). The role and scope of the derived invatiants in the context of various physical problems are high-lighted.

  12. On synchronisation of a class of complex chaotic systems with complex unknown parameters via integral sliding mode control

    Science.gov (United States)

    Tirandaz, Hamed; Karami-Mollaee, Ali

    2018-06-01

    Chaotic systems demonstrate complex behaviour in their state variables and their parameters, which generate some challenges and consequences. This paper presents a new synchronisation scheme based on integral sliding mode control (ISMC) method on a class of complex chaotic systems with complex unknown parameters. Synchronisation between corresponding states of a class of complex chaotic systems and also convergence of the errors of the system parameters to zero point are studied. The designed feedback control vector and complex unknown parameter vector are analytically achieved based on the Lyapunov stability theory. Moreover, the effectiveness of the proposed methodology is verified by synchronisation of the Chen complex system and the Lorenz complex systems as the leader and the follower chaotic systems, respectively. In conclusion, some numerical simulations related to the synchronisation methodology is given to illustrate the effectiveness of the theoretical discussions.

  13. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  14. An integrated view of complex landscapes: a big data-model integration approach to trans-disciplinary science

    Science.gov (United States)

    The Earth is a complex system comprised of many interacting spatial and temporal scales. Understanding, predicting, and managing for these dynamics requires a trans-disciplinary integrated approach. Although there have been calls for this integration, a general approach is needed. We developed a Tra...

  15. Fabric strain sensor integrated with CNPECs for repeated large deformation

    Science.gov (United States)

    Yi, Weijing

    Flexible and soft strain sensors that can be used in smart textiles for wearable applications are much desired. They should meet the requirements of low modulus, large working range and good fatigue resistance as well as good sensing performances. However, there were no commercial products available and the objective of the thesis is to investigate fabric strain sensors based on carbon nanoparticle (CNP) filled elastomer composites (CNPECs) for potential wearing applications. Conductive CNPECs were fabricated and investigated. The introduction of silicone oil (SO) significantly decreased modulus of the composites to less than 1 MPa without affecting their deformability and they showed good stability after heat treatment. With increase of CNP concentration, a percolation appeared in electrical resistivity and the composites can be divided into three ranges. I-V curves and impedance spectra together with electro-mechanical studies demonstrated a balance between sensitivity and working range for the composites with CNP concentrations in post percolation range, and were preferred for sensing applications only if the fatigue life was improved. Due to the good elasticity and failure resist property of knitted fabric under repeated extension, it was adopted as substrate to increase the fatigue life of the conductive composites. After optimization of processing parameters, the conductive fabric with CNP concentration of 9.0CNP showed linear I-V curves when voltage is in the range of -1 V/mm and 1 V/mm and negligible capacitive behavior when frequency below 103 Hz even with strain of 60%. It showed higher sensitivity due to the combination of nonlinear resistance-strain behavior of the CNPECs and non-even strain distribution of knitted fabric under extension. The fatigue life of the conductive fabric was greatly improved. Extended on the studies of CNPECs and the coated conductive fabrics, a fabric strain sensor was designed, fabricated and packaged. The Young's modulus of

  16. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  17. Optimal number of coarse-grained sites in different components of large biomolecular complexes.

    Science.gov (United States)

    Sinitskiy, Anton V; Saunders, Marissa G; Voth, Gregory A

    2012-07-26

    The computational study of large biomolecular complexes (molecular machines, cytoskeletal filaments, etc.) is a formidable challenge facing computational biophysics and biology. To achieve biologically relevant length and time scales, coarse-grained (CG) models of such complexes usually must be built and employed. One of the important early stages in this approach is to determine an optimal number of CG sites in different constituents of a complex. This work presents a systematic approach to this problem. First, a universal scaling law is derived and numerically corroborated for the intensity of the intrasite (intradomain) thermal fluctuations as a function of the number of CG sites. Second, this result is used for derivation of the criterion for the optimal number of CG sites in different parts of a large multibiomolecule complex. In the zeroth-order approximation, this approach validates the empirical rule of taking one CG site per fixed number of atoms or residues in each biomolecule, previously widely used for smaller systems (e.g., individual biomolecules). The first-order corrections to this rule are derived and numerically checked by the case studies of the Escherichia coli ribosome and Arp2/3 actin filament junction. In different ribosomal proteins, the optimal number of amino acids per CG site is shown to differ by a factor of 3.5, and an even wider spread may exist in other large biomolecular complexes. Therefore, the method proposed in this paper is valuable for the optimal construction of CG models of such complexes.

  18. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  19. Protein complex prediction in large ontology attributed protein-protein interaction networks.

    Science.gov (United States)

    Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Li, Yanpeng; Xu, Bo

    2013-01-01

    Protein complexes are important for unraveling the secrets of cellular organization and function. Many computational approaches have been developed to predict protein complexes in protein-protein interaction (PPI) networks. However, most existing approaches focus mainly on the topological structure of PPI networks, and largely ignore the gene ontology (GO) annotation information. In this paper, we constructed ontology attributed PPI networks with PPI data and GO resource. After constructing ontology attributed networks, we proposed a novel approach called CSO (clustering based on network structure and ontology attribute similarity). Structural information and GO attribute information are complementary in ontology attributed networks. CSO can effectively take advantage of the correlation between frequent GO annotation sets and the dense subgraph for protein complex prediction. Our proposed CSO approach was applied to four different yeast PPI data sets and predicted many well-known protein complexes. The experimental results showed that CSO was valuable in predicting protein complexes and achieved state-of-the-art performance.

  20. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  1. Understanding complex urban systems integrating multidisciplinary data in urban models

    CERN Document Server

    Gebetsroither-Geringer, Ernst; Atun, Funda; Werner, Liss

    2016-01-01

    This book is devoted to the modeling and understanding of complex urban systems. This second volume of Understanding Complex Urban Systems focuses on the challenges of the modeling tools, concerning, e.g., the quality and quantity of data and the selection of an appropriate modeling approach. It is meant to support urban decision-makers—including municipal politicians, spatial planners, and citizen groups—in choosing an appropriate modeling approach for their particular modeling requirements. The contributors to this volume are from different disciplines, but all share the same goal: optimizing the representation of complex urban systems. They present and discuss a variety of approaches for dealing with data-availability problems and finding appropriate modeling approaches—and not only in terms of computer modeling. The selection of articles featured in this volume reflect a broad variety of new and established modeling approaches such as: - An argument for using Big Data methods in conjunction with Age...

  2. Complex researches on substantiation of construction and seismic stability of large dams in seismic region

    International Nuclear Information System (INIS)

    Negmatullaev, S.Kh.; Yasunov, P.A.

    2001-01-01

    This article is devoted to complex researches on substantiation of construction and seismic stability of large dams (Nurec hydroelectric power station) in seismic region. Geological, seismological, model, and engineering investigations are discussed in this work. At construction of Nurec hydroelectric power station the rich experience is accumulated. This experience can be used in analogous seismically active regions at construction similar hydroelectric power stations.

  3. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    NARCIS (Netherlands)

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an

  4. Par@Graph - a parallel toolbox for the construction and analysis of large complex climate networks

    NARCIS (Netherlands)

    Tantet, A.J.J.

    2015-01-01

    In this paper, we present Par@Graph, a software toolbox to reconstruct and analyze complex climate networks having a large number of nodes (up to at least 106) and edges (up to at least 1012). The key innovation is an efficient set of parallel software tools designed to leverage the inherited hybrid

  5. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  6. The magnetic g-tensors for ion complexes with large spin-orbit coupling

    International Nuclear Information System (INIS)

    Chang, P.K.L.; Liu, Y.S.

    1977-01-01

    A nonperturbative method for calculating the magnetic g-tensors is presented and discussed for complexes of transition metal ions of large spin-orbit coupling, in the ground term 2 D. A numerical example for CuCl 2 .2H 2 O is given [pt

  7. An Improved Conceptually-Based Method for Analysis of Communication Network Structure of Large Complex Organizations.

    Science.gov (United States)

    Richards, William D., Jr.

    Previous methods for determining the communication structure of organizations work well for small or simple organizations, but are either inadequate or unwieldy for use with large complex organizations. An improved method uses a number of different measures and a series of successive approximations to order the communication matrix such that…

  8. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  9. 3D-Printed Disposable Wireless Sensors with Integrated Microelectronics for Large Area Environmental Monitoring

    KAUST Repository

    Farooqui, Muhammad Fahad; Karimi, Muhammad Akram; Salama, Khaled N.; Shamim, Atif

    2017-01-01

    disposable, compact, dispersible 3D-printed wireless sensor nodes with integrated microelectronics which can be dispersed in the environment and work in conjunction with few fixed nodes for large area monitoring applications. As a proof of concept

  10. Economic testing of large integrated switching circuits - a challenge to the test engineer

    International Nuclear Information System (INIS)

    Kreinberg, W.

    1978-01-01

    With reference to large integrated switching circuits, one can use an incoming standard programme test or the customer's switching circuits. The author describes the development of suitable, extensive and economical test programmes. (orig.) [de

  11. Communication Network Integration and Group Uniformity in a Complex Organization.

    Science.gov (United States)

    Danowski, James A.; Farace, Richard V.

    This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…

  12. LEGO-NMR spectroscopy: a method to visualize individual subunits in large heteromeric complexes.

    Science.gov (United States)

    Mund, Markus; Overbeck, Jan H; Ullmann, Janina; Sprangers, Remco

    2013-10-18

    Seeing the big picture: Asymmetric macromolecular complexes that are NMR active in only a subset of their subunits can be prepared, thus decreasing NMR spectral complexity. For the hetero heptameric LSm1-7 and LSm2-8 rings NMR spectra of the individual subunits of the complete complex are obtained, showing a conserved RNA binding site. This LEGO-NMR technique makes large asymmetric complexes accessible to detailed NMR spectroscopic studies. © 2013 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of Creative Commons the Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

  13. Generating functional analysis of complex formation and dissociation in large protein interaction networks

    International Nuclear Information System (INIS)

    Coolen, A C C; Rabello, S

    2009-01-01

    We analyze large systems of interacting proteins, using techniques from the non-equilibrium statistical mechanics of disordered many-particle systems. Apart from protein production and removal, the most relevant microscopic processes in the proteome are complex formation and dissociation, and the microscopic degrees of freedom are the evolving concentrations of unbound proteins (in multiple post-translational states) and of protein complexes. Here we only include dimer-complexes, for mathematical simplicity, and we draw the network that describes which proteins are reaction partners from an ensemble of random graphs with an arbitrary degree distribution. We show how generating functional analysis methods can be used successfully to derive closed equations for dynamical order parameters, representing an exact macroscopic description of the complex formation and dissociation dynamics in the infinite system limit. We end this paper with a discussion of the possible routes towards solving the nontrivial order parameter equations, either exactly (in specific limits) or approximately.

  14. Integrating complex functions: coordination of nuclear pore complex assembly and membrane expansion of the nuclear envelope requires a family of integral membrane proteins.

    Science.gov (United States)

    Schneiter, Roger; Cole, Charles N

    2010-01-01

    The nuclear envelope harbors numerous large proteinaceous channels, the nuclear pore complexes (NPCs), through which macromolecular exchange between the cytosol and the nucleoplasm occurs. This double-membrane nuclear envelope is continuous with the endoplasmic reticulum and thus functionally connected to such diverse processes as vesicular transport, protein maturation and lipid synthesis. Recent results obtained from studies in Saccharomyces cerevisiae indicate that assembly of the nuclear pore complex is functionally dependent upon maintenance of lipid homeostasis of the ER membrane. Previous work from one of our laboratories has revealed that an integral membrane protein Apq12 is important for the assembly of functional nuclear pores. Cells lacking APQ12 are viable but cannot grow at low temperatures, have aberrant NPCs and a defect in mRNA export. Remarkably, these defects in NPC assembly can be overcome by supplementing cells with a membrane fluidizing agent, benzyl alcohol, suggesting that Apq12 impacts the flexibility of the nuclear membrane, possibly by adjusting its lipid composition when cells are shifted to a reduced temperature. Our new study now expands these findings and reveals that an essential membrane protein, Brr6, shares at least partially overlapping functions with Apq12 and is also required for assembly of functional NPCs. A third nuclear envelope membrane protein, Brl1, is related to Brr6, and is also required for NPC assembly. Because maintenance of membrane homeostasis is essential for cellular survival, the fact that these three proteins are conserved in fungi that undergo closed mitoses, but are not found in metazoans or plants, may indicate that their functions are performed by proteins unrelated at the primary sequence level to Brr6, Brl1 and Apq12 in cells that disassemble their nuclear envelopes during mitosis.

  15. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  16. Complexity and network dynamics in physiological adaptation: An integrated view

    OpenAIRE

    Baffy, Gyorgy; Loscalzo, Joseph

    2014-01-01

    Living organisms constantly interact with their surroundings and sustain internal stability against perturbations. This dynamic process follows three fundamental strategies (restore, explore, and abandon) articulated in historical concepts of physiological adaptation such as homeostasis, allostasis, and the general adaptation syndrome. These strategies correspond to elementary forms of behavior (ordered, chaotic, and static) in complex adaptive systems and invite a network-based analysis of t...

  17. Integrating technology into complex intervention trial processes: a case study.

    Science.gov (United States)

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database

  18. Value of spatial planning for large mining and energy complexes. [Yugoslavia

    Energy Technology Data Exchange (ETDEWEB)

    Matko, Z; Spasic, N

    1982-01-01

    In the example of the Kosovo complex (Socialist Federated Republic of Yugoslovia) an examination is made of the value of developing a spatial plan for the territory of large mining-energy complexes. The goals and expected results of spatial planning are discussed. The open method of working lignite, fuel shale and other fossil energy raw material fields at the modern level of development of technology, in addition to large-volume physical interferences in space, causes considerable structural changes of functional-economic, socioeconomic and psychological-sociological nature in the direct zone of influence of the mining-energy complex. Improvement in technology of working a lignite field does not guarantee in the near future any solutions in developing the mining-energy complexes, and therefore it is necessary to count on considerable volume of degradation of space which is governed by the existing technology. Under these conditions detailed planning and regulation of space is especially important, if one views them as a component part of long term policy for development of the mining energy complex and the zones of its influence.

  19. Influence of the large-small split effect on strategy choice in complex subtraction.

    Science.gov (United States)

    Xiang, Yan Hui; Wu, Hao; Shang, Rui Hong; Chao, Xiaomei; Ren, Ting Ting; Zheng, Li Ling; Mo, Lei

    2018-04-01

    Two main theories have been used to explain the arithmetic split effect: decision-making process theory and strategy choice theory. Using the inequality paradigm, previous studies have confirmed that individuals tend to adopt a plausibility-checking strategy and a whole-calculation strategy to solve large and small split problems in complex addition arithmetic, respectively. This supports strategy choice theory, but it is unknown whether this theory also explains performance in solving different split problems in complex subtraction arithmetic. This study used small, intermediate and large split sizes, with each split condition being further divided into problems requiring and not requiring borrowing. The reaction times (RTs) for large and intermediate splits were significantly shorter than those for small splits, while accuracy was significantly higher for large and middle splits than for small splits, reflecting no speed-accuracy trade-off. Further, RTs and accuracy differed significantly between the borrow and no-borrow conditions only for small splits. This study indicates that strategy choice theory is suitable to explain the split effect in complex subtraction arithmetic. That is, individuals tend to choose the plausibility-checking strategy or the whole-calculation strategy according to the split size. © 2016 International Union of Psychological Science.

  20. VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration

    Science.gov (United States)

    Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.

    2012-09-01

    VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to

  1. How Project Managers Really Manage: An Indepth Look at Some Managers of Large, Complex NASA Projects

    Science.gov (United States)

    Mulenburg, Gerald M.; Impaeilla, Cliff (Technical Monitor)

    2000-01-01

    This paper reports on a research study by the author that examined ten contemporary National Aeronautics and Space Administration (NASA) complex projects. In-depth interviews with the project managers of these projects provided qualitative data about the inner workings of the project and the methodologies used in establishing and managing the projects. The inclusion of a variety of space, aeronautics, and ground based projects from several different NASA research centers helped to reduce potential bias in the findings toward any one type of project, or technical discipline. The findings address the participants and their individual approaches. The discussion includes possible implications for project managers of other large, complex, projects.

  2. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  3. Modulation of chromatin structure by the FACT histone chaperone complex regulates HIV-1 integration.

    Science.gov (United States)

    Matysiak, Julien; Lesbats, Paul; Mauro, Eric; Lapaillerie, Delphine; Dupuy, Jean-William; Lopez, Angelica P; Benleulmi, Mohamed Salah; Calmels, Christina; Andreola, Marie-Line; Ruff, Marc; Llano, Manuel; Delelis, Olivier; Lavigne, Marc; Parissi, Vincent

    2017-07-28

    Insertion of retroviral genome DNA occurs in the chromatin of the host cell. This step is modulated by chromatin structure as nucleosomes compaction was shown to prevent HIV-1 integration and chromatin remodeling has been reported to affect integration efficiency. LEDGF/p75-mediated targeting of the integration complex toward RNA polymerase II (polII) transcribed regions ensures optimal access to dynamic regions that are suitable for integration. Consequently, we have investigated the involvement of polII-associated factors in the regulation of HIV-1 integration. Using a pull down approach coupled with mass spectrometry, we have selected the FACT (FAcilitates Chromatin Transcription) complex as a new potential cofactor of HIV-1 integration. FACT is a histone chaperone complex associated with the polII transcription machinery and recently shown to bind LEDGF/p75. We report here that a tripartite complex can be formed between HIV-1 integrase, LEDGF/p75 and FACT in vitro and in cells. Biochemical analyzes show that FACT-dependent nucleosome disassembly promotes HIV-1 integration into chromatinized templates, and generates highly favored nucleosomal structures in vitro. This effect was found to be amplified by LEDGF/p75. Promotion of this FACT-mediated chromatin remodeling in cells both increases chromatin accessibility and stimulates HIV-1 infectivity and integration. Altogether, our data indicate that FACT regulates HIV-1 integration by inducing local nucleosomes dissociation that modulates the functional association between the incoming intasome and the targeted nucleosome.

  4. Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization

    Science.gov (United States)

    Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin

    This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.

  5. An integration of minimum local feature representation methods to recognize large variation of foods

    Science.gov (United States)

    Razali, Mohd Norhisham bin; Manshor, Noridayu; Halin, Alfian Abdul; Mustapha, Norwati; Yaakob, Razali

    2017-10-01

    Local invariant features have shown to be successful in describing object appearances for image classification tasks. Such features are robust towards occlusion and clutter and are also invariant against scale and orientation changes. This makes them suitable for classification tasks with little inter-class similarity and large intra-class difference. In this paper, we propose an integrated representation of the Speeded-Up Robust Feature (SURF) and Scale Invariant Feature Transform (SIFT) descriptors, using late fusion strategy. The proposed representation is used for food recognition from a dataset of food images with complex appearance variations. The Bag of Features (BOF) approach is employed to enhance the discriminative ability of the local features. Firstly, the individual local features are extracted to construct two kinds of visual vocabularies, representing SURF and SIFT. The visual vocabularies are then concatenated and fed into a Linear Support Vector Machine (SVM) to classify the respective food categories. Experimental results demonstrate impressive overall recognition at 82.38% classification accuracy based on the challenging UEC-Food100 dataset.

  6. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  7. Solving very large scattering problems using a parallel PWTD-enhanced surface integral equation solver

    KAUST Repository

    Liu, Yang

    2013-07-01

    The computational complexity and memory requirements of multilevel plane wave time domain (PWTD)-accelerated marching-on-in-time (MOT)-based surface integral equation (SIE) solvers scale as O(NtNs(log 2)Ns) and O(Ns 1.5); here N t and Ns denote numbers of temporal and spatial basis functions discretizing the current [Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003]. In the past, serial versions of these solvers have been successfully applied to the analysis of scattering from perfect electrically conducting as well as homogeneous penetrable targets involving up to Ns ≈ 0.5 × 106 and Nt ≈ 10 3. To solve larger problems, parallel PWTD-enhanced MOT solvers are called for. Even though a simple parallelization strategy was demonstrated in the context of electromagnetic compatibility analysis [M. Lu et al., in Proc. IEEE Int. Symp. AP-S, 4, 4212-4215, 2004], by and large, progress in this area has been slow. The lack of progress can be attributed wholesale to difficulties associated with the construction of a scalable PWTD kernel. © 2013 IEEE.

  8. Microdevelopment of Complex Featural and Spatial Integration with Contextual Support

    Directory of Open Access Journals (Sweden)

    Pamela L. Hirsch

    2015-01-01

    Full Text Available Complex spatial decisions involve the ability to combine featural and spatial information in a scene. In the present work, 4- through 9-year-old children completed a complex map-scene correspondence task under baseline and supported conditions. Children compared a photographed scene with a correct map and with map-foils that made salient an object feature or spatial property. Map-scene matches were analyzed for the effects of age and featural-spatial information on children’s selections. In both conditions children significantly favored maps that highlighted object detail and object perspective rather than color, landmark, and metric elements. Children’s correct performance did not differ by age and was suboptimal, but their ability to choose correct maps improved significantly when contextual support was provided. Strategy variability was prominent for all age groups, but at age 9 with support children were more likely to give up their focus on features and transition to the use of spatial strategies. These findings suggest the possibility of a U-shaped curve for children’s development of geometric knowledge: geometric coding is predominant early on, diminishes for a time in middle childhood in favor of a preference for features, and then reemerges along with the more advanced abilities to combine featural and spatial information.

  9. Complex mineralization at large ore deposits in the Russian Far East

    Science.gov (United States)

    Schneider, A. A.; Malyshev, Yu. F.; Goroshko, M. V.; Romanovsky, N. P.

    2011-04-01

    Genetic and mineralogical features of large deposits with complex Sn, W, and Mo mineralization in the Sikhote-Alin and Amur-Khingan metallogenic provinces are considered, as well as those of raremetal, rare earth, and uranium deposits in the Aldan-Stanovoi province. The spatiotemporal, geological, and mineralogical attributes of large deposits are set forth, and their geodynamic settings are determined. These attributes are exemplified in the large Tigriny Sn-W greisen-type deposit. The variation of regional tectonic settings and their spatial superposition are the main factor controlling formation of large deposits. Such a variation gives rise to multiple reactivation of the ore-magmatic system and long-term, multistage formation of deposits. Pulsatory mineralogical zoning with telescoped mineral assemblages related to different stages results in the formation of complex ores. The highest-grade zones of mass discharge of hydrothermal solutions are formed at the deposits. The promising greisen-type mineralization with complex Sn-W-Mo ore is suggested to be an additional source of tungsten and molybdenum. The Tigriny, Pravourminsky, and Arsen'evsky deposits, as well as deposits of the Komsomol'sk and Khingan-Olonoi ore districts are examples. Large and superlarge U, Ta, Nb, Be, and REE deposits are localized in the southeastern Aldan-Stanovoi Shield. The Ulkan and Arbarastakh ore districts attract special attention. The confirmed prospects of new large deposits with Sn, W, Mo, Ta, Nb, Be, REE, and U mineralization in the south of the Russian Far East assure expediency of further geological exploration in this territory.

  10. Complex Building Detection Through Integrating LIDAR and Aerial Photos

    Science.gov (United States)

    Zhai, R.

    2015-02-01

    This paper proposes a new approach on digital building detection through the integration of LiDAR data and aerial imagery. It is known that most building rooftops are represented by different regions from different seed pixels. Considering the principals of image segmentation, this paper employs a new region based technique to segment images, combining both the advantages of LiDAR and aerial images together. First, multiple seed points are selected by taking several constraints into consideration in an automated way. Then, the region growing procedures proceed by combining the elevation attribute from LiDAR data, visibility attribute from DEM (Digital Elevation Model), and radiometric attribute from warped images in the segmentation. Through this combination, the pixels with similar height, visibility, and spectral attributes are merged into one region, which are believed to represent the whole building area. The proposed methodology was implemented on real data and competitive results were achieved.

  11. Methods Dealing with Complexity in Selecting Joint Venture Contractors for Large-Scale Infrastructure Projects

    Directory of Open Access Journals (Sweden)

    Ru Liang

    2018-01-01

    Full Text Available The magnitude of business dynamics has increased rapidly due to increased complexity, uncertainty, and risk of large-scale infrastructure projects. This fact made it increasingly tough to “go alone” into a contractor. As a consequence, joint venture contractors with diverse strengths and weaknesses cooperatively bid for bidding. Understanding project complexity and making decision on the optimal joint venture contractor is challenging. This paper is to study how to select joint venture contractors for undertaking large-scale infrastructure projects based on a multiattribute mathematical model. Two different methods are developed to solve the problem. One is based on ideal points and the other one is based on balanced ideal advantages. Both of the two methods consider individual difference in expert judgment and contractor attributes. A case study of Hong Kong-Zhuhai-Macao-Bridge (HZMB project in China is used to demonstrate how to apply these two methods and their advantages.

  12. Studies on combined model based on functional objectives of large scale complex engineering

    Science.gov (United States)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  13. Does reef architectural complexity influence resource availability for a large reef-dwelling invertebrate?

    Science.gov (United States)

    Lozano-Álvarez, Enrique; Luviano-Aparicio, Nelia; Negrete-Soto, Fernando; Barradas-Ortiz, Cecilia; Aguíñiga-García, Sergio; Morillo-Velarde, Piedad S.; Álvarez-Filip, Lorenzo; Briones-Fourzán, Patricia

    2017-10-01

    In coral reefs, loss of architectural complexity and its associated habitat degradation is expected to affect reef specialists in particular due to changes in resource availability. We explored whether these features could potentially affect populations of a large invertebrate, the spotted spiny lobster Panulirus guttatus, which is an obligate Caribbean coral reef-dweller with a limited home range. We selected two separate large coral reef patches in Puerto Morelos (Mexico) that differed significantly in structural complexity and level of degradation, as assessed via the rugosity index, habitat assessment score, and percent cover of various benthic components. On each reef, we estimated density of P. guttatus and sampled lobsters to analyze their stomach contents, three different condition indices, and stable isotopes (δ15N and δ13C) in muscle. Lobster density did not vary with reef, suggesting that available crevices in the less complex patch still provided adequate refuge to these lobsters. Lobsters consumed many food types, dominated by mollusks and crustaceans, but proportionally more crustaceans (herbivore crabs) in the less complex patch, which had more calcareous macroalgae and algal turf. Lobsters from both reefs had a similar condition (all three indices) and mean δ15N, suggesting a similar quality of diet between reefs related to their opportunistic feeding, but differed in mean δ13C values, reflecting the different carbon sources between reefs and providing indirect evidence of individuals of P. guttatus foraging exclusively over their home reef. Overall, we found no apparent effects of architectural complexity, at least to the degree observed in our less complex patch, on density, condition, or trophic level of P. guttatus.

  14. Impacts of large dams on the complexity of suspended sediment dynamics in the Yangtze River

    Science.gov (United States)

    Wang, Yuankun; Rhoads, Bruce L.; Wang, Dong; Wu, Jichun; Zhang, Xiao

    2018-03-01

    The Yangtze River is one of the largest and most important rivers in the world. Over the past several decades, the natural sediment regime of the Yangtze River has been altered by the construction of dams. This paper uses multi-scale entropy analysis to ascertain the impacts of large dams on the complexity of high-frequency suspended sediment dynamics in the Yangtze River system, especially after impoundment of the Three Gorges Dam (TGD). In this study, the complexity of sediment dynamics is quantified by framing it within the context of entropy analysis of time series. Data on daily sediment loads for four stations located in the mainstem are analyzed for the past 60 years. The results indicate that dam construction has reduced the complexity of short-term (1-30 days) variation in sediment dynamics near the structures, but that complexity has actually increased farther downstream. This spatial pattern seems to reflect a filtering effect of the dams on the on the temporal pattern of sediment loads as well as decreased longitudinal connectivity of sediment transfer through the river system, resulting in downstream enhancement of the influence of local sediment inputs by tributaries on sediment dynamics. The TGD has had a substantial impact on the complexity of sediment series in the mainstem of the Yangtze River, especially after it became fully operational. This enhanced impact is attributed to the high trapping efficiency of this dam and its associated large reservoir. The sediment dynamics "signal" becomes more spatially variable after dam construction. This study demonstrates the spatial influence of dams on the high-frequency temporal complexity of sediment regimes and provides valuable information that can be used to guide environmental conservation of the Yangtze River.

  15. Universal properties of type IIB and F-theory flux compactifications at large complex structure

    International Nuclear Information System (INIS)

    Marsh, M.C. David; Sousa, Kepa

    2016-01-01

    We consider flux compactifications of type IIB string theory and F-theory in which the respective superpotentials at large complex structure are dominated by cubic or quartic terms in the complex structure moduli. In this limit, the low-energy effective theory exhibits universal properties that are insensitive to the details of the compactification manifold or the flux configuration. Focussing on the complex structure and axio-dilaton sector, we show that there are no vacua in this region and the spectrum of the Hessian matrix is highly peaked and consists only of three distinct eigenvalues (0, 2m 3/2 2 and 8m 3/2 2 ), independently of the number of moduli. We briefly comment on how the inclusion of Kähler moduli affect these findings. Our results generalise those of Brodie & Marsh http://dx.doi.org/10.1007/JHEP01(2016)037, in which these universal properties were found in a subspace of the large complex structure limit of type IIB compactifications.

  16. Pilot Study of Topical Copper Chlorophyllin Complex in Subjects With Facial Acne and Large Pores.

    Science.gov (United States)

    Stephens, Thomas J; McCook, John P; Herndon, James H

    2015-06-01

    Acne vulgaris is one of the most common skin diseases treated by dermatologists. Salts of copper chlorophyllin complex are semi-synthetic naturally-derived compounds with antioxidant, anti-inflammatory and wound healing activity that have not been previously tested topically in the treatment of acne-prone skin with enlarged pores. This single-center pilot study was conducted to assess the efficacy and safety of a liposomal dispersion of topically applied sodium copper chlorophyllin complex in subjects with mild-moderate acne and large, visible pores over a course of 3 weeks. Subjects were supplied with the test product, a topical gel containing a liposomal dispersion of sodium copper chlorophyllin complex (0.1%) with directions to apply a small amount to the facial area twice daily. Clinical assessments were performed at screening/baseline and at week 3. VISIA readings were taken and self-assessment questionnaires were conducted. 10 subjects were enrolled and completed the 3-week study. All clinical efficacy parameters showed statistically significant improvements over baseline at week 3. The study product was well tolerated. Subject questionnaires showed the test product was highly rated. In this pilot study, a topical formulation containing a liposomal dispersion of sodium copper chlorophyllin complex was shown to be clinically effective and well tolerated for the treatment of mild-moderate acne and large, visible pores when used for 3 weeks.

  17. The spectra of type IIB flux compactifications at large complex structure

    International Nuclear Information System (INIS)

    Brodie, Callum; Marsh, M.C. David

    2016-01-01

    We compute the spectra of the Hessian matrix, H, and the matrix M that governs the critical point equation of the low-energy effective supergravity, as a function of the complex structure and axio-dilaton moduli space in type IIB flux compactifications at large complex structure. We find both spectra analytically in an h − 1,2 +3 real-dimensional subspace of the moduli space, and show that they exhibit a universal structure with highly degenerate eigenvalues, independently of the choice of flux, the details of the compactification geometry, and the number of complex structure moduli. In this subspace, the spectrum of the Hessian matrix contains no tachyons, but there are also no critical points. We show numerically that the spectra of H and M remain highly peaked over a large fraction of the sampled moduli space of explicit Calabi-Yau compactifications with 2 to 5 complex structure moduli. In these models, the scale of the supersymmetric contribution to the scalar masses is strongly linearly correlated with the value of the superpotential over almost the entire moduli space, with particularly strong correlations arising for g s <1. We contrast these results with the expectations from the much-used continuous flux approximation, and comment on the applicability of Random Matrix Theory to the statistical modelling of the string theory landscape.

  18. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  19. Supply chain integration and performance : the moderating effect of supply complexity

    NARCIS (Netherlands)

    Giménez, C.; van der Vaart, T.; van Donk, D.P.

    2012-01-01

    Purpose - The purpose of this paper is to investigate the effectiveness of supply chain integration in different contexts. More specifically, it aims to show that supply chain integration is only effective in buyer-supplier relationships characterised by high supply complexity.

  20. Optimal design of integrated CHP systems for housing complexes

    International Nuclear Information System (INIS)

    Fuentes-Cortés, Luis Fabián; Ponce-Ortega, José María; Nápoles-Rivera, Fabricio; Serna-González, Medardo; El-Halwagi, Mahmoud M.

    2015-01-01

    Highlights: • An optimization formulation for designing domestic CHP systems is presented. • The operating scheme, prime mover and thermal storage system are optimized. • Weather conditions and behavior demands are considered. • Simultaneously economic and environmental objectives are considered. • Two case studies from Mexico are presented. - Abstract: This paper presents a multi-objective optimization approach for designing residential cogeneration systems based on a new superstructure that allows satisfying the demands of hot water and electricity at the minimum cost and the minimum environmental impact. The optimization involves the selection of technologies, size of required units and operating modes of equipment. Two residential complexes in different cities of the State of Michoacán in Mexico were considered as case studies. One is located on the west coast and the other one is in the mountainous area. The results show that the implementation of the proposed optimization method yields significant economic and environmental benefits due to the simultaneous reduction in the total annual cost and overall greenhouse gas emissions

  1. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    Science.gov (United States)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  2. Survey of large protein complexes D. vulgaris reveals great structural diversity

    Energy Technology Data Exchange (ETDEWEB)

    Han, B.-G.; Dong, M.; Liu, H.; Camp, L.; Geller, J.; Singer, M.; Hazen, T. C.; Choi, M.; Witkowska, H. E.; Ball, D. A.; Typke, D.; Downing, K. H.; Shatsky, M.; Brenner, S. E.; Chandonia, J.-M.; Biggin, M. D.; Glaeser, R. M.

    2009-08-15

    An unbiased survey has been made of the stable, most abundant multi-protein complexes in Desulfovibrio vulgaris Hildenborough (DvH) that are larger than Mr {approx} 400 k. The quaternary structures for 8 of the 16 complexes purified during this work were determined by single-particle reconstruction of negatively stained specimens, a success rate {approx}10 times greater than that of previous 'proteomic' screens. In addition, the subunit compositions and stoichiometries of the remaining complexes were determined by biochemical methods. Our data show that the structures of only two of these large complexes, out of the 13 in this set that have recognizable functions, can be modeled with confidence based on the structures of known homologs. These results indicate that there is significantly greater variability in the way that homologous prokaryotic macromolecular complexes are assembled than has generally been appreciated. As a consequence, we suggest that relying solely on previously determined quaternary structures for homologous proteins may not be sufficient to properly understand their role in another cell of interest.

  3. Developing integrated methods to address complex resource and environmental issues

    Science.gov (United States)

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some

  4. Complex singularities of the critical potential in the large-N limit

    International Nuclear Information System (INIS)

    Meurice, Y.

    2003-01-01

    We show with two numerical examples that the conventional expansion in powers of the field for the critical potential of 3-dimensional O(N) models in the large-N limit does not converge for values of φ 2 larger than some critical value. This can be explained by the existence of conjugated branch points in the complex φ 2 plane. Pade approximants [L+3/L] for the critical potential apparently converge at large φ 2 . This allows high-precision calculation of the fixed point in a more suitable set of coordinates. We argue that the singularities are generic and not an artifact of the large-N limit. We show that ignoring these singularities may lead to inaccurate approximations

  5. Dynamic Reactive Power Compensation of Large Scale Wind Integrated Power System

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    wind turbines especially wind farms with additional grid support functionalities like dynamic support (e,g dynamic reactive power support etc.) and ii) refurbishment of existing conventional central power plants to synchronous condensers could be one of the efficient, reliable and cost effective option......Due to progressive displacement of conventional power plants by wind turbines, dynamic security of large scale wind integrated power systems gets significantly compromised. In this paper we first highlight the importance of dynamic reactive power support/voltage security in large scale wind...... integrated power systems with least presence of conventional power plants. Then we propose a mixed integer dynamic optimization based method for optimal dynamic reactive power allocation in large scale wind integrated power systems. One of the important aspects of the proposed methodology is that unlike...

  6. Integrated health management and control of complex dynamical systems

    Science.gov (United States)

    Tolani, Devendra K.

    2005-11-01

    A comprehensive control and health management strategy for human-engineered complex dynamical systems is formulated for achieving high performance and reliability over a wide range of operation. Results from diverse research areas such as Probabilistic Robust Control (PRC), Damage Mitigating/Life Extending Control (DMC), Discrete Event Supervisory (DES) Control, Symbolic Time Series Analysis (STSA) and Health and Usage Monitoring System (HUMS) have been employed to achieve this goal. Continuous-domain control modules at the lower level are synthesized by PRC and DMC theories, whereas the upper-level supervision is based on DES control theory. In the PRC approach, by allowing different levels of risk under different flight conditions, the control system can achieve the desired trade off between stability robustness and nominal performance. In the DMC approach, component damage is incorporated in the control law to reduce the damage rate for enhanced structural durability. The DES controller monitors the system performance and, based on the mission requirements (e.g., performance metrics and level of damage mitigation), switches among various lower-level controllers. The core idea is to design a framework where the DES controller at the upper-level, mimics human intelligence and makes appropriate decisions to satisfy mission requirements, enhance system performance and structural durability. Recently developed tools in STSA have been used for anomaly detection and failure prognosis. The DMC deals with the usage monitoring or operational control part of health management, where as the issue of health monitoring is addressed by the anomaly detection tools. The proposed decision and control architecture has been validated on two test-beds, simulating the operations of rotorcraft dynamics and aircraft propulsion.

  7. When is vertical integration profitable? Focus on a large upstream company in the gas market

    International Nuclear Information System (INIS)

    Hatlebakk, Magnus

    2001-12-01

    This note discusses basic economic mechanisms that may affect the profitability of vertical integration in the European gas industry. It concentrates on reasonable strategies for a large upstream company which considers a stronger engagement downstream. The note warns against the effect of simplified conclusions with regard to the impact of vertical integration. It applies a simple model of successive oligopolies to discuss double mark-ups, exclusions, barriers to entry, etc

  8. The Complexity integrated-Instruments components media of IPA at Elementary School

    Directory of Open Access Journals (Sweden)

    Angreni Siska

    2018-01-01

    Full Text Available This research aims at describing the complexity of Integrated Instrument Components media (CII in learning of science at Elementary schools in District Siulak Mukai and at Elementary schools in District Siulak. The research applied a descriptive method which included survey forms. Instruments used were observation sheets. The result of the research showed Integrated Instrument Components media (CII natural science that complexity at primary school district Siulak was more complex compared with that at primary school district Siulak Mukai. is better than from primary school district Mukai

  9. Planning and Building Qualifiable Embedded Systems: Safety and Risk Properties Assessment for a Large and Complex System with Embedded Subsystems

    Science.gov (United States)

    Silva, N.; Lopes, R.; Barbosa, R.

    2012-01-01

    Systems based on embedded components and applications are today used in all markets. They are planned and developed by all types of institutions with different types of background experience, multidisciplinary teams and all types of capability and maturity levels. Organisational/engineering maturity has an impact on all aspects of the engineering of large and complex systems. An embedded system is a specific computer system designed to perform one or more dedicated functions, usually with real-time constraints. It is generally integrated as part of a more complex device typically composed of specific hardware such as sensors and actuators. This article presents an experimented technique to evaluate the organisation, processes, system and software engineering practices, methods, tools and the planned/produced artefacts themselves, leading towards certification/qualification. The safety and risk assessment of such core and complex systems is explained, described on a step-by- step manner, while presenting the main results and conclusions of the application of the technique to a real case study.

  10. Large system change challenges: addressing complex critical issues in linked physical and social domains

    Science.gov (United States)

    Waddell, Steve; Cornell, Sarah; Hsueh, Joe; Ozer, Ceren; McLachlan, Milla; Birney, Anna

    2015-04-01

    Most action to address contemporary complex challenges, including the urgent issues of global sustainability, occurs piecemeal and without meaningful guidance from leading complex change knowledge and methods. The potential benefit of using such knowledge is greater efficacy of effort and investment. However, this knowledge and its associated tools and methods are under-utilized because understanding about them is low, fragmented between diverse knowledge traditions, and often requires shifts in mindsets and skills from expert-led to participant-based action. We have been engaged in diverse action-oriented research efforts in Large System Change for sustainability. For us, "large" systems can be characterized as large-scale systems - up to global - with many components, of many kinds (physical, biological, institutional, cultural/conceptual), operating at multiple levels, driven by multiple forces, and presenting major challenges for people involved. We see change of such systems as complex challenges, in contrast with simple or complicated problems, or chaotic situations. In other words, issues and sub-systems have unclear boundaries, interact with each other, and are often contradictory; dynamics are non-linear; issues are not "controllable", and "solutions" are "emergent" and often paradoxical. Since choices are opportunity-, power- and value-driven, these social, institutional and cultural factors need to be made explicit in any actionable theory of change. Our emerging network is sharing and building a knowledge base of experience, heuristics, and theories of change from multiple disciplines and practice domains. We will present our views on focal issues for the development of the field of large system change, which include processes of goal-setting and alignment; leverage of systemic transitions and transformation; and the role of choice in influencing critical change processes, when only some sub-systems or levels of the system behave in purposeful ways

  11. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  12. How to integrate divergent integrals: a pure numerical approach to complex loop calculations

    International Nuclear Information System (INIS)

    Caravaglios, F.

    2000-01-01

    Loop calculations involve the evaluation of divergent integrals. Usually [G. 't Hooft, M. Veltman, Nucl. Phys. B 44 (1972) 189] one computes them in a number of dimensions different than four where the integral is convergent and then one performs the analytical continuation and considers the Laurent expansion in powers of ε=n-4. In this paper we discuss a method to extract directly all coefficients of this expansion by means of concrete and well defined integrals in a five-dimensional space. We by-pass the formal and symbolic procedure of analytic continuation; instead we can numerically compute the integrals to extract directly both the coefficient of the pole 1/ε and the finite part

  13. CRISPR-Cas9 mediated genetic engineering for the purification of the endogenous integrator complex from mammalian cells.

    Science.gov (United States)

    Baillat, David; Russell, William K; Wagner, Eric J

    2016-12-01

    The Integrator Complex (INT) is a large multi-subunit protein complex, containing at least 14 subunits and a host of associated factors. These protein components have been established through pulldowns of overexpressed epitope tagged subunits or by using antibodies raised against specific subunits. Here, we utilize CRISPR/Cas9 gene editing technology to introduce N-terminal FLAG epitope tags into the endogenous genes that encode Integrator subunit 4 and 11 within HEK293T cells. We provide specific details regarding design, approaches for facile screening, and our observed frequency of successful recombination. Finally, using silver staining, Western blotting and LC-MS/MS we compare the components of INT of purifications from CRISPR derived lines to 293T cells overexpressing FLAG-INTS11 to define a highly resolved constituency of mammalian INT. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  15. An integrated native mass spectrometry and top-down proteomics method that connects sequence to structure and function of macromolecular complexes

    Science.gov (United States)

    Li, Huilin; Nguyen, Hong Hanh; Ogorzalek Loo, Rachel R.; Campuzano, Iain D. G.; Loo, Joseph A.

    2018-02-01

    Mass spectrometry (MS) has become a crucial technique for the analysis of protein complexes. Native MS has traditionally examined protein subunit arrangements, while proteomics MS has focused on sequence identification. These two techniques are usually performed separately without taking advantage of the synergies between them. Here we describe the development of an integrated native MS and top-down proteomics method using Fourier-transform ion cyclotron resonance (FTICR) to analyse macromolecular protein complexes in a single experiment. We address previous concerns of employing FTICR MS to measure large macromolecular complexes by demonstrating the detection of complexes up to 1.8 MDa, and we demonstrate the efficacy of this technique for direct acquirement of sequence to higher-order structural information with several large complexes. We then summarize the unique functionalities of different activation/dissociation techniques. The platform expands the ability of MS to integrate proteomics and structural biology to provide insights into protein structure, function and regulation.

  16. Sedimentation Velocity Analysis of Large Oligomeric Chromatin Complexes Using Interference Detection.

    Science.gov (United States)

    Rogge, Ryan A; Hansen, Jeffrey C

    2015-01-01

    Sedimentation velocity experiments measure the transport of molecules in solution under centrifugal force. Here, we describe a method for monitoring the sedimentation of very large biological molecular assemblies using the interference optical systems of the analytical ultracentrifuge. The mass, partial-specific volume, and shape of macromolecules in solution affect their sedimentation rates as reflected in the sedimentation coefficient. The sedimentation coefficient is obtained by measuring the solute concentration as a function of radial distance during centrifugation. Monitoring the concentration can be accomplished using interference optics, absorbance optics, or the fluorescence detection system, each with inherent advantages. The interference optical system captures data much faster than these other optical systems, allowing for sedimentation velocity analysis of extremely large macromolecular complexes that sediment rapidly at very low rotor speeds. Supramolecular oligomeric complexes produced by self-association of 12-mer chromatin fibers are used to illustrate the advantages of the interference optics. Using interference optics, we show that chromatin fibers self-associate at physiological divalent salt concentrations to form structures that sediment between 10,000 and 350,000S. The method for characterizing chromatin oligomers described in this chapter will be generally useful for characterization of any biological structures that are too large to be studied by the absorbance optical system. © 2015 Elsevier Inc. All rights reserved.

  17. DNA-Directed Assembly of Capture Tools for Constitutional Studies of Large Protein Complexes.

    Science.gov (United States)

    Meyer, Rebecca; Faesen, Alex; Vogel, Katrin; Jeganathan, Sadasivam; Musacchio, Andrea; Niemeyer, Christof M

    2015-06-10

    Large supramolecular protein complexes, such as the molecular machinery involved in gene regulation, cell signaling, or cell division, are key in all fundamental processes of life. Detailed elucidation of structure and dynamics of such complexes can be achieved by reverse-engineering parts of the complexes in order to probe their interactions with distinctive binding partners in vitro. The exploitation of DNA nanostructures to mimic partially assembled supramolecular protein complexes in which the presence and state of two or more proteins are decisive for binding of additional building blocks is reported here. To this end, four-way DNA Holliday junction motifs bearing a fluorescein and a biotin tag, for tracking and affinity capture, respectively, are site-specifically functionalized with centromeric protein (CENP) C and CENP-T. The latter serves as baits for binding of the so-called KMN component, thereby mimicking early stages of the assembly of kinetochores, structures that mediate and control the attachment of microtubules to chromosomes in the spindle apparatus. Results from pull-down experiments are consistent with the hypothesis that CENP-C and CENP-T may bind cooperatively to the KMN network. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Gcn4-Mediator Specificity Is Mediated by a Large and Dynamic Fuzzy Protein-Protein Complex

    Directory of Open Access Journals (Sweden)

    Lisa M. Tuttle

    2018-03-01

    Full Text Available Summary: Transcription activation domains (ADs are inherently disordered proteins that often target multiple coactivator complexes, but the specificity of these interactions is not understood. Efficient transcription activation by yeast Gcn4 requires its tandem ADs and four activator-binding domains (ABDs on its target, the Mediator subunit Med15. Multiple ABDs are a common feature of coactivator complexes. We find that the large Gcn4-Med15 complex is heterogeneous and contains nearly all possible AD-ABD interactions. Gcn4-Med15 forms via a dynamic fuzzy protein-protein interface, where ADs bind the ABDs in multiple orientations via hydrophobic regions that gain helicity. This combinatorial mechanism allows individual low-affinity and specificity interactions to generate a biologically functional, specific, and higher affinity complex despite lacking a defined protein-protein interface. This binding strategy is likely representative of many activators that target multiple coactivators, as it allows great flexibility in combinations of activators that can cooperate to regulate genes with variable coactivator requirements. : Tuttle et al. report a “fuzzy free-for-all” interaction mechanism that explains how seemingly unrelated transcription activators converge on a limited number of coactivator targets. The mechanism provides a rationale for the observation that individually weak and low-specificity interactions can combine to produce biologically critical function without requiring highly ordered structure. Keywords: transcription activation, intrinsically disordered proteins, fuzzy binding

  19. Large eddy simulation modeling of particle-laden flows in complex terrain

    Science.gov (United States)

    Salesky, S.; Giometto, M. G.; Chamecki, M.; Lehning, M.; Parlange, M. B.

    2017-12-01

    The transport, deposition, and erosion of heavy particles over complex terrain in the atmospheric boundary layer is an important process for hydrology, air quality forecasting, biology, and geomorphology. However, in situ observations can be challenging in complex terrain due to spatial heterogeneity. Furthermore, there is a need to develop numerical tools that can accurately represent the physics of these multiphase flows over complex surfaces. We present a new numerical approach to accurately model the transport and deposition of heavy particles in complex terrain using large eddy simulation (LES). Particle transport is represented through solution of the advection-diffusion equation including terms that represent gravitational settling and inertia. The particle conservation equation is discretized in a cut-cell finite volume framework in order to accurately enforce mass conservation. Simulation results will be validated with experimental data, and numerical considerations required to enforce boundary conditions at the surface will be discussed. Applications will be presented in the context of snow deposition and transport, as well as urban dispersion.

  20. Gcn4-Mediator Specificity Is Mediated by a Large and Dynamic Fuzzy Protein-Protein Complex.

    Science.gov (United States)

    Tuttle, Lisa M; Pacheco, Derek; Warfield, Linda; Luo, Jie; Ranish, Jeff; Hahn, Steven; Klevit, Rachel E

    2018-03-20

    Transcription activation domains (ADs) are inherently disordered proteins that often target multiple coactivator complexes, but the specificity of these interactions is not understood. Efficient transcription activation by yeast Gcn4 requires its tandem ADs and four activator-binding domains (ABDs) on its target, the Mediator subunit Med15. Multiple ABDs are a common feature of coactivator complexes. We find that the large Gcn4-Med15 complex is heterogeneous and contains nearly all possible AD-ABD interactions. Gcn4-Med15 forms via a dynamic fuzzy protein-protein interface, where ADs bind the ABDs in multiple orientations via hydrophobic regions that gain helicity. This combinatorial mechanism allows individual low-affinity and specificity interactions to generate a biologically functional, specific, and higher affinity complex despite lacking a defined protein-protein interface. This binding strategy is likely representative of many activators that target multiple coactivators, as it allows great flexibility in combinations of activators that can cooperate to regulate genes with variable coactivator requirements. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Axiomatic design in large systems complex products, buildings and manufacturing systems

    CERN Document Server

    Suh, Nam

    2016-01-01

    This book provides a synthesis of recent developments in Axiomatic Design theory and its application in large complex systems. Introductory chapters provide concise tutorial materials for graduate students and new practitioners, presenting the fundamentals of Axiomatic Design and relating its key concepts to those of model-based systems engineering. A mathematical exposition of design axioms is also provided. The main body of the book, which represents a concentrated treatment of several applications, is divided into three parts covering work on: complex products; buildings; and manufacturing systems. The book shows how design work in these areas can benefit from the scientific and systematic underpinning provided by Axiomatic Design, and in so doing effectively combines the state of the art in design research with practice. All contributions were written by an international group of leading proponents of Axiomatic Design. The book concludes with a call to action motivating further research into the engineeri...

  2. Engineering a large application software project: the controls of the CERN PS accelerator complex

    International Nuclear Information System (INIS)

    Benincasa, G.P.; Daneels, A.; Heymans, P.; Serre, Ch.

    1985-01-01

    The CERN PS accelerator complex has been progressively converted to full computer controls without interrupting its full-time operation (more than 6000 hours per year with on average not more than 1% of the total down-time due to controls). The application software amounts to 120 man-years and 450'000 instructions: it compares with other large software projects, also outside the accelerator world: e.g. Skylab's ground support software. This paper outlines the application software structure which takes into account technical requirements and constraints (resulting from the complexity of the process and its operation) and economical and managerial ones. It presents the engineering and management techniques used to promote implementation, testing and commissioning within budget, manpower and time constraints and concludes with experience gained

  3. SPECTROSCOPIC STUDY OF THE N159/N160 COMPLEX IN THE LARGE MAGELLANIC CLOUD

    International Nuclear Information System (INIS)

    Farina, Cecilia; Bosch, Guillermo L.; Morrell, Nidia I.; Barba, Rodolfo H.; Walborn, Nolan R.

    2009-01-01

    We present a spectroscopic study of the N159/N160 massive star-forming region south of 30 Doradus in the Large Magellanic Cloud, classifying a total of 189 stars in the field of the complex. Most of them belong to O and early B spectral classes; we have also found some uncommon and very interesting spectra, including members of the Onfp class, a Be P Cygni star, and some possible multiple systems. Using spectral types as broad indicators of evolutionary stages, we considered the evolutionary status of the region as a whole. We infer that massive stars at different evolutionary stages are present throughout the region, favoring the idea of a common time for the origin of recent star formation in the N159/N160 complex as a whole, while sequential star formation at different rates is probably present in several subregions.

  4. Control protocol: large scale implementation at the CERN PS complex - a first assessment

    International Nuclear Information System (INIS)

    Abie, H.; Benincasa, G.; Coudert, G.; Davydenko, Y.; Dehavay, C.; Gavaggio, R.; Gelato, G.; Heinze, W.; Legras, M.; Lustig, H.; Merard, L.; Pearson, T.; Strubin, P.; Tedesco, J.

    1994-01-01

    The Control Protocol is a model-based, uniform access procedure from a control system to accelerator equipment. It was proposed at CERN about 5 years ago and prototypes were developed in the following years. More recently, this procedure has been finalized and implemented at a large scale in the PS Complex. More than 300 pieces of equipment are now using this protocol in normal operation and another 300 are under implementation. These include power converters, vacuum systems, beam instrumentation devices, RF equipment, etc. This paper describes how the single general procedure is applied to the different kinds of equipment. The advantages obtained are also discussed. ((orig.))

  5. Complex Security System for Premises Under Conditions of Large Volume of Passenger Traffic

    Directory of Open Access Journals (Sweden)

    Yakubov Vladimir

    2016-01-01

    Full Text Available Subsystems of the design of a complex security system for premises under conditions of large volume of passenger traffic are considered. These subsystems provide video- and thermal imaging control, radio wave tomography, and gas analysis. Simultaneous application of all examined variants will essentially increase the probability of timely prevention of dangerous situations with the probability of false alarm as low as possible. It is important that finally, this will provide protection of population and will facilitate the work of intelligence services.

  6. Descriptive Study of an Outbreak of Avian Urolithiasis in a Large Commercial Egg Complex in Algeria

    Directory of Open Access Journals (Sweden)

    Hicham SID

    2011-03-01

    Full Text Available Avian urolithiasis is one of the major causes of mortality in poultry. However, in Algeria this condition has never been described. An outbreak of avian urolithiasis was observed on a large commercial egg complex in the department of Chlef (West of Algeria. The clinical features of this condition are to be described. Mortality associated to urolithiasis started at the onset of egg production, estimated to 0.7 % per week. Urolithiasis induced an egg drop estimated to 12%. Dead and live layers were both necropsied and examined for kidney lesions. Most of the birds examined presented enlarged ureters, renal atrophy and visceral gout deposition.

  7. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  8. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  9. A note on U(N) integrals in the large N limit

    International Nuclear Information System (INIS)

    O'Brien, K.H.; Zuber, J.B.

    1984-01-01

    The U(N) integral ∫DUexp[Ntr(UJ+Usup(*)Jsup(*))]=exp(N 2 W) is reconsidered in the large N limit and the coefficients of the expansion of W in the moments of the eigenvalues of (JJsup(*)) explicitly computed. (orig.)

  10. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  11. Direct evaluation of free energy for large system through structure integration approach.

    Science.gov (United States)

    Takeuchi, Kazuhito; Tanaka, Ryohei; Yuge, Koretaka

    2015-09-30

    We propose a new approach, 'structure integration', enabling direct evaluation of configurational free energy for large systems. The present approach is based on the statistical information of lattice. Through first-principles-based simulation, we find that the present method evaluates configurational free energy accurately in disorder states above critical temperature.

  12. Management of large complex multi-stakeholders projects: a bibliometric approach

    Directory of Open Access Journals (Sweden)

    Aline Sacchi Homrich

    2017-06-01

    Full Text Available The growing global importance of large infrastructure projects has piqued the interest of many researchers in a variety of issues related to the management of large, multi-stakeholder projects, characterized by their high complexity and intense interaction among numerous stake-holders with distinct levels of responsibility. The objective of this study is to provide an overview of the academic literature focused on the management of these kinds of projects, describing the main themes considered, the lines of research identified and prominent trends. Bibliometric analysis techniques were used as well as network and content analysis. Research for information was performed in the scientific database, ISI Web of Knowledge and Scopus. The initial sample analysis consisted of 144 papers published between 1984 and 2014 and expanded to the references cited in these papers. The models identified in the literature converge with the following key-processes: project delivery systems; risk-management models; project cost management; public-private partnership.

  13. A Variable Stiffness Analysis Model for Large Complex Thin-Walled Guide Rail

    Directory of Open Access Journals (Sweden)

    Wang Xiaolong

    2016-01-01

    Full Text Available Large complex thin-walled guide rail has complicated structure and no uniform low rigidity. The traditional cutting simulations are time consuming due to huge computation especially in large workpiece. To solve these problems, a more efficient variable stiffness analysis model has been propose, which can obtain quantitative stiffness value of the machining surface. Applying simulate cutting force in sampling points using finite element analysis software ABAQUS, the single direction variable stiffness rule can be obtained. The variable stiffness matrix has been propose by analyzing multi-directions coupling variable stiffness rule. Combining with the three direction cutting force value, the reasonability of existing processing parameters can be verified and the optimized cutting parameters can be designed.

  14. A dynamic globalization model for large eddy simulation of complex turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hae Cheon; Park, No Ma; Kim, Jin Seok [Seoul National Univ., Seoul (Korea, Republic of)

    2005-07-01

    A dynamic subgrid-scale model is proposed for large eddy simulation of turbulent flows in complex geometry. The eddy viscosity model by Vreman [Phys. Fluids, 16, 3670 (2004)] is considered as a base model. A priori tests with the original Vreman model show that it predicts the correct profile of subgrid-scale dissipation in turbulent channel flow but the optimal model coefficient is far from universal. Dynamic procedures of determining the model coefficient are proposed based on the 'global equilibrium' between the subgrid-scale dissipation and viscous dissipation. An important feature of the proposed procedures is that the model coefficient determined is globally constant in space but varies only in time. Large eddy simulations with the present dynamic model are conducted for forced isotropic turbulence, turbulent channel flow and flow over a sphere, showing excellent agreements with previous results.

  15. Overview of the ITER Tokamak complex building and integration of plant systems toward construction

    Energy Technology Data Exchange (ETDEWEB)

    Cordier, Jean-Jacques, E-mail: jean-jacques.cordier@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Bak, Joo-Shik [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Baudry, Alain [Engage Consortium, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Benchikhoune, Magali [Fusion For Energy (F4E), c/ Josep Pla, n.2, Torres Diagonal Litoral, E-08019 Barcelona (Spain); Carafa, Leontin; Chiocchio, Stefano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Darbour, Romaric [Fusion For Energy (F4E), c/ Josep Pla, n.2, Torres Diagonal Litoral, E-08019 Barcelona (Spain); Elbez, Joelle; Di Giuseppe, Giovanni; Iwata, Yasuhiro; Jeannoutot, Thomas; Kotamaki, Miikka; Kuehn, Ingo; Lee, Andreas; Levesy, Bruno; Orlandi, Sergio [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Packer, Rachel [Engage Consortium, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Patisson, Laurent; Reich, Jens; Rigoni, Giuliano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); and others

    2015-10-15

    at mid-2014. The paper gives an overview of the final configuration of the ITER nuclear buildings and highlights the large progress made on the final integration of the plant systems in the Tokamak complex. The revised design of the Tokamak machine supporting system is also described.

  16. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  17. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    Science.gov (United States)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well

  18. A large class of solvable multistate Landau–Zener models and quantum integrability

    Science.gov (United States)

    Chernyak, Vladimir Y.; Sinitsyn, Nikolai A.; Sun, Chen

    2018-06-01

    The concept of quantum integrability has been introduced recently for quantum systems with explicitly time-dependent Hamiltonians (Sinitsyn et al 2018 Phys. Rev. Lett. 120 190402). Within the multistate Landau–Zener (MLZ) theory, however, there has been a successful alternative approach to identify and solve complex time-dependent models (Sinitsyn and Chernyak 2017 J. Phys. A: Math. Theor. 50 255203). Here we compare both methods by applying them to a new class of exactly solvable MLZ models. This class contains systems with an arbitrary number of interacting states and shows quick growth with N number of exact adiabatic energy crossing points, which appear at different moments of time. At each N, transition probabilities in these systems can be found analytically and exactly but complexity and variety of solutions in this class also grow with N quickly. We illustrate how common features of solvable MLZ systems appear from quantum integrability and develop an approach to further classification of solvable MLZ problems.

  19. Investigation on the integral output power model of a large-scale wind farm

    Institute of Scientific and Technical Information of China (English)

    BAO Nengsheng; MA Xiuqian; NI Weidou

    2007-01-01

    The integral output power model of a large-scale wind farm is needed when estimating the wind farm's output over a period of time in the future.The actual wind speed power model and calculation method of a wind farm made up of many wind turbine units are discussed.After analyzing the incoming wind flow characteristics and their energy distributions,and after considering the multi-effects among the wind turbine units and certain assumptions,the incoming wind flow model of multi-units is built.The calculation algorithms and steps of the integral output power model of a large-scale wind farm are provided.Finally,an actual power output of the wind farm is calculated and analyzed by using the practical measurement wind speed data.The characteristics of a large-scale wind farm are also discussed.

  20. 3D-Printed Disposable Wireless Sensors with Integrated Microelectronics for Large Area Environmental Monitoring

    KAUST Repository

    Farooqui, Muhammad Fahad

    2017-05-19

    Large area environmental monitoring can play a crucial role in dealing with crisis situations. However, it is challenging as implementing a fixed sensor network infrastructure over large remote area is economically unfeasible. This work proposes disposable, compact, dispersible 3D-printed wireless sensor nodes with integrated microelectronics which can be dispersed in the environment and work in conjunction with few fixed nodes for large area monitoring applications. As a proof of concept, the wireless sensing of temperature, humidity, and H2S levels are shown which are important for two critical environmental conditions namely forest fires and industrial leaks. These inkjet-printed sensors and an antenna are realized on the walls of a 3D-printed cubic package which encloses the microelectronics developed on a 3D-printed circuit board. Hence, 3D printing and inkjet printing are uniquely combined in order to realize a low-cost, fully integrated wireless sensor node.

  1. An Integrated Circuit for Radio Astronomy Correlators Supporting Large Arrays of Antennas

    Science.gov (United States)

    D'Addario, Larry R.; Wang, Douglas

    2016-01-01

    Radio telescopes that employ arrays of many antennas are in operation, and ever larger ones are being designed and proposed. Signals from the antennas are combined by cross-correlation. While the cost of most components of the telescope is proportional to the number of antennas N, the cost and power consumption of cross-correlationare proportional to N2 and dominate at sufficiently large N. Here we report the design of an integrated circuit (IC) that performs digital cross-correlations for arbitrarily many antennas in a power-efficient way. It uses an intrinsically low-power architecture in which the movement of data between devices is minimized. In a large system, each IC performs correlations for all pairs of antennas but for a portion of the telescope's bandwidth (the so-called "FX" structure). In our design, the correlations are performed in an array of 4096 complex multiply-accumulate (CMAC) units. This is sufficient to perform all correlations in parallel for 64 signals (N=32 antennas with 2 opposite-polarization signals per antenna). When N is larger, the input data are buffered in an on-chipmemory and the CMACs are re-used as many times as needed to compute all correlations. The design has been synthesized and simulated so as to obtain accurate estimates of the IC's size and power consumption. It isintended for fabrication in a 32 nm silicon-on-insulator process, where it will require less than 12mm2 of silicon area and achieve an energy efficiency of 1.76 to 3.3 pJ per CMAC operation, depending on the number of antennas. Operation has been analyzed in detail up to N = 4096. The system-level energy efficiency, including board-levelI/O, power supplies, and controls, is expected to be 5 to 7 pJ per CMAC operation. Existing correlators for the JVLA (N = 32) and ALMA (N = 64) telescopes achieve about 5000 pJ and 1000 pJ respectively usingapplication-specific ICs in older technologies. To our knowledge, the largest-N existing correlator is LEDA atN = 256; it

  2. Characterization of Aftershock Sequences from Large Strike-Slip Earthquakes Along Geometrically Complex Faults

    Science.gov (United States)

    Sexton, E.; Thomas, A.; Delbridge, B. G.

    2017-12-01

    Large earthquakes often exhibit complex slip distributions and occur along non-planar fault geometries, resulting in variable stress changes throughout the region of the fault hosting aftershocks. To better discern the role of geometric discontinuities on aftershock sequences, we compare areas of enhanced and reduced Coulomb failure stress and mean stress for systematic differences in the time dependence and productivity of these aftershock sequences. In strike-slip faults, releasing structures, including stepovers and bends, experience an increase in both Coulomb failure stress and mean stress during an earthquake, promoting fluid diffusion into the region and further failure. Conversely, Coulomb failure stress and mean stress decrease in restraining bends and stepovers in strike-slip faults, and fluids diffuse away from these areas, discouraging failure. We examine spatial differences in seismicity patterns along structurally complex strike-slip faults which have hosted large earthquakes, such as the 1992 Mw 7.3 Landers, the 2010 Mw 7.2 El-Mayor Cucapah, the 2014 Mw 6.0 South Napa, and the 2016 Mw 7.0 Kumamoto events. We characterize the behavior of these aftershock sequences with the Epidemic Type Aftershock-Sequence Model (ETAS). In this statistical model, the total occurrence rate of aftershocks induced by an earthquake is λ(t) = λ_0 + \\sum_{i:t_i

  3. Wall modeled large eddy simulations of complex high Reynolds number flows with synthetic inlet turbulence

    International Nuclear Information System (INIS)

    Patil, Sunil; Tafti, Danesh

    2012-01-01

    Highlights: ► Large eddy simulation. ► Wall layer modeling. ► Synthetic inlet turbulence. ► Swirl flows. - Abstract: Large eddy simulations of complex high Reynolds number flows are carried out with the near wall region being modeled with a zonal two layer model. A novel formulation for solving the turbulent boundary layer equation for the effective tangential velocity in a generalized co-ordinate system is presented and applied in the near wall zonal treatment. This formulation reduces the computational time in the inner layer significantly compared to the conventional two layer formulations present in the literature and is most suitable for complex geometries involving body fitted structured and unstructured meshes. The cost effectiveness and accuracy of the proposed wall model, used with the synthetic eddy method (SEM) to generate inlet turbulence, is investigated in turbulent channel flow, flow over a backward facing step, and confined swirling flows at moderately high Reynolds numbers. Predictions are compared with available DNS, experimental LDV data, as well as wall resolved LES. In all cases, there is at least an order of magnitude reduction in computational cost with no significant loss in prediction accuracy.

  4. Neural networks supporting audiovisual integration for speech: A large-scale lesion study.

    Science.gov (United States)

    Hickok, Gregory; Rogalsky, Corianne; Matchin, William; Basilakos, Alexandra; Cai, Julia; Pillay, Sara; Ferrill, Michelle; Mickelsen, Soren; Anderson, Steven W; Love, Tracy; Binder, Jeffrey; Fridriksson, Julius

    2018-06-01

    Auditory and visual speech information are often strongly integrated resulting in perceptual enhancements for audiovisual (AV) speech over audio alone and sometimes yielding compelling illusory fusion percepts when AV cues are mismatched, the McGurk-MacDonald effect. Previous research has identified three candidate regions thought to be critical for AV speech integration: the posterior superior temporal sulcus (STS), early auditory cortex, and the posterior inferior frontal gyrus. We assess the causal involvement of these regions (and others) in the first large-scale (N = 100) lesion-based study of AV speech integration. Two primary findings emerged. First, behavioral performance and lesion maps for AV enhancement and illusory fusion measures indicate that classic metrics of AV speech integration are not necessarily measuring the same process. Second, lesions involving superior temporal auditory, lateral occipital visual, and multisensory zones in the STS are the most disruptive to AV speech integration. Further, when AV speech integration fails, the nature of the failure-auditory vs visual capture-can be predicted from the location of the lesions. These findings show that AV speech processing is supported by unimodal auditory and visual cortices as well as multimodal regions such as the STS at their boundary. Motor related frontal regions do not appear to play a role in AV speech integration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  6. Fast and accurate detection of spread source in large complex networks.

    Science.gov (United States)

    Paluch, Robert; Lu, Xiaoyan; Suchecki, Krzysztof; Szymański, Bolesław K; Hołyst, Janusz A

    2018-02-06

    Spread over complex networks is a ubiquitous process with increasingly wide applications. Locating spread sources is often important, e.g. finding the patient one in epidemics, or source of rumor spreading in social network. Pinto, Thiran and Vetterli introduced an algorithm (PTVA) to solve the important case of this problem in which a limited set of nodes act as observers and report times at which the spread reached them. PTVA uses all observers to find a solution. Here we propose a new approach in which observers with low quality information (i.e. with large spread encounter times) are ignored and potential sources are selected based on the likelihood gradient from high quality observers. The original complexity of PTVA is O(N α ), where α ∈ (3,4) depends on the network topology and number of observers (N denotes the number of nodes in the network). Our Gradient Maximum Likelihood Algorithm (GMLA) reduces this complexity to O (N 2 log (N)). Extensive numerical tests performed on synthetic networks and real Gnutella network with limitation that id's of spreaders are unknown to observers demonstrate that for scale-free networks with such limitation GMLA yields higher quality localization results than PTVA does.

  7. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  8. Improved Peak Detection and Deconvolution of Native Electrospray Mass Spectra from Large Protein Complexes.

    Science.gov (United States)

    Lu, Jonathan; Trnka, Michael J; Roh, Soung-Hun; Robinson, Philip J J; Shiau, Carrie; Fujimori, Danica Galonic; Chiu, Wah; Burlingame, Alma L; Guan, Shenheng

    2015-12-01

    Native electrospray-ionization mass spectrometry (native MS) measures biomolecules under conditions that preserve most aspects of protein tertiary and quaternary structure, enabling direct characterization of large intact protein assemblies. However, native spectra derived from these assemblies are often partially obscured by low signal-to-noise as well as broad peak shapes because of residual solvation and adduction after the electrospray process. The wide peak widths together with the fact that sequential charge state series from highly charged ions are closely spaced means that native spectra containing multiple species often suffer from high degrees of peak overlap or else contain highly interleaved charge envelopes. This situation presents a challenge for peak detection, correct charge state and charge envelope assignment, and ultimately extraction of the relevant underlying mass values of the noncovalent assemblages being investigated. In this report, we describe a comprehensive algorithm developed for addressing peak detection, peak overlap, and charge state assignment in native mass spectra, called PeakSeeker. Overlapped peaks are detected by examination of the second derivative of the raw mass spectrum. Charge state distributions of the molecular species are determined by fitting linear combinations of charge envelopes to the overall experimental mass spectrum. This software is capable of deconvoluting heterogeneous, complex, and noisy native mass spectra of large protein assemblies as demonstrated by analysis of (1) synthetic mononucleosomes containing severely overlapping peaks, (2) an RNA polymerase II/α-amanitin complex with many closely interleaved ion signals, and (3) human TriC complex containing high levels of background noise. Graphical Abstract ᅟ.

  9. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    Science.gov (United States)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These

  10. Combining eastern and western practices for safe and effective endoscopic resection of large complex colorectal lesions.

    Science.gov (United States)

    Emmanuel, Andrew; Gulati, Shraddha; Burt, Margaret; Hayee, Bu'Hussain; Haji, Amyn

    2018-05-01

    Endoscopic resection of large colorectal polyps is well established. However, significant differences in technique exist between eastern and western interventional endoscopists. We report the results of endoscopic resection of large complex colorectal lesions from a specialist unit that combines eastern and western techniques for assessment and resection. Endoscopic resections of colorectal lesions of at least 2 cm were included. Lesions were assessed using magnification chromoendoscopy supplemented by colonoscopic ultrasound in selected cases. A lesion-specific approach to resection with endoscopic mucosal resection or endoscopic submucosal dissection (ESD) was used. Surveillance endoscopy was performed at 3 (SC1) and 12 (SC2) months. Four hundred and sixty-six large (≥20 mm) colorectal lesions (mean size 54.8 mm) were resected. Three hundread and fifty-six were resected using endoscopic mucosal resection and 110 by ESD or hybrid ESD. Fifty-one percent of lesions had been subjected to previous failed attempts at resection or heavy manipulation (≥6 biopsies). Nevertheless, endoscopic resection was deemed successful after an initial attempt in 98%. Recurrence occurred in 15% and could be treated with endoscopic resection in most. Only two patients required surgery for perforation. Nine patients had postprocedure bleeding; only two required endoscopic clips. Ninety-six percent of patients without invasive cancer were free from recurrence and had avoided surgery at last follow-up. Combining eastern and western practices for assessment and resection results in safe and effective organ-conserving treatment of complex colorectal lesions. Accurate assessment before and after resection using magnification chromoendoscopy and a lesion-specific approach to resection, incorporating ESD where appropriate, are important factors in achieving these results.

  11. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  12. Complex Behavior in an Integrate-and-Fire Neuron Model Based on Small World Networks

    International Nuclear Information System (INIS)

    Lin Min; Chen Tianlun

    2005-01-01

    Based on our previously pulse-coupled integrate-and-fire neuron model in small world networks, we investigate the complex behavior of electroencephalographic (EEG)-like activities produced by such a model. We find EEG-like activities have obvious chaotic characteristics. We also analyze the complex behaviors of EEG-like signals, such as spectral analysis, reconstruction of the phase space, the correlation dimension, and so on.

  13. DISTILLER: a data integration framework to reveal condition dependency of complex regulons in Escherichia coli.

    Science.gov (United States)

    Lemmens, Karen; De Bie, Tijl; Dhollander, Thomas; De Keersmaecker, Sigrid C; Thijs, Inge M; Schoofs, Geert; De Weerdt, Ami; De Moor, Bart; Vanderleyden, Jos; Collado-Vides, Julio; Engelen, Kristof; Marchal, Kathleen

    2009-01-01

    We present DISTILLER, a data integration framework for the inference of transcriptional module networks. Experimental validation of predicted targets for the well-studied fumarate nitrate reductase regulator showed the effectiveness of our approach in Escherichia coli. In addition, the condition dependency and modularity of the inferred transcriptional network was studied. Surprisingly, the level of regulatory complexity seemed lower than that which would be expected from RegulonDB, indicating that complex regulatory programs tend to decrease the degree of modularity.

  14. AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users

    Science.gov (United States)

    Maiersperger, T.

    2017-12-01

    The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.

  15. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  16. “Strategies of Large Systems Integrators in Malaysia to Overcome Lower Margins”

    OpenAIRE

    Omardin, Daniel Yusoff

    2013-01-01

    The Malaysian Information and Technology (IT) scene is now no longer seen as market for high returns as it has reached it peak. There has been a tremendous growth of IT related companies in the past 10 years. All this competition is now leading to a price war making the industry less attractive and harder for the system integrators that have been doing business in this market for the past 20 years. This paper investigates the lost of margins of three large system integrators in Malaysia and i...

  17. A large deviation principle in H\\"older norm for multiple fractional integrals

    OpenAIRE

    Sanz-Solé, Marta; Torrecilla-Tarantino, Iván

    2007-01-01

    For a fractional Brownian motion $B^H$ with Hurst parameter $H\\in]{1/4},{1/2}[\\cup]{1/2},1[$, multiple indefinite integrals on a simplex are constructed and the regularity of their sample paths are studied. Then, it is proved that the family of probability laws of the processes obtained by replacing $B^H$ by $\\epsilon^{{1/2}} B^H$ satisfies a large deviation principle in H\\"older norm. The definition of the multiple integrals relies upon a representation of the fractional Brownian motion in t...

  18. Evaluation model of project complexity for large-scale construction projects in Iran - A Fuzzy ANP approach

    Directory of Open Access Journals (Sweden)

    Aliyeh Kazemi

    2016-09-01

    Full Text Available Construction projects have always been complex. By growing trend of this complexity, implementations of large-scale constructions become harder. Hence, evaluating and understanding these complexities are critical. Correct evaluation of a project complication can provide executives and managers with good source to use. Fuzzy analytic network process (ANP is a logical and systematic approach toward defining, evaluation, and grading. This method allows for analyzing complex systems, and determining complexity of them. In this study, by taking advantage of fuzzy ANP, effective indexes for development of complications in large-scale construction projects in Iran have been determined and prioritized. The results show socio-political, project system interdependencies, and technological complexity indexes ranked top to three. Furthermore, in comparison of three main huge projects: commercial-administrative, hospital, and skyscrapers, the hospital project had been evaluated as the most complicated. This model is beneficial for professionals in managing large-scale projects.

  19. Reframing the challenges to integrated care: a complex-adaptive systems perspective

    Directory of Open Access Journals (Sweden)

    Peter Tsasis

    2012-09-01

    Full Text Available Introduction: Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS perspective on integrated care.Methods: In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses.Results: Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity. Discussion and Conclusion: One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that

  20. Reframing the challenges to integrated care: a complex-adaptive systems perspective

    Directory of Open Access Journals (Sweden)

    Peter Tsasis

    2012-09-01

    Full Text Available Introduction: Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS perspective on integrated care. Methods: In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses. Results: Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity.  Discussion and Conclusion: One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that

  1. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  2. Optimizing water resources management in large river basins with integrated surface water-groundwater modeling: A surrogate-based approach

    Science.gov (United States)

    Wu, Bin; Zheng, Yi; Wu, Xin; Tian, Yong; Han, Feng; Liu, Jie; Zheng, Chunmiao

    2015-04-01

    Integrated surface water-groundwater modeling can provide a comprehensive and coherent understanding on basin-scale water cycle, but its high computational cost has impeded its application in real-world management. This study developed a new surrogate-based approach, SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), to incorporate the integrated modeling into water management optimization. Its applicability and advantages were evaluated and validated through an optimization research on the conjunctive use of surface water (SW) and groundwater (GW) for irrigation in a semiarid region in northwest China. GSFLOW, an integrated SW-GW model developed by USGS, was employed. The study results show that, due to the strong and complicated SW-GW interactions, basin-scale water saving could be achieved by spatially optimizing the ratios of groundwater use in different irrigation districts. The water-saving potential essentially stems from the reduction of nonbeneficial evapotranspiration from the aqueduct system and shallow groundwater, and its magnitude largely depends on both water management schemes and hydrological conditions. Important implications for water resources management in general include: first, environmental flow regulation needs to take into account interannual variation of hydrological conditions, as well as spatial complexity of SW-GW interactions; and second, to resolve water use conflicts between upper stream and lower stream, a system approach is highly desired to reflect ecological, economic, and social concerns in water management decisions. Overall, this study highlights that surrogate-based approaches like SOIM represent a promising solution to filling the gap between complex environmental modeling and real-world management decision-making.

  3. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  4. The clinically-integrated randomized trial: proposed novel method for conducting large trials at low cost

    Directory of Open Access Journals (Sweden)

    Scardino Peter T

    2009-03-01

    Full Text Available Abstract Introduction Randomized controlled trials provide the best method of determining which of two comparable treatments is preferable. Unfortunately, contemporary randomized trials have become increasingly expensive, complex and burdened by regulation, so much so that many trials are of doubtful feasibility. Discussion Here we present a proposal for a novel, streamlined approach to randomized trials: the "clinically-integrated randomized trial". The key aspect of our methodology is that the clinical experience of the patient and doctor is virtually indistinguishable whether or not the patient is randomized, primarily because outcome data are obtained from routine clinical data, or from short, web-based questionnaires. Integration of a randomized trial into routine clinical practice also implies that there should be an attempt to randomize every patient, a corollary of which is that eligibility criteria are minimized. The similar clinical experience of patients on- and off-study also entails that the marginal cost of putting an additional patient on trial is negligible. We propose examples of how the clinically-integrated randomized trial might be applied in four distinct areas of medicine: comparisons of surgical techniques, "me too" drugs, rare diseases and lifestyle interventions. Barriers to implementing clinically-integrated randomized trials are discussed. Conclusion The proposed clinically-integrated randomized trial may allow us to enlarge dramatically the number of clinical questions that can be addressed by randomization.

  5. An Exploratory Study into Perceived Task Complexity, Topic Specificity and Usefulness for Integrated Search

    DEFF Research Database (Denmark)

    Ingwersen, Peter; Lioma, Christina; Larsen, Birger

    2012-01-01

    We investigate the relations between user perceptions of work task complexity, topic specificity, and usefulness of retrieved results. 23 academic researchers submitted detailed descriptions of 65 real-life work tasks in the physics domain, and assessed documents retrieved from an integrated...... collection consisting of full text research articles in PDF, abstracts, and bibliographic records [6]. Bibliographic records were found to be more precise than full text PDFs, regardless of task complexity and topic specificity. PDFs were found to be more useful. Overall, for higher task complexity and topic...

  6. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Science.gov (United States)

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented.

  7. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  8. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers....... The analysis was based on using the energy model Balmorel to determine the optimum dispatch of HPs in the system. The potential heat sources in Copenhagen for use in HPs were determined based on data related to temperatures, flows, and hydrography at different locations, while respecting technical constraints...

  9. Complex dewetting scenarios of ultrathin silicon films for large-scale nanoarchitectures.

    Science.gov (United States)

    Naffouti, Meher; Backofen, Rainer; Salvalaglio, Marco; Bottein, Thomas; Lodari, Mario; Voigt, Axel; David, Thomas; Benkouider, Abdelmalek; Fraj, Ibtissem; Favre, Luc; Ronda, Antoine; Berbezier, Isabelle; Grosso, David; Abbarchi, Marco; Bollani, Monica

    2017-11-01

    Dewetting is a ubiquitous phenomenon in nature; many different thin films of organic and inorganic substances (such as liquids, polymers, metals, and semiconductors) share this shape instability driven by surface tension and mass transport. Via templated solid-state dewetting, we frame complex nanoarchitectures of monocrystalline silicon on insulator with unprecedented precision and reproducibility over large scales. Phase-field simulations reveal the dominant role of surface diffusion as a driving force for dewetting and provide a predictive tool to further engineer this hybrid top-down/bottom-up self-assembly method. Our results demonstrate that patches of thin monocrystalline films of metals and semiconductors share the same dewetting dynamics. We also prove the potential of our method by fabricating nanotransfer molding of metal oxide xerogels on silicon and glass substrates. This method allows the novel possibility of transferring these Si-based patterns on different materials, which do not usually undergo dewetting, offering great potential also for microfluidic or sensing applications.

  10. Task Phase Recognition for Highly Mobile Workers in Large Building Complexes

    DEFF Research Database (Denmark)

    Stisen, Allan; Mathisen, Andreas; Krogh, Søren

    2016-01-01

    requirements on the accuracy of the indoor positioning, and thus come with low deployment and maintenance effort in real-world settings. We evaluated the proposed methods in a large hospital complex, where the highly mobile workers were recruited among the non-clinical workforce. The evaluation is based......-scale indoor work environments, namely from a WiFi infrastructure providing coarse grained indoor positioning, from inertial sensors in the workers’ mobile phones, and from a task management system yielding information about the scheduled tasks’ start and end locations. The methods presented have low...... on manually labelled real-world data collected over 4 days of regular work life of the mobile workforce. The collected data yields 83 tasks in total involving 8 different orderlies from a major university hospital with a building area of 160, 000 m2. The results show that the proposed methods can distinguish...

  11. The age calibration of integrated ultraviolet colors and young stellar clusters in the Large Magellanic Cloud

    International Nuclear Information System (INIS)

    Barbero, J.; Brocato, E.; Cassatella, A.; Castellani, V.; Geyer, E.H.

    1990-01-01

    Integrated colors in selected far-UV bands are presented for a large sample of Large Magellanic Cloud (LMC) clusters. Theoretical calculations of these integrated colors are derived and discussed. The location in the two-color diagram C(18-28), C(15-31) is expected to be a sensitive but smooth function of cluster age for ages in the range 5 to 800 million yr. Theoretical results appear in very good agreement with the observed colors of LMC clusters. From this comparison, the gap in the observed colors is suggested to be caused by the lack of LMC clusters in the range of ages between 200 million to one billion yr. The two-color location of old globulars is discussed, also in connection with available data for the M31 clusters. 36 refs

  12. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  13. Exploring the dynamic and complex integration of sustainability performance measurement into product development

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Morioka, S.; Pigosso, Daniela Cristina Antelmi

    2016-01-01

    In order to deal with the complex and dynamic nature of sustainability integration into the product development process, this research explore the use of a qualitative System Dynamics approach by using the causal loop diagram (CLD) tool. A literature analysis was followed by a case study, aiming ...

  14. Investigation of the complexity of streamflow fluctuations in a large heterogeneous lake catchment in China

    Science.gov (United States)

    Ye, Xuchun; Xu, Chong-Yu; Li, Xianghu; Zhang, Qi

    2018-05-01

    The occurrence of flood and drought frequency is highly correlated with the temporal fluctuations of streamflow series; understanding of these fluctuations is essential for the improved modeling and statistical prediction of extreme changes in river basins. In this study, the complexity of daily streamflow fluctuations was investigated by using multifractal detrended fluctuation analysis (MF-DFA) in a large heterogeneous lake basin, the Poyang Lake basin in China, and the potential impacts of human activities were also explored. Major results indicate that the multifractality of streamflow fluctuations shows significant regional characteristics. In the study catchment, all the daily streamflow series present a strong long-range correlation with Hurst exponents bigger than 0.8. The q-order Hurst exponent h( q) of all the hydrostations can be characterized well by only two parameters: a (0.354 ≤ a ≤ 0.384) and b (0.627 ≤ b ≤ 0.677), with no pronounced differences. Singularity spectrum analysis pointed out that small fluctuations play a dominant role in all daily streamflow series. Our research also revealed that both the correlation properties and the broad probability density function (PDF) of hydrological series can be responsible for the multifractality of streamflow series that depends on watershed areas. In addition, we emphasized the relationship between watershed area and the estimated multifractal parameters, such as the Hurst exponent and fitted parameters a and b from the q-order Hurst exponent h( q). However, the relationship between the width of the singularity spectrum (Δ α) and watershed area is not clear. Further investigation revealed that increasing forest coverage and reservoir storage can effectively enhance the persistence of daily streamflow, decrease the hydrological complexity of large fluctuations, and increase the small fluctuations.

  15. A large-eddy simulation based power estimation capability for wind farms over complex terrain

    Science.gov (United States)

    Senocak, I.; Sandusky, M.; Deleon, R.

    2017-12-01

    There has been an increasing interest in predicting wind fields over complex terrain at the micro-scale for resource assessment, turbine siting, and power forecasting. These capabilities are made possible by advancements in computational speed from a new generation of computing hardware, numerical methods and physics modelling. The micro-scale wind prediction model presented in this work is based on the large-eddy simulation paradigm with surface-stress parameterization. The complex terrain is represented using an immersed-boundary method that takes into account the parameterization of the surface stresses. Governing equations of incompressible fluid flow are solved using a projection method with second-order accurate schemes in space and time. We use actuator disk models with rotation to simulate the influence of turbines on the wind field. Data regarding power production from individual turbines are mostly restricted because of proprietary nature of the wind energy business. Most studies report percentage drop of power relative to power from the first row. There have been different approaches to predict power production. Some studies simply report available wind power in the upstream, some studies estimate power production using power curves available from turbine manufacturers, and some studies estimate power as torque multiplied by rotational speed. In the present work, we propose a black-box approach that considers a control volume around a turbine and estimate the power extracted from the turbine based on the conservation of energy principle. We applied our wind power prediction capability to wind farms over flat terrain such as the wind farm over Mower County, Minnesota and the Horns Rev offshore wind farm in Denmark. The results from these simulations are in good agreement with published data. We also estimate power production from a hypothetical wind farm in complex terrain region and identify potential zones suitable for wind power production.

  16. Characterisation of large catastrophic landslides using an integrated field, remote sensing and numerical modelling approach

    OpenAIRE

    Wolter, Andrea Elaine

    2014-01-01

    I apply a forensic, multidisciplinary approach that integrates engineering geology field investigations, engineering geomorphology mapping, long-range terrestrial photogrammetry, and a numerical modelling toolbox to two large rock slope failures to study their causes, initiation, kinematics, and dynamics. I demonstrate the significance of endogenic and exogenic processes, both separately and in concert, in contributing to landscape evolution and conditioning slopes for failure, and use geomor...

  17. Test methods of total dose effects in very large scale integrated circuits

    International Nuclear Information System (INIS)

    He Chaohui; Geng Bin; He Baoping; Yao Yujuan; Li Yonghong; Peng Honglun; Lin Dongsheng; Zhou Hui; Chen Yusheng

    2004-01-01

    A kind of test method of total dose effects (TDE) is presented for very large scale integrated circuits (VLSI). The consumption current of devices is measured while function parameters of devices (or circuits) are measured. Then the relation between data errors and consumption current can be analyzed and mechanism of TDE in VLSI can be proposed. Experimental results of 60 Co γ TDEs are given for SRAMs, EEPROMs, FLASH ROMs and a kind of CPU

  18. Benchmarking of London Dispersion-Accounting Density Functional Theory Methods on Very Large Molecular Complexes.

    Science.gov (United States)

    Risthaus, Tobias; Grimme, Stefan

    2013-03-12

    A new test set (S12L) containing 12 supramolecular noncovalently bound complexes is presented and used to evaluate seven different methods to account for dispersion in DFT (DFT-D3, DFT-D2, DFT-NL, XDM, dDsC, TS-vdW, M06-L) at different basis set levels against experimental, back-corrected reference energies. This allows conclusions about the performance of each method in an explorative research setting on "real-life" problems. Most DFT methods show satisfactory performance but, due to the largeness of the complexes, almost always require an explicit correction for the nonadditive Axilrod-Teller-Muto three-body dispersion interaction to get accurate results. The necessity of using a method capable of accounting for dispersion is clearly demonstrated in that the two-body dispersion contributions are on the order of 20-150% of the total interaction energy. MP2 and some variants thereof are shown to be insufficient for this while a few tested D3-corrected semiempirical MO methods perform reasonably well. Overall, we suggest the use of this benchmark set as a "sanity check" against overfitting to too small molecular cases.

  19. Integrated optimization on aerodynamics-structure coupling and flight stability of a large airplane in preliminary design

    Directory of Open Access Journals (Sweden)

    Xiaozhe WANG

    2018-06-01

    Full Text Available The preliminary phase is significant during the whole design process of a large airplane because of its enormous potential in enhancing the overall performance. However, classical sequential designs can hardly adapt to modern airplanes, due to their repeated iterations, long periods, and massive computational burdens. Multidisciplinary analysis and optimization demonstrates the capability to tackle such complex design issues. In this paper, an integrated optimization method for the preliminary design of a large airplane is proposed, accounting for aerodynamics, structure, and stability. Aeroelastic responses are computed by a rapid three-dimensional flight load analysis method combining the high-order panel method and the structural elasticity correction. The flow field is determined by the viscous/inviscid iteration method, and the cruise stability is evaluated by the linear small-disturbance theory. Parametric optimization is carried out using genetic algorithm to seek the minimal weight of a simplified plate-beam wing structure in the cruise trim condition subject to aeroelastic, aerodynamic, and stability constraints, and the optimal wing geometry shape, front/rear spar positions, and structural sizes are obtained simultaneously. To reduce the computational burden of the static aeroelasticity analysis in the optimization process, the Kriging method is employed to predict aerodynamic influence coefficient matrices of different aerodynamic shapes. The multidisciplinary analyses guarantee computational accuracy and efficiency, and the integrated optimization considers the coupling effect sufficiently between different disciplines to improve the overall performance, avoiding the limitations of sequential approaches utilized currently. Keywords: Aeroelasticity, Integrated optimization, Multidisciplinary analysis, Large airplane, Preliminary design

  20. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  1. Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS

    CERN Document Server

    Froidevaux, D

    2011-01-01

    Integration of Detectors Into a Large Experiment: Examples From ATLAS andCMS, part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B2: Detectors for Particles and Radiation. Part 2: Systems and Applications'. This document is part of Part 2 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Chapter '5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS' with the content: 5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS 5.1 Introduction 5.1.1 The context 5.1.2 The main initial physics goals of ATLAS and CMS at the LHC 5.1.3 A snapshot of the current status of the ATLAS and CMS experiments 5.2 Overall detector concept and magnet systems 5.2.1 Overall detector concept 5.2.2 Magnet systems 5.2.2.1 Rad...

  2. How to Commission, Operate and Maintain a Large Future Accelerator Complex From Far Remote Sites

    International Nuclear Information System (INIS)

    Phinney, Nan

    2001-01-01

    A study on future large accelerators [1] has considered a facility, which is designed, built and operated by a worldwide collaboration of equal partner institutions, and which is remote from most of these institutions. The full range of operation was considered including commissioning, machine development, maintenance, trouble shooting and repair. Experience from existing accelerators confirms that most of these activities are already performed remotely. The large high-energy physics experiments and astronomy projects, already involve international collaborations of distant institutions. Based on this experience, the prospects for a machine operated remotely from far sites are encouraging. Experts from each laboratory would remain at their home institution but continue to participate in the operation of the machine after construction. Experts are required to be on site only during initial commissioning and for particularly difficult problems. Repairs require an on-site non-expert maintenance crew. Most of the interventions can be made without an expert and many of the rest resolved with remote assistance. There appears to be no technical obstacle to controlling an accelerator from a distance. The major challenge is to solve the complex management and communication problems

  3. The Detection of Hot Cores and Complex Organic Molecules in the Large Magellanic Cloud

    Science.gov (United States)

    Sewiło, Marta; Indebetouw, Remy; Charnley, Steven B.; Zahorecz, Sarolta; Oliveira, Joana M.; van Loon, Jacco Th.; Ward, Jacob L.; Chen, C.-H. Rosie; Wiseman, Jennifer; Fukui, Yasuo; Kawamura, Akiko; Meixner, Margaret; Onishi, Toshikazu; Schilke, Peter

    2018-02-01

    We report the first extragalactic detection of the complex organic molecules (COMs) dimethyl ether (CH3OCH3) and methyl formate (CH3OCHO) with the Atacama Large Millimeter/submillimeter Array (ALMA). These COMs, together with their parent species methanol (CH3OH), were detected toward two 1.3 mm continuum sources in the N 113 star-forming region in the low-metallicity Large Magellanic Cloud (LMC). Rotational temperatures ({T}{rot}∼ 130 K) and total column densities ({N}{rot}∼ {10}16 cm‑2) have been calculated for each source based on multiple transitions of CH3OH. We present the ALMA molecular emission maps for COMs and measured abundances for all detected species. The physical and chemical properties of two sources with COMs detection, and the association with H2O and OH maser emission, indicate that they are hot cores. The fractional abundances of COMs scaled by a factor of 2.5 to account for the lower metallicity in the LMC are comparable to those found at the lower end of the range in Galactic hot cores. Our results have important implications for studies of organic chemistry at higher redshift.

  4. How to Commission, Operate and Maintain a Large Future Accelerator Complex From Far Remote Sites

    Energy Technology Data Exchange (ETDEWEB)

    Phinney, Nan

    2001-12-07

    A study on future large accelerators [1] has considered a facility, which is designed, built and operated by a worldwide collaboration of equal partner institutions, and which is remote from most of these institutions. The full range of operation was considered including commissioning, machine development, maintenance, troubleshooting and repair. Experience from existing accelerators confirms that most of these activities are already performed 'remotely'. The large high-energy physics experiments and astronomy projects, already involve international collaborations of distant institutions. Based on this experience, the prospects for a machine operated remotely from far sites are encouraging. Experts from each laboratory would remain at their home institution but continue to participate in the operation of the machine after construction. Experts are required to be on site only during initial commissioning and for particularly difficult problems. Repairs require an on-site non-expert maintenance crew. Most of the interventions can be made without an expert and many of the rest resolved with remote assistance. There appears to be no technical obstacle to controlling an accelerator from a distance. The major challenge is to solve the complex management and communication problems.

  5. THE GOULD's BELT VERY LARGE ARRAY SURVEY. I. THE OPHIUCHUS COMPLEX

    International Nuclear Information System (INIS)

    Dzib, Sergio A.; Loinard, Laurent; Rodríguez, Luis F.; Ortiz-León, Gisela N.; Pech, Gerardo; Rivera, Juana L.; Mioduszewski, Amy J.; Torres, Rosa M.; Boden, Andrew F.; Hartmann, Lee; Evans, Neal J. II; Briceño, Cesar; Tobin, John

    2013-01-01

    We present large-scale (∼2000 arcmin 2 ), deep (∼20 μJy), high-resolution (∼1'') radio observations of the Ophiuchus star-forming complex obtained with the Karl G. Jansky Very Large Array at λ = 4 and 6 cm. In total, 189 sources were detected, 56 of them associated with known young stellar sources, and 4 with known extragalactic objects; the other 129 remain unclassified, but most of them are most probably background quasars. The vast majority of the young stars detected at radio wavelengths have spectral types K or M, although we also detect four objects of A/F/B types and two brown dwarf candidates. At least half of these young stars are non-thermal (gyrosynchrotron) sources, with active coronas characterized by high levels of variability, negative spectral indices, and (in some cases) significant circular polarization. As expected, there is a clear tendency for the fraction of non-thermal sources to increase from the younger (Class 0/I or flat spectrum) to the more evolved (Class III or weak line T Tauri) stars. The young stars detected both in X-rays and at radio wavelengths broadly follow a Güdel-Benz relation, but with a different normalization than the most radioactive types of stars. Finally, we detect a ∼70 mJy compact extragalactic source near the center of the Ophiuchus core, which should be used as gain calibrator for any future radio observations of this region

  6. THE GOULD's BELT VERY LARGE ARRAY SURVEY. I. THE OPHIUCHUS COMPLEX

    Energy Technology Data Exchange (ETDEWEB)

    Dzib, Sergio A.; Loinard, Laurent; Rodríguez, Luis F.; Ortiz-León, Gisela N.; Pech, Gerardo; Rivera, Juana L. [Centro de Radioastronomía y Astrofísica, Universidad Nacional Autónoma de México Apartado Postal 3-72, 58090 Morelia, Michoacán (Mexico); Mioduszewski, Amy J. [National Radio Astronomy Observatory, Domenici Science Operations Center, 1003 Lopezville Road, Socorro, NM 87801 (United States); Torres, Rosa M. [Paul Harris 9065, Las Condes, Santiago (Chile); Boden, Andrew F. [Division of Physics, Math, and Astronomy, California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Hartmann, Lee [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48105 (United States); Evans, Neal J. II [Department of Astronomy, The University of Texas at Austin, 1 University Station, C1400, Austin, TX 78712 (United States); Briceño, Cesar [Centro de Investigaciones de Astronomía, Mérida 5101-A (Venezuela, Bolivarian Republic of); Tobin, John, E-mail: s.dzib@crya.unam.mx [National Radio Astronomy Observatory, Charlottesville, VA 22903 (United States)

    2013-09-20

    We present large-scale (∼2000 arcmin{sup 2}), deep (∼20 μJy), high-resolution (∼1'') radio observations of the Ophiuchus star-forming complex obtained with the Karl G. Jansky Very Large Array at λ = 4 and 6 cm. In total, 189 sources were detected, 56 of them associated with known young stellar sources, and 4 with known extragalactic objects; the other 129 remain unclassified, but most of them are most probably background quasars. The vast majority of the young stars detected at radio wavelengths have spectral types K or M, although we also detect four objects of A/F/B types and two brown dwarf candidates. At least half of these young stars are non-thermal (gyrosynchrotron) sources, with active coronas characterized by high levels of variability, negative spectral indices, and (in some cases) significant circular polarization. As expected, there is a clear tendency for the fraction of non-thermal sources to increase from the younger (Class 0/I or flat spectrum) to the more evolved (Class III or weak line T Tauri) stars. The young stars detected both in X-rays and at radio wavelengths broadly follow a Güdel-Benz relation, but with a different normalization than the most radioactive types of stars. Finally, we detect a ∼70 mJy compact extragalactic source near the center of the Ophiuchus core, which should be used as gain calibrator for any future radio observations of this region.

  7. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  8. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  9. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  10. Two-scale large deviations for chemical reaction kinetics through second quantization path integral

    International Nuclear Information System (INIS)

    Li, Tiejun; Lin, Feng

    2016-01-01

    Motivated by the study of rare events for a typical genetic switching model in systems biology, in this paper we aim to establish the general two-scale large deviations for chemical reaction systems. We build a formal approach to explicitly obtain the large deviation rate functionals for the considered two-scale processes based upon the second quantization path integral technique. We get three important types of large deviation results when the underlying two timescales are in three different regimes. This is realized by singular perturbation analysis to the rate functionals obtained by the path integral. We find that the three regimes possess the same deterministic mean-field limit but completely different chemical Langevin approximations. The obtained results are natural extensions of the classical large volume limit for chemical reactions. We also discuss its implication on the single-molecule Michaelis–Menten kinetics. Our framework and results can be applied to understand general multi-scale systems including diffusion processes. (paper)

  11. Impacts of large-scale offshore wind farm integration on power systems through VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Hongzhi; Chen, Zhe

    2013-01-01

    The potential of offshore wind energy has been commonly recognized and explored globally. Many countries have implemented and planned offshore wind farms to meet their increasing electricity demands and public environmental appeals, especially in Europe. With relatively less space limitation......, an offshore wind farm could have a capacity rating to hundreds of MWs or even GWs that is large enough to compete with conventional power plants. Thus the impacts of a large offshore wind farm on power system operation and security should be thoroughly studied and understood. This paper investigates...... the impacts of integrating a large-scale offshore wind farm into the transmission system of a power grid through VSC-HVDC connection. The concerns are focused on steady-state voltage stability, dynamic voltage stability and transient angle stability. Simulation results based on an exemplary power system...

  12. System Integration and Its Influence on the Quality of Life of Children with Complex Needs

    Directory of Open Access Journals (Sweden)

    Sandy Thurston

    2010-01-01

    Full Text Available Purpose. To explore the interactions between child and parents psychosocial factors and team integration variables that may explain improvements in physical dimensions of the PEDS QL quality of life of children with complex needs after 2 years. Methods. In this 2-year study, parents were identified by the Children's Treatment Network. Families were eligible if the child was aged 0–19 years, had physical limitations, resided in either Simcoe County or the Region of York, Ontario, and there were multiple other family needs. Regression analysis used to explore associations and interactions; n=110. Results. A child's physical quality of life was affected by interacting factors including child's behavior, parenting, and integrated care. Statistically significant interactions between team integration, processes of care, and child/parent variables highlight the complexity of the rehabilitation approach in real-life situations. Conclusions. Rehabilitation providers working with children with complex needs and their families should also address child and parent problematic behaviors. When this was the case in high integrated teams, the child's physical quality of life improved after two years.

  13. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  14. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  15. Model-based identification and use of task complexity factors of human integrated systems

    International Nuclear Information System (INIS)

    Ham, Dong-Han; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Task complexity is one of the conceptual constructs that are critical to explain and predict human performance in human integrated systems. A basic approach to evaluating the complexity of tasks is to identify task complexity factors and measure them. Although a great deal of task complexity factors have been studied, there is still a lack of conceptual frameworks for identifying and organizing them analytically, which can be generally used irrespective of the types of domains and tasks. This study proposes a model-based approach to identifying and using task complexity factors, which has two facets—the design aspects of a task and complexity dimensions. Three levels of design abstraction, which are functional, behavioral, and structural aspects of a task, characterize the design aspect of a task. The behavioral aspect is further classified into five cognitive processing activity types. The complexity dimensions explain a task complexity from different perspectives, which are size, variety, and order/organization. Twenty-one task complexity factors are identified by the combination of the attributes of each facet. Identification and evaluation of task complexity factors based on this model is believed to give insights for improving the design quality of tasks. This model for complexity factors can also be used as a referential framework for allocating tasks and designing information aids. The proposed approach is applied to procedure-based tasks of nuclear power plants (NPPs) as a case study to demonstrate its use. Last, we compare the proposed approach with other studies and then suggest some future research directions.

  16. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  17. Integrated complex care coordination for children with medical complexity: A mixed-methods evaluation of tertiary care-community collaboration

    Directory of Open Access Journals (Sweden)

    Cohen Eyal

    2012-10-01

    Full Text Available Abstract Background Primary care medical homes may improve health outcomes for children with special healthcare needs (CSHCN, by improving care coordination. However, community-based primary care practices may be challenged to deliver comprehensive care coordination to complex subsets of CSHCN such as children with medical complexity (CMC. Linking a tertiary care center with the community may achieve cost effective and high quality care for CMC. The objective of this study was to evaluate the outcomes of community-based complex care clinics integrated with a tertiary care center. Methods A before- and after-intervention study design with mixed (quantitative/qualitative methods was utilized. Clinics at two community hospitals distant from tertiary care were staffed by local community pediatricians with the tertiary care center nurse practitioner and linked with primary care providers. Eighty-one children with underlying chronic conditions, fragility, requirement for high intensity care and/or technology assistance, and involvement of multiple providers participated. Main outcome measures included health care utilization and expenditures, parent reports of parent- and child-quality of life [QOL (SF-36®, CPCHILD©, PedsQL™], and family-centered care (MPOC-20®. Comparisons were made in equal (up to 1 year pre- and post-periods supplemented by qualitative perspectives of families and pediatricians. Results Total health care system costs decreased from median (IQR $244 (981 per patient per month (PPPM pre-enrolment to $131 (355 PPPM post-enrolment (p=.007, driven primarily by fewer inpatient days in the tertiary care center (p=.006. Parents reported decreased out of pocket expenses (p© domains [Health Standardization Section (p=.04; Comfort and Emotions (p=.03], while total CPCHILD© score decreased between baseline and 1 year (p=.003. Parents and providers reported the ability to receive care close to home as a key benefit. Conclusions Complex

  18. CRISPR-Mediated Integration of Large Gene Cassettes Using AAV Donor Vectors

    Directory of Open Access Journals (Sweden)

    Rasmus O. Bak

    2017-07-01

    Full Text Available The CRISPR/Cas9 system has recently been shown to facilitate high levels of precise genome editing using adeno-associated viral (AAV vectors to serve as donor template DNA during homologous recombination (HR. However, the maximum AAV packaging capacity of ∼4.5 kb limits the donor size. Here, we overcome this constraint by showing that two co-transduced AAV vectors can serve as donors during consecutive HR events for the integration of large transgenes. Importantly, the method involves a single-step procedure applicable to primary cells with relevance to therapeutic genome editing. We use the methodology in primary human T cells and CD34+ hematopoietic stem and progenitor cells to site-specifically integrate an expression cassette that, as a single donor vector, would otherwise amount to a total of 6.5 kb. This approach now provides an efficient way to integrate large transgene cassettes into the genomes of primary human cells using HR-mediated genome editing with AAV vectors.

  19. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  20. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  1. Examining Food Risk in the Large using a Complex, Networked System-of-sytems Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, John [Los Alamos National Laboratory; Newkirk, Ryan [U OF MINNESOTA; Mc Donald, Mark P [VANDERBILT U

    2010-12-03

    The food production infrastructure is a highly complex system of systems. Characterizing the risks of intentional contamination in multi-ingredient manufactured foods is extremely challenging because the risks depend on the vulnerabilities of food processing facilities and on the intricacies of the supply-distribution networks that link them. A pure engineering approach to modeling the system is impractical because of the overall system complexity and paucity of data. A methodology is needed to assess food contamination risk 'in the large', based on current, high-level information about manufacturing facilities, corrunodities and markets, that will indicate which food categories are most at risk of intentional contamination and warrant deeper analysis. The approach begins by decomposing the system for producing a multi-ingredient food into instances of two subsystem archetypes: (1) the relevant manufacturing and processing facilities, and (2) the networked corrunodity flows that link them to each other and consumers. Ingredient manufacturing subsystems are modeled as generic systems dynamics models with distributions of key parameters that span the configurations of real facilities. Networks representing the distribution systems are synthesized from general information about food corrunodities. This is done in a series of steps. First, probability networks representing the aggregated flows of food from manufacturers to wholesalers, retailers, other manufacturers, and direct consumers are inferred from high-level approximate information. This is followed by disaggregation of the general flows into flows connecting 'large' and 'small' categories of manufacturers, wholesalers, retailers, and consumers. Optimization methods are then used to determine the most likely network flows consistent with given data. Vulnerability can be assessed for a potential contamination point using a modified CARVER + Shock model. Once the facility and

  2. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  3. Integrative analysis for finding genes and networks involved in diabetes and other complex diseases

    DEFF Research Database (Denmark)

    Bergholdt, R.; Størling, Zenia, Marian; Hansen, Kasper Lage

    2007-01-01

    We have developed an integrative analysis method combining genetic interactions, identified using type 1 diabetes genome scan data, and a high-confidence human protein interaction network. Resulting networks were ranked by the significance of the enrichment of proteins from interacting regions. We...... identified a number of new protein network modules and novel candidate genes/proteins for type 1 diabetes. We propose this type of integrative analysis as a general method for the elucidation of genes and networks involved in diabetes and other complex diseases....

  4. Low-cost sensor integrators for measuring the transmissivity of complex canopies to photosynthetically active radiation

    International Nuclear Information System (INIS)

    Newman, S.M.

    1985-01-01

    A system has been designed, tested and evaluated for measuring the transmissivities of complex canopies to photosynthetically active radiation (PAR). The system consists of filtered silicon photocells in cosine corrected mounts with outputs integrated by the use of chemical coulometers. The reading accumulated by the coulometers was taken electronically by the use of microcomputers. The low-cost sensor integrators, which do not require batteries, performed as expected and proved ideal for the study of agroforestry systems in remote areas. Information on the PAR transmissivity of a temperate agroforestry system in the form of an intercropped orchard is also presented. (author)

  5. Managing Active Learning Processes in Large First Year Physics Classes: The Advantages of an Integrated Approach

    Directory of Open Access Journals (Sweden)

    Michael J. Drinkwater

    2014-09-01

    Full Text Available Turning lectures into interactive, student-led question and answer sessions is known to increase learning, but enabling interaction in a large class seems aninsurmountable task. This can discourage adoption of this new approach – who has time to individualize responses, address questions from over 200 students and encourage active participation in class? An approach adopted by a teaching team in large first-year classes at a research-intensive university appears to provide a means to do so. We describe the implementation of active learning strategies in a large first-year undergraduate physics unit of study, replacing traditional, content-heavy lectures with an integrated approach to question-driven learning. A key feature of our approach is that it facilitates intensive in-class discussions by requiring students to engage in preparatory reading and answer short written quizzes before every class. The lecturer uses software to rapidly analyze the student responses and identify the main issues faced by the students before the start of each class. We report the success of the integration of student preparation with this analysis and feedback framework, and the impact on the in-class discussions. We also address some of the difficulties commonly experienced by staff preparing for active learning classes.

  6. SPITZER VIEW OF YOUNG MASSIVE STARS IN THE LARGE MAGELLANIC CLOUD H II COMPLEXES. II. N 159

    International Nuclear Information System (INIS)

    Chen, C.-H. Rosie; Indebetouw, Remy; Chu, You-Hua; Gruendl, Robert A.; Seale, Jonathan P.; Testor, Gerard; Heitsch, Fabian; Meixner, Margaret; Sewilo, Marta

    2010-01-01

    The H II complex N 159 in the Large Magellanic Cloud is used to study massive star formation in different environments, as it contains three giant molecular clouds (GMCs) that have similar sizes and masses but exhibit different intensities of star formation. We identify candidate massive young stellar objects (YSOs) using infrared photometry, and model their spectral energy distributions to constrain mass and evolutionary state. Good fits are obtained for less evolved Type I, I/II, and II sources. Our analysis suggests that there are massive embedded YSOs in N 159B, a maser source, and several ultracompact H II regions. Massive O-type YSOs are found in GMCs N 159-E and N 159-W, which are associated with ionized gas, i.e., where massive stars formed a few Myr ago. The third GMC, N 159-S, has neither O-type YSOs nor evidence of previous massive star formation. This correlation between current and antecedent formation of massive stars suggests that energy feedback is relevant. We present evidence that N 159-W is forming YSOs spontaneously, while collapse in N 159-E may be triggered. Finally, we compare star formation rates determined from YSO counts with those from integrated Hα and 24 μm luminosities and expected from gas surface densities. Detailed dissection of extragalactic GMCs like the one presented here is key to revealing the physics underlying commonly used star formation scaling laws.

  7. Identifying protein complex by integrating characteristic of core-attachment into dynamic PPI network.

    Directory of Open Access Journals (Sweden)

    Xianjun Shen

    Full Text Available How to identify protein complex is an important and challenging task in proteomics. It would make great contribution to our knowledge of molecular mechanism in cell life activities. However, the inherent organization and dynamic characteristic of cell system have rarely been incorporated into the existing algorithms for detecting protein complexes because of the limitation of protein-protein interaction (PPI data produced by high throughput techniques. The availability of time course gene expression profile enables us to uncover the dynamics of molecular networks and improve the detection of protein complexes. In order to achieve this goal, this paper proposes a novel algorithm DCA (Dynamic Core-Attachment. It detects protein-complex core comprising of continually expressed and highly connected proteins in dynamic PPI network, and then the protein complex is formed by including the attachments with high adhesion into the core. The integration of core-attachment feature into the dynamic PPI network is responsible for the superiority of our algorithm. DCA has been applied on two different yeast dynamic PPI networks and the experimental results show that it performs significantly better than the state-of-the-art techniques in terms of prediction accuracy, hF-measure and statistical significance in biology. In addition, the identified complexes with strong biological significance provide potential candidate complexes for biologists to validate.

  8. Inferring genetic architecture of complex traits using Bayesian integrative analysis of genome and transcriptiome data

    DEFF Research Database (Denmark)

    Ehsani, Alireza; Sørensen, Peter; Pomp, Daniel

    2012-01-01

    Background To understand the genetic architecture of complex traits and bridge the genotype-phenotype gap, it is useful to study intermediate -omics data, e.g. the transcriptome. The present study introduces a method for simultaneous quantification of the contributions from single nucleotide......-modal distribution of genomic values collapses, when gene expressions are added to the model Conclusions With increased availability of various -omics data, integrative approaches are promising tools for understanding the genetic architecture of complex traits. Partitioning of explained variances at the chromosome...

  9. Integration in New Product Development: Case Study in a Large Brazilian

    Directory of Open Access Journals (Sweden)

    Daniel Jugend

    2012-02-01

    Full Text Available Proficiency in management activities undertaken in product development processes is regarded as a key competitive advantage for companies, particularly for high-tech industrial firms, which benefit from the important competitiveness factor of launching products with a differentiated technological content. This paper’s objective was to identify, through case study, practices for integration between the roles of R & D with others involved in product development in a large Brazilian company of industrial automation. The results suggest some management practices to improve the integration in new products development, such as the use of employees from marketing with knowledge and experience previously gained from R & D activities and uses the heavyweight product manager to solve synchronization problems between product and technology development.

  10. THE GOULD'S BELT VERY LARGE ARRAY SURVEY. IV. THE TAURUS-AURIGA COMPLEX

    Energy Technology Data Exchange (ETDEWEB)

    Dzib, Sergio A. [Max Planck Institut für Radioastronomie, Auf dem Hügel 69, D-53121 Bonn (Germany); Loinard, Laurent; Rodríguez, Luis F.; Ortiz-León, Gisela N.; Pech, Gerardo; Rivera, Juana L. [Centro de Radioastronomía y Astrofísica, Universidad Nacional Autónoma de México Apartado Postal 3-72, 58090 Morelia, Michoacán (Mexico); Mioduszewski, Amy J. [National Radio Astronomy Observatory, Domenici Science Operations Center, 1003 Lopezville Road, Socorro, NM 87801 (United States); Kounkel, Marina A.; Hartmann, Lee [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48105 (United States); Torres, Rosa M. [Instituto de Astronomía y Meteorología, Universidad de Guadalajara, Avenida Vallarta No. 2602, Col. Arcos Vallarta, CP 44130 Guadalajara, Jalisco, México (Mexico); Boden, Andrew F. [Division of Physics, Math, and Astronomy, California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Evans II, Neal J. [Department of Astronomy, The University of Texas at Austin, 1 University Station, C1400, Austin, TX 78712 (United States); Briceño, Cesar [Cerro Tololo Interamerican Observatory, Casilla 603, La Serena (Chile); Tobin, John, E-mail: sdzib@mpifr-bonn.mpg.de [Leiden Observatory, Leiden University, P.O. Box 9513, 2300 RA Leiden (Netherlands)

    2015-03-10

    We present a multi-epoch radio study of the Taurus-Auriga star-forming complex made with the Karl G. Jansky Very Large Array at frequencies of 4.5 GHz and 7.5 GHz. We detect a total of 610 sources, 59 of which are related to young stellar objects (YSOs) and 18 to field stars. The properties of 56% of the young stars are compatible with non-thermal radio emission. We also show that the radio emission of more evolved YSOs tends to be more non-thermal in origin and, in general, that their radio properties are compatible with those found in other star-forming regions. By comparing our results with previously reported X-ray observations, we notice that YSOs in Taurus-Auriga follow a Güdel-Benz relation with κ = 0.03, as we previously suggested for other regions of star formation. In general, YSOs in Taurus-Auriga and in all the previous studied regions seem to follow this relation with a dispersion of ∼1 dex. Finally, we propose that most of the remaining sources are related with extragalactic objects but provide a list of 46 unidentified radio sources whose radio properties are compatible with a YSO nature.

  11. THE GOULD'S BELT VERY LARGE ARRAY SURVEY. IV. THE TAURUS-AURIGA COMPLEX

    International Nuclear Information System (INIS)

    Dzib, Sergio A.; Loinard, Laurent; Rodríguez, Luis F.; Ortiz-León, Gisela N.; Pech, Gerardo; Rivera, Juana L.; Mioduszewski, Amy J.; Kounkel, Marina A.; Hartmann, Lee; Torres, Rosa M.; Boden, Andrew F.; Evans II, Neal J.; Briceño, Cesar; Tobin, John

    2015-01-01

    We present a multi-epoch radio study of the Taurus-Auriga star-forming complex made with the Karl G. Jansky Very Large Array at frequencies of 4.5 GHz and 7.5 GHz. We detect a total of 610 sources, 59 of which are related to young stellar objects (YSOs) and 18 to field stars. The properties of 56% of the young stars are compatible with non-thermal radio emission. We also show that the radio emission of more evolved YSOs tends to be more non-thermal in origin and, in general, that their radio properties are compatible with those found in other star-forming regions. By comparing our results with previously reported X-ray observations, we notice that YSOs in Taurus-Auriga follow a Güdel-Benz relation with κ = 0.03, as we previously suggested for other regions of star formation. In general, YSOs in Taurus-Auriga and in all the previous studied regions seem to follow this relation with a dispersion of ∼1 dex. Finally, we propose that most of the remaining sources are related with extragalactic objects but provide a list of 46 unidentified radio sources whose radio properties are compatible with a YSO nature

  12. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    Science.gov (United States)

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  13. The rubber hand illusion in complex regional pain syndrome: preserved ability to integrate a rubber hand indicates intact multisensory integration.

    Science.gov (United States)

    Reinersmann, Annika; Landwehrt, Julia; Krumova, Elena K; Peterburs, Jutta; Ocklenburg, Sebastian; Güntürkün, Onur; Maier, Christoph

    2013-09-01

    In patients with complex regional pain syndrome (CRPS) type 1, processing of static tactile stimuli is impaired, whereas more complex sensory integration functions appear preserved. This study investigated higher order multisensory integration of body-relevant stimuli using the rubber hand illusion in CRPS patients. Subjective self-reports and skin conductance responses to watching the rubber hand being harmed were compared among CRPS patients (N=24), patients with upper limb pain of other origin (N=21, clinical control group), and healthy subjects (N=24). Additionally, the influence of body representation (body plasticity [Trinity Assessment of Body Plasticity], neglect-like severity symptoms), and clinical signs of illusion strength were investigated. For statistical analysis, 1-way analysis of variance, t test, Pearson correlation, with α=0.05 were used. CRPS patients did not differ from healthy subjects and the control group with regard to their illusion strength as assessed by subjective reports or skin conductance response values. Stronger left-sided rubber hand illusions were reported by healthy subjects and left-side-affected CRPS patients. Moreover, for this subgroup, illness duration and illusion strength were negatively correlated. Overall, severity of neglect-like symptoms and clinical signs were not related to illusion strength. However, patients with CRPS of the right hand reported significantly stronger neglect-like symptoms and significantly lower illusion strength of the affected hand than patients with CRPS of the left hand. The weaker illusion of CRPS patients with strong neglect-like symptoms on the affected hand supports the role of top-down processes modulating body ownership. Moreover, the intact ability to perceive illusory ownership confirms the notion that, despite impaired processing of proprioceptive or tactile input, higher order multisensory integration is unaffected in CRPS. Copyright © 2013 International Association for the Study

  14. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  15. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  16. Large-scale Wind Power integration in a Hydro-Thermal Power Market

    OpenAIRE

    Trøtscher, Thomas

    2007-01-01

    This master thesis describes a quadratic programming model used to calculate the spot prices in an efficient multi-area power market. The model has been adapted to Northern Europe, with focus on Denmark West and the integration of large quantities of wind power. In the model, demand and supply of electricity are equated, at an hourly time resolution, to find the spot price in each area. Historical load values are used to represent demand which is assumed to be completely inelastic. Supply i...

  17. IoT European Large-Scale Pilots – Integration, Experimentation and Testing

    OpenAIRE

    Guillén, Sergio Gustavo; Sala, Pilar; Fico, Giuseppe; Arredondo, Maria Teresa; Cano, Alicia; Posada, Jorge; Gutierrez, Germán; Palau, Carlos; Votis, Konstantinos; Verdouw, Cor N.; Wolfert, Sjaak; Beers, George; Sundmaeker, Harald; Chatzikostas, Grigoris; Ziegler, Sébastien

    2017-01-01

    The IoT European Large-Scale Pilots Programme includes the innovation consortia that are collaborating to foster the deployment of IoT solutions in Europe through the integration of advanced IoT technologies across the value chain, demonstration of multiple IoT applications at scale and in a usage context, and as close as possible to operational conditions. The programme projects are targeted, goal-driven initiatives that propose IoT approaches to specific real-life industrial/societal challe...

  18. Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor

    Science.gov (United States)

    Acree, C. W., Jr.

    2010-01-01

    Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.

  19. Introduction to Large-sized Test Facility for validating Containment Integrity under Severe Accidents

    International Nuclear Information System (INIS)

    Na, Young Su; Hong, Seongwan; Hong, Seongho; Min, Beongtae

    2014-01-01

    An overall assessment of containment integrity can be conducted properly by examining the hydrogen behavior in the containment building. Under severe accidents, an amount of hydrogen gases can be generated by metal oxidation and corium-concrete interaction. Hydrogen behavior in the containment building strongly depends on complicated thermal hydraulic conditions with mixed gases and steam. The performance of a PAR can be directly affected by the thermal hydraulic conditions, steam contents, gas mixture behavior and aerosol characteristics, as well as the operation of other engineering safety systems such as a spray. The models in computer codes for a severe accident assessment can be validated based on the experiment results in a large-sized test facility. The Korea Atomic Energy Research Institute (KAERI) is now preparing a large-sized test facility to examine in detail the safety issues related with hydrogen including the performance of safety devices such as a PAR in various severe accident situations. This paper introduces the KAERI test facility for validating the containment integrity under severe accidents. To validate the containment integrity, a large-sized test facility is necessary for simulating complicated phenomena induced by an amount of steam and gases, especially hydrogen released into the containment building under severe accidents. A pressure vessel 9.5 m in height and 3.4 m in diameter was designed at the KAERI test facility for the validating containment integrity, which was based on the THAI test facility with the experimental safety and the reliable measurement systems certified for a long time. This large-sized pressure vessel operated in steam and iodine as a corrosive agent was made by stainless steel 316L because of corrosion resistance for a long operating time, and a vessel was installed in at KAERI in March 2014. In the future, the control systems for temperature and pressure in a vessel will be constructed, and the measurement system

  20. Set of CAMAC modules on the base of large integrated circuits for an accelerator synchronization system

    International Nuclear Information System (INIS)

    Glejbman, Eh.M.; Pilyar, N.V.

    1986-01-01

    Parameters of functional moduli in the CAMAC standard developed for accelerator synchronization system are presented. They comprise BZN-8K and BZ-8K digital delay circuits, timing circuit and pulse selection circuit. In every module 3 large integral circuits of KR 580 VI53 type programmed timer, circuits of the given system bus bar interface with bus bars of crate, circuits of data recording control, 2 peripheric storage devices, circuits of initial regime setting, input and output shapers, circuits of installation and removal of blocking in channels are used

  1. Deciphering the clinical effect of drugs through large-scale data integration

    DEFF Research Database (Denmark)

    Kjærulff, Sonny Kim

    . This work demonstrates the power of a strategy that uses clinical data mining in association with chemical biology in order to reduce the search space and aid identification of novel drug actions. The second article described in chapter 3 outlines a high confidence side-effect-drug interaction dataset. We...... demonstrates the importance of using high-confidence drug-side-effect data in deciphering the effect of small molecules in humans. In summary, this thesis presents computational systems chemical biology approaches that can help identify clinical effects of small molecules through large-scale data integration...

  2. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  3. Channel Capacity Calculation at Large SNR and Small Dispersion within Path-Integral Approach

    Science.gov (United States)

    Reznichenko, A. V.; Terekhov, I. S.

    2018-04-01

    We consider the optical fiber channel modelled by the nonlinear Shrödinger equation with additive white Gaussian noise. Using Feynman path-integral approach for the model with small dispersion we find the first nonzero corrections to the conditional probability density function and the channel capacity estimations at large signal-to-noise ratio. We demonstrate that the correction to the channel capacity in small dimensionless dispersion parameter is quadratic and positive therefore increasing the earlier calculated capacity for a nondispersive nonlinear optical fiber channel in the intermediate power region. Also for small dispersion case we find the analytical expressions for simple correlators of the output signals in our noisy channel.

  4. Department of Energy environmental management complex-wide integration using systems engineering

    International Nuclear Information System (INIS)

    Fairbourn, P.

    1997-01-01

    A systems engineering approach was successfully used to recommend changes to environmental management activities across the DOE Complex. A team of technical experts and systems engineers developed alternatives that could save tax payers billions of dollars if the barriers are removed to allow complete implementation. The alternatives are technically-based and defensible, and are being worked through the stakeholder review process. The integration process and implementing project structure are both discussed

  5. Manufacturing of large and integral-type steel forgings for nuclear steam supply system components

    International Nuclear Information System (INIS)

    Kawaguchi, S.; Tsukada, H.; Suzuki, K.; Sato, I.; Onodera, S.

    1986-01-01

    Forgings for the reactor pressure vessel (RPV) of the pressurized heavy water reactor (PHWR) 700 MWe, which is composed of seven major parts and nozzles totaling about 965 tons, were successfully developed. These forgings are: 1. Flanges: an outside diameter of 8440 mm and a weight of 238 tons max, requiring an ingot of 570 tons. 2. Shells and torus: an outside diameter of about 8000 mm with large height. 3. Cover dome: a diameter of 6800 mm and a thickness of 460 mm, requiring a blank forging before forming of 8000 mm in diameter and 550 m thick. The material designation is 20Mn-Mo-Ni 5 5 (equivalent to SA508, Class 3). In this paper, the manufacturing of and the properties of such large and integral forgings are discussed, including an overview of manufacturing processes for ultralarge-sized forgings over the last two decades

  6. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  7. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  8. Integrated numerical platforms for environmental dose assessments of large tritium inventory facilities

    International Nuclear Information System (INIS)

    Castro, P.; Ardao, J.; Velarde, M.; Sedano, L.; Xiberta, J.

    2013-01-01

    Related with a prospected new scenario of large inventory tritium facilities [KATRIN at TLK, CANDUs, ITER, EAST, other coming] the prescribed dosimetric limits by ICRP-60 for tritium committed-doses are under discussion requiring, in parallel, to surmount the highly conservative assessments by increasing the refinement of dosimetric-assessments in many aspects. Precise Lagrangian-computations of dosimetric cloud-evolution after standardized (normal/incidental/SBO) tritium cloud emissions are today numerically open to the perfect match of real-time meteorological-data, and patterns data at diverse scales for prompt/early and chronic tritium dose assessments. The trends towards integrated-numerical-platforms for environmental-dose assessments of large tritium inventory facilities under development.

  9. Integration of complex-wide mixed low-level waste activities for program acceleration and optimization

    International Nuclear Information System (INIS)

    McKenney, D.E.

    1998-01-01

    In July 1996, the US Department of Energy (DOE) chartered a contractor-led effort to develop a suite of technically defensible, integrated alternatives which would allow the Environmental Management program to accomplish its mission objectives in an accelerated fashion and at a reduced cost. These alternatives, or opportunities, could then be evaluated by DOE and stakeholders for possible implementation, given precursor requirements (regulatory changes, etc.) could be met and benefits to the Complex realized. This contractor effort initially focused on six waste types, one of which was Mixed Low-Level Waste (MLLW). Many opportunities were identified by the contractor team for integrating MLLW activities across the DOE Complex. These opportunities were further narrowed to six that had the most promise for implementation and savings to the DOE Complex. The opportunities include six items: (1) the consolidation of individual site analytical services procurement efforts, (2) the consolidation of individual site MLLW treatment services procurement efforts, (3) establishment of ''de minimus'' radioactivity levels, (4) standardization of characterization requirements, (5) increased utilization of existing DOE treatment facilities, and (6) using a combination of DOE and commercial MLLW disposal capacity. The results of the integration effort showed that by managing MLLW activities across the DOE Complex as a cohesive unit rather than as independent site efforts, the DOE could improve the rate of progress toward meeting its objectives and reduce its overall MLLW program costs. Savings potential for MLLW, if the identified opportunities could be implemented, could total $224 million or more. Implementation of the opportunities also could result in the acceleration of the MLLW ''work off schedule'' across the DOE Complex by five years

  10. HOW TO AVOID GIVING THE RIGHT ANSWERS TO THE WRONG QUESTIONS: THE NEED FOR INTEGRATED ASSESSMENTS OF COMPLEX HEALTH TECHNOLOGIES.

    Science.gov (United States)

    Gerhardus, Ansgar; Oortwijn, Wija; van der Wilt, Gert Jan

    2017-01-01

    Health technologies are becoming increasingly complex and contemporary health technology assessment (HTA) is only partly equipped to address this complexity. The project "Integrated assessments of complex health technologies" (INTEGRATE-HTA), funded by the European Commission, was initiated with the overall objective to develop concepts and methods to enable patient-centered, integrated assessments of the effectiveness, and the economic, social, cultural, and ethical issues of complex technologies that take context and implementation issues into account. The project resulted in a series of guidances that should support the work of HTA scientists and decision makers alike.

  11. Integrating water and agricultural management: collaborative governance for a complex policy problem.

    Science.gov (United States)

    Fish, Rob D; Ioris, Antonio A R; Watson, Nigel M

    2010-11-01

    This paper examines governance requirements for integrating water and agricultural management (IWAM). The institutional arrangements for the agriculture and water sectors are complex and multi-dimensional, and integration cannot therefore be achieved through a simplistic 'additive' policy process. Effective integration requires the development of a new collaborative approach to governance that is designed to cope with scale dependencies and interactions, uncertainty and contested knowledge, and interdependency among diverse and unequal interests. When combined with interdisciplinary research, collaborative governance provides a viable normative model because of its emphasis on reciprocity, relationships, learning and creativity. Ultimately, such an approach could lead to the sorts of system adaptations and transformations that are required for IWAM. Copyright © 2009 Elsevier B.V. All rights reserved.

  12. Plastic influence functions for calculating J-integral of complex-cracks in pipe

    International Nuclear Information System (INIS)

    Jeong, Jae-Uk; Choi, Jae-Boong; Kim, Moon-Ki; Huh, Nam-Su; Kim, Yun-Jae

    2016-01-01

    In this study, the plastic influence functions, h_1, for estimates of J-integral of a pipe with a complex crack were newly proposed based on the systematic 3-dimensional (3-D) elastic-plastic finite element (FE) analyses by using Ramberg-Osgood (R-O) relation, in which global bending moment, axial tension and internal pressure were considered as loading conditions. Based on the present plastic influence functions, the GE/EPRI-type J-estimation scheme for complex-cracked pipes was suggested, and the results from the proposed J-estimation were compared with the FE results using both R-O fit parameters and actual tensile data of SA376 TP304 stainless steel. The comparison results demonstrate that although the proposed scheme provided sensitive J estimations according to fitting ranges of R-O parameters, it showed overall good agreements with the FE results using R-O relation. Thus, the proposed engineering J prediction method can be utilized to assess instability of a complex crack in pipes for R-O material. - Highlights: • New h_1values of GE/EPRI method for complex-cracked pipes are proposed. • The plastic limit loads of complex-cracked pipes using Mises yield criterion are provided. • The new J estimates of complex-cracked pipes are proposed based on GE/EPRI concept. • The proposed estimates for J are validated against 3-D finite element results.

  13. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    Science.gov (United States)

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  14. Large scale integration of intermittent renewable energy sources in the Greek power sector

    International Nuclear Information System (INIS)

    Voumvoulakis, Emmanouil; Asimakopoulou, Georgia; Danchev, Svetoslav; Maniatis, George; Tsakanikas, Aggelos

    2012-01-01

    As a member of the European Union, Greece has committed to achieve ambitious targets for the penetration of renewable energy sources (RES) in gross electricity consumption by 2020. Large scale integration of RES requires a suitable mixture of compatible generation units, in order to deal with the intermittency of wind velocity and solar irradiation. The scope of this paper is to examine the impact of large scale integration of intermittent energy sources, required to meet the 2020 RES target, on the generation expansion plan, the fuel mix and the spinning reserve requirements of the Greek electricity system. We perform hourly simulation of the intermittent RES generation to estimate residual load curves on a monthly basis, which are then inputted in a WASP-IV model of the Greek power system. We find that the decarbonisation effort, with the rapid entry of RES and the abolishment of the grandfathering of CO 2 allowances, will radically transform the Greek electricity sector over the next 10 years, which has wide-reaching policy implications. - Highlights: ► Greece needs 8.8 to 9.3 GW additional RES installations by 2020. ► RES capacity credit varies between 12.2% and 15.3%, depending on interconnections. ► Without institutional changes, the reserve requirements will be more than double. ► New CCGT installed capacity will probably exceed the cost-efficient level. ► Competitive pressures should be introduced in segments other than day-ahead market.

  15. Integration of large chemical kinetic mechanisms via exponential methods with Krylov approximations to Jacobian matrix functions

    KAUST Repository

    Bisetti, Fabrizio

    2012-06-01

    Recent trends in hydrocarbon fuel research indicate that the number of species and reactions in chemical kinetic mechanisms is rapidly increasing in an effort to provide predictive capabilities for fuels of practical interest. In order to cope with the computational cost associated with the time integration of stiff, large chemical systems, a novel approach is proposed. The approach combines an exponential integrator and Krylov subspace approximations to the exponential function of the Jacobian matrix. The components of the approach are described in detail and applied to the ignition of stoichiometric methane-air and iso-octane-air mixtures, here described by two widely adopted chemical kinetic mechanisms. The approach is found to be robust even at relatively large time steps and the global error displays a nominal third-order convergence. The performance of the approach is improved by utilising an adaptive algorithm for the selection of the Krylov subspace size, which guarantees an approximation to the matrix exponential within user-defined error tolerance. The Krylov projection of the Jacobian matrix onto a low-dimensional space is interpreted as a local model reduction with a well-defined error control strategy. Finally, the performance of the approach is discussed with regard to the optimal selection of the parameters governing the accuracy of its individual components. © 2012 Copyright Taylor and Francis Group, LLC.

  16. SV40 large T-p53 complex: evidence for the presence of two immunologically distinct forms of p53

    International Nuclear Information System (INIS)

    Milner, J.; Gamble, J.

    1985-01-01

    The transforming protein of SV40 is the large T antigen. Large T binds a cellular protein, p53, which is potentially oncogenic by virtue of its functional involvement in the control of cell proliferation. This raises the possibility that p53 may mediate, in part, the transforming function of SV40 large T. Two immunologically distinct forms of p53 have been identified in normal cells: the forms are cell-cycle dependent, one being restricted to nondividing cells (p53-Go) and the second to dividing cells (p53-G divided by). The authors have now dissociated and probed the multimeric complex of SV40 large T-p53 for the presence of immunologically distinct forms of p53. Here they present evidence for the presence of p53-Go and p53-G divided by complexed with SV40 large T

  17. Detection of large numbers of novel sequences in the metatranscriptomes of complex marine microbial communities.

    Science.gov (United States)

    Gilbert, Jack A; Field, Dawn; Huang, Ying; Edwards, Rob; Li, Weizhong; Gilna, Paul; Joint, Ian

    2008-08-22

    Sequencing the expressed genetic information of an ecosystem (metatranscriptome) can provide information about the response of organisms to varying environmental conditions. Until recently, metatranscriptomics has been limited to microarray technology and random cloning methodologies. The application of high-throughput sequencing technology is now enabling access to both known and previously unknown transcripts in natural communities. We present a study of a complex marine metatranscriptome obtained from random whole-community mRNA using the GS-FLX Pyrosequencing technology. Eight samples, four DNA and four mRNA, were processed from two time points in a controlled coastal ocean mesocosm study (Bergen, Norway) involving an induced phytoplankton bloom producing a total of 323,161,989 base pairs. Our study confirms the finding of the first published metatranscriptomic studies of marine and soil environments that metatranscriptomics targets highly expressed sequences which are frequently novel. Our alternative methodology increases the range of experimental options available for conducting such studies and is characterized by an exceptional enrichment of mRNA (99.92%) versus ribosomal RNA. Analysis of corresponding metagenomes confirms much higher levels of assembly in the metatranscriptomic samples and a far higher yield of large gene families with >100 members, approximately 91% of which were novel. This study provides further evidence that metatranscriptomic studies of natural microbial communities are not only feasible, but when paired with metagenomic data sets, offer an unprecedented opportunity to explore both structure and function of microbial communities--if we can overcome the challenges of elucidating the functions of so many never-seen-before gene families.

  18. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  19. Development of an integrated genome informatics, data management and workflow infrastructure: A toolbox for the study of complex disease genetics

    Directory of Open Access Journals (Sweden)

    Burren Oliver S

    2004-01-01

    Full Text Available Abstract The genetic dissection of complex disease remains a significant challenge. Sample-tracking and the recording, processing and storage of high-throughput laboratory data with public domain data, require integration of databases, genome informatics and genetic analyses in an easily updated and scaleable format. To find genes involved in multifactorial diseases such as type 1 diabetes (T1D, chromosome regions are defined based on functional candidate gene content, linkage information from humans and animal model mapping information. For each region, genomic information is extracted from Ensembl, converted and loaded into ACeDB for manual gene annotation. Homology information is examined using ACeDB tools and the gene structure verified. Manually curated genes are extracted from ACeDB and read into the feature database, which holds relevant local genomic feature data and an audit trail of laboratory investigations. Public domain information, manually curated genes, polymorphisms, primers, linkage and association analyses, with links to our genotyping database, are shown in Gbrowse. This system scales to include genetic, statistical, quality control (QC and biological data such as expression analyses of RNA or protein, all linked from a genomics integrative display. Our system is applicable to any genetic study of complex disease, of either large or small scale.

  20. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  1. European wind integration study (EWIS). Towards a successful integration of large scale wind power into European electricity grids. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Winter, W.

    2010-03-15

    Large capacities of wind generators have already been installed and are operating in Germany (26GW) and Spain (16GW). Installations which are as significant in terms of proportion to system size are also established in Denmark (3.3GW), the All Island Power System of Ireland and Northern Ireland (1.5GW), and Portugal (3.4GW). Many other countries expect significant growth in wind generation such that the total currently installed capacity in Europe of 68GW is expected to at least double by 2015. Yet further increases can be expected in order to achieve Europe's 2020 targets for renewable energy. The scale of this development poses big challenges for wind generation developers in terms of obtaining suitable sites, delivering large construction projects, and financing the associated investments from their operations. Such developments also impact the networks and it was to address the immediate transmission related challenges that the European Wind Integration Study (EWIS) was initiated by Transmission System Operators (TSOs) with the objective of ensuring the most effective integration of large scale wind generation into Europe's transmission networks and electricity system. The challenges anticipated and addressed include: 1) How to efficiently accommodate wind generation when markets and transmission access arrangements have evolved for the needs of traditional controllable generation. 2) How to ensure supplies remain secure as wind varies (establishing the required backup/reserves for low wind days and wind forecast errors as well as managing network congestion in windy conditions). 3) How to maintain the quality and reliability of supplies given the new generation characteristics. 4) How to achieve efficient network costs by suitable design and operation of network connections, the deeper infrastructure including offshore connections, and crossborder interconnections. EWIS has focused on the immediate network related challenges by analysing detailed

  2. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    Science.gov (United States)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  3. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks

    Directory of Open Access Journals (Sweden)

    Raja Jurdak

    2008-11-01

    Full Text Available Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  4. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks.

    Science.gov (United States)

    Jurdak, Raja; Nafaa, Abdelhamid; Barbirato, Alessio

    2008-11-24

    Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  5. Enabling the Integrated Assessment of Large Marine Ecosystems: Informatics to the Forefront of Science-Based Decision Support

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; Beaulieu, S. E.; Maffei, A. R.; West, P.; Hare, J. A.

    2012-12-01

    Integrated assessments of large marine ecosystems require the understanding of interactions between environmental, ecological, and socio-economic factors that affect production and utilization of marine natural resources. Assessing the functioning of complex coupled natural-human systems calls for collaboration between natural and social scientists across disciplinary and national boundaries. We are developing a platform to implement and sustain informatics solutions for these applications, providing interoperability among very diverse and heterogeneous data and information sources, as well as multi-disciplinary organizations and people. We have partnered with NOAA NMFS scientists to facilitate the deployment of an integrated ecosystem approach to management in the Northeast U.S. (NES) and California Current Large Marine Ecosystems (LMEs). Our platform will facilitate the collaboration and knowledge sharing among NMFS natural and social scientists, promoting community participation in integrating data, models, and knowledge. Here, we present collaborative software tools developed to aid the production of the Ecosystem Status Report (ESR) for the NES LME. The ESR addresses the D-P-S portion of the DPSIR (Driver-Pressure-State-Impact-Response) management framework: reporting data, indicators, and information products for climate drivers, physical and human (fisheries) pressures, and ecosystem state (primary and secondary production and higher trophic levels). We are developing our tools in open-source software, with the main tool based on a web application capable of providing the ability to work on multiple data types from a variety of sources, providing an effective way to share the source code used to generate data products and associated metadata as well as track workflow provenance to allow in the reproducibility of a data product. Our platform retrieves data, conducts standard analyses, reports data quality and other standardized metadata, provides iterative

  6. GEOMETRIC COMPLEXITY ANALYSIS IN AN INTEGRATIVE TECHNOLOGY EVALUATION MODEL (ITEM FOR SELECTIVE LASER MELTING (SLM#

    Directory of Open Access Journals (Sweden)

    S. Merkt

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Selective laser melting (SLM is becoming an economically viable choice for manufacturing complex serial parts. This paper focuses on a geometric complexity analysis as part of the integrative technology evaluation model (ITEM presented here. In contrast to conventional evaluation methodologies, the ITEM considers interactions between product and process innovations generated by SLM. The evaluation of manufacturing processes that compete with SLM is the main goal of ITEM. The paper includes a complexity analysis of a test part from Festo AG. The paper closes with a discussion of how the expanded design freedom of SLM can be used to improve company operations, and how the complexity analysis presented here can be seen as a starting point for feature-based complexity analysis..

    AFRIKAANSE OPSOMMING: Selektiewe lasersmelting word geleidelik ’n gangbare ekonomiese keuse vir die vervaar-diging van opeenvolgende komplekse onderdele. Die navorsing is toegespits op die ontleding van meetkundige kompleksiteit as ’n gedeelte van ’n integrerende tegnologiese evalueringsmodel. Gemeet teen konvensionele evalueringsmodelle behandel die genoemde metode interaksies tussen produkte- en prosesinnovasies wat gegenereer word. Die navorsing behandel ’n kompleksiteitsontleding van ’n toetsonderdeel van die firma FESTO AG. Die resultaat toon hoe kompleksiteits-analise gebruik kan word as die vertrekpunt vir eienskapsgebaseerde analise.

  7. Electromagnetic scattering of large structures in layered earths using integral equations

    Science.gov (United States)

    Xiong, Zonghou; Tripp, Alan C.

    1995-07-01

    An electromagnetic scattering algorithm for large conductivity structures in stratified media has been developed and is based on the method of system iteration and spatial symmetry reduction using volume electric integral equations. The method of system iteration divides a structure into many substructures and solves the resulting matrix equation using a block iterative method. The block submatrices usually need to be stored on disk in order to save computer core memory. However, this requires a large disk for large structures. If the body is discretized into equal-size cells it is possible to use the spatial symmetry relations of the Green's functions to regenerate the scattering impedance matrix in each iteration, thus avoiding expensive disk storage. Numerical tests show that the system iteration converges much faster than the conventional point-wise Gauss-Seidel iterative method. The numbers of cells do not significantly affect the rate of convergency. Thus the algorithm effectively reduces the solution of the scattering problem to an order of O(N2), instead of O(N3) as with direct solvers.

  8. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  9. Large-scale offshore wind energy. Cost analysis and integration in the Dutch electricity market

    International Nuclear Information System (INIS)

    De Noord, M.

    1999-02-01

    The results of analysis of the construction and integration costs of large-scale offshore wind energy (OWE) farms in 2010 are presented. The integration of these farms (1 and 5 GW) in the Dutch electricity distribution system have been regarded against the background of a liberalised electricity market. A first step is taken for the determination of costs involved in solving integration problems. Three different types of foundations are examined: the mono-pile, the jacket and a new type of foundation: the concrete caisson pile: all single-turbine-single-support structures. For real offshore applications (>10 km offshore, at sea-depths >20 m), the concrete caisson pile is regarded as the most suitable. The price/power ratios of wind turbines are analysed. It is assumed that in 2010 turbines in the power range of 3-5 MW are available. The main calculations have been conducted for a 3 MW turbine. The main choice in electrical infrastructure is for AC or DC. Calculations show that at distances of 30 km offshore and more, the use of HVDC will result in higher initial costs but lower operating costs. The share of operating and maintenance (O ampersand M) costs in the kWh cost price is approximately 3.3%. To be able to compare the two farms, a base case is derived with a construction time of 10 years for both. The energy yield is calculated for a wind regime offshore of 9.0 m/s annual mean wind speed. Per 3 MW turbine this results in an annual energy production of approximately 12 GWh. The total farm efficiency amounts to 82%, resulting in a total farm capacity factor of 38%. With a required internal rate of return of 15%, the kWh cost price amounts to 0.24 DFl and 0.21 DFl for the 1 GW and 5 GW farms respectively in the base case. The required internal rate of return has a large effect on the kWh cost price, followed by costs of subsystems. O ampersand M costs have little effect on the cost price. Parameter studies show that a small cost reduction of 5% is possible when

  10. Predicting co-complexed protein pairs using genomic and proteomic data integration

    Directory of Open Access Journals (Sweden)

    King Oliver D

    2004-04-01

    Full Text Available Abstract Background Identifying all protein-protein interactions in an organism is a major objective of proteomics. A related goal is to know which protein pairs are present in the same protein complex. High-throughput methods such as yeast two-hybrid (Y2H and affinity purification coupled with mass spectrometry (APMS have been used to detect interacting proteins on a genomic scale. However, both Y2H and APMS methods have substantial false-positive rates. Aside from high-throughput interaction screens, other gene- or protein-pair characteristics may also be informative of physical interaction. Therefore it is desirable to integrate multiple datasets and utilize their different predictive value for more accurate prediction of co-complexed relationship. Results Using a supervised machine learning approach – probabilistic decision tree, we integrated high-throughput protein interaction datasets and other gene- and protein-pair characteristics to predict co-complexed pairs (CCP of proteins. Our predictions proved more sensitive and specific than predictions based on Y2H or APMS methods alone or in combination. Among the top predictions not annotated as CCPs in our reference set (obtained from the MIPS complex catalogue, a significant fraction was found to physically interact according to a separate database (YPD, Yeast Proteome Database, and the remaining predictions may potentially represent unknown CCPs. Conclusions We demonstrated that the probabilistic decision tree approach can be successfully used to predict co-complexed protein (CCP pairs from other characteristics. Our top-scoring CCP predictions provide testable hypotheses for experimental validation.

  11. Systems and complexity thinking in the general practice literature: an integrative, historical narrative review.

    Science.gov (United States)

    Sturmberg, Joachim P; Martin, Carmel M; Katerndahl, David A

    2014-01-01

    Over the past 7 decades, theories in the systems and complexity sciences have had a major influence on academic thinking and research. We assessed the impact of complexity science on general practice/family medicine. We performed a historical integrative review using the following systematic search strategy: medical subject heading [humans] combined in turn with the terms complex adaptive systems, nonlinear dynamics, systems biology, and systems theory, limited to general practice/family medicine and published before December 2010. A total of 16,242 articles were retrieved, of which 49 were published in general practice/family medicine journals. Hand searches and snowballing retrieved another 35. After a full-text review, we included 56 articles dealing specifically with systems sciences and general/family practice. General practice/family medicine engaged with the emerging systems and complexity theories in 4 stages. Before 1995, articles tended to explore common phenomenologic general practice/family medicine experiences. Between 1995 and 2000, articles described the complex adaptive nature of this discipline. Those published between 2000 and 2005 focused on describing the system dynamics of medical practice. After 2005, articles increasingly applied the breadth of complex science theories to health care, health care reform, and the future of medicine. This historical review describes the development of general practice/family medicine in relation to complex adaptive systems theories, and shows how systems sciences more accurately reflect the discipline's philosophy and identity. Analysis suggests that general practice/family medicine first embraced systems theories through conscious reorganization of its boundaries and scope, before applying empirical tools. Future research should concentrate on applying nonlinear dynamics and empirical modeling to patient care, and to organizing and developing local practices, engaging in community development, and influencing

  12. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  13. Convergent Innovation in Emerging Healthcare Technology Ecosystems: Addressing Complexity and Integration

    Directory of Open Access Journals (Sweden)

    Mark A. Phillips

    2017-09-01

    Full Text Available Precision Medicine and Digital Health are emerging areas in healthcare, and they are underpinned by convergent or cross-industry innovation. However, convergence results in greater uncertainty and complexity in terms of technologies, value networks, and organization. There has been limited empirical research on emerging and convergent ecosystems, especially in addressing the issue of integration. This research identifies how organizations innovate in emerging and convergent ecosystems, specifically, how they address the challenge of integration. We base our research on empirical analyses using a series of longitudinal case studies employing a combination of case interviews, field observations, and documents. Our findings identify a need to embrace the complexity by adopting a variety of approaches that balance “credibility-seeking” and “advantage-seeking” behaviours, to navigate, negotiate, and nurture both the innovation and ecosystem, in addition to a combination of “analysis” and “synthesis” actions to manage aspects of integration. We contribute to the convergent innovation agenda and provide practical approaches for innovators in this domain.

  14. Pan-Cancer Mutational and Transcriptional Analysis of the Integrator Complex

    Directory of Open Access Journals (Sweden)

    Antonio Federico

    2017-04-01

    Full Text Available The integrator complex has been recently identified as a key regulator of RNA Polymerase II-mediated transcription, with many functions including the processing of small nuclear RNAs, the pause-release and elongation of polymerase during the transcription of protein coding genes, and the biogenesis of enhancer derived transcripts. Moreover, some of its components also play a role in genome maintenance. Thus, it is reasonable to hypothesize that their functional impairment or altered expression can contribute to malignancies. Indeed, several studies have described the mutations or transcriptional alteration of some Integrator genes in different cancers. Here, to draw a comprehensive pan-cancer picture of the genomic and transcriptomic alterations for the members of the complex, we reanalyzed public data from The Cancer Genome Atlas. Somatic mutations affecting Integrator subunit genes and their transcriptional profiles have been investigated in about 11,000 patients and 31 tumor types. A general heterogeneity in the mutation frequencies was observed, mostly depending on tumor type. Despite the fact that we could not establish them as cancer drivers, INTS7 and INTS8 genes were highly mutated in specific cancers. A transcriptome analysis of paired (normal and tumor samples revealed that the transcription of INTS7, INTS8, and INTS13 is significantly altered in several cancers. Experimental validation performed on primary tumors confirmed these findings.

  15. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  16. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    Science.gov (United States)

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  17. Talking about the institutional complexity of the integrated rehabilitation system – the importance of coordination

    Directory of Open Access Journals (Sweden)

    Sari Miettinen

    2013-03-01

    Full Text Available Rehabilitation in Finland is a good example of functions divided among several welfare sectors, such as health services and social services.  The rehabilitation system in Finland is a complex one and there have been many efforts to create a coordinated entity. The purpose of this study is to open up a complex welfare system at the upper policy level and to understand the meaning of coordination at the level of service delivery. We shed light in particular on the national rehabilitation policy in Finland and how the policy has tried to overcome the negative effects of institutional complexity. In this study we used qualitative content analysis and frame analysis. As a result we identified four different welfare state frames with distinct features of policy problems, policy alternatives and institutional failure. The rehabilitation policy in Finland seems to be divided into different components which may cause problems at the level of service delivery and thus in the integration of services. Bringing these components together could at policy level enable a shared view of the rights of different population groups, effective management of integration at the level of service delivery and also an opportunity for change throughout the rehabilitation system.

  18. Talking about the institutional complexity of the integrated rehabilitation system-the importance of coordination.

    Science.gov (United States)

    Miettinen, Sari; Ashorn, Ulla; Lehto, Juhani

    2013-01-01

    Rehabilitation in Finland is a good example of functions divided among several welfare sectors, such as health services and social services. The rehabilitation system in Finland is a complex one and there have been many efforts to create a coordinated entity. The purpose of this study is to open up a complex welfare system at the upper policy level and to understand the meaning of coordination at the level of service delivery. We shed light in particular on the national rehabilitation policy in Finland and how the policy has tried to overcome the negative effects of institutional complexity. In this study we used qualitative content analysis and frame analysis. As a result we identified four different welfare state frames with distinct features of policy problems, policy alternatives and institutional failure. The rehabilitation policy in Finland seems to be divided into different components which may cause problems at the level of service delivery and thus in the integration of services. Bringing these components together could at policy level enable a shared view of the rights of different population groups, effective management of integration at the level of service delivery and also an opportunity for change throughout the rehabilitation system.

  19. Large scale IRAM 30 m CO-observations in the giant molecular cloud complex W43

    Science.gov (United States)

    Carlhoff, P.; Nguyen Luong, Q.; Schilke, P.; Motte, F.; Schneider, N.; Beuther, H.; Bontemps, S.; Heitsch, F.; Hill, T.; Kramer, C.; Ossenkopf, V.; Schuller, F.; Simon, R.; Wyrowski, F.

    2013-12-01

    We aim to fully describe the distribution and location of dense molecular clouds in the giant molecular cloud complex W43. It was previously identified as one of the most massive star-forming regions in our Galaxy. To trace the moderately dense molecular clouds in the W43 region, we initiated W43-HERO, a large program using the IRAM 30 m telescope, which covers a wide dynamic range of scales from 0.3 to 140 pc. We obtained on-the-fly-maps in 13CO (2-1) and C18O (2-1) with a high spectral resolution of 0.1 km s-1 and a spatial resolution of 12''. These maps cover an area of ~1.5 square degrees and include the two main clouds of W43 and the lower density gas surrounding them. A comparison to Galactic models and previous distance calculations confirms the location of W43 near the tangential point of the Scutum arm at approximately 6 kpc from the Sun. The resulting intensity cubes of the observed region are separated into subcubes, which are centered on single clouds and then analyzed in detail. The optical depth, excitation temperature, and H2 column density maps are derived out of the 13CO and C18O data. These results are then compared to those derived from Herschel dust maps. The mass of a typical cloud is several 104 M⊙ while the total mass in the dense molecular gas (>102 cm-3) in W43 is found to be ~1.9 × 106 M⊙. Probability distribution functions obtained from column density maps derived from molecular line data and Herschel imaging show a log-normal distribution for low column densities and a power-law tail for high densities. A flatter slope for the molecular line data probability distribution function may imply that those selectively show the gravitationally collapsing gas. Appendices are available in electronic form at http://www.aanda.orgThe final datacubes (13CO and C18O) for the entire survey are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/560/A24

  20. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  1. An Integrated Approach for Monitoring Contemporary and Recruitable Large Woody Debris

    Directory of Open Access Journals (Sweden)

    Jeffrey J. Richardson

    2016-09-01

    Full Text Available Large woody debris (LWD plays a critical structural role in riparian ecosystems, but it can be difficult and time-consuming to quantify and survey in the field. We demonstrate an automated method for quantifying LWD using aerial LiDAR and object-based image analysis techniques, as well as a manual method for quantifying LWD using image interpretation derived from LiDAR rasters and aerial four-band imagery. In addition, we employ an established method for estimating the number of individual trees within the riparian forest. These methods are compared to field data showing high accuracies for the LWD method and moderate accuracy for the individual tree method. These methods can be integrated to quantify the contemporary and recruitable LWD in a river system.

  2. Large deviations and Lifshitz singularity of the integrated density of states of random Hamiltonians

    International Nuclear Information System (INIS)

    Kirsch, W.; Martinelli, F.

    1983-01-01

    We consider the integrated density of states (IDS) rhosub(infinite)(lambda) of random Hamiltonian Hsub#betta#=-δ+Vsub#betta#, Vsub#betta# being a random field on Rsup(d) which satisfies a mixing condition. We prove that the probability of large fluctuations of the finite volume IDSvertical stroke#betta#vertical stroke - 1 rho(lambda,Hsub(lambda)(#betta#)), #betta#is contained inRsup(d), around the thermodynamic limit rhosub(infinite)(lambda) is bounded from above by exp[-kvertical stroke#betta#vertical stroke], k>0. In this case rhosub(infinite)(lambda) can be recovered from a variational principle. Furthermore we show the existence of a Lifshitz-type of singularity of rhosub(infinite)(lambda) as lambda->0 + in the case where Vsub#betta# is non-negative. More precisely we prove the following bound: rhosub(infinite)(lambda) 0 + k>0. This last result is then discussed in some examples. (orig.)

  3. The integration of novel diagnostics techniques for multi-scale monitoring of large civil infrastructures

    Directory of Open Access Journals (Sweden)

    F. Soldovieri

    2008-11-01

    Full Text Available In the recent years, structural monitoring of large infrastructures (buildings, dams, bridges or more generally man-made structures has raised an increased attention due to the growing interest about safety and security issues and risk assessment through early detection. In this framework, aim of the paper is to introduce a new integrated approach which combines two sensing techniques acting on different spatial and temporal scales. The first one is a distributed optic fiber sensor based on the Brillouin scattering phenomenon, which allows a spatially and temporally continuous monitoring of the structure with a "low" spatial resolution (meter. The second technique is based on the use of Ground Penetrating Radar (GPR, which can provide detailed images of the inner status of the structure (with a spatial resolution less then tens centimetres, but does not allow a temporal continuous monitoring. The paper describes the features of these two techniques and provides experimental results concerning preliminary test cases.

  4. analysis of large electromagnetic pulse simulators using the electric field integral equation method in time domain

    International Nuclear Information System (INIS)

    Jamali, J.; Aghajafari, R.; Moini, R.; Sadeghi, H.

    2002-01-01

    A time-domain approach is presented to calculate electromagnetic fields inside a large Electromagnetic Pulse (EMP) simulator. This type of EMP simulator is used for studying the effect of electromagnetic pulses on electrical apparatus in various structures such as vehicles, a reoplanes, etc. The simulator consists of three planar transmission lines. To solve the problem, we first model the metallic structure of the simulator as a grid of conducting wires. The numerical solution of the governing electric field integral equation is then obtained using the method of moments in time domain. To demonstrate the accuracy of the model, we consider a typical EMP simulator. The comparison of our results with those obtained experimentally in the literature validates the model introduced in this paper

  5. 'Take the long way down': Integration of large-scale North Sea wind using HVDC transmission

    International Nuclear Information System (INIS)

    Weigt, Hannes; Jeske, Till; Leuthold, Florian; Hirschhausen, Christian von

    2010-01-01

    We analyze the impact of extensive wind development in Germany for the year 2015, focusing on grid extensions and price signals. We apply the electricity generation and network model ELMOD to compare zonal, nodal, and uniform pricing approaches. In addition to a reference case of network extensions recommended by the German Energy Agency (Dena), we develop a scenario to transmit wind energy to major load centers in Western and Southern Germany via high-voltage direct current (HVDC) connections. From an economic-engineering standpoint, our results indicate that these connections are the most economic way to manage the integration of large-scale offshore wind resources, and that nodal pricing is most likely to determine the locales for future investment to eliminate congestion. We conclude with a description of the model's potential limitations.

  6. The large-scale integration of wind generation: Impacts on price, reliability and dispatchable conventional suppliers

    International Nuclear Information System (INIS)

    MacCormack, John; Hollis, Aidan; Zareipour, Hamidreza; Rosehart, William

    2010-01-01

    This work examines the effects of large-scale integration of wind powered electricity generation in a deregulated energy-only market on loads (in terms of electricity prices and supply reliability) and dispatchable conventional power suppliers. Hourly models of wind generation time series, load and resultant residual demand are created. From these a non-chronological residual demand duration curve is developed that is combined with a probabilistic model of dispatchable conventional generator availability, a model of an energy-only market with a price cap, and a model of generator costs and dispatch behavior. A number of simulations are performed to evaluate the effect on electricity prices, overall reliability of supply, the ability of a dominant supplier acting strategically to profitably withhold supplies, and the fixed cost recovery of dispatchable conventional power suppliers at different levels of wind generation penetration. Medium and long term responses of the market and/or regulator in the long term are discussed.

  7. Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits

    Science.gov (United States)

    Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin

    2009-01-01

    Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.

  8. Integration of large amounts of wind power. Markets for trading imbalances

    Energy Technology Data Exchange (ETDEWEB)

    Neimane, Viktoria; Axelsson, Urban [Vattenfall Research and Development AB, Stockholm (Sweden); Gustafsson, Johan; Gustafsson, Kristian [Vattenfall Nordic Generation Management, Stockholm (Sweden); Murray, Robin [Vattenfall Vindkraft AB, Stockholm (Sweden)

    2008-07-01

    The well-known concerns about wind power are related to its intermittent nature and difficulty to make exact forecasts. The expected increase in balancing and reserve requirements due to wind power has been investigated in several studies. This paper takes the next step in studying integration of large amounts of wind power in Sweden. Several wind power producers' and corresponding balance providers' perspective is taken and their imbalance costs modeled. Larger producers having wind power spread over larger geographical areas will have lower relative costs than producers having their units concentrated within limited geographical area. Possibilities of the wind power producers to reduce the imbalance costs by acting on after sales market are exposed and compared. (orig.)

  9. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    International Nuclear Information System (INIS)

    Taylor, J'Tia Patrice; Shropshire, David E.

    2009-01-01

    This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated system

  10. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    Energy Technology Data Exchange (ETDEWEB)

    J' Tia Patrice Taylor; David E. Shropshire

    2009-09-01

    Abstract This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated

  11. A Hybrid Neuro-Fuzzy Model For Integrating Large Earth-Science Datasets

    Science.gov (United States)

    Porwal, A.; Carranza, J.; Hale, M.

    2004-12-01

    A GIS-based hybrid neuro-fuzzy approach to integration of large earth-science datasets for mineral prospectivity mapping is described. It implements a Takagi-Sugeno type fuzzy inference system in the framework of a four-layered feed-forward adaptive neural network. Each unique combination of the datasets is considered a feature vector whose components are derived by knowledge-based ordinal encoding of the constituent datasets. A subset of feature vectors with a known output target vector (i.e., unique conditions known to be associated with either a mineralized or a barren location) is used for the training of an adaptive neuro-fuzzy inference system. Training involves iterative adjustment of parameters of the adaptive neuro-fuzzy inference system using a hybrid learning procedure for mapping each training vector to its output target vector with minimum sum of squared error. The trained adaptive neuro-fuzzy inference system is used to process all feature vectors. The output for each feature vector is a value that indicates the extent to which a feature vector belongs to the mineralized class or the barren class. These values are used to generate a prospectivity map. The procedure is demonstrated by an application to regional-scale base metal prospectivity mapping in a study area located in the Aravalli metallogenic province (western India). A comparison of the hybrid neuro-fuzzy approach with pure knowledge-driven fuzzy and pure data-driven neural network approaches indicates that the former offers a superior method for integrating large earth-science datasets for predictive spatial mathematical modelling.

  12. Evaluation of remote delivery of Passive Integrated Transponder (PIT technology to mark large mammals.

    Directory of Open Access Journals (Sweden)

    W David Walter

    Full Text Available Methods to individually mark and identify free-ranging wildlife without trapping and handling would be useful for a variety of research and management purposes. The use of Passive Integrated Transponder technology could be an efficient method for collecting data for mark-recapture analysis and other strategies for assessing characteristics about populations of various wildlife species. Passive Integrated Transponder tags (PIT have unique numbered frequencies and have been used to successfully mark and identify mammals. We tested for successful injection of PIT and subsequent functioning of PIT into gelatin blocks using 4 variations of a prototype dart. We then selected the prototype dart that resulted in the least depth of penetration in the gelatin block to assess the ability of PIT to be successfully implanted into muscle tissue of white-tailed deer (Odocoileus virginianus post-mortem and long-term in live, captive Rocky Mountain elk (Cervus elaphus. The prototype dart with a 12.7 mm (0.5 inch needle length and no powder charge resulted in the shallowest mean (± SD penetration depth into gelatin blocks of 27.0 mm (± 5.6 mm with 2.0 psi setting on the Dan-Inject CO(2-pressured rifle. Eighty percent of PIT were successfully injected in the muscle mass of white-tailed deer post-mortem with a mean (± SD penetration depth of 22.2 mm (± 3.8 mm; n = 6. We injected PIT successfully into 13 live, captive elk by remote delivery at about 20 m that remained functional for 7 months. We successfully demonstrated that PIT could be remotely delivered in darts into muscle mass of large mammals and remain functional for >6 months. Although further research is warranted to fully develop the technique, remote delivery of PIT technology to large mammals is possible using prototype implant darts.

  13. Large-scale modeling of condition-specific gene regulatory networks by information integration and inference.

    Science.gov (United States)

    Ellwanger, Daniel Christian; Leonhardt, Jörn Florian; Mewes, Hans-Werner

    2014-12-01

    Understanding how regulatory networks globally coordinate the response of a cell to changing conditions, such as perturbations by shifting environments, is an elementary challenge in systems biology which has yet to be met. Genome-wide gene expression measurements are high dimensional as these are reflecting the condition-specific interplay of thousands of cellular components. The integration of prior biological knowledge into the modeling process of systems-wide gene regulation enables the large-scale interpretation of gene expression signals in the context of known regulatory relations. We developed COGERE (http://mips.helmholtz-muenchen.de/cogere), a method for the inference of condition-specific gene regulatory networks in human and mouse. We integrated existing knowledge of regulatory interactions from multiple sources to a comprehensive model of prior information. COGERE infers condition-specific regulation by evaluating the mutual dependency between regulator (transcription factor or miRNA) and target gene expression using prior information. This dependency is scored by the non-parametric, nonlinear correlation coefficient η(2) (eta squared) that is derived by a two-way analysis of variance. We show that COGERE significantly outperforms alternative methods in predicting condition-specific gene regulatory networks on simulated data sets. Furthermore, by inferring the cancer-specific gene regulatory network from the NCI-60 expression study, we demonstrate the utility of COGERE to promote hypothesis-driven clinical research. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Potentials and challenges of integration for complex metal oxides in CMOS devices and beyond

    International Nuclear Information System (INIS)

    Kim, Y; Pham, C; Chang, J P

    2015-01-01

    This review focuses on recent accomplishments on complex metal oxide based multifunctional materials and the potential they hold in advancing integrated circuits. It begins with metal oxide based high-κ materials to highlight the success of their integration since 45 nm complementary metal–oxide–semiconductor (CMOS) devices. By simultaneously offering a higher dielectric constant for improved capacitance as well as providing a thicker physical layer to prevent the quantum mechanical tunnelling of electrons, high-κ materials have enabled the continued down-scaling of CMOS based devices. The most recent technology driver has been the demand to lower device power consumption, which requires the design and synthesis of novel materials, such as complex metal oxides that exhibit remarkable tunability in their ferromagnetic, ferroelectric and multiferroic properties. These properties make them suitable for a wide variety of applications such as magnetoelectric random access memory, radio frequency band pass filters, antennae and magnetic sensors. Single-phase multiferroics, while rare, offer unique functionalities which have motivated much scientific and technological research to ascertain the origins of their multiferroicity and their applicability to potential devices. However, due to the weak magnetoelectric coupling for single-phase multiferroics, engineered multiferroic composites based on magnetostrictive ferromagnets interfacing piezoelectrics or ferroelectrics have shown enhanced multiferroic behaviour from effective strain coupling at the interface. In addition, nanostructuring of the ferroic phases has demonstrated further improvement in the coupling effect. Therefore, single-phase and engineered composite multiferroics consisting of complex metal oxides are reviewed in terms of magnetoelectric coupling effects and voltage controlled ferromagnetic properties, followed by a review on the integration challenges that need to be overcome to realize the

  15. Integrated versus fragmented implementation of complex innovations in acute health care

    Science.gov (United States)

    Woiceshyn, Jaana; Blades, Kenneth; Pendharkar, Sachin R.

    2017-01-01

    Background: Increased demand and escalating costs necessitate innovation in health care. The challenge is to implement complex innovations—those that require coordinated use across the adopting organization to have the intended benefits. Purpose: We wanted to understand why and how two of five similar hospitals associated with the same health care authority made more progress with implementing a complex inpatient discharge innovation whereas the other three experienced more difficulties in doing so. Methodology: We conducted a qualitative comparative case study of the implementation process at five comparable urban hospitals adopting the same inpatient discharge innovation mandated by their health care authority. We analyzed documents and conducted 39 interviews of the health care authority and hospital executives and frontline managers across the five sites over a 1-year period while the implementation was ongoing. Findings: In two and a half years, two of the participating hospitals had made significant progress with implementing the innovation and had begun to realize benefits; they exemplified an integrated implementation mode. Three sites had made minimal progress, following a fragmented implementation mode. In the former mode, a semiautonomous health care organization developed a clear overall purpose and chose one umbrella initiative to implement it. The integrative initiative subsumed the rest and guided resource allocation and the practices of hospital executives, frontline managers, and staff who had bought into it. In contrast, in the fragmented implementation mode, the health care authority had several overlapping, competing innovations that overwhelmed the sites and impeded their implementation. Practice Implications: Implementing a complex innovation across hospital sites required (a) early prioritization of one initiative as integrative, (b) the commitment of additional (traded off or new) human resources, (c) deliberate upfront planning and

  16. Integrated ecotechnology approach towards treatment of complex wastewater with simultaneous bioenergy production.

    Science.gov (United States)

    Hemalatha, Manupati; Sravan, J Shanthi; Yeruva, Dileep Kumar; Venkata Mohan, S

    2017-10-01

    Sequential integration of three stage diverse biological processes was studied by exploiting the individual process advantage towards enhanced treatment of complex chemical based wastewater. A successful attempt to integrate sequence batch reactor (SBR) with bioelectrochemical treatment (BET) and finally with microalgae treatment was studied. The sequential integration has showed individual substrate degradation (COD) of 55% in SBR, 49% in BET and 56% in microalgae, accounting for a consolidated treatment efficiency of 90%. Nitrates removal efficiency of 25% was observed in SBR, 31% in BET and 44% in microalgae, with a total efficiency of 72%. The SBR treated effluents fed to BET with the electrode intervention showed TDS removal. BET exhibited relatively higher process performance than SBR. The integration approach significantly overcame the individual process limitations along with value addition as biomass (1.75g/L), carbohydrates (640mg/g), lipids (15%) and bioelectricity. The study resulted in providing a strategy of combining SBR as pretreatment step to BET process and finally polishing with microalgae cultivation achieving the benefits of enhanced wastewater treatment along with value addition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Complex fluid network optimization and control integrative design based on nonlinear dynamic model

    International Nuclear Information System (INIS)

    Sui, Jinxue; Yang, Li; Hu, Yunan

    2016-01-01

    In view of distribution according to complex fluid network’s needs, this paper proposed one optimization computation method of the nonlinear programming mathematical model based on genetic algorithm. The simulation result shows that the overall energy consumption of the optimized fluid network has a decrease obviously. The control model of the fluid network is established based on nonlinear dynamics. We design the control law based on feedback linearization, take the optimal value by genetic algorithm as the simulation data, can also solve the branch resistance under the optimal value. These resistances can provide technical support and reference for fluid network design and construction, so can realize complex fluid network optimization and control integration design.

  18. Dependency of {gamma}-secretase complex activity on the structural integrity of the bilayer

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Hua, E-mail: hzhou2@lbl.gov [Life Sciences Division, Lawrence Berkeley National Laboratory, University of California, Berkeley, CA 94720 (United States); Zhou, Shuxia; Walian, Peter J.; Jap, Bing K. [Life Sciences Division, Lawrence Berkeley National Laboratory, University of California, Berkeley, CA 94720 (United States)

    2010-11-12

    Research highlights: {yields} Partial solubilization of membranes with CHAPSO can increase {gamma}-secretase activity. {yields} Completely solubilized {gamma}-secretase is inactive. {yields} Purified {gamma}-secretase regains activity after reconstitution into lipid bilayers. {yields} A broad range of detergents can be used to successfully reconstitute {gamma}-secretase. -- Abstract: {gamma}-secretase is a membrane protein complex associated with the production of A{beta} peptides that are pathogenic in Alzheimer's disease. We have characterized the activity of {gamma}-secretase complexes under a variety of detergent solubilization and reconstitution conditions, and the structural state of proteoliposomes by electron microscopy. We found that {gamma}-secretase activity is highly dependent on the physical state or integrity of the membrane bilayer - partial solubilization may increase activity while complete solubilization will abolish it. The activity of well-solubilized {gamma}-secretase can be restored to near native levels when properly reconstituted into a lipid bilayer environment.

  19. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    Science.gov (United States)

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  20. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  1. Study of integrated optimization design of wind farm in complex terrain

    DEFF Research Database (Denmark)

    Xu, Chang; Chen, Dandan; Han, Xingxing

    2017-01-01

    wind farm design in complex terrain and setting up integrated optimization mathematical model for micro-site selection, power lines and road maintenance design etc.. Based on the existing 1-year wind measurement data in the wind farm area, the genetic algorithm was used to optimize the micro......-site selection. On the basis of location optimization of wind turbine, the optimization algorithms such as single-source shortest path algorithm and minimum spanning tree algorithm were used to optimize electric lines and maintenance roads. The practice shows that the research results can provide important...

  2. Experience with LHC Magnets from Prototyping to Large Scale Industrial Production and Integration

    CERN Multimedia

    Rossi, L

    2004-01-01

    The construction of the LHC superconducting magnets is approaching its half way to completion. At the end of 2003, main dipoles cold masses for more than one octant were delivered; meanwhile the winding for the second octant was almost completed. The other large magnets, like the main quadrupoles and the insertion quadrupoles, have entered into series production as well. Providing more than 20 km of superconducting magnets, with the quality required for an accelerator like LHC, is an unprecedented challenge in term of complexity that has required many steps from the construction of 1 meterlong magnets in the laboratory to today’s production of more than one 15 meter-long magnet per day in Industry. The work and its organization is made even more complex by the fact that CERN supplies most of the critical components and part of the main tooling to the magnet manufacturers, both for cost reduction and for quality issues. In this paper the critical aspects of the construction will be reviewed and the actual ...

  3. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  4. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  5. MacroBac: New Technologies for Robust and Efficient Large-Scale Production of Recombinant Multiprotein Complexes.

    Science.gov (United States)

    Gradia, Scott D; Ishida, Justin P; Tsai, Miaw-Sheue; Jeans, Chris; Tainer, John A; Fuss, Jill O

    2017-01-01

    Recombinant expression of large, multiprotein complexes is essential and often rate limiting for determining structural, biophysical, and biochemical properties of DNA repair, replication, transcription, and other key cellular processes. Baculovirus-infected insect cell expression systems are especially well suited for producing large, human proteins recombinantly, and multigene baculovirus systems have facilitated studies of multiprotein complexes. In this chapter, we describe a multigene baculovirus system called MacroBac that uses a Biobricks-type assembly method based on restriction and ligation (Series 11) or ligation-independent cloning (Series 438). MacroBac cloning and assembly is efficient and equally well suited for either single subcloning reactions or high-throughput cloning using 96-well plates and liquid handling robotics. MacroBac vectors are polypromoter with each gene flanked by a strong polyhedrin promoter and an SV40 poly(A) termination signal that minimize gene order expression level effects seen in many polycistronic assemblies. Large assemblies are robustly achievable, and we have successfully assembled as many as 10 genes into a single MacroBac vector. Importantly, we have observed significant increases in expression levels and quality of large, multiprotein complexes using a single, multigene, polypromoter virus rather than coinfection with multiple, single-gene viruses. Given the importance of characterizing functional complexes, we believe that MacroBac provides a critical enabling technology that may change the way that structural, biophysical, and biochemical research is done. © 2017 Elsevier Inc. All rights reserved.

  6. Failure of large transformation projects from the viewpoint of complex adaptive systems: Management principles for dealing with project dynamics

    NARCIS (Netherlands)

    Janssen, M.; Voort, H. van der; Veenstra, A.F.E. van

    2015-01-01

    Many large transformation projects do not result in the outcomes desired or envisioned by the stakeholders. This type of project is characterised by dynamics which are both caused by and result of uncertainties and unexpected behaviour. In this paper a complex adaptive system (CAS) view was adopted

  7. Large-scale grid-enabled lattice-Boltzmann simulations of complex fluid flow in porous media and under shear

    NARCIS (Netherlands)

    Harting, J.D.R.; Venturoli, M.; Coveney, P.V.

    2004-01-01

    Well–designed lattice Boltzmann codes exploit the essentially embarrassingly parallel features of the algorithm and so can be run with considerable efficiency on modern supercomputers. Such scalable codes permit us to simulate the behaviour of increasingly large quantities of complex condensed

  8. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem; Kammoun, Abla; Sanguinetti, Luca; Debbah, Merouane; Alouini, Mohamed-Slim

    2016-01-01

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity

  9. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases

    NARCIS (Netherlands)

    Heidema, A.G.; Boer, J.M.A.; Nagelkerke, N.; Mariman, E.C.M.; A, van der D.L.; Feskens, E.J.M.

    2006-01-01

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods

  10. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Science.gov (United States)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  11. A Doherty Power Amplifier with Large Back-Off Power Range Using Integrated Enhancing Reactance

    Directory of Open Access Journals (Sweden)

    Wa Kong

    2018-01-01

    Full Text Available A symmetric Doherty power amplifier (DPA based on integrated enhancing reactance (IER was proposed for large back-off applications. The IER was generated using the peaking amplifier with the help of a desired impedance transformation in the low-power region to enhance the back-off efficiency of the carrier amplifier. To convert the impedances properly, both in the low-power region and at saturation, a two-impedance matching method was employed to design the output matching networks. For verification, a symmetric DPA with large back-off power range over 2.2–2.5 GHz was designed and fabricated. Measurement results show that the designed DPA has the 9 dB back-off efficiency of higher than 45%, while the saturated output power is higher than 44 dBm over the whole operation bandwidth. When driven by a 20 MHz LTE signal, the DPA can achieve good average efficiency of around 50% with adjacent channel leakage ratio of about –50 dBc after linearization over the frequency band of interest. The linearity improvement of the DPA for multistandard wireless communication system was also verified with a dual-band modulated signal.

  12. Integrated biodosimetry in large scale radiological events. Opportunities for civil military co-operation

    International Nuclear Information System (INIS)

    Port, M.; Eder, S.F.; Lamkowski, A.; Majewski, M.; Abend, M.

    2016-01-01

    Radiological events like large scale radiological or nuclear accidents, terroristic attacks with radionuclide dispersal devices require rapid and precise medical classification (''triage'') and medical management of a large number of patients. Estimates on the absorbed dose and in particular predictions of the radiation induced health effects are mandatory for optimized allocation of limited medical resources and initiation of patient centred treatment. Among the German Armed Forces Medical Services the Bundeswehr Institute of Radiobiology offers a wide range of tools for the purpose of medical management to cope with different scenarios. The forward deployable mobile Medical Task Force has access to state of the art methodologies summarized into approaches such as physical dosimetry (including mobile gammaspectroscopy), clinical ''dosimetry'' (prodromi, H-Modul) and different means of biological dosimetry (e.g. dicentrics, high throughput gene expression techniques, gamma-H2AX). The integration of these different approaches enables trained physicians of the Medical Task Force to assess individual health injuries as well as prognostic evaluation, considering modern treatment options. To enhance the capacity of single institutions, networking has been recognized as an important emergency response strategy. The capabilities of physical, biological and clinical ''dosimetry'' approaches spanning from low up to high radiation exposures will be discussed. Furthermore civil military opportunities for combined efforts will be demonstrated.

  13. Operation Modeling of Power Systems Integrated with Large-Scale New Energy Power Sources

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-10-01

    Full Text Available In the most current methods of probabilistic power system production simulation, the output characteristics of new energy power generation (NEPG has not been comprehensively considered. In this paper, the power output characteristics of wind power generation and photovoltaic power generation are firstly analyzed based on statistical methods according to their historical operating data. Then the characteristic indexes and the filtering principle of the NEPG historical output scenarios are introduced with the confidence level, and the calculation model of NEPG’s credible capacity is proposed. Based on this, taking the minimum production costs or the best energy-saving and emission-reduction effect as the optimization objective, the power system operation model with large-scale integration of new energy power generation (NEPG is established considering the power balance, the electricity balance and the peak balance. Besides, the constraints of the operating characteristics of different power generation types, the maintenance schedule, the load reservation, the emergency reservation, the water abandonment and the transmitting capacity between different areas are also considered. With the proposed power system operation model, the operation simulations are carried out based on the actual Northwest power grid of China, which resolves the new energy power accommodations considering different system operating conditions. The simulation results well verify the validity of the proposed power system operation model in the accommodation analysis for the power system which is penetrated with large scale NEPG.

  14. 1 million-Q optomechanical microdisk resonators for sensing with very large scale integration

    Science.gov (United States)

    Hermouet, M.; Sansa, M.; Banniard, L.; Fafin, A.; Gely, M.; Allain, P. E.; Santos, E. Gil; Favero, I.; Alava, T.; Jourdan, G.; Hentz, S.

    2018-02-01

    Cavity optomechanics have become a promising route towards the development of ultrasensitive sensors for a wide range of applications including mass, chemical and biological sensing. In this study, we demonstrate the potential of Very Large Scale Integration (VLSI) with state-of-the-art low-loss performance silicon optomechanical microdisks for sensing applications. We report microdisks exhibiting optical Whispering Gallery Modes (WGM) with 1 million quality factors, yielding high displacement sensitivity and strong coupling between optical WGMs and in-plane mechanical Radial Breathing Modes (RBM). Such high-Q microdisks with mechanical resonance frequencies in the 102 MHz range were fabricated on 200 mm wafers with Variable Shape Electron Beam lithography. Benefiting from ultrasensitive readout, their Brownian motion could be resolved with good Signal-to-Noise ratio at ambient pressure, as well as in liquid, despite high frequency operation and large fluidic damping: the mechanical quality factor reduced from few 103 in air to 10's in liquid, and the mechanical resonance frequency shifted down by a few percent. Proceeding one step further, we performed an all-optical operation of the resonators in air using a pump-probe scheme. Our results show our VLSI process is a viable approach for the next generation of sensors operating in vacuum, gas or liquid phase.

  15. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  16. Multilevel compression of random walks on networks reveals hierarchical organization in large integrated systems.

    Directory of Open Access Journals (Sweden)

    Martin Rosvall

    Full Text Available To comprehend the hierarchical organization of large integrated systems, we introduce the hierarchical map equation, which reveals multilevel structures in networks. In this information-theoretic approach, we exploit the duality between compression and pattern detection; by compressing a description of a random walker as a proxy for real flow on a network, we find regularities in the network that induce this system-wide flow. Finding the shortest multilevel description of the random walker therefore gives us the best hierarchical clustering of the network--the optimal number of levels and modular partition at each level--with respect to the dynamics on the network. With a novel search algorithm, we extract and illustrate the rich multilevel organization of several large social and biological networks. For example, from the global air traffic network we uncover countries and continents, and from the pattern of scientific communication we reveal more than 100 scientific fields organized in four major disciplines: life sciences, physical sciences, ecology and earth sciences, and social sciences. In general, we find shallow hierarchical structures in globally interconnected systems, such as neural networks, and rich multilevel organizations in systems with highly separated regions, such as road networks.

  17. Rotation analysis on large complex superconducting cables based on numerical modeling and experiments

    NARCIS (Netherlands)

    Qin, Jinggang; Yue, Donghua; Zhang, Xingyi; Wu, Yu; Liu, Xiaochuan; Liu, Huajun; Jin, Huan; Dai, Chao; Nijhuis, Arend; Zhou, Chao; Devred, Arnaud

    2018-01-01

    The conductors used in large fusion reactors, e.g. ITER, CFETR and DEMO, are made of cable-in-conduit conductor (CICC) with large diameters up to about 50 mm. The superconducting and copper strands are cabled around a central spiral and then wrapped with stainless-steel tape of 0.1 mm thickness. The

  18. Concurrent use of data base and graphics computer workstations to provide graphic access to large, complex data bases for robotics control of nuclear surveillance and maintenance

    International Nuclear Information System (INIS)

    Dalton, G.R.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida is part of a multiuniversity research effort, sponsored by the US Department of Energy which is under way to develop and deploy an advanced semi-autonomous robotic system for use in nuclear power stations. This paper reports on the development of the computer tools necessary to gain convenient graphic access to the intelligence implicit in a large complex data base such as that in a nuclear reactor plant. This program is integrated as a man/machine interface within the larger context of the total computerized robotic planning and control system. The portion of the project described here addresses the connection between the three-dimensional displays on an interactive graphic workstation and a data-base computer running a large data-base server program. Programming the two computers to work together to accept graphic queries and return answers on the graphic workstation is a key part of the interactive capability developed

  19. Speciation on the rocks: integrated systematics of the Heteronotia spelea species complex (Gekkota; Reptilia from Western and Central Australia.

    Directory of Open Access Journals (Sweden)

    Mitzy Pepper

    Full Text Available The isolated uplands of the Australian arid zone are known to provide mesic refuges in an otherwise xeric landscape, and divergent lineages of largely arid zone taxa have persisted in these regions following the onset of Miocene aridification. Geckos of the genus Heteronotia are one such group, and have been the subject of many genetic studies, including H. spelea, a strongly banded form that occurs in the uplands of the Pilbara and Central Ranges regions of the Australian arid zone. Here we assess the systematics of these geckos based on detailed examination of morphological and genetic variation. The H. spelea species complex is a monophyletic lineage to the exclusion of the H. binoei and H. planiceps species complexes. Within the H. spelea complex, our previous studies based on mtDNA and nine nDNA loci found populations from the Central Ranges to be genetically divergent from Pilbara populations. Here we supplement our published molecular data with additional data gathered from central Australian samples. In the spirit of integrative species delimitation, we combine multi-locus, coalescent-based lineage delimitation with extensive morphological analyses to test species boundaries, and we describe the central populations as a new species, H. fasciolatus sp. nov. In addition, within the Pilbara there is strong genetic evidence for three lineages corresponding to northeastern (type, southern, and a large-bodied melanic population isolated in the northwest. Due to its genetic distinctiveness and extreme morphological divergence from all other Heteronotia, we describe the melanic form as a new species, H. atra sp. nov. The northeastern and southern Pilbara populations are morphologically indistinguishable with the exception of a morpho-type in the southeast that has a banding pattern resembling H. planiceps from the northern monsoonal tropics. Pending more extensive analyses, we therefore treat Pilbara H. spelea as a single species with

  20. Mapping multivalency and differential affinities within large intrinsically disordered protein complexes with segmental motion analysis.

    Science.gov (United States)

    Milles, Sigrid; Lemke, Edward A

    2014-07-07

    Intrinsically disordered proteins (IDPs) can bind to multiple interaction partners. Numerous binding regions in the IDP that act in concert through complex cooperative effects facilitate such interactions, but complicate studying IDP complexes. To address this challenge we developed a combined fluorescence correlation and time-resolved polarization spectroscopy approach to study the binding properties of the IDP nucleoporin153 (Nup153) to nuclear transport receptors (NTRs). The detection of segmental backbone mobility of Nup153 within the unperturbed complex provided a readout of local, region-specific binding properties that are usually masked in measurements of the whole IDP. The binding affinities of functionally and structurally diverse NTRs to distinct regions of Nup153 can differ by orders of magnitudes-a result with implications for the diversity of transport routes in nucleocytoplasmic transport. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Complex active regions as the main source of extreme and large solar proton events

    Science.gov (United States)

    Ishkov, V. N.

    2013-12-01

    A study of solar proton sources indicated that solar flare events responsible for ≥2000 pfu proton fluxes mostly occur in complex active regions (CARs), i.e., in transition structures between active regions and activity complexes. Different classes of similar structures and their relation to solar proton events (SPEs) and evolution, depending on the origination conditions, are considered. Arguments in favor of the fact that sunspot groups with extreme dimensions are CARs are presented. An analysis of the flare activity in a CAR resulted in the detection of "physical" boundaries, which separate magnetic structures of the same polarity and are responsible for the independent development of each structure.

  2. Integrated digital control and man-machine interface for complex remote handing systems

    International Nuclear Information System (INIS)

    Rowe, J.C.; Spille, R.F.; Zimmermann, S.D.

    1987-01-01

    The Advanced Integrated Maintenance System (AIMS) is part of a continuing effort within the Consolidated Fuel Reprocessing Program at Oak Ridge National Laboratory to develop and extend the capabilities of remote manipulation and maintenance technology. The AIMS is a totally integrated approach to remote handling in hazardous environments. State-of-the-art computer systems connected through a high-speed distributed control system that supports the flexibility and expandability needed for large integrated maintenance applications. A man-Machine Interface provides high-level human interaction through a powerful color graphics menu-controlled operator console. An auxiliary control system handles the real-time processing needs for a variety of support hardware. A pair of dedicated fiber-optic-linked master/slave computer systems control the Advanced Servomanipulator master/slave arms using powerful distributed digital processing methods. The FORTH language was used as a real-time operating and development environment for the entire system, and all of these components are integrated into a control room concept that represents the latest advancements in the development of remote maintenance facilities for hazardous environments

  3. Integrated digital control and man-machine interface for complex remote handling systems

    International Nuclear Information System (INIS)

    Rowe, J.C.; Spille, R.F.; Zimmermann, S.D.

    1986-12-01

    The Advanced Integrated Maintenance System (AIMS) is part of a continuing effort within the Consolidated Fuel Reprocessing Program at Oak Ridge National Laboratory to develop and extend the capabilities of remote manipulation and maintenance technology. The AIMS is a totally integrated approach to remote handling in hazardous environments. State-of-the-art computer systems connected through a high-speed communication network provide a real-time distributed control system that supports the flexibility and expandability needed for large integrated maintenance applications. A Man-Machine Interface provides high-level human interaction through a powerful color graphics menu-controlled operator console. An auxiliary control system handles the real-time processing needs for a variety of support hardware. A pair of dedicated fiber-optic-linked master/slave computer system control the Advanced Servomanipulator master/slave arms using powerful distributed digital processing methods. The FORTH language was used as a real-time operating and development environment for the entire system, and all of these components are integrated into a control room concept that represents the latest advancements in the development of remote maintenance facilities for hazardous environments

  4. Large space antenna communications systems: Integrated Langley Research Center/Jet Propulsion Laboratory development activities. 2: Langley Research Center activities

    Science.gov (United States)

    Cambell, T. G.; Bailey, M. C.; Cockrell, C. R.; Beck, F. B.

    1983-01-01

    The electromagnetic analysis activities at the Langley Research Center are resulting in efficient and accurate analytical methods for predicting both far- and near-field radiation characteristics of large offset multiple-beam multiple-aperture mesh reflector antennas. The utilization of aperture integration augmented with Geometrical Theory of Diffraction in analyzing the large reflector antenna system is emphasized.

  5. Multi-attribute integrated measurement of node importance in complex networks.

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  6. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  7. Retention of habitat complexity minimizes disassembly of reef fish communities following disturbance: a large-scale natural experiment.

    Directory of Open Access Journals (Sweden)

    Michael J Emslie

    Full Text Available High biodiversity ecosystems are commonly associated with complex habitats. Coral reefs are highly diverse ecosystems, but are under increasing pressure from numerous stressors, many of which reduce live coral cover and habitat complexity with concomitant effects on other organisms such as reef fishes. While previous studies have highlighted the importance of habitat complexity in structuring reef fish communities, they employed gradient or meta-analyses which lacked a controlled experimental design over broad spatial scales to explicitly separate the influence of live coral cover from overall habitat complexity. Here a natural experiment using a long term (20 year, spatially extensive (∼ 115,000 kms(2 dataset from the Great Barrier Reef revealed the fundamental importance of overall habitat complexity for reef fishes. Reductions of both live coral cover and habitat complexity had substantial impacts on fish communities compared to relatively minor impacts after major reductions in coral cover but not habitat complexity. Where habitat complexity was substantially reduced, species abundances broadly declined and a far greater number of fish species were locally extirpated, including economically important fishes. This resulted in decreased species richness and a loss of diversity within functional groups. Our results suggest that the retention of habitat complexity following disturbances can ameliorate the impacts of coral declines on reef fishes, so preserving their capacity to perform important functional roles essential to reef resilience. These results add to a growing body of evidence about the importance of habitat complexity for reef fishes, and represent the first large-scale examination of this question on the Great Barrier Reef.

  8. Integrative shell of the program complex MARS (Version 1.0) radiation transfer in three-dimensional geometries

    International Nuclear Information System (INIS)

    Degtyarev, I.I.; Lokhovitskij, A.E.; Maslov, M.A.; Yazynin, I.A.

    1994-01-01

    The first version of integrative shell of the program complex MARS is written for calculating radiation transfer in the three-dimensional geometries. The integrative shell allows the user to work in convenient form with complex MARS, creat input files data and get graphic visualization of calculated functions. Version 1.0 is adapted for personal computers of types IBM-286,386,486 with operative size memory not smaller than 500K. 5 refs

  9. 76 FR 21256 - Proposed Assessment Rate Adjustment Guidelines for Large and Highly Complex Institutions

    Science.gov (United States)

    2011-04-15

    ... and comment are not required and need not be employed to make future changes to the guidelines. [[Page..., including the materiality of guarantees and franchise value. Commenters on the proposed large bank pricing...

  10. Monolithic Ge-on-Si lasers for large-scale electronic-photonic integration

    Science.gov (United States)

    Liu, Jifeng; Kimerling, Lionel C.; Michel, Jurgen

    2012-09-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic-photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500-1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  11. Monolithic Ge-on-Si lasers for large-scale electronic–photonic integration

    International Nuclear Information System (INIS)

    Liu, Jifeng; Kimerling, Lionel C; Michel, Jurgen

    2012-01-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic–photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500–1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  12. Integrated System Design for a Large Wind Turbine Supported on a Moored Semi-Submersible Platform

    Directory of Open Access Journals (Sweden)

    Jinsong Liu

    2018-01-01

    Full Text Available Over the past few decades, wind energy has emerged as an alternative to conventional power generation that is economical, environmentally friendly and, importantly, renewable. Specifically, offshore wind energy is being considered by a number of countries to harness the stronger and more consistent wind resource compared to that over land. To meet the projected “20% energy from wind by 2030” scenario that was announced in 2006, 54 GW of added wind energy capacity need to come from offshore according to a National Renewable Energy Laboratory (NREL study. In this study, we discuss the development of a semi-submersible floating offshore platform with a catenary mooring system to support a very large 13.2-MW wind turbine with 100-m blades. An iterative design process is applied to baseline models with Froude scaling in order to achieve preliminary static stability. Structural dynamic analyses are performed to investigate the performance of the new model using a finite element method approach for the tower and a boundary integral equation (panel method for the platform. The steady-state response of the system under uniform wind and regular waves is first studied to evaluate the performance of the integrated system. Response amplitude operators (RAOs are computed in the time domain using white-noise wave excitation; this serves to highlight nonlinear, as well as dynamic characteristics of the system. Finally, selected design load cases (DLCs and the stochastic dynamic response of the system are studied to assess the global performance for sea states defined by wind fields with turbulence and long-crested irregular waves.

  13. Developing integrated parametric planning models for budgeting and managing complex projects

    Science.gov (United States)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  14. Integrated complex care model: lessons learned from inter-organizational partnership.

    Science.gov (United States)

    Cohen, Eyal; Bruce-Barrett, Cindy; Kingsnorth, Shauna; Keilty, Krista; Cooper, Anna; Daub, Stacey

    2011-01-01

    Providing integrated care for children with medical complexity in Canada is challenging as these children are, by definition, in need of coordinated care from disparate providers, organizations and funders across the continuum in order to optimize health outcomes. We describe the development of an inter-organizational team constructed as a unique tripartite partnership of an acute care hospital, a children's rehabilitation hospital and a home/community health organization focused on children who frequently use services across these three organizations. Model building and operationalization within the Canadian healthcare system is emphasized. Key challenges identified to date include communication and policy barriers as well as optimizing interactions with families; critical enablers have been alignment with policy trends in healthcare and inter-organizational commitment to integrate at the point of care. Considerations for policy developments supporting full integration across service sectors are raised. Early indicators of success include the enrolment of 34 clients and patients and the securing of funds to evaluate and expand the model to serve more children.

  15. Importance of Mediator complex in the regulation and integration of diverse signaling pathways in plants

    Directory of Open Access Journals (Sweden)

    Subhasis eSamanta

    2015-09-01

    Full Text Available Basic transcriptional machinery in eukaryotes is assisted by a number of cofactors, which either increase or decrease the rate of transcription. Mediator complex is one such cofactor, and recently has drawn a lot of interest because of its integrative power to converge different signaling pathways before channelling the transcription instructions to the RNA polymerase II machinery. Like yeast and metazoans, plants do possess the Mediator complex across the kingdom, and its isolation and subunit analyses have been reported from the model plant, Arabidopsis. Genetic and molecular analyses have unravelled important regulatory roles of Mediator subunits at every stage of plant life cycle starting from flowering to embryo and organ development, to even size determination. It also contributes immensely to the survival of plants against different environmental vagaries by the timely activation of its resistance mechanisms. Here, we have provided an overview of plant Mediator complex starting from its discovery to regulation of stoichiometry of its subunits. We have also reviewed involvement of different Mediator subunits in different processes and pathways including defense response pathways evoked by diverse biotic cues. Wherever possible, attempts have been made to provide mechanistic insight of Mediator’s involvement in these processes.

  16. Assessment of integrated electrical resistivity data on complex aquifer structures in NE Nuba Mountains - Sudan

    Science.gov (United States)

    Mohamed, N. E.; Yaramanci, U.; Kheiralla, K. M.; Abdelgalil, M. Y.

    2011-07-01

    Two geophysical techniques were integrated to map the groundwater aquifers on complex geological settings, in the crystalline basement terrain in northeast Nuba Mountains. The water flow is structurally controlled by the northwest-southeast extensional faults as one of several in-situ deformational patterns that are attributed to the collision of the Pan-African oceanic assemblage of the Nubian shield against the pre-Pan African continental crust to the west. The structural lineaments and drainage systems have been enhanced by the remote sensing technique. The geophysical techniques used are: vertical electrical soundings (VES) and electrical resistivity tomography (ERT), in addition to hydraulic conductivity measurements. These measurements were designed to overlap in order to improve the producibility of the geophysical data and to provide a better interpretation of the hydrogeological setting of the aquifer complex structure. Smooth and Block inversion schemes were attempted for the observed ERT data to study their reliability in mapping the different geometries in the complex subsurface. The VES data was conducted where ERT survey was not accessible, and inverted smoothly and merged with the ERT in the 3D resistivity grid. The hydraulic conductivity was measured for 42 water samples collected from the distributed dug wells in the study area; where extremely high saline zones were recorded and have been compared to the resistivity values in the 3D model.

  17. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  18. Importance of Mediator complex in the regulation and integration of diverse signaling pathways in plants.

    Science.gov (United States)

    Samanta, Subhasis; Thakur, Jitendra K

    2015-01-01

    Basic transcriptional machinery in eukaryotes is assisted by a number of cofactors, which either increase or decrease the rate of transcription. Mediator complex is one such cofactor, and recently has drawn a lot of interest because of its integrative power to converge different signaling pathways before channeling the transcription instructions to the RNA polymerase II machinery. Like yeast and metazoans, plants do possess the Mediator complex across the kingdom, and its isolation and subunit analyses have been reported from the model plant, Arabidopsis. Genetic, and molecular analyses have unraveled important regulatory roles of Mediator subunits at every stage of plant life cycle starting from flowering to embryo and organ development, to even size determination. It also contributes immensely to the survival of plants against different environmental vagaries by the timely activation of its resistance mechanisms. Here, we have provided an overview of plant Mediator complex starting from its discovery to regulation of stoichiometry of its subunits. We have also reviewed involvement of different Mediator subunits in different processes and pathways including defense response pathways evoked by diverse biotic cues. Wherever possible, attempts have been made to provide mechanistic insight of Mediator's involvement in these processes.

  19. Proteomic analysis of the dysferlin protein complex unveils its importance for sarcolemmal maintenance and integrity.

    Directory of Open Access Journals (Sweden)

    Antoine de Morrée

    Full Text Available Dysferlin is critical for repair of muscle membranes after damage. Mutations in dysferlin lead to a progressive muscular dystrophy. Recent studies suggest additional roles for dysferlin. We set out to study dysferlin's protein-protein interactions to obtain comprehensive knowledge of dysferlin functionalities in a myogenic context. We developed a robust and reproducible method to isolate dysferlin protein complexes from cells and tissue. We analyzed the composition of these complexes in cultured myoblasts, myotubes and skeletal muscle tissue by mass spectrometry and subsequently inferred potential protein functions through bioinformatics analyses. Our data confirm previously reported interactions and support a function for dysferlin as a vesicle trafficking protein. In addition novel potential functionalities were uncovered, including phagocytosis and focal adhesion. Our data reveal that the dysferlin protein complex has a dynamic composition as a function of myogenic differentiation. We provide additional experimental evidence and show dysferlin localization to, and interaction with the focal adhesion protein vinculin at the sarcolemma. Finally, our studies reveal evidence for cross-talk between dysferlin and its protein family member myoferlin. Together our analyses show that dysferlin is not only a membrane repair protein but also important for muscle membrane maintenance and integrity.

  20. Integrating Infrastructure and Institutions for Water Security in Large Urban Areas

    Science.gov (United States)

    Padowski, J.; Jawitz, J. W.; Carrera, L.

    2015-12-01

    Urban growth has forced cities to procure more freshwater to meet demands; however the relationship between urban water security, water availability and water management is not well understood. This work quantifies the urban water security of 108 large cities in the United States (n=50) and Africa (n=58) based on their hydrologic, hydraulic and institutional settings. Using publicly available data, urban water availability was estimated as the volume of water available from local water resources and those captured via hydraulic infrastructure (e.g. reservoirs, wellfields, aqueducts) while urban water institutions were assessed according to their ability to deliver, supply and regulate water resources to cities. When assessing availability, cities relying on local water resources comprised a minority (37%) of those assessed. The majority of cities (55%) instead rely on captured water to meet urban demands, with African cities reaching farther and accessing a greater number and variety of sources for water supply than US cities. Cities using captured water generally had poorer access to local water resources and maintained significantly more complex strategies for water delivery, supply and regulatory management. Eight cities, all African, are identified in this work as having water insecurity issues. These cities lack sufficient infrastructure and institutional complexity to capture and deliver adequate amounts of water for urban use. Together, these findings highlight the important interconnection between infrastructure investments and management techniques for urban areas with a limited or dwindling natural abundance of water. Addressing water security challenges in the future will require that more attention be placed not only on increasing water availability, but on developing the institutional support to manage captured water supplies.

  1. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    Science.gov (United States)

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  2. Evaluation of the depth-integration method of measuring water discharge in large rivers

    Science.gov (United States)

    Moody, J.A.; Troutman, B.M.

    1992-01-01

    The depth-integration method oor measuring water discharge makes a continuos measurement of the water velocity from the water surface to the bottom at 20 to 40 locations or verticals across a river. It is especially practical for large rivers where river traffic makes it impractical to use boats attached to taglines strung across the river or to use current meters suspended from bridges. This method has the additional advantage over the standard two- and eight-tenths method in that a discharge-weighted suspended-sediment sample can be collected at the same time. When this method is used in large rivers such as the Missouri, Mississippi and Ohio, a microwave navigation system is used to determine the ship's position at each vertical sampling location across the river, and to make accurate velocity corrections to compensate for shift drift. An essential feature is a hydraulic winch that can lower and raise the current meter at a constant transit velocity so that the velocities at all depths are measured for equal lengths of time. Field calibration measurements show that: (1) the mean velocity measured on the upcast (bottom to surface) is within 1% of the standard mean velocity determined by 9-11 point measurements; (2) if the transit velocity is less than 25% of the mean velocity, then average error in the mean velocity is 4% or less. The major source of bias error is a result of mounting the current meter above a sounding weight and sometimes above a suspended-sediment sampling bottle, which prevents measurement of the velocity all the way to the bottom. The measured mean velocity is slightly larger than the true mean velocity. This bias error in the discharge is largest in shallow water (approximately 8% for the Missouri River at Hermann, MO, where the mean depth was 4.3 m) and smallest in deeper water (approximately 3% for the Mississippi River at Vickbsurg, MS, where the mean depth was 14.5 m). The major source of random error in the discharge is the natural

  3. An investigation of multidisciplinary complex health care interventions - steps towards an integrative treatment model in the rehabilitation of People with Multiple Sclerosis

    Directory of Open Access Journals (Sweden)

    Skovgaard Lasse

    2012-04-01

    Full Text Available Abstract Background The Danish Multiple Sclerosis Society initiated a large-scale bridge building and integrative treatment project to take place from 2004–2010 at a specialized Multiple Sclerosis (MS hospital. In this project, a team of five conventional health care practitioners and five alternative practitioners was set up to work together in developing and offering individualized treatments to 200 people with MS. The purpose of this paper is to present results from the six year treatment collaboration process regarding the development of an integrative treatment model. Discussion The collaborative work towards an integrative treatment model for people with MS, involved six steps: 1 Working with an initial model 2 Unfolding the different treatment philosophies 3 Discussing the elements of the Intervention-Mechanism-Context-Outcome-scheme (the IMCO-scheme 4 Phrasing the common assumptions for an integrative MS program theory 5 Developing the integrative MS program theory 6 Building the integrative MS treatment model. The model includes important elements of the different treatment philosophies represented in the team and thereby describes a common understanding of the complexity of the courses of treatment. Summary An integrative team of practitioners has developed an integrative model for combined treatments of People with Multiple Sclerosis. The model unites different treatment philosophies and focuses on process-oriented factors and the strengthening of the patients’ resources and competences on a physical, an emotional and a cognitive level.

  4. The challenges of integrating multiple safeguards systems in a large nuclear facility

    International Nuclear Information System (INIS)

    Lavietes, A.; Liguori, C.; Pickrell, M.; Plenteda, R.; Sweet, M.

    2009-01-01

    Full-text: Implementing safeguards in a cost-effective manner in large nuclear facilities such as fuel conditioning, fuel reprocessing, and fuel fabrication plants requires the extensive use of instrumentation that is operated in unattended mode. The collected data is then periodically reviewed by the inspectors either on-site at a central location in the facility or remotely in the IAEA offices. A wide variety of instruments are deployed in large facilities, including video surveillance cameras, electronic sealing devices, non-destructive assay systems based on gamma ray and neutron detection, load cells for mass measurement, ID-readers, and other process-specific monitors. The challenge to integrate these different measurement instruments into an efficient, reliable, and secure system requires implementing standardization at various levels throughout the design process. This standardization includes the data generator behaviour and interface, networking solutions, and data security approaches. This standardization will provide a wide range of savings, including reduced training for inspectors and technicians, reduced periodic technical maintenance, reduced spare parts inventory, increased system robustness, and more predictive system behaviour. The development of standard building blocks will reduce the number of data generators required and allow implementation of simplified architectures that do not require local collection computers but rather utilize transmission of the acquired data directly to a central server via Ethernet connectivity. This approach will result in fewer system components and therefore reduced maintenance efforts and improved reliability. This paper discusses in detail the challenges and the subsequent solutions in the various areas that the IAEA Department of Safeguards has committed to pursue as the best sustainable way of maintaining the ability to implement reliable safeguards systems. (author)

  5. Large, dynamic, multi-protein complexes: a challenge for structural biology

    Czech Academy of Sciences Publication Activity Database

    Rozycki, B.; Bouřa, Evžen

    2014-01-01

    Roč. 26, č. 46 (2014), 463103/1-463103/11 ISSN 0953-8984 R&D Projects: GA MŠk LO1302 EU Projects: European Commission(XE) 333916 - STARPI4K Institutional support: RVO:61388963 Keywords : protein structure * multi-protein complexes * hybrid methods of structural biology Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.346, year: 2014

  6. Building flexibility and managing complexity in community mental health: lessons learned in a large urban centre.

    Science.gov (United States)

    Stergiopoulos, Vicky; Saab, Dima; Francombe Pridham, Kate; Aery, Anjana; Nakhost, Arash

    2018-01-24

    Across many jurisdictions, adults with complex mental health and social needs face challenges accessing appropriate supports due to system fragmentation and strict eligibility criteria of existing services. To support this underserviced population, Toronto's local health authority launched two novel community mental health models in 2014, inspired by Flexible Assertive Community Team principles. This study explores service user and provider perspectives on the acceptability of these services, and lessons learned during early implementation. We purposively sampled 49 stakeholders (staff, physicians, service users, health systems stakeholders) and conducted 17 semi-structured qualitative interviews and 5 focus groups between October 23, 2014 and March 2, 2015, exploring stakeholder perspectives on the newly launched team based models, as well as activities and strategies employed to support early implementation. Interviews and focus groups were audio recorded, transcribed verbatim and analyzed using thematic analysis. Findings revealed wide-ranging endorsement for the two team-based models' success in engaging the target population of adults with complex service needs. Implementation strengths included the broad recognition of existing service gaps, the use of interdisciplinary teams and experienced service providers, broad partnerships and collaboration among various service sectors, training and team building activities. Emerging challenges included lack of complementary support services such as suitable housing, organizational contexts reluctant to embrace change and risk associated with complexity, as well as limited service provider and organizational capacity to deliver evidence-based interventions. Findings identified implementation drivers at the practitioner, program, and system levels, specific to the implementation of community mental health interventions for adults with complex health and social needs. These can inform future efforts to address the health

  7. Large Eddy Simulations of Complex Flows in IC-Engine's Exhaust Manifold and Turbine

    OpenAIRE

    Fjällman, Johan

    2014-01-01

    The thesis deals with the flow in pipe bends and radial turbines geometries that are commonly found in an Internal Combustion Engine (ICE). The development phase of internal combustion engines relies more and more on simulations as an important complement to experiments. This is partly because of the reduction in development cost and the shortening of the development time. This is one of the reasons for the need of more accurate and predictive simulations. By using more complex computational ...

  8. Complexity impact factors on the integration process of ERP and non ERP systems : a basis for an evaluation instrument

    NARCIS (Netherlands)

    Janssens, G.; Hoeijenbos, M.; Kusters, R.J.; Cuaresma, M.J.E.; Shishkov, B.; Cordeiro, J.

    2011-01-01

    This study shows an expert confirmed initial list of factors which influence the complexity of the integration process of ERP systems and non ERP systems. After a thorough search for complexity factors in scientific literature, a survey amongst 8 experts in a leading European long special steel

  9. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  10. Integrated calibration of a 3D attitude sensor in large-scale metrology

    International Nuclear Information System (INIS)

    Gao, Yang; Lin, Jiarui; Yang, Linghui; Zhu, Jigui; Muelaner, Jody; Keogh, Patrick

    2017-01-01

    A novel calibration method is presented for a multi-sensor fusion system in large-scale metrology, which improves the calibration efficiency and reliability. The attitude sensor is composed of a pinhole prism, a converging lens, an area-array camera and a biaxial inclinometer. A mathematical model is established to determine its 3D attitude relative to a cooperative total station by using two vector observations from the imaging system and the inclinometer. There are two areas of unknown parameters in the measurement model that should be calibrated: the intrinsic parameters of the imaging model, and the transformation matrix between the camera and the inclinometer. An integrated calibration method using a three-axis rotary table and a total station is proposed. A single mounting position of the attitude sensor on the rotary table is sufficient to solve for all parameters of the measurement model. A correction technique for the reference laser beam of the total station is also presented to remove the need for accurate positioning of the sensor on the rotary table. Experimental verification has proved the practicality and accuracy of this calibration method. Results show that the mean deviations of attitude angles using the proposed method are less than 0.01°. (paper)

  11. Large Scale Production of Densified Hydrogen Using Integrated Refrigeration and Storage

    Science.gov (United States)

    Notardonato, William U.; Swanger, Adam Michael; Jumper, Kevin M.; Fesmire, James E.; Tomsik, Thomas M.; Johnson, Wesley L.

    2017-01-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage (IRAS) technology at NASA Kennedy Space Center led to the production of large quantities of solid densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. System energy balances and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing (up to 1 K), and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. Twenty silicon diode temperature sensors were recorded over approximately one month for testing at two different fill levels (33 67). The phenomenon, observed at both two fill levels, is described and presented detailed and explained herein., and The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  12. Vedic division methodology for high-speed very large scale integration applications

    Directory of Open Access Journals (Sweden)

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  13. Practical method of calculating time-integrated concentrations at medium and large distances

    International Nuclear Information System (INIS)

    Cagnetti, P.; Ferrara, V.

    1980-01-01

    Previous reports have covered the possibility of calculating time-integrated concentrations (TICs) for a prolonged release, based on concentration estimates for a brief release. This study proposes a simple method of evaluating concentrations in the air at medium and large distances, for a brief release. It is known that the stability of the atmospheric layers close to ground level influence diffusion only over short distances. Beyond some tens of kilometers, as the pollutant cloud progressively reaches higher layers, diffusion is affected by factors other than the stability at ground level, such as wind shear for intermediate distances and the divergence and rotational motion of air masses towards the upper limit of the mesoscale and on the synoptic scale. Using the data available in the literature, expressions for sigmasub(y) and sigmasub(z) are proposed for transfer times corresponding to those for up to distances of several thousand kilometres, for two initial diffusion situations (up to distances of 10 - 20 km), those characterized by stable and neutral conditions respectively. Using this method simple hand calculations can be made for any problem relating to the diffusion of radioactive pollutants over long distances

  14. Integrating Web-Based Teaching Tools into Large University Physics Courses

    Science.gov (United States)

    Toback, David; Mershin, Andreas; Novikova, Irina

    2005-12-01

    Teaching students in our large, introductory, calculus-based physics courses to be good problem-solvers is a difficult task. Not only must students be taught to understand and use the physics concepts in a problem, they must become adept at turning the physical quantities into symbolic variables, translating the problem into equations, and "turning the crank" on the mathematics to find both a closed-form solution and a numerical answer. Physics education research has shown that students' poor math skills and instructors' lack of pen-and-paper homework grading resources, two problems we face at our institution, can have a significant impact on problem-solving skill development.2-4 While Interactive Engagement methods appear to be the preferred mode of instruction,5 for practical reasons we have not been able to widely implement them. In this paper, we describe three Internet-based "teaching-while-quizzing" tools we have developed and how they have been integrated into our traditional lecture course in powerful but easy to incorporate ways.6 These are designed to remediate students' math deficiencies, automate homework grading, and guide study time toward problem solving. Our intent is for instructors who face similar obstacles to adopt these tools, which are available upon request.7

  15. Large scale continuous integration and delivery : Making great software better and faster

    NARCIS (Netherlands)

    Stahl, Daniel

    2017-01-01

    Since the inception of continuous integration, and later continuous delivery, the methods of producing software in the industry have changed dramatically over the last two decades. Automated, rapid and frequent compilation, integration, testing, analysis, packaging and delivery of new software

  16. Spacer capture and integration by a type I-F Cas1-Cas2-3 CRISPR adaptation complex.

    Science.gov (United States)

    Fagerlund, Robert D; Wilkinson, Max E; Klykov, Oleg; Barendregt, Arjan; Pearce, F Grant; Kieper, Sebastian N; Maxwell, Howard W R; Capolupo, Angela; Heck, Albert J R; Krause, Kurt L; Bostina, Mihnea; Scheltema, Richard A; Staals, Raymond H J; Fineran, Peter C

    2017-06-27

    CRISPR-Cas adaptive immune systems capture DNA fragments from invading bacteriophages and plasmids and integrate them as spacers into bacterial CRISPR arrays. In type I-E and II-A CRISPR-Cas systems, this adaptation process is driven by Cas1-Cas2 complexes. Type I-F systems, however, contain a unique fusion of Cas2, with the type I effector helicase and nuclease for invader destruction, Cas3. By using biochemical, structural, and biophysical methods, we present a structural model of the 400-kDa Cas1 4 -Cas2-3 2 complex from Pectobacterium atrosepticum with bound protospacer substrate DNA. Two Cas1 dimers assemble on a Cas2 domain dimeric core, which is flanked by two Cas3 domains forming a groove where the protospacer binds to Cas1-Cas2. We developed a sensitive in vitro assay and demonstrated that Cas1-Cas2-3 catalyzed spacer integration into CRISPR arrays. The integrase domain of Cas1 was necessary, whereas integration was independent of the helicase or nuclease activities of Cas3. Integration required at least partially duplex protospacers with free 3'-OH groups, and leader-proximal integration was stimulated by integration host factor. In a coupled capture and integration assay, Cas1-Cas2-3 processed and integrated protospacers independent of Cas3 activity. These results provide insight into the structure of protospacer-bound type I Cas1-Cas2-3 adaptation complexes and their integration mechanism.

  17. Automated NMR fragment based screening identified a novel interface blocker to the LARG/RhoA complex.

    Directory of Open Access Journals (Sweden)

    Jia Gao

    Full Text Available The small GTPase cycles between the inactive GDP form and the activated GTP form, catalyzed by the upstream guanine exchange factors. The modulation of such process by small molecules has been proven to be a fruitful route for therapeutic intervention to prevent the over-activation of the small GTPase. The fragment based approach emerging in the past decade has demonstrated its paramount potential in the discovery of inhibitors targeting such novel and challenging protein-protein interactions. The details regarding the procedure of NMR fragment screening from scratch have been rarely disclosed comprehensively, thus restricts its wider applications. To achieve a consistent screening applicable to a number of targets, we developed a highly automated protocol to cover every aspect of NMR fragment screening as possible, including the construction of small but diverse libray, determination of the aqueous solubility by NMR, grouping compounds with mutual dispersity to a cocktail, and the automated processing and visualization of the ligand based screening spectra. We exemplified our streamlined screening in RhoA alone and the complex of the small GTPase RhoA and its upstream guanine exchange factor LARG. Two hits were confirmed from the primary screening in cocktail and secondary screening over individual hits for LARG/RhoA complex, while one of them was also identified from the screening for RhoA alone. HSQC titration of the two hits over RhoA and LARG alone, respectively, identified one compound binding to RhoA.GDP at a 0.11 mM affinity, and perturbed the residues at the switch II region of RhoA. This hit blocked the formation of the LARG/RhoA complex, validated by the native gel electrophoresis, and the titration of RhoA to ¹⁵N labeled LARG in the absence and presence the compound, respectively. It therefore provides us a starting point toward a more potent inhibitor to RhoA activation catalyzed by LARG.

  18. Electromagnetic modelling of large complex 3-D structures with LEGO and the eigencurrent expansion method

    NARCIS (Netherlands)

    Lancellotti, V.; Hon, de B.P.; Tijhuis, A.G.

    2009-01-01

    Linear embedding via Green's operators (LEGO) is a computational method in which the multiple scattering between adjacent objects - forming a large composite structure - is determined through the interaction of simple-shaped building domains, whose electromagnetic (EM) behavior is accounted for by

  19. From Collective Knowledge to Intelligence : Pre-Requirements Analysis of Large and Complex Systems

    NARCIS (Netherlands)

    Liang, Peng; Avgeriou, Paris; He, Keqing; Xu, Lai

    2010-01-01

    Requirements engineering is essentially a social collaborative activity in which involved stakeholders have to closely work together to communicate, elicit, negotiate, define, confirm, and finally come up with the requirements for the system to be implemented or upgraded. In the development of large

  20. A constraint logic programming approach to associate 1D and 3D structural components for large protein complexes.

    Science.gov (United States)

    Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang

    2007-01-01

    The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.

  1. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  2. Large Differences in the Optical Spectrum Associated with the Same Complex: The Effect of the Anisotropy of the Embedding Lattice

    DEFF Research Database (Denmark)

    Aramburu, José Antonio; García-Fernández, Pablo; García Lastra, Juan Maria

    2017-01-01

    of the electric field created by the rest of lattice ions over the complex. To illustrate this concept we analyze the origin of the surprisingly large differences in the d–d optical transitions of two systems containing square-planar CuF42– complexes, CaCuF4, and center II in Cu2+-doped Ba2ZnF6, even though...... the Cu2+–F–distance difference is just found to be 1%. Using a minimalist first-principles model we show that the different morphology of the host lattices creates an anisotropic field that red-shifts the in vacuo complex transitions to the 1.25–1.70 eV range in CaCuF4, while it blue-shifts them to the 1...

  3. Direct Observation of Very Large Zero-Field Splitting in a Tetrahedral Ni(II)Se4 Coordination Complex.

    Science.gov (United States)

    Jiang, Shang-Da; Maganas, Dimitrios; Levesanos, Nikolaos; Ferentinos, Eleftherios; Haas, Sabrina; Thirunavukkuarasu, Komalavalli; Krzystek, J; Dressel, Martin; Bogani, Lapo; Neese, Frank; Kyritsis, Panayotis

    2015-10-14

    The high-spin (S = 1) tetrahedral Ni(II) complex [Ni{(i)Pr2P(Se)NP(Se)(i)Pr2}2] was investigated by magnetometry, spectroscopic, and quantum chemical methods. Angle-resolved magnetometry studies revealed the orientation of the magnetization principal axes. The very large zero-field splitting (zfs), D = 45.40(2) cm(-1), E = 1.91(2) cm(-1), of the complex was accurately determined by far-infrared magnetic spectroscopy, directly observing transitions between the spin sublevels of the triplet ground state. These are the largest zfs values ever determined--directly--for a high-spin Ni(II) complex. Ab initio calculations further probed the electronic structure of the system, elucidating the factors controlling the sign and magnitude of D. The latter is dominated by spin-orbit coupling contributions of the Ni ions, whereas the corresponding effects of the Se atoms are remarkably smaller.

  4. Motor imagery and its effect on complex regional pain syndrome: an integrative review

    Directory of Open Access Journals (Sweden)

    Nélio Silva de Souza

    2015-12-01

    Full Text Available The motor imagery (MI has been proposed as a treatment in the complex regional pain syndrome type 1 (CRPS-1, since it seems to promote a brain reorganization effect on sensory- motor areas of pain perception. The aim of this paper is to investigate, through an integrative critical review, the influence of MI on the CRPS-1, correlating their evidence to clinical practice. Research in PEDro, Medline, Bireme and Google Scholar databases was conducted. Nine randomized controlled trials (level 2, 1 non-controlled clinical study (level 3, 1 case study (level 4, 1 systematic review (level 1, 2 review articles and 1 comment (level 5 were found. We can conclude that MI has shown effect in reducing pain and functionality that remains after 6 months of treatment. However, the difference between the MI strategies for CRPS-1 is unknown as well as the intensity of mental stress influences the painful response or effect of MI or other peripheral neuropathies.

  5. Integrated approach to knowledge acquisition and safety management of complex plants with emphasis on human factors

    International Nuclear Information System (INIS)

    Kosmowski, K.T.

    1998-01-01

    In this paper an integrated approach to the knowledge acquisition and safety management of complex industrial plants is proposed and outlined. The plant is considered within a man-technology-environment (MTE) system. The knowledge acquisition is aimed at the consequent reliability evaluation of human factor and probabilistic modeling of the plant. Properly structured initial knowledge is updated in life-time of the plant. The data and knowledge concerning the topology of safety related systems and their functions are created in a graphical CAD system and are object oriented. Safety oriented monitoring of the plant includes abnormal situations due to external and internal disturbances, failures of hard/software components and failures of human factor. The operation and safety related evidence is accumulated in special data bases. Data/knowledge bases are designed in such a way to support effectively the reliability and safety management of the plant. (author)

  6. Training requirements and responsibilities for the Buried Waste Integrated Demonstration at the Radioactive Waste Management Complex

    International Nuclear Information System (INIS)

    Vega, H.G.; French, S.B.; Rick, D.L.

    1992-09-01

    The Buried Waste Integrated Demonstration (BWID) is scheduled to conduct intrusive (hydropunch screening tests, bore hole installation, soil sampling, etc.) and nonintrusive (geophysical surveys) studies at the Radioactive Waste Management Complex (RWMC). These studies and activities will be limited to specific locations at the RWMC. The duration of these activities will vary, but most tasks are not expected to exceed 90 days. The BWID personnel requested that the Waste Management Operational Support Group establish the training requirements and training responsibilities for BWID personnel and BWID subcontractor personnel. This document specifies these training requirements and responsibilities. While the responsibilities of BWID and the RWMC are, in general, defined in the interface agreement, the training elements are based on regulatory requirements, DOE orders, DOE-ID guidance, state law, and the nature of the work to be performed

  7. An efficient fringe integral equation method for optimizing the antenna location on complex bodies

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Meincke, Peter; Breinbjerg, Olav

    2001-01-01

    The radiation pattern of an antenna mounted nearby, or directly on, a complex three-dimensional (3D) structure can be significantly influenced by this structure. Integral equations combined with the method of moments (MoM) provide an accurate means for calculating the scattering from the structures...... in such applications. The structure is then modelled by triangular or rectangular surface patches with corresponding surface current expansion functions. A MoM matrix which is independent of the antenna location can be obtained by modelling the antenna as an impressed electric or magnetic source, e.g., a slot antenna...... can be modelled by a magnetic Hertzian dipole. For flush-mounted antennas, or antennas mounted in close vicinity of the scattering structure, the nearby impressed source induces a highly peaked surface current on the scattering structure. For the low-order basis functions usually applied...

  8. Methods for the preparation of large quantities of complex single-stranded oligonucleotide libraries.

    Science.gov (United States)

    Murgha, Yusuf E; Rouillard, Jean-Marie; Gulari, Erdogan

    2014-01-01

    Custom-defined oligonucleotide collections have a broad range of applications in fields of synthetic biology, targeted sequencing, and cytogenetics. Also, they are used to encode information for technologies like RNA interference, protein engineering and DNA-encoded libraries. High-throughput parallel DNA synthesis technologies developed for the manufacture of DNA microarrays can produce libraries of large numbers of different oligonucleotides, but in very limited amounts. Here, we compare three approaches to prepare large quantities of single-stranded oligonucleotide libraries derived from microarray synthesized collections. The first approach, alkaline melting of double-stranded PCR amplified libraries with a biotinylated strand captured on streptavidin coated magnetic beads results in little or no non-biotinylated ssDNA. The second method wherein the phosphorylated strand of PCR amplified libraries is nucleolyticaly hydrolyzed is recommended when small amounts of libraries are needed. The third method combining in vitro transcription of PCR amplified libraries to reverse transcription of the RNA product into single-stranded cDNA is our recommended method to produce large amounts of oligonucleotide libraries. Finally, we propose a method to remove any primer binding sequences introduced during library amplification.

  9. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  10. Analysing a Chinese Regional Integrated Healthcare Organisation Reform Failure using a Complex Adaptive System Approach

    Directory of Open Access Journals (Sweden)

    Wenxi Tang

    2017-06-01

    Full Text Available Introduction: China’s organised health system has remained outdated for decades. Current health systems in many less market-oriented countries still adhere to traditional administrative-based directives and linear planning. Furthermore, they neglect the responsiveness and feedback of institutions and professionals, which often results in reform failure in integrated care. Complex adaptive system theory (CAS provides a new perspective and methodology for analysing the health system and policy implementation.  Methods: We observed the typical case of Qianjiang’s Integrated Health Organization Reform (IHO for 2 years to analyse integrated care reforms using CAS theory. Via questionnaires and interviews, we observed 32 medical institutions and 344 professionals. We compared their cooperative behaviours from both organisational and inter-professional levels between 2013 and 2015, and further investigated potential reasons for why medical institutions and professionals did not form an effective IHO. We discovered how interested parties in the policy implementation process influenced reform outcome, and by theoretical induction, proposed a new semi-organised system and corresponding policy analysis flowchart that potentially suits the actual realisation of CAS.  Results: The reform did not achieve its desired effect. The Qianjiang IHO was loosely integrated rather than closely integrated, and the cooperation levels between organisations and professionals were low. This disappointing result was due to low mutual trust among IHO members, with the main contributing factors being insufficient financial incentives and the lack of a common vision.  Discussion and Conclusions: The traditional 'organised health system' is old-fashioned. Rather than being completely organised or adaptive, the health system is currently more similar to a s'emi-organised system'. Medical institutions and professionals operate in a middle ground between complete adherence

  11. Exergoeconomic improvement of a complex cogeneration system integrated with a professional process simulator

    International Nuclear Information System (INIS)

    Vieira, Leonardo S.; Donatelli, Joao L.; Cruz, Manuel E.

    2009-01-01

    In this paper, the application of an iterative exergoeconomic methodology for improvement of thermal systems to a complex combined-cycle cogeneration plant is presented. The methodology integrates exergoeconomics with a professional process simulator, and represents an alternative to conventional mathematical optimization techniques, because it reduces substantially the number of variables to be considered in the improvement process. By exploiting the computational power of a simulator, the integrated approach permits the optimization routine to ignore the variables associated with the thermodynamic equations, and thus to deal only with the economic equations and objective function. In addition, the methodology combines recent available exergoeconomic techniques with qualitative and quantitative criteria to identify only those decision variables, which matter for the improvement of the system. To demonstrate the strengths of the methodology, it is here applied to a 24-component cogeneration plant, which requires O(10 3 ) variables for its simulation. The results which are obtained, are compared to those reached using a conventional mathematical optimization procedure, also coupled to the process simulator. It is shown that, for engineering purposes, improvement of the system is often more cost effective and less time consuming than optimization of the system.

  12. Petrochemical refinery and integrated petrochemical complexes; Refinaria petroquimica e complexos petroquimicos integrados

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Patricia C. dos; Seidl, Peter R.; Borschiver, Suzana [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Escola de Quimica

    2008-07-01

    Global demand for light olefins points to strong prospects for growth, stimulating investments in overall productive capacity. With propylene demand growing slightly faster than that of ethylene, rising prices and difficulties in supplies of petrochemical feedstocks (mainly naphtha and natural gas), steam crackers alone are not able to fill the light olefins gap nor do they allow extraordinary margins. As petrochemical market dynamics also influence refining activities, there has been significant progress in the development of technologies for petrochemical refining, such as Petrochemical FCC. This petrochemistry-refining integration offers great opportunities for synergism since both industries share many common challenges, like more severe environmental requirements and optimizing the use of utilities. However, in the case of valuation of non-conventional oils (which tend to increase in importance in oil markets), to take full advantage of this opportunity to add value to low cost streams, deep conversion and treatment processes are of great significance in refining scheme to have enough feedstock for cracking. In this context, a petrochemical refinery seems to be an important alternative source of petrochemicals and may be integrated or not to a petrochemical complex. (author)

  13. Does company size matter? Validation of an integrative model of safety behavior across small and large construction companies.

    Science.gov (United States)

    Guo, Brian H W; Yiu, Tak Wing; González, Vicente A

    2018-02-01

    Previous safety climate studies primarily focused on either large construction companies or the construction industry as a whole, while little is known about whether company size has significant effects on workers' understanding of safety climate measures and relationships between safety climate factors and safety behavior. Thus, this study aims to: (a) test the measurement equivalence (ME) of a safety climate measure across workers from small and large companies; (b) investigate if company size alters the causal structure of the integrative model developed by Guo, Yiu, and González (2016). Data were collected from 253 construction workers in New Zealand using a safety climate measure. This study used multi-group confirmatory factor analyses (MCFA) to test the measurement equivalence of the safety climate measure and structure invariance of the integrative model. Results indicate that workers from small and large companies understood the safety climate measure in a similar manner. In addition, it was suggested that company size does not change the causal structure and mediational processes of the integrative model. Both measurement equivalence of the safety climate measure and structural invariance of the integrative model were supported by this study. Practical applications: Findings of this study provided strong support for a meaningful use of the safety climate measure across construction companies in different sizes. Safety behavior promotion strategies designed based on the integrative model may be well suited for both large and small companies. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  14. Intermolecular symmetry-adapted perturbation theory study of large organic complexes

    International Nuclear Information System (INIS)

    Heßelmann, Andreas; Korona, Tatiana

    2014-01-01

    Binding energies for the complexes of the S12L database by Grimme [Chem. Eur. J. 18, 9955 (2012)] were calculated using intermolecular symmetry-adapted perturbation theory combined with a density-functional theory description of the interacting molecules. The individual interaction energy decompositions revealed no particular change in the stabilisation pattern as compared to smaller dimer systems at equilibrium structures. This demonstrates that, to some extent, the qualitative description of the interaction of small dimer systems may be extrapolated to larger systems, a method that is widely used in force-fields in which the total interaction energy is decomposed into atom-atom contributions. A comparison of the binding energies with accurate experimental reference values from Grimme, the latter including thermodynamic corrections from semiempirical calculations, has shown a fairly good agreement to within the error range of the reference binding energies

  15. Development of large scale industrial complex and its pollution. Case study of Kashima area

    Energy Technology Data Exchange (ETDEWEB)

    Nagai, S

    1975-01-01

    The development of Kashima industrial complex which embraces three townships started in 1960 to promote both agricultural and industrial developments using the most advanced techniques available for environmental pollution control. The chronological development progress is described with reference to the capital investment, gross product, employment and labor supply, population, status of the use of agricultural land, annual revenue and expenditure of three townships, and township tax. The environmental pollution control policies and measures taken since 1964 are reviewed. The emphasis was placed on preliminary investigations by various means and emission standards were applied. However, many incidences of pollution damage occurred due to operational errors and accidental causes. The emission quantity of sulfur dioxide is to be reduced from 8212 N cu m/h in 1973 to 4625 N cu m/h in 1976.

  16. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  17. Coping with Complex Environmental and Societal Flood Risk Management Decisions: An Integrated Multi-criteria Framework

    Directory of Open Access Journals (Sweden)

    Love Ekenberg

    2011-08-01

    Full Text Available During recent years, a great deal of attention has been focused on the financial risk management of natural disasters. One reason behind is that the economic losses from floods, windstorms, earthquakes and other disasters in both the developing and developed countries are escalating dramatically. It has become apparent that an integrated water resource management approach would be beneficial in order to take both the best interests of society and of the environment into consideration. One improvement consists of models capable of handling multiple criteria (conflicting objectives as well as multiple stakeholders (conflicting interests. A systems approach is applied for coping with complex environmental and societal risk management decisions with respect to flood catastrophe policy formation, wherein the emphasis is on computer-based modeling and simulation techniques combined with methods for evaluating strategies where numerous stakeholders are incorporated in the process. The resulting framework consists of a simulation model, a decision analytical tool, and a set of suggested policy strategies for policy formulation. The framework will aid decision makers with high risk complex environmental decisions subject to significant uncertainties.

  18. Inferior Olive HCN1 Channels Coordinate Synaptic Integration and Complex Spike Timing

    Directory of Open Access Journals (Sweden)

    Derek L.F. Garden

    2018-02-01

    Full Text Available Cerebellar climbing-fiber-mediated complex spikes originate from neurons in the inferior olive (IO, are critical for motor coordination, and are central to theories of cerebellar learning. Hyperpolarization-activated cyclic-nucleotide-gated (HCN channels expressed by IO neurons have been considered as pacemaker currents important for oscillatory and resonant dynamics. Here, we demonstrate that in vitro, network actions of HCN1 channels enable bidirectional glutamatergic synaptic responses, while local actions of HCN1 channels determine the timing and waveform of synaptically driven action potentials. These roles are distinct from, and may complement, proposed pacemaker functions of HCN channels. We find that in behaving animals HCN1 channels reduce variability in the timing of cerebellar complex spikes, which serve as a readout of IO spiking. Our results suggest that spatially distributed actions of HCN1 channels enable the IO to implement network-wide rules for synaptic integration that modulate the timing of cerebellar climbing fiber signals.

  19. Magnetic storm generation by large-scale complex structure Sheath/ICME

    Science.gov (United States)

    Grigorenko, E. E.; Yermolaev, Y. I.; Lodkina, I. G.; Yermolaev, M. Y.; Riazantseva, M.; Borodkova, N. L.

    2017-12-01

    We study temporal profiles of interplanetary plasma and magnetic field parameters as well as magnetospheric indices. We use our catalog of large-scale solar wind phenomena for 1976-2000 interval (see the catalog for 1976-2016 in web-side ftp://ftp.iki.rssi.ru/pub/omni/ prepared on basis of OMNI database (Yermolaev et al., 2009)) and the double superposed epoch analysis method (Yermolaev et al., 2010). Our analysis showed (Yermolaev et al., 2015) that average profiles of Dst and Dst* indices decrease in Sheath interval (magnetic storm activity increases) and increase in ICME interval. This profile coincides with inverted distribution of storm numbers in both intervals (Yermolaev et al., 2017). This behavior is explained by following reasons. (1) IMF magnitude in Sheath is higher than in Ejecta and closed to value in MC. (2) Sheath has 1.5 higher efficiency of storm generation than ICME (Nikolaeva et al., 2015). The most part of so-called CME-induced storms are really Sheath-induced storms and this fact should be taken into account during Space Weather prediction. The work was in part supported by the Russian Science Foundation, grant 16-12-10062. References. 1. Nikolaeva N.S., Y. I. Yermolaev and I. G. Lodkina (2015), Modeling of the corrected Dst* index temporal profile on the main phase of the magnetic storms generated by different types of solar wind, Cosmic Res., 53(2), 119-127 2. Yermolaev Yu. I., N. S. Nikolaeva, I. G. Lodkina and M. Yu. Yermolaev (2009), Catalog of Large-Scale Solar Wind Phenomena during 1976-2000, Cosmic Res., , 47(2), 81-94 3. Yermolaev, Y. I., N. S. Nikolaeva, I. G. Lodkina, and M. Y. Yermolaev (2010), Specific interplanetary conditions for CIR-induced, Sheath-induced, and ICME-induced geomagnetic storms obtained by double superposed epoch analysis, Ann. Geophys., 28, 2177-2186 4. Yermolaev Yu. I., I. G. Lodkina, N. S. Nikolaeva and M. Yu. Yermolaev (2015), Dynamics of large-scale solar wind streams obtained by the double superposed epoch

  20. Large Scale Integration of Renewable Power Sources into the Vietnamese Power System

    Science.gov (United States)

    Kies, Alexander; Schyska, Bruno; Thanh Viet, Dinh; von Bremen, Lueder; Heinemann, Detlev; Schramm, Stefan

    2017-04-01

    The Vietnamese Power system is expected to expand considerably in upcoming decades. Power capacities installed are projected to grow from 39 GW in 2015 to 129.5 GW by 2030. Installed wind power capacities are expected to grow to 6 GW (0.8 GW 2015) and solar power capacities to 12 GW (0.85 GW 2015). This goes hand in hand with an increase of the renewable penetration in the power mix from 1.3% from wind and photovoltaics (PV) in 2015 to 5.4% by 2030. The overall potential for wind power in Vietnam is estimated to be around 24 GW. Moreover, the up-scaling of renewable energy sources was formulated as one of the priorized targets of the Vietnamese government in the National Power Development Plan VII. In this work, we investigate the transition of the Vietnamese power system towards high shares of renewables. For this purpose, we jointly optimise the expansion of renewable generation facilities for wind and PV, and the transmission grid within renewable build-up pathways until 2030 and beyond. To simulate the Vietnamese power system and its generation from renewable sources, we use highly spatially and temporally resolved historical weather and load data and the open source modelling toolbox Python for Power System Analysis (PyPSA). We show that the highest potential of renewable generation for wind and PV is observed in southern Vietnam and discuss the resulting need for transmission grid extensions in dependency of the optimal pathway. Furthermore, we show that the smoothing effect of wind power has several considerable beneficial effects and that the Vietnamese hydro power potential can be efficiently used to provide balancing opportunities. This work is part of the R&D Project "Analysis of the Large Scale Integration of Renewable Power into the Future Vietnamese Power System" (GIZ, 2016-2018).

  1. How do you assign persistent identifiers to extracts from large, complex, dynamic data sets that underpin scholarly publications?

    Science.gov (United States)

    Wyborn, Lesley; Car, Nicholas; Evans, Benjamin; Klump, Jens

    2016-04-01

    Persistent identifiers in the form of a Digital Object Identifier (DOI) are becoming more mainstream, assigned at both the collection and dataset level. For static datasets, this is a relatively straight-forward matter. However, many new data collections are dynamic, with new data being appended, models and derivative products being revised with new data, or the data itself revised as processing methods are improved. Further, because data collections are becoming accessible as services, researchers can log in and dynamically create user-defined subsets for specific research projects: they also can easily mix and match data from multiple collections, each of which can have a complex history. Inevitably extracts from such dynamic data sets underpin scholarly publications, and this presents new challenges. The National Computational Infrastructure (NCI) has been experiencing and making progress towards addressing these issues. The NCI is large node of the Research Data Services initiative (RDS) of the Australian Government's research infrastructure, which currently makes available over 10 PBytes of priority research collections, ranging from geosciences, geophysics, environment, and climate, through to astronomy, bioinformatics, and social sciences. Data are replicated to, or are produced at, NCI and then processed there to higher-level data products or directly analysed. Individual datasets range from multi-petabyte computational models and large volume raster arrays, down to gigabyte size, ultra-high resolution datasets. To facilitate access, maximise reuse and enable integration across the disciplines, datasets have been organized on a platform called the National Environmental Research Data Interoperability Platform (NERDIP). Combined, the NERDIP data collections form a rich and diverse asset for researchers: their co-location and standardization optimises the value of existing data, and forms a new resource to underpin data-intensive Science. New publication

  2. Evaluating greenhouse gas emissions from hydropower complexes on large rivers in Eastern Washington

    Energy Technology Data Exchange (ETDEWEB)

    Arntzen, Evan V.; Miller, Benjamin L.; O' Toole, Amanda C.; Niehus, Sara E.; Richmond, Marshall C.

    2013-03-15

    Water bodies, such as freshwater lakes, are known to be net emitters of carbon dioxide (CO2), and methane (CH4). In recent years, significant greenhouse gas (GHG) emissions from tropical, boreal, and mid-latitude reservoirs have been reported. At a time when hydropower is increasing worldwide, better understanding of seasonal and regional variation in GHG emissions is needed in order to develop a predictive understanding of such fluxes within man-made impoundments. We examined power-producing dam complexes within xeric temperate locations in the northwestern United States. Sampling environments on the Snake (Lower Monumental Dam Complex) and Columbia Rivers (Priest Rapids Dam Complex) included tributary, mainstem, embayment, forebay, and tailrace areas during winter and summer 2012. At each sampling location, GHG measurement pathways included surface gas flux, degassing as water passed through dams during power generation, ebullition within littoral embayments, and direct sampling of hyporheic pore-water. Measurements were also carried out in a free-flowing reach of the Columbia River to estimate unaltered conditions. Surface flux resulted in very low emissions, with reservoirs acting as a sink for CO2 (up to –262 mg m-2 d-1, which is within the range previously reported for similarly located reservoirs). Surface flux of methane remained below 1 mg CH4 m-2d-1, a value well below fluxes reported previously for temperate reservoirs. Water passing through hydroelectric projects acted as a sink for CO2 during winter and a small source during summer, with mean degassing fluxes of –117 and 4.5 t CO2 d-1, respectively. Degassing of CH4 was minimal, with mean fluxes of 3.1 × 10-6 and –5.6 × 10-4 t CH4 d-1 during winter and summer, respectively. Gas flux due to ebullition was greater in coves located within reservoirs than in coves within the free flowing Hanford Reach–and CH4 flux exceeded that of CO2. Methane emissions varied widely across sampling locations

  3. Capturing variations in inundation with satellite remote sensing in a morphologically complex, large lake

    Science.gov (United States)

    Wu, Guiping; Liu, Yuanbo

    2015-04-01

    Poyang Lake is the largest freshwater lake in China, with high morphological complexity from south to north. In recent years, the lake has experienced expansion and shrinkage processes over both short- and long-term scales, resulting in significant hydrological, ecological and economic problems. Exactly how and how rapidly the processes of spatial change have occurred in the lake during the expansion and shrinkage periods is unknown. Such knowledge is of great importance for policymakers as it may help with flood/drought prevention, land use planning and lake ecological conservation. In this study, we investigated the spatial-temporal distribution and changing processes of inundation in Poyang Lake based on Moderate Resolution Imaging Spectroradiometer (MODIS) Level-1B data from 2000 to 2011. A defined water variation rate (WVR) and inundation frequency (IF) indicator revealed the water surface submersion and exposure processes of lake expansion and shrinkage in different zones which were divided according to the lake's hydrological and topographic features. Regional differences and significant seasonality variability were found in the annual and monthly mean IF. The monthly mean IF increased slowly from north to south during January-August but decreased quickly from south to north during September-December. During the lake expansion period, the lake-type water body zone (Zone II) had the fastest expansion rate, with a mean monthly WVR value of 34.47% in February-March, and was followed by the channel-type water body zone (Zone I) in March-May (22.47%). However, during the lake shrinkage period, rapid shrinkage first appeared around the alluvial delta zones in August-October. The sequence of lake surface shrinkage from August to December is exactly opposite to that of lake expansion from February to July. These complex inundation characteristics and changing process were driven by the high temporal variability of the river flows, the morphological diversity of the

  4. Large-scale in silico mapping of complex quantitative traits in inbred mice.

    Directory of Open Access Journals (Sweden)

    Pengyuan Liu

    2007-07-01

    Full Text Available Understanding the genetic basis of common disease and disease-related quantitative traits will aid in the development of diagnostics and therapeutics. The processs of gene discovery can be sped up by rapid and effective integration of well-defined mouse genome and phenome data resources. We describe here an in silico gene-discovery strategy through genome-wide association (GWA scans in inbred mice with a wide range of genetic variation. We identified 937 quantitative trait loci (QTLs from a survey of 173 mouse phenotypes, which include models of human disease (atherosclerosis, cardiovascular disease, cancer and obesity as well as behavioral, hematological, immunological, metabolic, and neurological traits. 67% of QTLs were refined into genomic regions <0.5 Mb with approximately 40-fold increase in mapping precision as compared with classical linkage analysis. This makes for more efficient identification of the genes that underlie disease. We have identified two QTL genes, Adam12 and Cdh2, as causal genetic variants for atherogenic diet-induced obesity. Our findings demonstrate that GWA analysis in mice has the potential to resolve multiple tightly linked QTLs and achieve single-gene resolution. These high-resolution QTL data can serve as a primary resource for positional cloning and gene identification in the research community.

  5. Integration of large wind farms into weak power grids. Emphasis on the Ethiopian interconnected system (ICS)

    Energy Technology Data Exchange (ETDEWEB)

    Bantyirga Gessesse, Belachew

    2013-07-18

    The impact of increased wind power on the steady state and dynamic behavior of the Ethiopian power system is the main focus of this thesis. The integration of wind power to the existing grid with conventional generators introduces new set of challenges regarding system security and operational planning, the main cause of the difference arising from the uncertainty of the primary source of energy and the response time following a disturbance. For incorporating wind turbine models into the overall dynamic model of the system and investigating the effect of wind on the dynamic behavior of the wind first models of wind turbine components were put together by reviewing the current state of the art in wind turbine modeling and control concepts. The theoretical insight thus gained was applied to the Ethiopian power system as a case study. Since the models of the installed turbines were either not available or incomplete, an alternative modeling approach based on generic models was adopted. The generic model, in addition to obviating the need for technology or manufacturer specific models, reduces the complexity the dynamic model. Using this procedure, generic dynamic models for wind farm in the system were developed. The capability of dynamic models to reproduce the dynamic response of the system has been verified by comparing simulation results obtained with a detailed and generic wind farm model. It could be shown that the generic wind turbine model is simple, but accurate enough to represent any wind turbine types or entire wind farms for power system stability analysis. The next task was the study of the effect of increased wind power level on the general behavior of the Ethiopian system. It is observed that overall the impact of wind turbines on the operational indices of the system was -as could be expected- more pronounced in the vicinity of the wind farm. But the power angle oscillation following a disturbance was observed across the whole system. Further, as a

  6. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  7. APINetworks Java. A Java approach to the efficient treatment of large-scale complex networks

    Science.gov (United States)

    Muñoz-Caro, Camelia; Niño, Alfonso; Reyes, Sebastián; Castillo, Miriam

    2016-10-01

    We present a new version of the core structural package of our Application Programming Interface, APINetworks, for the treatment of complex networks in arbitrary computational environments. The new version is written in Java and presents several advantages over the previous C++ version: the portability of the Java code, the easiness of object-oriented design implementations, and the simplicity of memory management. In addition, some additional data structures are introduced for storing the sets of nodes and edges. Also, by resorting to the different garbage collectors currently available in the JVM the Java version is much more efficient than the C++ one with respect to memory management. In particular, the G1 collector is the most efficient one because of the parallel execution of G1 and the Java application. Using G1, APINetworks Java outperforms the C++ version and the well-known NetworkX and JGraphT packages in the building and BFS traversal of linear and complete networks. The better memory management of the present version allows for the modeling of much larger networks.

  8. Complex long-distance effects of mutations that confer linezolid resistance in the large ribosomal subunit

    Science.gov (United States)

    Fulle, Simone; Saini, Jagmohan S.; Homeyer, Nadine; Gohlke, Holger

    2015-01-01

    The emergence of multidrug-resistant pathogens will make current antibiotics ineffective. For linezolid, a member of the novel oxazolidinone class of antibiotics, 10 nucleotide mutations in the ribosome have been described conferring resistance. Hypotheses for how these mutations affect antibiotics binding have been derived based on comparative crystallographic studies. However, a detailed description at the atomistic level of how remote mutations exert long-distance effects has remained elusive. Here, we show that the G2032A-C2499A double mutation, located > 10 Å away from the antibiotic, confers linezolid resistance by a complex set of effects that percolate to the binding site. By molecular dynamics simulations and free energy calculations, we identify U2504 and C2452 as spearheads among binding site nucleotides that exert the most immediate effect on linezolid binding. Structural reorganizations within the ribosomal subunit due to the mutations are likely associated with mutually compensating changes in the effective energy. Furthermore, we suggest two main routes of information transfer from the mutation sites to U2504 and C2452. Between these, we observe cross-talk, which suggests that synergistic effects observed for the two mutations arise in an indirect manner. These results should be relevant for the development of oxazolidinone derivatives that are active against linezolid-resistant strains. PMID:26202966

  9. LARGE-SCALE CO MAPS OF THE LUPUS MOLECULAR CLOUD COMPLEX

    International Nuclear Information System (INIS)

    Tothill, N. F. H.; Loehr, A.; Stark, A. A.; Lane, A. P.; Harnett, J. I.; Bourke, T. L.; Myers, P. C.; Parshley, S. C.; Wright, G. A.; Walker, C. K.

    2009-01-01

    Fully sampled degree-scale maps of the 13 CO 2-1 and CO 4-3 transitions toward three members of the Lupus Molecular Cloud Complex-Lupus I, III, and IV-trace the column density and temperature of the molecular gas. Comparison with IR extinction maps from the c2d project requires most of the gas to have a temperature of 8-10 K. Estimates of the cloud mass from 13 CO emission are roughly consistent with most previous estimates, while the line widths are higher, around 2 km s -1 . CO 4-3 emission is found throughout Lupus I, indicating widespread dense gas, and toward Lupus III and IV. Enhanced line widths at the NW end and along the edge of the B 228 ridge in Lupus I, and a coherent velocity gradient across the ridge, are consistent with interaction between the molecular cloud and an expanding H I shell from the Upper-Scorpius subgroup of the Sco-Cen OB Association. Lupus III is dominated by the effects of two HAe/Be stars, and shows no sign of external influence. Slightly warmer gas around the core of Lupus IV and a low line width suggest heating by the Upper-Centaurus-Lupus subgroup of Sco-Cen, without the effects of an H I shell.

  10. Detecting outliers and learning complex structures with large spectroscopic surveys - a case study with APOGEE stars

    Science.gov (United States)

    Reis, Itamar; Poznanski, Dovi; Baron, Dalya; Zasowski, Gail; Shahaf, Sahar

    2018-05-01

    In this work, we apply and expand on a recently introduced outlier detection algorithm that is based on an unsupervised random forest. We use the algorithm to calculate a similarity measure for stellar spectra from the Apache Point Observatory Galactic Evolution Experiment (APOGEE). We show that the similarity measure traces non-trivial physical properties and contains information about complex structures in the data. We use it for visualization and clustering of the data set, and discuss its ability to find groups of highly similar objects, including spectroscopic twins. Using the similarity matrix to search the data set for objects allows us to find objects that are impossible to find using their best-fitting model parameters. This includes extreme objects for which the models fail, and rare objects that are outside the scope of the model. We use the similarity measure to detect outliers in the data set, and find a number of previously unknown Be-type stars, spectroscopic binaries, carbon rich stars, young stars, and a few that we cannot interpret. Our work further demonstrates the potential for scientific discovery when combining machine learning methods with modern survey data.

  11. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem

    2016-12-28

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity of the linear precoder and receiver that maximize the minimum signal-to-interference-plus-noise ratio subject to a given power constraint. To this end, we consider the asymptotic regime in which M and K grow large with a given ratio. Tools from random matrix theory (RMT) are then used to compute, in closed form, accurate approximations for the parameters of the optimal precoder and receiver, when imperfect channel state information (modeled by the generic Gauss-Markov formulation form) is available at the BS. The asymptotic analysis allows us to derive the asymptotically optimal linear precoder and receiver that are characterized by a lower complexity (due to the dependence on the large scale components of the channel) and, possibly, by a better resilience to imperfect channel state information. However, the implementation of both is still challenging as it requires fast inversions of large matrices in every coherence period. To overcome this issue, we apply the truncated polynomial expansion (TPE) technique to the precoding and receiving vector of each UE and make use of RMT to determine the optimal weighting coefficients on a per- UE basis that asymptotically solve the max-min SINR problem. Numerical results are used to validate the asymptotic analysis in the finite system regime and to show that the proposed TPE transceivers efficiently mimic the optimal ones, while requiring much lower computational complexity.

  12. Regulating with imagery and the complexity of basic emotions. Comment on "The quartet theory of human emotions: An integrative and neurofunctional model" by S. Koelsch et al.

    Science.gov (United States)

    Meyer, Marcel; Kuchinke, Lars

    2015-06-01

    Literature, music and the arts have long attested to the complexity of human emotions. Hitherto, psychological and biological theories of emotions have largely neglected this rich heritage. In their review Koelsch and colleagues [1] have embarked upon the pioneering endeavour of integrating the diverse perspectives in emotion research. Noting that the focus of prior neurobiological theories relies mainly on animal studies, the authors sought to complement this body of research with a model of complex ("moral") emotions in humans (henceforth: complex emotions). According to this novel framework, there are four main interacting affective centres in the brain. Each centre is associated with a dominant affective function, such as ascending activation (brainstem), pain/pleasure (diencephalon), attachment-related affects (hippocampus) or moral emotions and unconscious cognitive appraisal (orbitofrontal cortex). Furthermore, language is ascribed a key role in (a) the communication of subjective feeling (reconfiguration) and (b) in the conscious regulation of emotions (by means of logic and rational thought).

  13. Intraoperative computed tomography with an integrated navigation system in stabilization surgery for complex craniovertebral junction malformation.

    Science.gov (United States)

    Yu, Xinguang; Li, Lianfeng; Wang, Peng; Yin, Yiheng; Bu, Bo; Zhou, Dingbiao

    2014-07-01

    This study was designed to report our preliminary experience with stabilization procedures for complex craniovertebral junction malformation (CVJM) using intraoperative computed tomography (iCT) with an integrated neuronavigation system (NNS). To evaluate the workflow, feasibility and clinical outcome of stabilization procedures using iCT image-guided navigation for complex CVJM. The stabilization procedures in CVJM are complex because of the area's intricate geometry and bony structures, its critical relationship to neurovascular structures and the intricate biomechanical issues involved. A sliding gantry 40-slice computed tomography scanner was installed in a preexisting operating room. The images were transferred directly from the scanner to the NNS using an automated registration system. On the basis of the analysis of intraoperative computed tomographic images, 23 cases (11 males, 12 females) with complicated CVJM underwent navigated stabilization procedures to allow more control over screw placement. The age of these patients were 19-52 years (mean: 33.5 y). We performed C1-C2 transarticular screw fixation in 6 patients to produce atlantoaxial arthrodesis with better reliability. Because of a high-riding transverse foramen on at least 1 side of the C2 vertebra and an anomalous vertebral artery position, 7 patients underwent C1 lateral mass and C2 pedicle screw fixation. Ten additional patients were treated with individualized occipitocervical fixation surgery from the hypoplasia of C1 or constraints due to C2 bone structure. In total, 108 screws were inserted into 23 patients using navigational assistance. The screws comprised 20 C1 lateral mass screws, 26 C2, 14 C3, or 4 C4 pedicle screws, 32 occipital screws, and 12 C1-C2 transarticular screws. There were no vascular or neural complications except for pedicle perforations that were detected in 2 (1.9%) patients and were corrected intraoperatively without any persistent nerves or vessel damage. The overall

  14. CoreFlow: A computational platform for integration, analysis and modeling of complex biological data

    DEFF Research Database (Denmark)

    Pasculescu, Adrian; Schoof, Erwin; Creixell, Pau

    2014-01-01

    between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion......A major challenge in mass spectrometry and other large-scale applications is how to handle, integrate, and model the data that is produced. Given the speed at which technology advances and the need to keep pace with biological experiments, we designed a computational platform, CoreFlow, which...... provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts...

  15. Heritability and demographic analyses in the large isolated population of Val Borbera suggest advantages in mapping complex traits genes.

    Directory of Open Access Journals (Sweden)

    Michela Traglia

    2009-10-01

    Full Text Available Isolated populations are a useful resource for mapping complex traits due to shared stable environment, reduced genetic complexity and extended Linkage Disequilibrium (LD compared to the general population. Here we describe a large genetic isolate from the North West Apennines, the mountain range that runs through Italy from the North West Alps to the South.The study involved 1,803 people living in 7 villages of the upper Borbera Valley. For this large population cohort, data from genealogy reconstruction, medical questionnaires, blood, anthropometric and bone status QUS parameters were evaluated. Demographic and epidemiological analyses indicated a substantial genetic component contributing to each trait variation as well as overlapping genetic determinants and family clustering for some traits.The data provide evidence for significant heritability of medical relevant traits that will be important in mapping quantitative traits. We suggest that this population isolate is suitable to identify rare variants associated with complex phenotypes that may be difficult to study in larger but more heterogeneous populations.

  16. Isotopic shifts in chemical exchange systems. 1. Large isotope effects in the complexation of Na+ isotopes by macrocyclic polyethers

    International Nuclear Information System (INIS)

    Knoechel, A.; Wilken, R.D.

    1981-01-01

    The complexation of 24 Na + and 22 Na + by 18 of the most widely used macrocyclic polyethers (crown ethers and monocyclic and bicyclic aminopolyethers) has been investigated in view of possible equilibrium isotope shifts. Solvated salts and polyether complexes were distributed differently into two phases and isotope ratios determined in both phases. Chloroform/water systems were shown to be particularly suitable to the investigations allowing favorable distribution for Na + and 13 of the 18 polyethers employed. With crown ethers 24 Na + enrichment varied from nonsignficant values (for large crown ethers) up to 3.1 +- 0.4% (18-crown-6). In the case of bicyclic aminopolyethers, ligands with cages of optimum size to accommodate Na + showed 24 Na + enrichment between O (nonsignificant) (2.2/sub B/2./sub B/) and 5.2 +- 1.8% (2.2.1). In contrast, for 2.2.2. and its derivatives, being too large for Na + , 22 Na + enrichment varying from O (nonsignificant) (2.2.2.p) up to 5.4 +- 0.5% (2.2.2.) has been observed. These values are remarkably high. They are explained by different bonding in solvate structure and polyether complex by using the theoretical approach of Bigeleisen

  17. Parental decision-making for medically complex infants and children: An integrated literature review

    Science.gov (United States)

    Allen, Kimberly A.

    2014-01-01

    Background Many children with life-threatening conditions who would have died at birth are now surviving months to years longer than previously expected. Understanding how parents make decisions is necessary to prevent parental regret about decision-making, which can lead to psychological distress, decreased physical health, and decreased quality of life for the parents. Objective The aim of this integrated literature review was to describe possible factors that affect parental decision-making for medically complex children. The critical decisions included continuation or termination of a high-risk pregnancy, initiation of life-sustaining treatments such as resuscitation, complex cardiothoracic surgery, use of experimental treatments, end-of-life care, and limitation of care or withdrawal of support. Design PubMed, Cumulative Index of Nursing and Allied Health Literature, and PsycINFO were searched using the combined key terms ‘parents and decision-making’ to obtain English language publications from 2000 to June 2013. Results The findings from each of the 31 articles retained were recorded. The strengths of the empirical research reviewed are that decisions about initiating life support and withdrawing life support have received significant attention. Researchers have explored how many different factors impact decision-making and have used multiple different research designs and data collection methods to explore the decision-making process. These initial studies lay the foundation for future research and have provided insight into parental decision-making during times of crisis. Conclusions Studies must begin to include both parents and providers so that researchers can evaluate how decisions are made for individual children with complex chronic conditions to understand the dynamics between parents and parent–provider relationships. The majority of studies focused on one homogenous diagnostic group of premature infants and children with complex congenital

  18. Parental decision-making for medically complex infants and children: an integrated literature review.

    Science.gov (United States)

    Allen, Kimberly A

    2014-09-01

    Many children with life-threatening conditions who would have died at birth are now surviving months to years longer than previously expected. Understanding how parents make decisions is necessary to prevent parental regret about decision-making, which can lead to psychological distress, decreased physical health, and decreased quality of life for the parents. The aim of this integrated literature review was to describe possible factors that affect parental decision-making for medically complex children. The critical decisions included continuation or termination of a high-risk pregnancy, initiation of life-sustaining treatments such as resuscitation, complex cardiothoracic surgery, use of experimental treatments, end-of-life care, and limitation of care or withdrawal of support. PubMed, Cumulative Index of Nursing and Allied Health Literature, and PsycINFO were searched using the combined key terms 'parents and decision-making' to obtain English language publications from 2000 to June 2013. The findings from each of the 31 articles retained were recorded. The strengths of the empirical research reviewed are that decisions about initiating life support and withdrawing life support have received significant attention. Researchers have explored how many different factors impact decision-making and have used multiple different research designs and data collection methods to explore the decision-making process. These initial studies lay the foundation for future research and have provided insight into parental decision-making during times of crisis. Studies must begin to include both parents and providers so that researchers can evaluate how decisions are made for individual children with complex chronic conditions to understand the dynamics between parents and parent-provider relationships. The majority of studies focused on one homogenous diagnostic group of premature infants and children with complex congenital heart disease. Thus comparisons across other child

  19. Complexity analysis on public transport networks of 97 large- and medium-sized cities in China

    Science.gov (United States)

    Tian, Zhanwei; Zhang, Zhuo; Wang, Hongfei; Ma, Li

    2018-04-01

    The traffic situation in Chinese urban areas is continuing to deteriorate. To make a better planning and designing of the public transport system, it is necessary to make profound research on the structure of urban public transport networks (PTNs). We investigate 97 large- and medium-sized cities’ PTNs in China, construct three types of network models — bus stop network, bus transit network and bus line network, then analyze the structural characteristics of them. It is revealed that bus stop network is small-world and scale-free, bus transit network and bus line network are both small-world. Betweenness centrality of each city’s PTN shows similar distribution pattern, although these networks’ size is various. When classifying cities according to the characteristics of PTNs or economic development level, the results are similar. It means that the development of cities’ economy and transport network has a strong correlation, PTN expands in a certain model with the development of economy.

  20. Complex magnetic properties and large magnetocaloric effects in RCoGe (R=Tb, Dy compounds

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2018-05-01

    Full Text Available Complicated magnetic phase transitions and Large magnetocaloric effects (MCEs in RCoGe (R=Tb, Dy compounds have been reported in this paper. Results show that the TbCoGe compounds have a magnetic phase transition from antiferromagnetic to paramagnetic (AFM-PM at TN∼16 K, which is close to the value reported by neutron diffraction. The DyCoGe compound undergoes complicated phase changes from 2 K up to 300 K. The peak at 10 K displays a phase transition from antiferromagnetic to ferromagnetic (AFM-FM. In particular, a significant ferromagnetic to paramagnetic (FM-PM phase transition was found at the temperature as high as 175 K and the cusp becomes more abrupt with the magnetic field increasing from 0.01 T to 0.1 T. The maximum value of magnetic entropy change of TbCoGe and DyCoGe compounds achieve 14.5 J/kg K and 11.5 J/kg K respectively for a field change of 0-5 T. Additionally, the correspondingly considerable refrigerant capacity value of 260 J/kg and 242 J/kg are also obtained respectively, suggesting that both TbCoGe and DyCoGe compounds could be considered as good candidates for low temperature magnetic refrigerant.

  1. Ambient noise forecasting with a large acoustic array in a complex shallow water environment.

    Science.gov (United States)

    Rogers, Jeffrey S; Wales, Stephen C; Means, Steven L

    2017-11-01

    Forecasting ambient noise levels in the ocean can be a useful way of characterizing the detection performance of sonar systems and projecting bounds on performance into the near future. The assertion is that noise forecasting can be improved with a priori knowledge of source positions coupled with the ability to resolve closely separated sources in bearing. One example of such a system is the large aperture research array located at the South Florida Test Facility. Given radar and Automatic Identification System defined source positions and environmental information, transmission loss (TL) is computed from known source positions to the array. Source levels (SLs) of individual ships are then estimated from computed TL and the pre-determined beam response of the array using a non-negative least squares algorithm. Ambient noise forecasts are formed by projecting the estimated SLs along known ship tracks. Ambient noise forecast estimates are compared to measured beam level data and mean-squared error is computed. A mean squared error as low as 3.5 dB is demonstrated in 30 min forecast estimates when compared to ground truth.

  2. Coupled Finite Volume and Finite Element Method Analysis of a Complex Large-Span Roof Structure

    Science.gov (United States)

    Szafran, J.; Juszczyk, K.; Kamiński, M.

    2017-12-01

    The main goal of this paper is to present coupled Computational Fluid Dynamics and structural analysis for the precise determination of wind impact on internal forces and deformations of structural elements of a longspan roof structure. The Finite Volume Method (FVM) serves for a solution of the fluid flow problem to model the air flow around the structure, whose results are applied in turn as the boundary tractions in the Finite Element Method problem structural solution for the linear elastostatics with small deformations. The first part is carried out with the use of ANSYS 15.0 computer system, whereas the FEM system Robot supports stress analysis in particular roof members. A comparison of the wind pressure distribution throughout the roof surface shows some differences with respect to that available in the engineering designing codes like Eurocode, which deserves separate further numerical studies. Coupling of these two separate numerical techniques appears to be promising in view of future computational models of stochastic nature in large scale structural systems due to the stochastic perturbation method.

  3. Exploring Integration of Care for Children Living with Complex Care Needs across the European Union and European Economic Area.

    Science.gov (United States)

    Brenner, Maria; O'Shea, Miriam; J Larkin, Philip; Kamionka, Stine Lundstroem; Berry, Jay; Hiscock, Harriet; Rigby, Michael; Blair, Mitch

    2017-04-24

    The aim of this paper is to report on the development of surveys to explore integration of care for children living with complex care needs across the European Union (EU) and European Economic Area (EEA). Each survey consists of a vignette and questions adapted from the Standards for Systems of Care for Children and Youth with Special Health Care Needs and the Eurobarometer Survey . A Country Agent in each country, a local expert in child health services, will obtain data from indigenous sources. We identified 'in-principle' complex problems and adapted surveys to capture care integration. We expect to get rich data to understand perceptions and to inform actions for a number of complex health issues. The study has the potential to make a wide contribution to individual countries of the EU/EEA to understand their own integration of services mapped against responses from other member states. Early results are expected in Spring 2017.

  4. Performance-Oriented Design of Large Passive Solar Roofs : A method for the integration of parametric modelling and genetic algorithms

    NARCIS (Netherlands)

    Turrin, M.; Von Buelow, P.; Stouffs, R.M.F.; Kilian, A.

    2010-01-01

    The paper addresses the design of large roof structures for semi outdoor spaces through an investigation of a type of performance-oriented design, which aims at integrating performance evaluations in the early stages of the design process. Particularly, aiming at improving daylight and thermal

  5. Development of the mechanical engineering complex on the basis of the improvement of large and small businesses relations

    Directory of Open Access Journals (Sweden)

    Sokolova Svetlana

    2017-01-01

    Full Text Available Condition, pace and character of the development of the mechanical engineering complex is in many aspects a crucial factor for the social and economic situation of any country. The development of market relations, changes of the conditions of doing business encourage the enterprises to search new managerial methods and to improve the interaction forms. In this respect the display of the peculiarities of the interaction of large machine engineering enterprises and small business in this sphere and also the assessment of the relationship of their development is an important and crucial issue under modern conditions. The most widely spread forms of the cooperation of large scale mechanical engineering enterprises and small businesses of the industry are: outsourcing, franchising, leasing, subcontracting, venture financing, creation of regional forms of the cooperation of large and small firms. However cooperation processes of large scale and small entrepreneurship in Russia are not properly developed. The authors determine the factors hindering the growth of the machine building industry, suggest the recommendations for the development of the large scale enterprises and small business in the industry, substantiate the role of the government in this process. Besides the mechanism of the state support of the development of small business is described.

  6. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    Science.gov (United States)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  7. Experience of Integrated Safeguards Approach for Large-scale Hot Cell Laboratory

    International Nuclear Information System (INIS)

    Miyaji, N.; Kawakami, Y.; Koizumi, A.; Otsuji, A.; Sasaki, K.

    2010-01-01

    The Japan Atomic Energy Agency (JAEA) has been operating a large-scale hot cell laboratory, the Fuels Monitoring Facility (FMF), located near the experimental fast reactor Joyo at the Oarai Research and Development Center (JNC-2 site). The FMF conducts post irradiation examinations (PIE) of fuel assemblies irradiated in Joyo. The assemblies are disassembled and non-destructive examinations, such as X-ray computed tomography tests, are carried out. Some of the fuel pins are cut into specimens and destructive examinations, such as ceramography and X-ray micro analyses, are performed. Following PIE, the tested material, in the form of a pin or segments, is shipped back to a Joyo spent fuel pond. In some cases, after reassembly of the examined irradiated fuel pins is completed, the fuel assemblies are shipped back to Joyo for further irradiation. For the IAEA to apply the integrated safeguards approach (ISA) to the FMF, a new verification system on material shipping and receiving process between Joyo and the FMF has been established by the IAEA under technical collaboration among the Japan Safeguard Office (JSGO) of MEXT, the Nuclear Material Control Center (NMCC) and the JAEA. The main concept of receipt/shipment verification under the ISA for JNC-2 site is as follows: under the IS, the FMF is treated as a Joyo-associated facility in terms of its safeguards system because it deals with the same spent fuels. Verification of the material shipping and receiving process between Joyo and the FMF can only be applied to the declared transport routes and transport casks. The verification of the nuclear material contained in the cask is performed with the method of gross defect at the time of short notice random interim inspections (RIIs) by measuring the surface neutron dose rate of the cask, filled with water to reduce radiation. The JAEA performed a series of preliminary tests with the IAEA, the JSGO and the NMCC, and confirmed from the standpoint of the operator that this

  8. "Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2011-09-01

    Full Text Available Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found not to have modeled the analyses to take account of the complex sample (Johnson & Elliott, 1998 even when publishing in highly-regarded journals. It is well known that failure to appropriately model the complex sample can substantially bias the results of the analysis. Examples presented in this paper highlight the risk of error of inference and mis-estimation of parameters from failure to analyze these data sets appropriately.

  9. Arabidopsis GCP3-interacting protein 1/MOZART 1 is an integral component of the γ-tubulin-containing microtubule nucleating complex.

    Science.gov (United States)

    Nakamura, Masayoshi; Yagi, Noriyoshi; Kato, Takehide; Fujita, Satoshi; Kawashima, Noriyuki; Ehrhardt, David W; Hashimoto, Takashi

    2012-07-01

    Microtubules in eukaryotic cells are nucleated from ring-shaped complexes that contain γ-tubulin and a family of homologous γ-tubulin complex proteins (GCPs), but the subunit composition of the complexes can vary among fungi, animals and plants. Arabidopsis GCP3-interacting protein 1 (GIP1), a small protein with no homology to the GCP family, interacts with GCP3 in vitro, and is a plant homolog of vertebrate mitotic-spindle organizing protein associated with a ring of γ-tubulin 1 (MOZART1), a recently identified component of the γ-tubulin complex in human cell lines. In this study, we characterized two closely related Arabidopsis GIP1s: GIP1a and GIP1b. Single mutants of gip1a and gip1b were indistinguishable from wild-type plants, but their double mutant was embryonic lethal, and showed impaired development of male gametophytes. Functional fusions of GIP1a with green fluorescent protein (GFP) were used to purify GIP1a-containing complexes from Arabidopsis plants, which contained all the subunits (except NEDD1) previously identified in the Arabidopsis γ-tubulin complexes. GIP1a and GIP1b interacted specifically with Arabidopsis GCP3 in yeast. GFP-GIP1a labeled mitotic microtubule arrays in a pattern largely consistent with, but partly distinct from, the localization of the γ-tubulin complex containing GCP2 or GCP3 in planta. In interphase cortical arrays, the labeled complexes were preferentially recruited to existing microtubules, from which new microtubules were efficiently nucleated. However, in contrast to complexes labeled with tagged GCP2 or GCP3, their recruitment to cortical areas with no microtubules was rarely observed. These results indicate that GIP1/MOZART1 is an integral component of a subset of the Arabidopsis γ-tubulin complexes. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  10. Model of complex integrated use of alternative energy sources for highly urbanized areas

    Directory of Open Access Journals (Sweden)

    Ivanova Elena Ivanovna

    2014-04-01

    Full Text Available The increase of population and continuous development of highly urbanized territories poses new challenges to experts in the field of energy saving technologies. Only a multifunctional and autonomous system of building engineering equipment formed by the principles of energy efficiency and cost-effectiveness meets the needs of modern urban environment. Alternative energy sources, exploiting the principle of converting thermal energy into electrical power, show lack of efficiency, so it appears to be necessary for reaching a visible progress to skip this middle step. A fuel cell, converting chemical energy straight into electricity, and offering a vast diversity of both fuel types and oxidizing agents, gives a strong base for designing a complex integrated system. Regarding the results of analysis and comparison conducted among the most types of fuel cells proposed by contemporary scholars, a solid oxide fuel cell (SOFC is approved to be able to ensure the smooth operation of such a system. While the advantages of this device meet the requirements of engineering equipment for modern civil and, especially, dwelling architecture, its drawbacks do not contradict with the operating regime of the proposed system. The article introduces a model of a multifunctional system based on solid oxide fuel cell (SOFC and not only covering the energy demand of a particular building, but also providing the opportunity for proper and economical operation of several additional sub-systems. Air heating and water cooling equipment, ventilating and conditioning devices, the circle of water supply and preparation of water discharge for external use (e.g. agricultural needs included into a closed circuit of the integrated system allow evaluating it as a promising model of further implementation of energy saving technologies into architectural and building practice. This, consequently, will positively affect both ecological and economic development of urban environment.

  11. Maximising the recovery of low grade heat: An integrated heat integration framework incorporating heat pump intervention for simple and complex factories

    International Nuclear Information System (INIS)

    Miah, J.H.; Griffiths, A.; McNeill, R.; Poonaji, I.; Martin, R.; Leiser, A.; Morse, S.; Yang, A.; Sadhukhan, J.

    2015-01-01

    Highlights: • A new practical heat integration framework incorporating heat pump technology for simple and complex food factories. • A decision making procedure was proposed to select process or utility heat integration in complex and diverse factories. • New stream classifications proposed to identify and compare streams linked between process and utility, especially waste heat. • A range of ‘Heat Pump Thresholds’ to identify and compare heat pump configurations with steam generation combustion boiler. - Abstract: The recovery of heat has long been a key measure to improving energy efficiency and maximising the heat recovery of factories by Pinch analysis. However, a substantial amount of research has been dedicated to conventional heat integration where low grade heat is often ignored. Despite this, the sustainability challenges facing the process manufacturing community are turning interest on low grade energy recovery systems to further advance energy efficiency by technological interventions such as heat pumps. This paper presents a novel heat integration framework incorporating technological interventions for both simple and complex factories to evaluate all possible heat integration opportunities including low grade and waste heat. The key features of the framework include the role of heat pumps to upgrade heat which can significantly enhance energy efficiency; the selection process of heat pump designs which was aided by the development of ‘Heat Pump Thresholds’ to decide if heat pump designs are cost-competitive with steam generation combustion boiler; a decision making procedure to select process or utility heat integration in complex and diverse factories; and additional stream classifications to identify and separate streams that can be practically integrated. The application of the framework at a modified confectionery factory has yielded four options capable of delivering a total energy reduction of about 32% with an economic payback

  12. The application of J integral to measure cohesive laws in materials undergoing large scale yielding

    DEFF Research Database (Denmark)

    Sørensen, Bent F.; Goutianos, Stergios

    2015-01-01

    We explore the possibility of determining cohesive laws by the J-integral approach for materials having non-linear stress-strain behaviour (e.g. polymers and composites) by the use of a DCB sandwich specimen, consisting of stiff elastic beams bonded to the non-linear test material, loaded with pure...... bending moments. For a wide range of parameters of the non-linear material, the plastic unloading during crack extension is small, resulting in J integral values (fracture resistance) that deviate maximum 15% from the work of the cohesive traction. Thus the method can be used to extract the cohesive laws...... directly from experiments without any presumption about their shape. Finally, the DCB sandwich specimen was also analysed using the I integral to quantify the overestimation of the steady-state fracture resistance obtained using the J integral based method....

  13. Highly Integrated, Reconfigurable, Large-Area, Flexible Radar Antenna Arrays, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Highly-integrated, reconfigurable radar antenna arrays fabricated on flexible substrates offer high functionality in a portable package that can be rolled up and...

  14. Large-Scale Integration of Solid-State Microfluidic Valves With No Moving Parts

    National Research Council Canada - National Science Library

    Mastangelo, Carlos H; Gianchandani, Yogesh B; Frechet, J. M

    2005-01-01

    This research concerns the development of a new kind of revolutionary design, solid-state microvalves that will permit the realization of complex microfluidic systems with arrays of hundreds of flow-control devices...

  15. TradeWind. Integrating wind. Developing Europe's power market for the large-scale integration of wind power. Final report

    Energy Technology Data Exchange (ETDEWEB)

    2009-02-15

    Based on a single European grid and power market system, the TradeWind project explores to what extent large-scale wind power integration challenges could be addressed by reinforcing interconnections between Member States in Europe. Additionally, the project looks at the conditions required for a sound power market design that ensures a cost-effective integration of wind power at EU level. In this way, the study addresses two issues of key importance for the future integration of renewable energy, namely the weak interconnectivity levels between control zones and the inflexibility and fragmented nature of the European power market. Work on critical transmission paths and interconnectors is slow for a variety of reasons including planning and administrative barriers, lack of public acceptance, insufficient economic incentives for TSOs, and the lack of a joint European approach by the key stakeholders. (au)

  16. The HADDOCK2.2 Web Server: User-Friendly Integrative Modeling of Biomolecular Complexes.

    Science.gov (United States)

    van Zundert, G C P; Rodrigues, J P G L M; Trellet, M; Schmitz, C; Kastritis, P L; Karaca, E; Melquiond, A S J; van Dijk, M; de Vries, S J; Bonvin, A M J J

    2016-02-22

    The prediction of the quaternary structure of biomolecular macromolecules is of paramount importance for fundamental understanding of cellular processes and drug design. In the era of integrative structural biology, one way of increasing the accuracy of modeling methods used to predict the structure of biomolecular complexes is to include as much experimental or predictive information as possible in the process. This has been at the core of our information-driven docking approach HADDOCK. We present here the updated version 2.2 of the HADDOCK portal, which offers new features such as support for mixed molecule types, additional experimental restraints and improved protocols, all of this in a user-friendly interface. With well over 6000 registered users and 108,000 jobs served, an increasing fraction of which on grid resources, we hope that this timely upgrade will help the community to solve important biological questions and further advance the field. The HADDOCK2.2 Web server is freely accessible to non-profit users at http://haddock.science.uu.nl/services/HADDOCK2.2. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Analysis of a fuel cell on-site integrated energy system for a residential complex

    Science.gov (United States)

    Simons, S. N.; Maag, W. L.

    1979-01-01

    The energy use and costs of the on-site integrated energy system (OS/IES) which provides electric power from an on-site power plant and recovers heat that would normally be rejected to the environment is compared to a conventional system purchasing electricity from a utility and a phosphoric acid fuel cell powered system. The analysis showed that for a 500-unit apartment complex a fuel OS/IES would be about 10% more energy conservative in terms of total coal consumption than a diesel OS/IES system or a conventional system. The fuel cell OS/IES capital costs could be 30 to 55% greater than the diesel OS/IES capital costs for the same life cycle costs. The life cycle cost of a fuel cell OS/IES would be lower than that for a conventional system as long as the cost of electricity is greater than $0.05 to $0.065/kWh. An analysis of several parametric combinations of fuel cell power plant and state-of-art energy recovery systems and annual fuel requirement calculations for four locations were made. It was shown that OS/IES component choices are a major factor in fuel consumption, with the least efficient system using 25% more fuel than the most efficient. Central air conditioning and heat pumps result in minimum fuel consumption while individual air conditioning units increase it, and in general the fuel cell of highest electrical efficiency has the lowest fuel consumption.

  18. An integrated model for simulating nitrogen trading in an agricultural catchment with complex hydrogeology.

    Science.gov (United States)

    Cox, T J; Rutherford, J C; Kerr, S C; Smeaton, D C; Palliser, C C

    2013-09-30

    Nitrogen loads to several New Zealand lakes are dominated by nonpoint runoff from pastoral farmland which adversely affects lake water quality. A 'cap and trade' scheme is being considered to help meet targets set for nitrogen loads to Lake Rotorua, and a numerical model, NTRADER, has been developed to simulate and compare alternative schemes. NTRADER models both the geophysics of nitrogen generation and transport, including groundwater lag times, and the economics of 'cap and trade' schemes. It integrates the output from several existing models, including a farm-scale nitrogen leaching and abatement model, a farm-scale management economic model, and a catchment-scale nitrogen transport model. This paper details modeling methods and compares possible trading program design features for the Lake Rotorua catchment. Model simulations demonstrate how a cap and trade program could be used to effectively achieve challenging environmental goals in the targeted catchment. However, results also show that, due to complex hydrogeology, satisfactory environmental outcomes may be not achieved unless groundwater lag times are incorporated into the regulatory scheme. One way to do this, as demonstrated here, would be to explicitly include lag times in the cap and trade program. The utility of the model is further demonstrated by quantifying relative differences in abatement costs across potential regulatory schemes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Integration of Microalgae-Based Bioenergy Production into a Petrochemical Complex: Techno-Economic Assessment

    Directory of Open Access Journals (Sweden)

    Ana L. Gonçalves

    2016-03-01

    Full Text Available The rapid development of modern society has resulted in an increased demand for energy, mainly from fossil fuels. The use of this source of energy has led to the accumulation of carbon dioxide (CO2 in the atmosphere. In this context, microalgae culturing may be an effective solution to reduce the CO2 concentration in the atmosphere, since these microorganisms can capture CO2 and, simultaneously, produce bioenergy. This work consists of a techno-economic assessment of a microalgal production facility integrated in a petrochemical complex, in which established infrastructure allows efficient material and energy transport. Seven different scenarios were considered regarding photosynthetic, lipids extraction and anaerobic digestion efficiencies. This analysis has demonstrated six economically viable scenarios able to: (i reduce CO2 emissions from a thermoelectric power plant; (ii treat domestic wastewaters (which were used as culture medium; and (iii produce lipids and electrical and thermal energy. For a 100-ha facility, considering a photosynthetic efficiency of 3%, a lipids extraction efficiency of 75% and an anaerobic digestion efficiency of 45% (scenario 3, an economically viable process was obtained (net present value of 22.6 million euros, being effective in both CO2 removal (accounting for 1.1 × 104 t per year and energy production (annual energy produced was 1.6 × 107 kWh and annual lipids productivity was 1.9 × 103 m3.

  20. Formulation and integration of constitutive models describing large deformations in thermoplasticity and thermoviscoplasticity

    International Nuclear Information System (INIS)

    Jansohn, W.

    1997-10-01

    This report deals with the formulation and numerical integration of constitutive models in the framework of finite deformation thermomechanics. Based on the concept of dual variables, plasticity and viscoplasticity models exhibiting nonlinear kinematic hardening as well as nonlinear isotropic hardening rules are presented. Care is taken that the evolution equations governing the hardening response fulfill the intrinsic dissipation inequality in every admissible process. In view of the development of an efficient numerical integration procedure, simplified versions of these constitutive models are supposed. In these versions, the thermoelastic strains are assumed to be small and a simplified kinematic hardening rule is considered. Additionally, in view of an implementation into the ABAQUS finite element code, the elasticity law is approximated by a hypoelasticity law. For the simplified onstitutive models, an implicit time-integration algorithm is developed. First, in order to obtain a numerical objective integration scheme, use is made of the HUGHES-WINGET-Algorithm. In the resulting system of ordinary differential equations, it can be distinguished between three differential operators representing different physical effects. The structure of this system of differential equations allows to apply an operator split scheme, which leads to an efficient integration scheme for the constitutive equations. By linearizing the integration algorithm the consistent tangent modulus is derived. In this way, the quadratic convergence of Newton's method used to solve the basic finite element equations (i.e. the finite element discretization of the governing thermomechanical field equations) is preserved. The resulting integration scheme is implemented as a user subroutine UMAT in ABAQUS. The properties of the app