WorldWideScience

Sample records for integrating large complex

  1. An integration strategy for large enterprises

    Directory of Open Access Journals (Sweden)

    Risimić Dejan

    2007-01-01

    Full Text Available Integration is the process of enabling a communication between disparate software components. Integration has been the burning issue for large enterprises in the last twenty years, due to the fact that 70% of the development and deployment budget is spent on integrating complex and heterogeneous back-end and front-end IT systems. The need to integrate existing applications is to support newer, faster, more accurate business processes and to provide meaningful, consistent management information. Historically, integration started with the introduction of point-to-point approaches evolving into simpler hub-and spoke topologies. These topologies were combined with custom remote procedure calls, distributed object technologies and message-oriented middleware (MOM, continued with enterprise application integration (EAI and used an application server as a primary vehicle for integration. The current phase of the evolution is service-oriented architecture (SOA combined with an enterprise service bus (ESB. Technical aspects of the comparison between the aforementioned technologies are analyzed and presented. The result of the study is the recommended integration strategy for large enterprises.

  2. Reliability of large and complex systems

    CERN Document Server

    Kolowrocki, Krzysztof

    2014-01-01

    Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt

  3. Integration of functional complex oxide nanomaterials on silicon

    Directory of Open Access Journals (Sweden)

    Jose Manuel eVila-Fungueiriño

    2015-06-01

    Full Text Available The combination of standard wafer-scale semiconductor processing with the properties of functional oxides opens up to innovative and more efficient devices with high value applications that can be produced at large scale. This review uncovers the main strategies that are successfully used to monolithically integrate functional complex oxide thin films and nanostructures on silicon: the chemical solution deposition approach (CSD and the advanced physical vapor deposition techniques such as oxide molecular beam epitaxy (MBE. Special emphasis will be placed on complex oxide nanostructures epitaxially grown on silicon using the combination of CSD and MBE. Several examples will be exposed, with a particular stress on the control of interfaces and crystallization mechanisms on epitaxial perovskite oxide thin films, nanostructured quartz thin films, and octahedral molecular sieve nanowires. This review enlightens on the potential of complex oxide nanostructures and the combination of both chemical and physical elaboration techniques for novel oxide-based integrated devices.

  4. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  5. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  6. Integrating the Differentiated: A Review of the Personal Construct Approach to Cognitive Complexity

    OpenAIRE

    Kovářová, M. (Marie); Filip, M. (Miroslav)

    2015-01-01

    This article reviews personal construct psychology (PCP) research on cognitive complexity. It examines conceptual foundations, measures of cognitive complexity, and a large body of empirical findings. It identifies several ambiguities in the conceptualization of the two components of cognitive complexity: differentiation and integration. These ambiguities lead to inconsistent interpretations of indexes proposed for their measurement and consequently to an inconsistent interpretation of em...

  7. Fuel pin integrity assessment under large scale transients

    International Nuclear Information System (INIS)

    Dutta, B.K.

    2006-01-01

    The integrity of fuel rods under normal, abnormal and accident conditions is an important consideration during fuel design of advanced nuclear reactors. The fuel matrix and the sheath form the first barrier to prevent the release of radioactive materials into the primary coolant. An understanding of the fuel and clad behaviour under different reactor conditions, particularly under the beyond-design-basis accident scenario leading to large scale transients, is always desirable to assess the inherent safety margins in fuel pin design and to plan for the mitigation the consequences of accidents, if any. The severe accident conditions are typically characterized by the energy deposition rates far exceeding the heat removal capability of the reactor coolant system. This may lead to the clad failure due to fission gas pressure at high temperature, large- scale pellet-clad interaction and clad melting. The fuel rod performance is affected by many interdependent complex phenomena involving extremely complex material behaviour. The versatile experimental database available in this area has led to the development of powerful analytical tools to characterize fuel under extreme scenarios

  8. Integrating complexity into data-driven multi-hazard supply chain network strategies

    Science.gov (United States)

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  9. Architectures of adaptive integration in large collaborative projects

    Directory of Open Access Journals (Sweden)

    Lois Wright Morton

    2015-12-01

    Full Text Available Collaborations to address complex societal problems associated with managing human-natural systems often require large teams comprised of scientists from multiple disciplines. For many such problems, large-scale, transdisciplinary projects whose members include scientists, stakeholders, and other professionals are necessary. The success of very large, transdisciplinary projects can be facilitated by attending to the diversity of types of collaboration that inevitably occur within them. As projects progress and evolve, the resulting dynamic collaborative heterogeneity within them constitutes architectures of adaptive integration (AAI. Management that acknowledges this dynamic and fosters and promotes awareness of it within a project can better facilitate the creativity and innovation required to address problems from a systems perspective. In successful large projects, AAI (1 functionally meets objectives and goals, (2 uses disciplinary expertise and concurrently bridges many disciplines, (3 has mechanisms to enable connection, (4 delineates boundaries to keep focus but retain flexibility, (5 continuously monitors and adapts, and (6 encourages project-wide awareness. These principles are illustrated using as case studies three large climate change and agriculture projects funded by the U.S. Department of Agriculture-National Institute of Food and Agriculture.

  10. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  11. Packaging Concerns and Techniques for Large Devices: Challenges for Complex Electronics

    Science.gov (United States)

    LaBel, Kenneth A.; Sampson, Michael J.

    2010-01-01

    NASA is going to have to accept the use of non-hermetic packages for complex devices. There are a large number of packaging options available. Space application subjects the packages to stresses that they were probably not designed for (vacuum for instance). NASA has to find a way of having assurance in the integrity of the packages. There are manufacturers interested in qualifying non-hermetic packages to MIL-PRF-38535 Class V. Government space users are agreed that Class V should be for hermetic packages only. NASA is working on a new Class for non-hermetic packages for M38535 Appendix B, "Class Y". Testing for package integrity will be required but can be package specific as described by a Package Integrity Test Plan. The plan is developed by the manufacturer and approved by DSCC and government space.

  12. Integrative structure and functional anatomy of a nuclear pore complex

    Science.gov (United States)

    Kim, Seung Joong; Fernandez-Martinez, Javier; Nudelman, Ilona; Shi, Yi; Zhang, Wenzhu; Raveh, Barak; Herricks, Thurston; Slaughter, Brian D.; Hogan, Joanna A.; Upla, Paula; Chemmama, Ilan E.; Pellarin, Riccardo; Echeverria, Ignacia; Shivaraju, Manjunatha; Chaudhury, Azraa S.; Wang, Junjie; Williams, Rosemary; Unruh, Jay R.; Greenberg, Charles H.; Jacobs, Erica Y.; Yu, Zhiheng; de La Cruz, M. Jason; Mironska, Roxana; Stokes, David L.; Aitchison, John D.; Jarrold, Martin F.; Gerton, Jennifer L.; Ludtke, Steven J.; Akey, Christopher W.; Chait, Brian T.; Sali, Andrej; Rout, Michael P.

    2018-03-01

    Nuclear pore complexes play central roles as gatekeepers of RNA and protein transport between the cytoplasm and nucleoplasm. However, their large size and dynamic nature have impeded a full structural and functional elucidation. Here we determined the structure of the entire 552-protein nuclear pore complex of the yeast Saccharomyces cerevisiae at sub-nanometre precision by satisfying a wide range of data relating to the molecular arrangement of its constituents. The nuclear pore complex incorporates sturdy diagonal columns and connector cables attached to these columns, imbuing the structure with strength and flexibility. These cables also tie together all other elements of the nuclear pore complex, including membrane-interacting regions, outer rings and RNA-processing platforms. Inwardly directed anchors create a high density of transport factor-docking Phe-Gly repeats in the central channel, organized into distinct functional units. This integrative structure enables us to rationalize the architecture, transport mechanism and evolutionary origins of the nuclear pore complex.

  13. Integrative structure and functional anatomy of a nuclear pore complex.

    Science.gov (United States)

    Kim, Seung Joong; Fernandez-Martinez, Javier; Nudelman, Ilona; Shi, Yi; Zhang, Wenzhu; Raveh, Barak; Herricks, Thurston; Slaughter, Brian D; Hogan, Joanna A; Upla, Paula; Chemmama, Ilan E; Pellarin, Riccardo; Echeverria, Ignacia; Shivaraju, Manjunatha; Chaudhury, Azraa S; Wang, Junjie; Williams, Rosemary; Unruh, Jay R; Greenberg, Charles H; Jacobs, Erica Y; Yu, Zhiheng; de la Cruz, M Jason; Mironska, Roxana; Stokes, David L; Aitchison, John D; Jarrold, Martin F; Gerton, Jennifer L; Ludtke, Steven J; Akey, Christopher W; Chait, Brian T; Sali, Andrej; Rout, Michael P

    2018-03-22

    Nuclear pore complexes play central roles as gatekeepers of RNA and protein transport between the cytoplasm and nucleoplasm. However, their large size and dynamic nature have impeded a full structural and functional elucidation. Here we determined the structure of the entire 552-protein nuclear pore complex of the yeast Saccharomyces cerevisiae at sub-nanometre precision by satisfying a wide range of data relating to the molecular arrangement of its constituents. The nuclear pore complex incorporates sturdy diagonal columns and connector cables attached to these columns, imbuing the structure with strength and flexibility. These cables also tie together all other elements of the nuclear pore complex, including membrane-interacting regions, outer rings and RNA-processing platforms. Inwardly directed anchors create a high density of transport factor-docking Phe-Gly repeats in the central channel, organized into distinct functional units. This integrative structure enables us to rationalize the architecture, transport mechanism and evolutionary origins of the nuclear pore complex.

  14. Protein complex detection in PPI networks based on data integration and supervised learning method.

    Science.gov (United States)

    Yu, Feng; Yang, Zhi; Hu, Xiao; Sun, Yuan; Lin, Hong; Wang, Jian

    2015-01-01

    Revealing protein complexes are important for understanding principles of cellular organization and function. High-throughput experimental techniques have produced a large amount of protein interactions, which makes it possible to predict protein complexes from protein-protein interaction (PPI) networks. However, the small amount of known physical interactions may limit protein complex detection. The new PPI networks are constructed by integrating PPI datasets with the large and readily available PPI data from biomedical literature, and then the less reliable PPI between two proteins are filtered out based on semantic similarity and topological similarity of the two proteins. Finally, the supervised learning protein complex detection (SLPC), which can make full use of the information of available known complexes, is applied to detect protein complex on the new PPI networks. The experimental results of SLPC on two different categories yeast PPI networks demonstrate effectiveness of the approach: compared with the original PPI networks, the best average improvements of 4.76, 6.81 and 15.75 percentage units in the F-score, accuracy and maximum matching ratio (MMR) are achieved respectively; compared with the denoising PPI networks, the best average improvements of 3.91, 4.61 and 12.10 percentage units in the F-score, accuracy and MMR are achieved respectively; compared with ClusterONE, the start-of the-art complex detection method, on the denoising extended PPI networks, the average improvements of 26.02 and 22.40 percentage units in the F-score and MMR are achieved respectively. The experimental results show that the performances of SLPC have a large improvement through integration of new receivable PPI data from biomedical literature into original PPI networks and denoising PPI networks. In addition, our protein complexes detection method can achieve better performance than ClusterONE.

  15. Using lanthanoid complexes to phase large macromolecular assemblies

    International Nuclear Information System (INIS)

    Talon, Romain; Kahn, Richard; Durá, M. Asunción; Maury, Olivier; Vellieux, Frédéric M. D.; Franzetti, Bruno; Girard, Eric

    2011-01-01

    A lanthanoid complex, [Eu(DPA) 3 ] 3− , was used to obtain experimental phases at 4.0 Å resolution of PhTET1-12s, a large self-compartmentalized homo-dodecameric protease complex of 444 kDa. Lanthanoid ions exhibit extremely large anomalous X-ray scattering at their L III absorption edge. They are thus well suited for anomalous diffraction experiments. A novel class of lanthanoid complexes has been developed that combines the physical properties of lanthanoid atoms with functional chemical groups that allow non-covalent binding to proteins. Two structures of large multimeric proteins have already been determined by using such complexes. Here the use of the luminescent europium tris-dipicolinate complex [Eu(DPA) 3 ] 3− to solve the low-resolution structure of a 444 kDa homododecameric aminopeptidase, called PhTET1-12s from the archaea Pyrococcus horikoshii, is reported. Surprisingly, considering the low resolution of the data, the experimental electron density map is very well defined. Experimental phases obtained by using the lanthanoid complex lead to maps displaying particular structural features usually observed in higher-resolution maps. Such complexes open a new way for solving the structure of large molecular assemblies, even with low-resolution data

  16. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  17. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  18. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    Science.gov (United States)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  19. Path integral in area tensor Regge calculus and complex connections

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2006-01-01

    Euclidean quantum measure in Regge calculus with independent area tensors is considered using example of the Regge manifold of a simple structure. We go over to integrations along certain contours in the hyperplane of complex connection variables. Discrete connection and curvature on classical solutions of the equations of motion are not, strictly speaking, genuine connection and curvature, but more general quantities and, therefore, these do not appear as arguments of a function to be averaged, but are the integration (dummy) variables. We argue that upon integrating out the latter the resulting measure can be well-defined on physical hypersurface (for the area tensors corresponding to certain edge vectors, i.e. to certain metric) as positive and having exponential cutoff at large areas on condition that we confine ourselves to configurations which do not pass through degenerate metrics

  20. Some thoughts on the management of large, complex international space ventures

    Science.gov (United States)

    Lee, T. J.; Kutzer, Ants; Schneider, W. C.

    1992-01-01

    Management issues relevant to the development and deployment of large international space ventures are discussed with particular attention given to previous experience. Management approaches utilized in the past are labeled as either simple or complex, and signs of efficient management are examined. Simple approaches include those in which experiments and subsystems are developed for integration into spacecraft, and the Apollo-Soyuz Test Project is given as an example of a simple multinational approach. Complex approaches include those for ESA's Spacelab Project and the Space Station Freedom in which functional interfaces cross agency and political boundaries. It is concluded that individual elements of space programs should be managed by individual participating agencies, and overall configuration control is coordinated by level with a program director acting to manage overall objectives and project interfaces.

  1. Everyday value conflicts and integrative complexity of thought.

    Science.gov (United States)

    Myyry, Liisa

    2002-12-01

    This study examined the value pluralism model in everyday value conflicts, and the effect of issue context on complexity of thought. According to the cognitive manager model we hypothesized that respondents would obtain a higher level of integrative complexity on personal issues that on professional and general issues. We also explored the relations of integrative complexity to value priorities, measured by the Schwartz Value Survey, and to emotional empathy. The value pluralism model was not supported by the data collected from 126 university students from social science, business and technology. The cognitive manager model was partially confirmed by data from females but not from males. Concerning value priorities, more complex respondents had higher regard for self-transcendence values, and less complex respondents for self-enhancement values Emotional empathy was also significantly related to complexity score.

  2. Evaluation of integration methods for hybrid simulation of complex structural systems through collapse

    Science.gov (United States)

    Del Carpio R., Maikol; Hashemi, M. Javad; Mosqueda, Gilberto

    2017-10-01

    This study examines the performance of integration methods for hybrid simulation of large and complex structural systems in the context of structural collapse due to seismic excitations. The target application is not necessarily for real-time testing, but rather for models that involve large-scale physical sub-structures and highly nonlinear numerical models. Four case studies are presented and discussed. In the first case study, the accuracy of integration schemes including two widely used methods, namely, modified version of the implicit Newmark with fixed-number of iteration (iterative) and the operator-splitting (non-iterative) is examined through pure numerical simulations. The second case study presents the results of 10 hybrid simulations repeated with the two aforementioned integration methods considering various time steps and fixed-number of iterations for the iterative integration method. The physical sub-structure in these tests consists of a single-degree-of-freedom (SDOF) cantilever column with replaceable steel coupons that provides repeatable highlynonlinear behavior including fracture-type strength and stiffness degradations. In case study three, the implicit Newmark with fixed-number of iterations is applied for hybrid simulations of a 1:2 scale steel moment frame that includes a relatively complex nonlinear numerical substructure. Lastly, a more complex numerical substructure is considered by constructing a nonlinear computational model of a moment frame coupled to a hybrid model of a 1:2 scale steel gravity frame. The last two case studies are conducted on the same porotype structure and the selection of time steps and fixed number of iterations are closely examined in pre-test simulations. The generated unbalance forces is used as an index to track the equilibrium error and predict the accuracy and stability of the simulations.

  3. Unusually large erupted complex odontoma: A rare case report

    Energy Technology Data Exchange (ETDEWEB)

    Bagewadi, Shivanand B.; Kukreja, Rahul; Suma, Gundareddy N.; Yadav, Bhawn; Sharma, Havi [Dept. of Oral Medicine and Radiology, ITS Centre for Dental Studies and Research, Murad Nagar (India)

    2015-03-15

    Odontomas are nonaggressive, hamartomatous developmental malformations composed of mature tooth substances and may be compound or complex depending on the extent of morphodifferentiation or on their resemblance to normal teeth. Among them, complex odontomas are relatively rare tumors. They are usually asymptomatic in nature. Occasionally, these tumors become large, causing bone expansion followed by facial asymmetry. Odontoma eruptions are uncommon, and thus far, very few cases of erupted complex odontomas have been reported in the literature. Here, we report the case of an unusually large, painless, complex odontoma located in the right posterior mandible.

  4. Research and assessment of competitiveness of large engineering complexes

    Directory of Open Access Journals (Sweden)

    Krivorotov V.V.

    2017-01-01

    Full Text Available The urgency of the problem of ensuring the competitiveness of manufacturing and high-tech sectors is shown. Substantiated the decisive role of the large industrial complexes in the formation of the results of the national economy; the author’s interpretation of the concept of “industrial complex” with regard to current economic systems. Current approaches to assessing the competitiveness of enterprises and industrial complexes are analyzed; showing their main advantages and disadvantages. Provides scientific-methodological approach to the study and management of competitiveness of a large industrial complex; the description of its main units is provided. As a Central element of the scientific methodology approach proposed the methodology for assessing the competitiveness of a large industrial complex based on the Pattern-method; a modular system of indicators of competitiveness is developed and its adaptation to a large engineering complexes is made. Using the developed methodology the competitiveness of one of the largest engineering complexes of the group of companies Uralelectrotyazhmash, which is the leading enterprises in electrotechnical industry of Russia is assessed. The evaluation identified the main problems and bottlenecks in the development of these enterprises, and their comparison with leading competitors is provided. According to the results of the study the main conclusions and recommendations are formed.

  5. An integrated native mass spectrometry and top-down proteomics method that connects sequence to structure and function of macromolecular complexes

    Science.gov (United States)

    Li, Huilin; Nguyen, Hong Hanh; Ogorzalek Loo, Rachel R.; Campuzano, Iain D. G.; Loo, Joseph A.

    2018-02-01

    Mass spectrometry (MS) has become a crucial technique for the analysis of protein complexes. Native MS has traditionally examined protein subunit arrangements, while proteomics MS has focused on sequence identification. These two techniques are usually performed separately without taking advantage of the synergies between them. Here we describe the development of an integrated native MS and top-down proteomics method using Fourier-transform ion cyclotron resonance (FTICR) to analyse macromolecular protein complexes in a single experiment. We address previous concerns of employing FTICR MS to measure large macromolecular complexes by demonstrating the detection of complexes up to 1.8 MDa, and we demonstrate the efficacy of this technique for direct acquirement of sequence to higher-order structural information with several large complexes. We then summarize the unique functionalities of different activation/dissociation techniques. The platform expands the ability of MS to integrate proteomics and structural biology to provide insights into protein structure, function and regulation.

  6. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  7. Reorganizing Complex Network to Improve Large-Scale Multiagent Teamwork

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2014-01-01

    Full Text Available Large-scale multiagent teamwork has been popular in various domains. Similar to human society infrastructure, agents only coordinate with some of the others, with a peer-to-peer complex network structure. Their organization has been proven as a key factor to influence their performance. To expedite team performance, we have analyzed that there are three key factors. First, complex network effects may be able to promote team performance. Second, coordination interactions coming from their sources are always trying to be routed to capable agents. Although they could be transferred across the network via different paths, their sources and sinks depend on the intrinsic nature of the team which is irrelevant to the network connections. In addition, the agents involved in the same plan often form a subteam and communicate with each other more frequently. Therefore, if the interactions between agents can be statistically recorded, we are able to set up an integrated network adjustment algorithm by combining the three key factors. Based on our abstracted teamwork simulations and the coordination statistics, we implemented the adaptive reorganization algorithm. The experimental results briefly support our design that the reorganized network is more capable of coordinating heterogeneous agents.

  8. Evaluation of complex integrated care programmes: the approach in North West London

    Science.gov (United States)

    Greaves, Felix; Pappas, Yannis; Bardsley, Martin; Harris, Matthew; Curry, Natasha; Holder, Holly; Blunt, Ian; Soljak, Michael; Gunn, Laura; Majeed, Azeem; Car, Josip

    2013-01-01

    Background Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect. Study design We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context. Theory and discussion This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them. Conclusions We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important. PMID:23687478

  9. Evaluation of complex integrated care programmes: the approach in North West London

    Directory of Open Access Journals (Sweden)

    Felix Greaves

    2013-03-01

    Full Text Available Background: Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect. Study design: We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context. Theory and discussion: This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them. Conclusions: We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important

  10. Evaluation of complex integrated care programmes: the approach in North West London

    Directory of Open Access Journals (Sweden)

    Felix Greaves

    2013-03-01

    Full Text Available Background: Several local attempts to introduce integrated care in the English National Health Service have been tried, with limited success. The Northwest London Integrated Care Pilot attempts to improve the quality of care of the elderly and people with diabetes by providing a novel integration process across primary, secondary and social care organisations. It involves predictive risk modelling, care planning, multidisciplinary management of complex cases and an information technology tool to support information sharing. This paper sets out the evaluation approach adopted to measure its effect.Study design: We present a mixed methods evaluation methodology. It includes a quantitative approach measuring changes in service utilization, costs, clinical outcomes and quality of care using routine primary and secondary data sources. It also contains a qualitative component, involving observations, interviews and focus groups with patients and professionals, to understand participant experiences and to understand the pilot within the national policy context.Theory and discussion: This study considers the complexity of evaluating a large, multi-organisational intervention in a changing healthcare economy. We locate the evaluation within the theory of evaluation of complex interventions. We present the specific challenges faced by evaluating an intervention of this sort, and the responses made to mitigate against them.Conclusions: We hope this broad, dynamic and responsive evaluation will allow us to clarify the contribution of the pilot, and provide a potential model for evaluation of other similar interventions. Because of the priority given to the integrated agenda by governments internationally, the need to develop and improve strong evaluation methodologies remains strikingly important

  11. Integration as the basis of stable and dynamic development of enterprises in agroindustrial complex

    Directory of Open Access Journals (Sweden)

    Petr Ivanovich Ogorodnikov

    2011-12-01

    Full Text Available Formation of market relations in Russian economy generates an objective need to address a number of problems in the relationship between agroundustrial complex organizations in connection with privatization, liberalization of prices and imbalances in the existing inter-industry production and economic relations that negatively affect the results of their economic activities. Because of the flagrant violations of the replenishment process, a diverse range of connections and relationships between producers and processors was broken. The major direction of lifting agricultural economy in this situation is the development of cooperatives and agroindustrial integration. In addition, the formation of large integrated complexes demonstrates high efficiency and rapid development, which is the basis of agroindustrial sector in many developed countries. The increase of competition forces business entities to combine capabilities and mutually beneficial cooperation in the struggle for the strengthening of market positions. Thus, increasing the degree of integration in the agricultural sector helps to get out of the protracted crisis and move more quickly to the innovations.

  12. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  13. Vertical integration from the large Hilbert space

    Science.gov (United States)

    Erler, Theodore; Konopka, Sebastian

    2017-12-01

    We develop an alternative description of the procedure of vertical integration based on the observation that amplitudes can be written in BRST exact form in the large Hilbert space. We relate this approach to the description of vertical integration given by Sen and Witten.

  14. Large branched self-assembled DNA complexes

    International Nuclear Information System (INIS)

    Tosch, Paul; Waelti, Christoph; Middelberg, Anton P J; Davies, A Giles

    2007-01-01

    Many biological molecules have been demonstrated to self-assemble into complex structures and networks by using their very efficient and selective molecular recognition processes. The use of biological molecules as scaffolds for the construction of functional devices by self-assembling nanoscale complexes onto the scaffolds has recently attracted significant attention and many different applications in this field have emerged. In particular DNA, owing to its inherent sophisticated self-organization and molecular recognition properties, has served widely as a scaffold for various nanotechnological self-assembly applications, with metallic and semiconducting nanoparticles, proteins, macromolecular complexes, inter alia, being assembled onto designed DNA scaffolds. Such scaffolds may typically contain multiple branch-points and comprise a number of DNA molecules selfassembled into the desired configuration. Previously, several studies have used synthetic methods to produce the constituent DNA of the scaffolds, but this typically constrains the size of the complexes. For applications that require larger self-assembling DNA complexes, several tens of nanometers or more, other techniques need to be employed. In this article, we discuss a generic technique to generate large branched DNA macromolecular complexes

  15. Accurate Complex Systems Design: Integrating Serious Games with Petri Nets

    Directory of Open Access Journals (Sweden)

    Kirsten Sinclair

    2016-03-01

    Full Text Available Difficulty understanding the large number of interactions involved in complex systems makes their successful engineering a problem. Petri Nets are one graphical modelling technique used to describe and check proposed designs of complex systems thoroughly. While automatic analysis capabilities of Petri Nets are useful, their visual form is less so, particularly for communicating the design they represent. In engineering projects, this can lead to a gap in communications between people with different areas of expertise, negatively impacting achieving accurate designs.In contrast, although capable of representing a variety of real and imaginary objects effectively, behaviour of serious games can only be analysed manually through interactive simulation. This paper examines combining the complementary strengths of Petri Nets and serious games. The novel contribution of this work is a serious game prototype of a complex system design that has been checked thoroughly. Underpinned by Petri Net analysis, the serious game can be used as a high-level interface to communicate and refine the design.Improvement of a complex system design is demonstrated by applying the integration to a proof-of-concept case study.   

  16. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  17. Control Synthesis for the Flow-Based Microfluidic Large-Scale Integration Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2013-01-01

    In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units, such asmi......In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units...

  18. Privatization of Land Plot Under Integral Real Estate Complex

    Directory of Open Access Journals (Sweden)

    Maruchek A. A.

    2014-10-01

    Full Text Available The article deals with the questions concerning the privatization of a land plot under integral real estate complex. The authors come to conclusion that a number of legislation norms relating to privatization of a land plot do not take into account the construction of an integral real estate complex that could cause some problems in the realization of the right to privatization of the land plot

  19. Path integral representations on the complex sphere

    International Nuclear Information System (INIS)

    Grosche, C.

    2007-08-01

    In this paper we discuss the path integral representations for the coordinate systems on the complex sphere S 3C . The Schroedinger equation, respectively the path integral, separates in exactly 21 orthogonal coordinate systems. We enumerate these coordinate systems and we are able to present the path integral representations explicitly in the majority of the cases. In each solution the expansion into the wave-functions is stated. Also, the kernel and the corresponding Green function can be stated in closed form in terms of the invariant distance on the sphere, respectively on the hyperboloid. (orig.)

  20. Path integral representations on the complex sphere

    Energy Technology Data Exchange (ETDEWEB)

    Grosche, C. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik

    2007-08-15

    In this paper we discuss the path integral representations for the coordinate systems on the complex sphere S{sub 3C}. The Schroedinger equation, respectively the path integral, separates in exactly 21 orthogonal coordinate systems. We enumerate these coordinate systems and we are able to present the path integral representations explicitly in the majority of the cases. In each solution the expansion into the wave-functions is stated. Also, the kernel and the corresponding Green function can be stated in closed form in terms of the invariant distance on the sphere, respectively on the hyperboloid. (orig.)

  1. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  2. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  3. Automated X-ray television complex for testing large dynamic objects

    International Nuclear Information System (INIS)

    Gusev, E.A.; Luk'yanenko, Eh.A.; Chelnokov, V.B.; Kuleshov, V.K.; Alkhimov, Yu.V.

    1992-01-01

    An automated X-ray television complex on the base of matrix gas-dischage large-area (2.1x1.0 m) converter for testing large cargoes and containers, as well as for inductrial article diagnostics is described. The complex pulsed operation with the 512 kbytes television digital memory unit provides for testing dynamic objects under minimal doses (20-100 μR)

  4. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  5. Reframing the challenges to integrated care: a complex-adaptive systems perspective

    Directory of Open Access Journals (Sweden)

    Peter Tsasis

    2012-09-01

    Full Text Available Introduction: Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS perspective on integrated care.Methods: In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses.Results: Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity. Discussion and Conclusion: One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that

  6. Reframing the challenges to integrated care: a complex-adaptive systems perspective

    Directory of Open Access Journals (Sweden)

    Peter Tsasis

    2012-09-01

    Full Text Available Introduction: Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS perspective on integrated care. Methods: In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses. Results: Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity.  Discussion and Conclusion: One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that

  7. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  8. Management of Large Erupting Complex Odontoma in Maxilla

    Directory of Open Access Journals (Sweden)

    Colm Murphy

    2014-01-01

    Full Text Available We present the unusual case of a large complex odontoma erupting in the maxilla. Odontomas are benign developmental tumours of odontogenic origin. They are characterized by slow growth and nonaggressive behaviour. Complex odontomas, which erupt, are rare. They are usually asymptomatic and are identified on routine radiograph but may present with erosion into the oral cavity with subsequent cellulitis and facial asymmetry. This present paper describes the presentation and management of an erupting complex odontoma, occupying the maxillary sinus with extension to the infraorbital rim. We also discuss various surgical approaches used to access this anatomic area.

  9. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  10. Iterative methods for the solution of very large complex symmetric linear systems of equations in electrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Clemens, M.; Weiland, T. [Technische Hochschule Darmstadt (Germany)

    1996-12-31

    In the field of computational electrodynamics the discretization of Maxwell`s equations using the Finite Integration Theory (FIT) yields very large, sparse, complex symmetric linear systems of equations. For this class of complex non-Hermitian systems a number of conjugate gradient-type algorithms is considered. The complex version of the biconjugate gradient (BiCG) method by Jacobs can be extended to a whole class of methods for complex-symmetric algorithms SCBiCG(T, n), which only require one matrix vector multiplication per iteration step. In this class the well-known conjugate orthogonal conjugate gradient (COCG) method for complex-symmetric systems corresponds to the case n = 0. The case n = 1 yields the BiCGCR method which corresponds to the conjugate residual algorithm for the real-valued case. These methods in combination with a minimal residual smoothing process are applied separately to practical 3D electro-quasistatical and eddy-current problems in electrodynamics. The practical performance of the SCBiCG methods is compared with other methods such as QMR and TFQMR.

  11. The Complexity integrated-Instruments components media of IPA at Elementary School

    Directory of Open Access Journals (Sweden)

    Angreni Siska

    2018-01-01

    Full Text Available This research aims at describing the complexity of Integrated Instrument Components media (CII in learning of science at Elementary schools in District Siulak Mukai and at Elementary schools in District Siulak. The research applied a descriptive method which included survey forms. Instruments used were observation sheets. The result of the research showed Integrated Instrument Components media (CII natural science that complexity at primary school district Siulak was more complex compared with that at primary school district Siulak Mukai. is better than from primary school district Mukai

  12. Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities

    Science.gov (United States)

    Terblanche, Deon; Jalkanen, Liisa

    2013-04-01

    Megacities and Large Urban Complexes - WMO Role in Addressing Challenges and Opportunities Deon E. Terblanche and Liisa Jalkanen dterblanche@wmo.int ljalkanen@wmo.int World Meteorological Organization, Geneva, Switzerland The 21st Century could amongst others, become known as the century in which our species has evolved from Homo sapiens to Homo urbanus. By now the urban population has surpassed the rural population and the rate of urbanization will continue at such a pace that by 2050 urban dwellers could outnumber their rural counterpart by more than two to one. Most of this growth in urban population will occur in developing countries and along coastal areas. Urbanization is to a large extent the outcome of humans seeking a better life through improved opportunities presented by high-density communities. Megacities and large urban complexes provide more job opportunities and social structures, better transport and communication links and a relative abundance of physical goods and services when compared to most rural areas. Unfortunately these urban complexes also present numerous social and environmental challenges. Urban areas differ from their surroundings by morphology, population density, and with high concentration of industrial activities, energy consumption and transport. They also pose unique challenges to atmospheric modelling and monitoring and create a multi-disciplinary spectrum of potential threats, including air pollution, which need to be addressed in an integrated way. These areas are also vulnerable to the changing climate and its implications to sea-level and extreme events, air quality and related health impacts. Many urban activities are significantly impacted by weather events that would not be considered to be of high impact in less densely populated areas. For instance, moderate precipitation events can cause flooding and landslides as modified urban catchments generally have higher run-off to rainfall ratios than their more pristine rural

  13. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  14. CRISPR-Cas9 mediated genetic engineering for the purification of the endogenous integrator complex from mammalian cells.

    Science.gov (United States)

    Baillat, David; Russell, William K; Wagner, Eric J

    2016-12-01

    The Integrator Complex (INT) is a large multi-subunit protein complex, containing at least 14 subunits and a host of associated factors. These protein components have been established through pulldowns of overexpressed epitope tagged subunits or by using antibodies raised against specific subunits. Here, we utilize CRISPR/Cas9 gene editing technology to introduce N-terminal FLAG epitope tags into the endogenous genes that encode Integrator subunit 4 and 11 within HEK293T cells. We provide specific details regarding design, approaches for facile screening, and our observed frequency of successful recombination. Finally, using silver staining, Western blotting and LC-MS/MS we compare the components of INT of purifications from CRISPR derived lines to 293T cells overexpressing FLAG-INTS11 to define a highly resolved constituency of mammalian INT. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The Integrated Complex of Marketing of Higher Education Services

    Directory of Open Access Journals (Sweden)

    Zhehus Olena V.

    2017-10-01

    Full Text Available The article, on the basis of generalization of scientific views of foreign and domestic scientists, substantiates the integrated model of marketing of higher education products and services with consideration of their specificities. The obtained result is the «5Р + S» model, which includes the newly introduced poli-element «proposition», combining the interrelated and indivisible elements of «product», «people» and «process», as well as the traditional elements of the service marketing complex: «price», «place», «promotion», «physical evidence». The «social-marketing» element has been added to the integrated model on the basis of the high societal importance of educational services. Altogether, the proposed integrated model of the complex of marketing of higher education products and services is a symbiosis of commercial and non-commercial marketing, which will enhance social and economic efficiency of functioning of higher educational institution.

  16. An integrated view of complex landscapes: a big data-model integration approach to trans-disciplinary science

    Science.gov (United States)

    The Earth is a complex system comprised of many interacting spatial and temporal scales. Understanding, predicting, and managing for these dynamics requires a trans-disciplinary integrated approach. Although there have been calls for this integration, a general approach is needed. We developed a Tra...

  17. High-level waste program integration within the DOE complex

    International Nuclear Information System (INIS)

    Valentine, J.H.; Malone, K.; Schaus, P.S.

    1998-03-01

    Eleven major Department of Energy (DOE) site contractors were chartered by the Assistant Secretary to use a systems engineering approach to develop and evaluate technically defensible cost savings opportunities across the complex. Known as the complex-wide Environmental Management Integration (EMI), this process evaluated all the major DOE waste streams including high level waste (HLW). Across the DOE complex, this waste stream has the highest life cycle cost and is scheduled to take until at least 2035 before all HLW is processed for disposal. Technical contract experts from the four DOE sites that manage high level waste participated in the integration analysis: Hanford, Savannah River Site (SRS), Idaho National Engineering and Environmental Laboratory (INEEL), and West Valley Demonstration Project (WVDP). In addition, subject matter experts from the Yucca Mountain Project and the Tanks Focus Area participated in the analysis. Also, departmental representatives from the US Department of Energy Headquarters (DOE-HQ) monitored the analysis and results. Workouts were held throughout the year to develop recommendations to achieve a complex-wide integrated program. From this effort, the HLW Environmental Management (EM) Team identified a set of programmatic and technical opportunities that could result in potential cost savings and avoidance in excess of $18 billion and an accelerated completion of the HLW mission by seven years. The cost savings, schedule improvements, and volume reduction are attributed to a multifaceted HLW treatment disposal strategy which involves waste pretreatment, standardized waste matrices, risk-based retrieval, early development and deployment of a shipping system for glass canisters, and reasonable, low cost tank closure

  18. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  19. RESEARCH ON COMPLEX, LARGE INDUSTRIAL PROJECTS IN TRANSNATIONAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Florin POPESCU

    2016-12-01

    Full Text Available More and more projects from different industrial sectors developed in transnational environment are becoming more characterized as "complex". In recent years, there has been much discussion and controversy about the complexity of the projects, and, despite what has been written and said in various papers, journals and professional conferences, more confusion than clarification was created, complexity of projects being interpreted differently from one author to another. Most of the literature studied is based on linear, analytical and rational approach, focusing on the size of project management planning and control and actually less on projects that are characterized as taking place and grow into a dynamic socio-human environment in a continuous change. This study represents a critical review of existing theoretical models found in literature, highlighting their limitations. The output of this literature study represents an integration of different approaches concerning complexity under one umbrella to provide a common understanding of the evolution of this concept.

  20. Integration and test plans for complex manufacturing systems

    NARCIS (Netherlands)

    Boumen, R.

    2007-01-01

    The integration and test phases that are part of the development and manufacturing of complex manufacturing systems are costly and time consuming. As time-to-market is becoming increasingly important, it is crucial to keep these phases as short as possible, whilemaintaining system quality. This is

  1. Guidelines for integrated risk assessment and management in large industrial areas. Inter-Agency programme on the assessment and management of health and environmental risks from energy and other complex industrial systems

    International Nuclear Information System (INIS)

    1998-01-01

    The IAEA, the United Nations Environment Programme (UNEP) within the framework of the Awareness and Preparedness for Emergencies at Local Level (APELL), the United Nations Industrial Development Organization (UNIDO) and the World Health Organization (WHO) decided in 1986 to join forces in order to promote the use of integrated area wide approaches to risk management. An Inter-Agency Programme, which brings together expertise in health the environment, industry and energy, all vital for effective risk management, was established. The Inter-Agency Programme on the assessment and Management of Health and Environmental Risks from Energy and Other Complex Industrial Systems aims at promoting and facilitating the implementation of integrated risk assessment and management for large industrial areas. This initiative includes the compilation of procedures and methods for environmental and public health risk assessment, the transfer of knowledge and experience amongst countries in the application of these procedures and the implementation of an integrated approach to risk management. The purpose of the Inter-Agency Programme is to develop a broad approach to the identification, prioritization and minimization of industrial hazards in a given geographical area. The UN organizations sponsoring this programme have been involved for several years in activities aimed at assessment and management of environmental and health risks, prevention of major accidents and emergency preparedness. These Guidelines have been developed on the basis of experience from these activities to assist in the planning and conduct of regional risk management projects. They provide a reference framework for the undertaking of integrated health and environmental risk assessment for large industrial areas and for the formulation of appropriate risk management strategies

  2. Guidelines for integrated risk assessment and management in large industrial areas. Inter-Agency programme on the assessment and management of health and environmental risks from energy and other complex industrial systems

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    The IAEA, the United Nations Environment Programme (UNEP) within the framework of the Awareness and Preparedness for Emergencies at Local Level (APELL), the United Nations Industrial Development Organization (UNIDO) and the World Health Organization (WHO) decided in 1986 to join forces in order to promote the use of integrated area wide approaches to risk management. An Inter-Agency Programme, which brings together expertise in health the environment, industry and energy, all vital for effective risk management, was established. The Inter-Agency Programme on the assessment and Management of Health and Environmental Risks from Energy and Other Complex Industrial Systems aims at promoting and facilitating the implementation of integrated risk assessment and management for large industrial areas. This initiative includes the compilation of procedures and methods for environmental and public health risk assessment, the transfer of knowledge and experience amongst countries in the application of these procedures and the implementation of an integrated approach to risk management. The purpose of the Inter-Agency Programme is to develop a broad approach to the identification, prioritization and minimization of industrial hazards in a given geographical area. The UN organizations sponsoring this programme have been involved for several years in activities aimed at assessment and management of environmental and health risks, prevention of major accidents and emergency preparedness. These Guidelines have been developed on the basis of experience from these activities to assist in the planning and conduct of regional risk management projects. They provide a reference framework for the undertaking of integrated health and environmental risk assessment for large industrial areas and for the formulation of appropriate risk management strategies. Refs, figs, tabs.

  3. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  4. Quantifying complexity in translational research: an integrated approach.

    Science.gov (United States)

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  5. Complexity of Configurators Relative to Integrations and Field of Application

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Battistello, Loris

    . Moreover, configurators are commonly integrated to various IT systems within companies. The complexity of configurators is an important factor when it comes to performance, development and maintenance of the systems. A direct comparison of the complexity based on the different application...... integrations to other IT systems. The research method adopted in the paper is based on a survey followed with interviews where the unit of analysis is based on operating configurators within a company.......Configurators are applied widely to automate the specification processes at companies. The literature describes the industrial application of configurators supporting both sales and engineering processes, where configurators supporting the engineering processes are described more challenging...

  6. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    OpenAIRE

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an existing large and complex software-intensive system. This definition approach enables the customization and extension of a set of predefined viewpoints to address the requirements of a specific developme...

  7. Supply chain integration and performance : the moderating effect of supply complexity

    NARCIS (Netherlands)

    Giménez, C.; van der Vaart, T.; van Donk, D.P.

    2012-01-01

    Purpose - The purpose of this paper is to investigate the effectiveness of supply chain integration in different contexts. More specifically, it aims to show that supply chain integration is only effective in buyer-supplier relationships characterised by high supply complexity.

  8. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  9. Dynamic Reactive Power Compensation of Large Scale Wind Integrated Power System

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    wind turbines especially wind farms with additional grid support functionalities like dynamic support (e,g dynamic reactive power support etc.) and ii) refurbishment of existing conventional central power plants to synchronous condensers could be one of the efficient, reliable and cost effective option......Due to progressive displacement of conventional power plants by wind turbines, dynamic security of large scale wind integrated power systems gets significantly compromised. In this paper we first highlight the importance of dynamic reactive power support/voltage security in large scale wind...... integrated power systems with least presence of conventional power plants. Then we propose a mixed integer dynamic optimization based method for optimal dynamic reactive power allocation in large scale wind integrated power systems. One of the important aspects of the proposed methodology is that unlike...

  10. Assembly and control of large microtubule complexes

    Science.gov (United States)

    Korolev, Kirill; Ishihara, Keisuke; Mitchison, Timothy

    Motility, division, and other cellular processes require rapid assembly and disassembly of microtubule structures. We report a new mechanism for the formation of asters, radial microtubule complexes found in very large cells. The standard model of aster growth assumes elongation of a fixed number of microtubules originating from the centrosomes. However, aster morphology in this model does not scale with cell size, and we found evidence for microtubule nucleation away from centrosomes. By combining polymerization dynamics and auto-catalytic nucleation of microtubules, we developed a new biophysical model of aster growth. The model predicts an explosive transition from an aster with a steady-state radius to one that expands as a travelling wave. At the transition, microtubule density increases continuously, but aster growth rate discontinuously jumps to a nonzero value. We tested our model with biochemical perturbations in egg extract and confirmed main theoretical predictions including the jump in the growth rate. Our results show that asters can grow even though individual microtubules are short and unstable. The dynamic balance between microtubule collapse and nucleation could be a general framework for the assembly and control of large microtubule complexes. NIH GM39565; Simons Foundation 409704; Honjo International 486 Scholarship Foundation.

  11. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  12. From Synergy to Complexity: The Trend Toward Integrated Value Chain and Landscape Governance.

    Science.gov (United States)

    Ros-Tonen, Mirjam A F; Reed, James; Sunderland, Terry

    2018-05-30

    This Editorial introduces a special issue that illustrates a trend toward integrated landscape approaches. Whereas two papers echo older "win-win" strategies based on the trade of non-timber forest products, ten papers reflect a shift from a product to landscape perspective. However, they differ from integrated landscape approaches in that they emanate from sectorial approaches driven primarily by aims such as forest restoration, sustainable commodity sourcing, natural resource management, or carbon emission reduction. The potential of such initiatives for integrated landscape governance and achieving landscape-level outcomes has hitherto been largely unaddressed in the literature on integrated landscape approaches. This special issue addresses this gap, with a focus on actor constellations and institutional arrangements emerging in the transition from sectorial to integrated approaches. This editorial discusses the trends arising from the papers, including the need for a commonly shared concern and sense of urgency; inclusive stakeholder engagement; accommodating and coordinating polycentric governance in landscapes beset with institutional fragmentation and jurisdictional mismatches; alignment with locally embedded initiatives and governance structures; and a framework to assess and monitor the performance of integrated multi-stakeholder approaches. We conclude that, despite a growing tendency toward integrated approaches at the landscape level, inherent landscape complexity renders persistent and significant challenges such as balancing multiple objectives, equitable inclusion of all relevant stakeholders, dealing with power and gender asymmetries, adaptive management based on participatory outcome monitoring, and moving beyond existing administrative, jurisdictional, and sectorial silos. Multi-stakeholder platforms and bridging organizations and individuals are seen as key in overcoming such challenges.

  13. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  14. Modulation of chromatin structure by the FACT histone chaperone complex regulates HIV-1 integration.

    Science.gov (United States)

    Matysiak, Julien; Lesbats, Paul; Mauro, Eric; Lapaillerie, Delphine; Dupuy, Jean-William; Lopez, Angelica P; Benleulmi, Mohamed Salah; Calmels, Christina; Andreola, Marie-Line; Ruff, Marc; Llano, Manuel; Delelis, Olivier; Lavigne, Marc; Parissi, Vincent

    2017-07-28

    Insertion of retroviral genome DNA occurs in the chromatin of the host cell. This step is modulated by chromatin structure as nucleosomes compaction was shown to prevent HIV-1 integration and chromatin remodeling has been reported to affect integration efficiency. LEDGF/p75-mediated targeting of the integration complex toward RNA polymerase II (polII) transcribed regions ensures optimal access to dynamic regions that are suitable for integration. Consequently, we have investigated the involvement of polII-associated factors in the regulation of HIV-1 integration. Using a pull down approach coupled with mass spectrometry, we have selected the FACT (FAcilitates Chromatin Transcription) complex as a new potential cofactor of HIV-1 integration. FACT is a histone chaperone complex associated with the polII transcription machinery and recently shown to bind LEDGF/p75. We report here that a tripartite complex can be formed between HIV-1 integrase, LEDGF/p75 and FACT in vitro and in cells. Biochemical analyzes show that FACT-dependent nucleosome disassembly promotes HIV-1 integration into chromatinized templates, and generates highly favored nucleosomal structures in vitro. This effect was found to be amplified by LEDGF/p75. Promotion of this FACT-mediated chromatin remodeling in cells both increases chromatin accessibility and stimulates HIV-1 infectivity and integration. Altogether, our data indicate that FACT regulates HIV-1 integration by inducing local nucleosomes dissociation that modulates the functional association between the incoming intasome and the targeted nucleosome.

  15. Risk Management and Uncertainty in Large Complex Public Projects

    DEFF Research Database (Denmark)

    Neerup Themsen, Tim; Harty, Chris; Tryggestad, Kjell

    Governmental actors worldwide are promoting risk management as a rational approach to man-age uncertainty and improve the abilities to deliver large complex projects according to budget, time plans, and pre-set project specifications: But what do we know about the effects of risk management...... on the abilities to meet such objectives? Using Callon’s (1998) twin notions of framing and overflowing we examine the implementation of risk management within the Dan-ish public sector and the effects this generated for the management of two large complex pro-jects. We show how the rational framing of risk...... management have generated unexpected costly outcomes such as: the undermining of the longer-term value and societal relevance of the built asset, the negligence of the wider range of uncertainties emerging during project processes, and constraining forms of knowledge. We also show how expert accountants play...

  16. Large complex ovarian cyst managed by laparoscopy

    OpenAIRE

    Dipak J. Limbachiya; Ankit Chaudhari; Grishma P. Agrawal

    2017-01-01

    Complex ovarian cyst with secondary infection is a rare disease that hardly responds to the usual antibiotic treatment. Most of the times, it hampers day to day activities of women. It is commonly known to cause pain and fever. To our surprise, in our case the cyst was large enough to compress the ureter and it was adherent to the surrounding structures. Laparoscopic removal of the cyst was done and specimen was sent for histopathological examination.

  17. Value of spatial planning for large mining and energy complexes. [Yugoslavia

    Energy Technology Data Exchange (ETDEWEB)

    Matko, Z; Spasic, N

    1982-01-01

    In the example of the Kosovo complex (Socialist Federated Republic of Yugoslovia) an examination is made of the value of developing a spatial plan for the territory of large mining-energy complexes. The goals and expected results of spatial planning are discussed. The open method of working lignite, fuel shale and other fossil energy raw material fields at the modern level of development of technology, in addition to large-volume physical interferences in space, causes considerable structural changes of functional-economic, socioeconomic and psychological-sociological nature in the direct zone of influence of the mining-energy complex. Improvement in technology of working a lignite field does not guarantee in the near future any solutions in developing the mining-energy complexes, and therefore it is necessary to count on considerable volume of degradation of space which is governed by the existing technology. Under these conditions detailed planning and regulation of space is especially important, if one views them as a component part of long term policy for development of the mining energy complex and the zones of its influence.

  18. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  19. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  20. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  1. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  2. Integrated optimization on aerodynamics-structure coupling and flight stability of a large airplane in preliminary design

    Directory of Open Access Journals (Sweden)

    Xiaozhe WANG

    2018-06-01

    Full Text Available The preliminary phase is significant during the whole design process of a large airplane because of its enormous potential in enhancing the overall performance. However, classical sequential designs can hardly adapt to modern airplanes, due to their repeated iterations, long periods, and massive computational burdens. Multidisciplinary analysis and optimization demonstrates the capability to tackle such complex design issues. In this paper, an integrated optimization method for the preliminary design of a large airplane is proposed, accounting for aerodynamics, structure, and stability. Aeroelastic responses are computed by a rapid three-dimensional flight load analysis method combining the high-order panel method and the structural elasticity correction. The flow field is determined by the viscous/inviscid iteration method, and the cruise stability is evaluated by the linear small-disturbance theory. Parametric optimization is carried out using genetic algorithm to seek the minimal weight of a simplified plate-beam wing structure in the cruise trim condition subject to aeroelastic, aerodynamic, and stability constraints, and the optimal wing geometry shape, front/rear spar positions, and structural sizes are obtained simultaneously. To reduce the computational burden of the static aeroelasticity analysis in the optimization process, the Kriging method is employed to predict aerodynamic influence coefficient matrices of different aerodynamic shapes. The multidisciplinary analyses guarantee computational accuracy and efficiency, and the integrated optimization considers the coupling effect sufficiently between different disciplines to improve the overall performance, avoiding the limitations of sequential approaches utilized currently. Keywords: Aeroelasticity, Integrated optimization, Multidisciplinary analysis, Large airplane, Preliminary design

  3. Complex mineralization at large ore deposits in the Russian Far East

    Science.gov (United States)

    Schneider, A. A.; Malyshev, Yu. F.; Goroshko, M. V.; Romanovsky, N. P.

    2011-04-01

    Genetic and mineralogical features of large deposits with complex Sn, W, and Mo mineralization in the Sikhote-Alin and Amur-Khingan metallogenic provinces are considered, as well as those of raremetal, rare earth, and uranium deposits in the Aldan-Stanovoi province. The spatiotemporal, geological, and mineralogical attributes of large deposits are set forth, and their geodynamic settings are determined. These attributes are exemplified in the large Tigriny Sn-W greisen-type deposit. The variation of regional tectonic settings and their spatial superposition are the main factor controlling formation of large deposits. Such a variation gives rise to multiple reactivation of the ore-magmatic system and long-term, multistage formation of deposits. Pulsatory mineralogical zoning with telescoped mineral assemblages related to different stages results in the formation of complex ores. The highest-grade zones of mass discharge of hydrothermal solutions are formed at the deposits. The promising greisen-type mineralization with complex Sn-W-Mo ore is suggested to be an additional source of tungsten and molybdenum. The Tigriny, Pravourminsky, and Arsen'evsky deposits, as well as deposits of the Komsomol'sk and Khingan-Olonoi ore districts are examples. Large and superlarge U, Ta, Nb, Be, and REE deposits are localized in the southeastern Aldan-Stanovoi Shield. The Ulkan and Arbarastakh ore districts attract special attention. The confirmed prospects of new large deposits with Sn, W, Mo, Ta, Nb, Be, REE, and U mineralization in the south of the Russian Far East assure expediency of further geological exploration in this territory.

  4. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  5. Integrator complex plays an essential role in adipose differentiation

    International Nuclear Information System (INIS)

    Otani, Yuichiro; Nakatsu, Yusuke; Sakoda, Hideyuki; Fukushima, Toshiaki; Fujishiro, Midori; Kushiyama, Akifumi; Okubo, Hirofumi; Tsuchiya, Yoshihiro; Ohno, Haruya; Takahashi, Shin-Ichiro; Nishimura, Fusanori; Kamata, Hideaki; Katagiri, Hideki; Asano, Tomoichiro

    2013-01-01

    Highlights: •IntS6 and IntS11 are subunits of the Integrator complex. •Expression levels of IntS6 and IntS11 were very low in 3T3-L1 fibroblast. •IntS6 and IntS11 were upregulated during adipose differentiation. •Suppression of IntS6 or IntS11 expression inhibited adipose differentiation. -- Abstract: The dynamic process of adipose differentiation involves stepwise expressions of transcription factors and proteins specific to the mature fat cell phenotype. In this study, it was revealed that expression levels of IntS6 and IntS11, subunits of the Integrator complex, were increased in 3T3-L1 cells in the period when the cells reached confluence and differentiated into adipocytes, while being reduced to basal levels after the completion of differentiation. Suppression of IntS6 or IntS11 expression using siRNAs in 3T3-L1 preadipocytes markedly inhibited differentiation into mature adipocytes, based on morphological findings as well as mRNA analysis of adipocyte-specific genes such as Glut4, perilipin and Fabp4. Although Pparγ2 protein expression was suppressed in IntS6 or IntS11-siRNA treated cells, adenoviral forced expression of Pparγ2 failed to restore the capacity for differentiation into mature adipocytes. Taken together, these findings demonstrate that increased expression of Integrator complex subunits is an indispensable event in adipose differentiation. Although further study is necessary to elucidate the underlying mechanism, the processing of U1, U2 small nuclear RNAs may be involved in cell differentiation steps

  6. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  7. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  8. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  9. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  10. On the benefits of an integrated nuclear complex for Nevada

    International Nuclear Information System (INIS)

    Blink, J.A.; Halsey, W.G.

    1994-01-01

    An integrated nuclear complex is proposed for location at the Nevada Test Site. In addition to solving the nuclear waste disposal problem, this complex would tremendously enhance the southern Nevada economy, and it would provide low cost electricity to each resident and business in the affected counties. Nuclear industry and the national economy would benefit because the complex would demonstrate the new generation of safer nuclear power plants and revitalize the industry. Many spin-offs of the complex would be possible, including research into nuclear fusion and a world class medical facility for southern Nevada. For such a complex to become a reality, the cycle of distrust between the federal government and the State of Nevada must be broken. The paper concludes with a discussion of implementation through a public process led by state officials and culminating in a voter referendum

  11. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  12. Data integration, systems approach and multilevel description of complex biosystems

    International Nuclear Information System (INIS)

    Hernández-Lemus, Enrique

    2013-01-01

    Recent years have witnessed the development of new quantitative approaches and theoretical tenets in the biological sciences. The advent of high throughput experiments in genomics, proteomics and electrophysiology (to cite just a few examples) have provided the researchers with unprecedented amounts of data to be analyzed. Large datasets, however can not provide the means to achieve a complete understanding of the underlying biological phenomena, unless they are supplied with a solid theoretical framework and with proper analytical tools. It is now widely accepted that by using and extending some of the paradigmatic principles of what has been called complex systems theory, some degree of advance in this direction can be attained. We will be presenting ways in which by using data integration techniques (linear, non-linear, combinatorial, graphical), multidimensional-multilevel descriptions (multifractal modeling, dimensionality reduction, computational learning), as well as an approach based in systems theory (interaction maps, probabilistic graphical models, non-equilibrium physics) have allowed us to better understand some problems in the interface of Statistical Physics and Computational Biology

  13. Large-Eddy Simulations of Flows in Complex Terrain

    Science.gov (United States)

    Kosovic, B.; Lundquist, K. A.

    2011-12-01

    Large-eddy simulation as a methodology for numerical simulation of turbulent flows was first developed to study turbulent flows in atmospheric by Lilly (1967). The first LES were carried by Deardorff (1970) who used these simulations to study atmospheric boundary layers. Ever since, LES has been extensively used to study canonical atmospheric boundary layers, in most cases flat plate boundary layers under the assumption of horizontal homogeneity. Carefully designed LES of canonical convective and neutrally stratified and more recently stably stratified atmospheric boundary layers have contributed significantly to development of better understanding of these flows and their parameterizations in large scale models. These simulations were often carried out using codes specifically designed and developed for large-eddy simulations of horizontally homogeneous flows with periodic lateral boundary conditions. Recent developments in multi-scale numerical simulations of atmospheric flows enable numerical weather prediction (NWP) codes such as ARPS (Chow and Street, 2009), COAMPS (Golaz et al., 2009) and Weather Research and Forecasting model, to be used nearly seamlessly across a wide range of atmospheric scales from synoptic down to turbulent scales in atmospheric boundary layers. Before we can with confidence carry out multi-scale simulations of atmospheric flows, NWP codes must be validated for accurate performance in simulating flows over complex or inhomogeneous terrain. We therefore carry out validation of WRF-LES for simulations of flows over complex terrain using data from Askervein Hill (Taylor and Teunissen, 1985, 1987) and METCRAX (Whiteman et al., 2008) field experiments. WRF's nesting capability is employed with a one-way nested inner domain that includes complex terrain representation while the coarser outer nest is used to spin up fully developed atmospheric boundary layer turbulence and thus represent accurately inflow to the inner domain. LES of a

  14. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  15. Maximising the recovery of low grade heat: An integrated heat integration framework incorporating heat pump intervention for simple and complex factories

    International Nuclear Information System (INIS)

    Miah, J.H.; Griffiths, A.; McNeill, R.; Poonaji, I.; Martin, R.; Leiser, A.; Morse, S.; Yang, A.; Sadhukhan, J.

    2015-01-01

    Highlights: • A new practical heat integration framework incorporating heat pump technology for simple and complex food factories. • A decision making procedure was proposed to select process or utility heat integration in complex and diverse factories. • New stream classifications proposed to identify and compare streams linked between process and utility, especially waste heat. • A range of ‘Heat Pump Thresholds’ to identify and compare heat pump configurations with steam generation combustion boiler. - Abstract: The recovery of heat has long been a key measure to improving energy efficiency and maximising the heat recovery of factories by Pinch analysis. However, a substantial amount of research has been dedicated to conventional heat integration where low grade heat is often ignored. Despite this, the sustainability challenges facing the process manufacturing community are turning interest on low grade energy recovery systems to further advance energy efficiency by technological interventions such as heat pumps. This paper presents a novel heat integration framework incorporating technological interventions for both simple and complex factories to evaluate all possible heat integration opportunities including low grade and waste heat. The key features of the framework include the role of heat pumps to upgrade heat which can significantly enhance energy efficiency; the selection process of heat pump designs which was aided by the development of ‘Heat Pump Thresholds’ to decide if heat pump designs are cost-competitive with steam generation combustion boiler; a decision making procedure to select process or utility heat integration in complex and diverse factories; and additional stream classifications to identify and separate streams that can be practically integrated. The application of the framework at a modified confectionery factory has yielded four options capable of delivering a total energy reduction of about 32% with an economic payback

  16. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  17. Navigating Complexities: An Integrative Approach to English Language Teacher Education

    Science.gov (United States)

    Ryan, Phillip; Glodjo, Tyler; Hobbs, Bethany; Stargel, Victoria; Williams, Thad

    2015-01-01

    This article is an analysis of one undergraduate English language teacher education program's integrative theoretical framework that is structured around three pillars: interdisciplinarity, critical pedagogy, and teacher exploration. First, the authors survey the unique complexities of language teaching and learning. Then, they introduce this…

  18. Development of a descriptive model of an integrated information system to support complex, dynamic, distributed decision making for emergency management in large organisations

    International Nuclear Information System (INIS)

    Andersen, V.; Andersen, H.B.; Axel, E.; Petersen, T.

    1990-01-01

    A short introduction will be given to the European (ESPRIT II) project, ''IT Support for Emergency Management - ISEM''. The project is aimed at the development of an integrated information system capable of supporting the complex, dynamic, distributed decision making in the management of emergencies. The basic models developed to describe and construct emergency management organisations and their preparedness have been illustrated, and it has been stated that similarities may be found even in emergency situations that originally are of quite different nature. (author)

  19. On synchronisation of a class of complex chaotic systems with complex unknown parameters via integral sliding mode control

    Science.gov (United States)

    Tirandaz, Hamed; Karami-Mollaee, Ali

    2018-06-01

    Chaotic systems demonstrate complex behaviour in their state variables and their parameters, which generate some challenges and consequences. This paper presents a new synchronisation scheme based on integral sliding mode control (ISMC) method on a class of complex chaotic systems with complex unknown parameters. Synchronisation between corresponding states of a class of complex chaotic systems and also convergence of the errors of the system parameters to zero point are studied. The designed feedback control vector and complex unknown parameter vector are analytically achieved based on the Lyapunov stability theory. Moreover, the effectiveness of the proposed methodology is verified by synchronisation of the Chen complex system and the Lorenz complex systems as the leader and the follower chaotic systems, respectively. In conclusion, some numerical simulations related to the synchronisation methodology is given to illustrate the effectiveness of the theoretical discussions.

  20. Integral criteria for large-scale multiple fingerprint solutions

    Science.gov (United States)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  1. Convergent Innovation in Emerging Healthcare Technology Ecosystems: Addressing Complexity and Integration

    Directory of Open Access Journals (Sweden)

    Mark A. Phillips

    2017-09-01

    Full Text Available Precision Medicine and Digital Health are emerging areas in healthcare, and they are underpinned by convergent or cross-industry innovation. However, convergence results in greater uncertainty and complexity in terms of technologies, value networks, and organization. There has been limited empirical research on emerging and convergent ecosystems, especially in addressing the issue of integration. This research identifies how organizations innovate in emerging and convergent ecosystems, specifically, how they address the challenge of integration. We base our research on empirical analyses using a series of longitudinal case studies employing a combination of case interviews, field observations, and documents. Our findings identify a need to embrace the complexity by adopting a variety of approaches that balance “credibility-seeking” and “advantage-seeking” behaviours, to navigate, negotiate, and nurture both the innovation and ecosystem, in addition to a combination of “analysis” and “synthesis” actions to manage aspects of integration. We contribute to the convergent innovation agenda and provide practical approaches for innovators in this domain.

  2. Optimal number of coarse-grained sites in different components of large biomolecular complexes.

    Science.gov (United States)

    Sinitskiy, Anton V; Saunders, Marissa G; Voth, Gregory A

    2012-07-26

    The computational study of large biomolecular complexes (molecular machines, cytoskeletal filaments, etc.) is a formidable challenge facing computational biophysics and biology. To achieve biologically relevant length and time scales, coarse-grained (CG) models of such complexes usually must be built and employed. One of the important early stages in this approach is to determine an optimal number of CG sites in different constituents of a complex. This work presents a systematic approach to this problem. First, a universal scaling law is derived and numerically corroborated for the intensity of the intrasite (intradomain) thermal fluctuations as a function of the number of CG sites. Second, this result is used for derivation of the criterion for the optimal number of CG sites in different parts of a large multibiomolecule complex. In the zeroth-order approximation, this approach validates the empirical rule of taking one CG site per fixed number of atoms or residues in each biomolecule, previously widely used for smaller systems (e.g., individual biomolecules). The first-order corrections to this rule are derived and numerically checked by the case studies of the Escherichia coli ribosome and Arp2/3 actin filament junction. In different ribosomal proteins, the optimal number of amino acids per CG site is shown to differ by a factor of 3.5, and an even wider spread may exist in other large biomolecular complexes. Therefore, the method proposed in this paper is valuable for the optimal construction of CG models of such complexes.

  3. Optimizing water resources management in large river basins with integrated surface water-groundwater modeling: A surrogate-based approach

    Science.gov (United States)

    Wu, Bin; Zheng, Yi; Wu, Xin; Tian, Yong; Han, Feng; Liu, Jie; Zheng, Chunmiao

    2015-04-01

    Integrated surface water-groundwater modeling can provide a comprehensive and coherent understanding on basin-scale water cycle, but its high computational cost has impeded its application in real-world management. This study developed a new surrogate-based approach, SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), to incorporate the integrated modeling into water management optimization. Its applicability and advantages were evaluated and validated through an optimization research on the conjunctive use of surface water (SW) and groundwater (GW) for irrigation in a semiarid region in northwest China. GSFLOW, an integrated SW-GW model developed by USGS, was employed. The study results show that, due to the strong and complicated SW-GW interactions, basin-scale water saving could be achieved by spatially optimizing the ratios of groundwater use in different irrigation districts. The water-saving potential essentially stems from the reduction of nonbeneficial evapotranspiration from the aqueduct system and shallow groundwater, and its magnitude largely depends on both water management schemes and hydrological conditions. Important implications for water resources management in general include: first, environmental flow regulation needs to take into account interannual variation of hydrological conditions, as well as spatial complexity of SW-GW interactions; and second, to resolve water use conflicts between upper stream and lower stream, a system approach is highly desired to reflect ecological, economic, and social concerns in water management decisions. Overall, this study highlights that surrogate-based approaches like SOIM represent a promising solution to filling the gap between complex environmental modeling and real-world management decision-making.

  4. Structuring and assessing large and complex decision problems using MCDA

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    This paper presents an approach for the structuring and assessing of large and complex decision problems using multi-criteria decision analysis (MCDA). The MCDA problem is structured in a decision tree and assessed using the REMBRANDT technique featuring a procedure for limiting the number of pair...

  5. Integrated circuit devices in control systems of coal mining complexes

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    Systems of automatic monitoring and control of coal mining complexes developed in the 1960's used electromagnetic relays, thyristors, and flip-flops on transistors of varying conductivity. The circuits' designers, devoted much attention to ensuring spark safety, lowering power consumption, and raising noise immunity and repairability of functional devices. The fast development of integrated circuitry led to the use of microelectronic components in most devices of mine automation. An analysis of specifications and experimental research into integrated circuits (IMS) shows that the series K 176 IMS components made by CMOS technology best meet mine conditions of operation. The use of IMS devices under mine conditions has demonstrated their high reliability. Further development of integrated circuitry involve using microprocessors and microcomputers. (SC)

  6. CRISPR-Mediated Integration of Large Gene Cassettes Using AAV Donor Vectors

    Directory of Open Access Journals (Sweden)

    Rasmus O. Bak

    2017-07-01

    Full Text Available The CRISPR/Cas9 system has recently been shown to facilitate high levels of precise genome editing using adeno-associated viral (AAV vectors to serve as donor template DNA during homologous recombination (HR. However, the maximum AAV packaging capacity of ∼4.5 kb limits the donor size. Here, we overcome this constraint by showing that two co-transduced AAV vectors can serve as donors during consecutive HR events for the integration of large transgenes. Importantly, the method involves a single-step procedure applicable to primary cells with relevance to therapeutic genome editing. We use the methodology in primary human T cells and CD34+ hematopoietic stem and progenitor cells to site-specifically integrate an expression cassette that, as a single donor vector, would otherwise amount to a total of 6.5 kb. This approach now provides an efficient way to integrate large transgene cassettes into the genomes of primary human cells using HR-mediated genome editing with AAV vectors.

  7. Planning and Building Qualifiable Embedded Systems: Safety and Risk Properties Assessment for a Large and Complex System with Embedded Subsystems

    Science.gov (United States)

    Silva, N.; Lopes, R.; Barbosa, R.

    2012-01-01

    Systems based on embedded components and applications are today used in all markets. They are planned and developed by all types of institutions with different types of background experience, multidisciplinary teams and all types of capability and maturity levels. Organisational/engineering maturity has an impact on all aspects of the engineering of large and complex systems. An embedded system is a specific computer system designed to perform one or more dedicated functions, usually with real-time constraints. It is generally integrated as part of a more complex device typically composed of specific hardware such as sensors and actuators. This article presents an experimented technique to evaluate the organisation, processes, system and software engineering practices, methods, tools and the planned/produced artefacts themselves, leading towards certification/qualification. The safety and risk assessment of such core and complex systems is explained, described on a step-by- step manner, while presenting the main results and conclusions of the application of the technique to a real case study.

  8. Large erupted complex odontoma

    Directory of Open Access Journals (Sweden)

    Vijeev Vasudevan

    2009-01-01

    Full Text Available Odontomas are a heterogeneous group of jaw bone lesions, classified as odontogenic tumors which usually include well-diversified dental tissues. Odontoma is a term introduced to the literature by Broca in 1867. Trauma, infection and hereditary factors are the possible causes of forming this kind of lesions. Among odontogenic tumors, they constitute about 2/3 of cases. These lesions usually develop slowly and asymptomatically, and in most cases they do not cross the bone borders. Two types of odontoma are recognized: compound and complex. Complex odontomas are less common than the compound variety in the ratio 1:2.3. Eruption of an odontoma in the oral cavity is rare. We present a case of complex odontoma, in which apparent eruption has occurred in the area of the right maxillary second molar region.

  9. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  10. Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS

    CERN Document Server

    Froidevaux, D

    2011-01-01

    Integration of Detectors Into a Large Experiment: Examples From ATLAS andCMS, part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B2: Detectors for Particles and Radiation. Part 2: Systems and Applications'. This document is part of Part 2 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Chapter '5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS' with the content: 5 Integration of Detectors Into a Large Experiment: Examples From ATLAS and CMS 5.1 Introduction 5.1.1 The context 5.1.2 The main initial physics goals of ATLAS and CMS at the LHC 5.1.3 A snapshot of the current status of the ATLAS and CMS experiments 5.2 Overall detector concept and magnet systems 5.2.1 Overall detector concept 5.2.2 Magnet systems 5.2.2.1 Rad...

  11. Evaluation model of project complexity for large-scale construction projects in Iran - A Fuzzy ANP approach

    Directory of Open Access Journals (Sweden)

    Aliyeh Kazemi

    2016-09-01

    Full Text Available Construction projects have always been complex. By growing trend of this complexity, implementations of large-scale constructions become harder. Hence, evaluating and understanding these complexities are critical. Correct evaluation of a project complication can provide executives and managers with good source to use. Fuzzy analytic network process (ANP is a logical and systematic approach toward defining, evaluation, and grading. This method allows for analyzing complex systems, and determining complexity of them. In this study, by taking advantage of fuzzy ANP, effective indexes for development of complications in large-scale construction projects in Iran have been determined and prioritized. The results show socio-political, project system interdependencies, and technological complexity indexes ranked top to three. Furthermore, in comparison of three main huge projects: commercial-administrative, hospital, and skyscrapers, the hospital project had been evaluated as the most complicated. This model is beneficial for professionals in managing large-scale projects.

  12. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  13. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  14. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  15. The spectra of type IIB flux compactifications at large complex structure

    International Nuclear Information System (INIS)

    Brodie, Callum; Marsh, M.C. David

    2016-01-01

    We compute the spectra of the Hessian matrix, H, and the matrix M that governs the critical point equation of the low-energy effective supergravity, as a function of the complex structure and axio-dilaton moduli space in type IIB flux compactifications at large complex structure. We find both spectra analytically in an h − 1,2 +3 real-dimensional subspace of the moduli space, and show that they exhibit a universal structure with highly degenerate eigenvalues, independently of the choice of flux, the details of the compactification geometry, and the number of complex structure moduli. In this subspace, the spectrum of the Hessian matrix contains no tachyons, but there are also no critical points. We show numerically that the spectra of H and M remain highly peaked over a large fraction of the sampled moduli space of explicit Calabi-Yau compactifications with 2 to 5 complex structure moduli. In these models, the scale of the supersymmetric contribution to the scalar masses is strongly linearly correlated with the value of the superpotential over almost the entire moduli space, with particularly strong correlations arising for g s <1. We contrast these results with the expectations from the much-used continuous flux approximation, and comment on the applicability of Random Matrix Theory to the statistical modelling of the string theory landscape.

  16. Large-area smart glass and integrated photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Lampert, C.M. [Star Science, 8730 Water Road, Cotati, CA 94931-4252 (United States)

    2003-04-01

    Several companies throughout the world are developing dynamic glazing and large-area flat panel displays. University and National Laboratory groups are researching new materials and processes to improve these products. The concept of a switchable glazing for building and vehicle application is very attractive. Conventional glazing only offers fixed transmittance and control of energy passing through it. Given the wide range of illumination conditions and glare, a dynamic glazing with adjustable transmittance offers the best solution. Photovoltaics can be integrated as power sources for smart windows. In this way a switchable window could be a completely stand alone smart system. A new range of large-area flat panel display including light-weight and flexible displays are being developed. These displays can be used for banner advertising, dynamic pricing in stores, electronic paper, and electronic books, to name only a few applications. This study covers selected switching technologies including electrochromism, suspended particles, and encapsulated liquid crystals.

  17. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  18. Integrating complex business processes for knowledge-driven clinical decision support systems.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  19. Pan-Cancer Mutational and Transcriptional Analysis of the Integrator Complex

    Directory of Open Access Journals (Sweden)

    Antonio Federico

    2017-04-01

    Full Text Available The integrator complex has been recently identified as a key regulator of RNA Polymerase II-mediated transcription, with many functions including the processing of small nuclear RNAs, the pause-release and elongation of polymerase during the transcription of protein coding genes, and the biogenesis of enhancer derived transcripts. Moreover, some of its components also play a role in genome maintenance. Thus, it is reasonable to hypothesize that their functional impairment or altered expression can contribute to malignancies. Indeed, several studies have described the mutations or transcriptional alteration of some Integrator genes in different cancers. Here, to draw a comprehensive pan-cancer picture of the genomic and transcriptomic alterations for the members of the complex, we reanalyzed public data from The Cancer Genome Atlas. Somatic mutations affecting Integrator subunit genes and their transcriptional profiles have been investigated in about 11,000 patients and 31 tumor types. A general heterogeneity in the mutation frequencies was observed, mostly depending on tumor type. Despite the fact that we could not establish them as cancer drivers, INTS7 and INTS8 genes were highly mutated in specific cancers. A transcriptome analysis of paired (normal and tumor samples revealed that the transcription of INTS7, INTS8, and INTS13 is significantly altered in several cancers. Experimental validation performed on primary tumors confirmed these findings.

  20. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  1. Protein complex prediction in large ontology attributed protein-protein interaction networks.

    Science.gov (United States)

    Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Li, Yanpeng; Xu, Bo

    2013-01-01

    Protein complexes are important for unraveling the secrets of cellular organization and function. Many computational approaches have been developed to predict protein complexes in protein-protein interaction (PPI) networks. However, most existing approaches focus mainly on the topological structure of PPI networks, and largely ignore the gene ontology (GO) annotation information. In this paper, we constructed ontology attributed PPI networks with PPI data and GO resource. After constructing ontology attributed networks, we proposed a novel approach called CSO (clustering based on network structure and ontology attribute similarity). Structural information and GO attribute information are complementary in ontology attributed networks. CSO can effectively take advantage of the correlation between frequent GO annotation sets and the dense subgraph for protein complex prediction. Our proposed CSO approach was applied to four different yeast PPI data sets and predicted many well-known protein complexes. The experimental results showed that CSO was valuable in predicting protein complexes and achieved state-of-the-art performance.

  2. Influence of the large-small split effect on strategy choice in complex subtraction.

    Science.gov (United States)

    Xiang, Yan Hui; Wu, Hao; Shang, Rui Hong; Chao, Xiaomei; Ren, Ting Ting; Zheng, Li Ling; Mo, Lei

    2018-04-01

    Two main theories have been used to explain the arithmetic split effect: decision-making process theory and strategy choice theory. Using the inequality paradigm, previous studies have confirmed that individuals tend to adopt a plausibility-checking strategy and a whole-calculation strategy to solve large and small split problems in complex addition arithmetic, respectively. This supports strategy choice theory, but it is unknown whether this theory also explains performance in solving different split problems in complex subtraction arithmetic. This study used small, intermediate and large split sizes, with each split condition being further divided into problems requiring and not requiring borrowing. The reaction times (RTs) for large and intermediate splits were significantly shorter than those for small splits, while accuracy was significantly higher for large and middle splits than for small splits, reflecting no speed-accuracy trade-off. Further, RTs and accuracy differed significantly between the borrow and no-borrow conditions only for small splits. This study indicates that strategy choice theory is suitable to explain the split effect in complex subtraction arithmetic. That is, individuals tend to choose the plausibility-checking strategy or the whole-calculation strategy according to the split size. © 2016 International Union of Psychological Science.

  3. Predicting co-complexed protein pairs using genomic and proteomic data integration

    Directory of Open Access Journals (Sweden)

    King Oliver D

    2004-04-01

    Full Text Available Abstract Background Identifying all protein-protein interactions in an organism is a major objective of proteomics. A related goal is to know which protein pairs are present in the same protein complex. High-throughput methods such as yeast two-hybrid (Y2H and affinity purification coupled with mass spectrometry (APMS have been used to detect interacting proteins on a genomic scale. However, both Y2H and APMS methods have substantial false-positive rates. Aside from high-throughput interaction screens, other gene- or protein-pair characteristics may also be informative of physical interaction. Therefore it is desirable to integrate multiple datasets and utilize their different predictive value for more accurate prediction of co-complexed relationship. Results Using a supervised machine learning approach – probabilistic decision tree, we integrated high-throughput protein interaction datasets and other gene- and protein-pair characteristics to predict co-complexed pairs (CCP of proteins. Our predictions proved more sensitive and specific than predictions based on Y2H or APMS methods alone or in combination. Among the top predictions not annotated as CCPs in our reference set (obtained from the MIPS complex catalogue, a significant fraction was found to physically interact according to a separate database (YPD, Yeast Proteome Database, and the remaining predictions may potentially represent unknown CCPs. Conclusions We demonstrated that the probabilistic decision tree approach can be successfully used to predict co-complexed protein (CCP pairs from other characteristics. Our top-scoring CCP predictions provide testable hypotheses for experimental validation.

  4. Contribution of Large Region Joint Associations to Complex Traits Genetics

    Science.gov (United States)

    Paré, Guillaume; Asma, Senay; Deng, Wei Q.

    2015-01-01

    A polygenic model of inheritance, whereby hundreds or thousands of weakly associated variants contribute to a trait’s heritability, has been proposed to underlie the genetic architecture of complex traits. However, relatively few genetic variants have been positively identified so far and they collectively explain only a small fraction of the predicted heritability. We hypothesized that joint association of multiple weakly associated variants over large chromosomal regions contributes to complex traits variance. Confirmation of such regional associations can help identify new loci and lead to a better understanding of known ones. To test this hypothesis, we first characterized the ability of commonly used genetic association models to identify large region joint associations. Through theoretical derivation and simulation, we showed that multivariate linear models where multiple SNPs are included as independent predictors have the most favorable association profile. Based on these results, we tested for large region association with height in 3,740 European participants from the Health and Retirement Study (HRS) study. Adjusting for SNPs with known association with height, we demonstrated clustering of weak associations (p = 2x10-4) in regions extending up to 433.0 Kb from known height loci. The contribution of regional associations to phenotypic variance was estimated at 0.172 (95% CI 0.063-0.279; p < 0.001), which compared favorably to 0.129 explained by known height variants. Conversely, we showed that suggestively associated regions are enriched for known height loci. To extend our findings to other traits, we also tested BMI, HDLc and CRP for large region associations, with consistent results for CRP. Our results demonstrate the presence of large region joint associations and suggest these can be used to pinpoint weakly associated SNPs. PMID:25856144

  5. Integrating complex functions: coordination of nuclear pore complex assembly and membrane expansion of the nuclear envelope requires a family of integral membrane proteins.

    Science.gov (United States)

    Schneiter, Roger; Cole, Charles N

    2010-01-01

    The nuclear envelope harbors numerous large proteinaceous channels, the nuclear pore complexes (NPCs), through which macromolecular exchange between the cytosol and the nucleoplasm occurs. This double-membrane nuclear envelope is continuous with the endoplasmic reticulum and thus functionally connected to such diverse processes as vesicular transport, protein maturation and lipid synthesis. Recent results obtained from studies in Saccharomyces cerevisiae indicate that assembly of the nuclear pore complex is functionally dependent upon maintenance of lipid homeostasis of the ER membrane. Previous work from one of our laboratories has revealed that an integral membrane protein Apq12 is important for the assembly of functional nuclear pores. Cells lacking APQ12 are viable but cannot grow at low temperatures, have aberrant NPCs and a defect in mRNA export. Remarkably, these defects in NPC assembly can be overcome by supplementing cells with a membrane fluidizing agent, benzyl alcohol, suggesting that Apq12 impacts the flexibility of the nuclear membrane, possibly by adjusting its lipid composition when cells are shifted to a reduced temperature. Our new study now expands these findings and reveals that an essential membrane protein, Brr6, shares at least partially overlapping functions with Apq12 and is also required for assembly of functional NPCs. A third nuclear envelope membrane protein, Brl1, is related to Brr6, and is also required for NPC assembly. Because maintenance of membrane homeostasis is essential for cellular survival, the fact that these three proteins are conserved in fungi that undergo closed mitoses, but are not found in metazoans or plants, may indicate that their functions are performed by proteins unrelated at the primary sequence level to Brr6, Brl1 and Apq12 in cells that disassemble their nuclear envelopes during mitosis.

  6. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  7. System Integration and Its Influence on the Quality of Life of Children with Complex Needs

    Directory of Open Access Journals (Sweden)

    Sandy Thurston

    2010-01-01

    Full Text Available Purpose. To explore the interactions between child and parents psychosocial factors and team integration variables that may explain improvements in physical dimensions of the PEDS QL quality of life of children with complex needs after 2 years. Methods. In this 2-year study, parents were identified by the Children's Treatment Network. Families were eligible if the child was aged 0–19 years, had physical limitations, resided in either Simcoe County or the Region of York, Ontario, and there were multiple other family needs. Regression analysis used to explore associations and interactions; n=110. Results. A child's physical quality of life was affected by interacting factors including child's behavior, parenting, and integrated care. Statistically significant interactions between team integration, processes of care, and child/parent variables highlight the complexity of the rehabilitation approach in real-life situations. Conclusions. Rehabilitation providers working with children with complex needs and their families should also address child and parent problematic behaviors. When this was the case in high integrated teams, the child's physical quality of life improved after two years.

  8. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  9. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  10. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  11. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  12. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    International Nuclear Information System (INIS)

    Taylor, J'Tia Patrice; Shropshire, David E.

    2009-01-01

    This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated system

  13. New integrable structures in large-N QCD

    International Nuclear Information System (INIS)

    Ferretti, Gabriele; Heise, Rainer; Zarembo, Konstantin

    2004-01-01

    We study the anomalous dimensions of single trace operators composed of field strengths F μν in large-N QCD. The matrix of anomalous dimensions is the Hamiltonian of a compact spin chain with two spin one representations at each vertex corresponding to the self-dual and anti-self-dual components of F μν . Because of the special form of the interaction it is possible to study separately renormalization of purely self-dual components. In this sector the Hamiltonian is integrable and can be exactly solved by Bethe ansatz. Its continuum limit is described by the level two SU(2) Wess-Zumino-Witten model

  14. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  15. Large Variability in the Diversity of Physiologically Complex Surgical Procedures Exists Nationwide Among All Hospitals Including Among Large Teaching Hospitals.

    Science.gov (United States)

    Dexter, Franklin; Epstein, Richard H; Thenuwara, Kokila; Lubarsky, David A

    2017-11-22

    Multiple previous studies have shown that having a large diversity of procedures has a substantial impact on quality management of hospital surgical suites. At hospitals with substantial diversity, unless sophisticated statistical methods suitable for rare events are used, anesthesiologists working in surgical suites will have inaccurate predictions of surgical blood usage, case durations, cost accounting and price transparency, times remaining in late running cases, and use of intraoperative equipment. What is unknown is whether large diversity is a feature of only a few very unique set of hospitals nationwide (eg, the largest hospitals in each state or province). The 2013 United States Nationwide Readmissions Database was used to study heterogeneity among 1981 hospitals in their diversities of physiologically complex surgical procedures (ie, the procedure codes). The diversity of surgical procedures performed at each hospital was quantified using a summary measure, the number of different physiologically complex surgical procedures commonly performed at the hospital (ie, 1/Herfindahl). A total of 53.9% of all hospitals commonly performed 3-fold larger diversity (ie, >30 commonly performed physiologically complex procedures). Larger hospitals had greater diversity than the small- and medium-sized hospitals (P 30 procedures (lower 99% CL, 71.9% of hospitals). However, there was considerable variability among the large teaching hospitals in their diversity (interquartile range of the numbers of commonly performed physiologically complex procedures = 19.3; lower 99% CL, 12.8 procedures). The diversity of procedures represents a substantive differentiator among hospitals. Thus, the usefulness of statistical methods for operating room management should be expected to be heterogeneous among hospitals. Our results also show that "large teaching hospital" alone is an insufficient description for accurate prediction of the extent to which a hospital sustains the

  16. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  17. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  18. Survey of large protein complexes D. vulgaris reveals great structural diversity

    Energy Technology Data Exchange (ETDEWEB)

    Han, B.-G.; Dong, M.; Liu, H.; Camp, L.; Geller, J.; Singer, M.; Hazen, T. C.; Choi, M.; Witkowska, H. E.; Ball, D. A.; Typke, D.; Downing, K. H.; Shatsky, M.; Brenner, S. E.; Chandonia, J.-M.; Biggin, M. D.; Glaeser, R. M.

    2009-08-15

    An unbiased survey has been made of the stable, most abundant multi-protein complexes in Desulfovibrio vulgaris Hildenborough (DvH) that are larger than Mr {approx} 400 k. The quaternary structures for 8 of the 16 complexes purified during this work were determined by single-particle reconstruction of negatively stained specimens, a success rate {approx}10 times greater than that of previous 'proteomic' screens. In addition, the subunit compositions and stoichiometries of the remaining complexes were determined by biochemical methods. Our data show that the structures of only two of these large complexes, out of the 13 in this set that have recognizable functions, can be modeled with confidence based on the structures of known homologs. These results indicate that there is significantly greater variability in the way that homologous prokaryotic macromolecular complexes are assembled than has generally been appreciated. As a consequence, we suggest that relying solely on previously determined quaternary structures for homologous proteins may not be sufficient to properly understand their role in another cell of interest.

  19. A large class of solvable multistate Landau–Zener models and quantum integrability

    Science.gov (United States)

    Chernyak, Vladimir Y.; Sinitsyn, Nikolai A.; Sun, Chen

    2018-06-01

    The concept of quantum integrability has been introduced recently for quantum systems with explicitly time-dependent Hamiltonians (Sinitsyn et al 2018 Phys. Rev. Lett. 120 190402). Within the multistate Landau–Zener (MLZ) theory, however, there has been a successful alternative approach to identify and solve complex time-dependent models (Sinitsyn and Chernyak 2017 J. Phys. A: Math. Theor. 50 255203). Here we compare both methods by applying them to a new class of exactly solvable MLZ models. This class contains systems with an arbitrary number of interacting states and shows quick growth with N number of exact adiabatic energy crossing points, which appear at different moments of time. At each N, transition probabilities in these systems can be found analytically and exactly but complexity and variety of solutions in this class also grow with N quickly. We illustrate how common features of solvable MLZ systems appear from quantum integrability and develop an approach to further classification of solvable MLZ problems.

  20. Management systems for high reliability organizations. Integration and effectiveness; Managementsysteme fuer Hochzuverlaessigkeitsorganisationen. Integration und Wirksamkeit

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, Michael

    2015-03-09

    The scope of the thesis is the development of a method for improvement of efficient integrated management systems for high reliability organizations (HRO). A comprehensive analysis of severe accident prevention is performed. Severe accident management, mitigation measures and business continuity management are not included. High reliability organizations are complex and potentially dynamic organization forms that can be inherently dangerous like nuclear power plants, offshore platforms, chemical facilities, large ships or large aircrafts. A recursive generic management system model (RGM) was development based on the following factors: systemic and cybernetic Asepcts; integration of different management fields, high decision quality, integration of efficient methods of safety and risk analysis, integration of human reliability aspects, effectiveness evaluation and improvement.

  1. Multi-attribute integrated measurement of node importance in complex networks.

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  2. Defining Execution Viewpoints for a Large and Complex Software-Intensive System

    NARCIS (Netherlands)

    Callo Arias, Trosky B.; America, Pierre; Avgeriou, Paris

    2009-01-01

    An execution view is an important asset for developing large and complex systems. An execution view helps practitioners to describe, analyze, and communicate what a software system does at runtime and how it does it. In this paper, we present an approach to define execution viewpoints for an

  3. Large eddy simulation modeling of particle-laden flows in complex terrain

    Science.gov (United States)

    Salesky, S.; Giometto, M. G.; Chamecki, M.; Lehning, M.; Parlange, M. B.

    2017-12-01

    The transport, deposition, and erosion of heavy particles over complex terrain in the atmospheric boundary layer is an important process for hydrology, air quality forecasting, biology, and geomorphology. However, in situ observations can be challenging in complex terrain due to spatial heterogeneity. Furthermore, there is a need to develop numerical tools that can accurately represent the physics of these multiphase flows over complex surfaces. We present a new numerical approach to accurately model the transport and deposition of heavy particles in complex terrain using large eddy simulation (LES). Particle transport is represented through solution of the advection-diffusion equation including terms that represent gravitational settling and inertia. The particle conservation equation is discretized in a cut-cell finite volume framework in order to accurately enforce mass conservation. Simulation results will be validated with experimental data, and numerical considerations required to enforce boundary conditions at the surface will be discussed. Applications will be presented in the context of snow deposition and transport, as well as urban dispersion.

  4. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  5. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  6. Complex singularities of the critical potential in the large-N limit

    International Nuclear Information System (INIS)

    Meurice, Y.

    2003-01-01

    We show with two numerical examples that the conventional expansion in powers of the field for the critical potential of 3-dimensional O(N) models in the large-N limit does not converge for values of φ 2 larger than some critical value. This can be explained by the existence of conjugated branch points in the complex φ 2 plane. Pade approximants [L+3/L] for the critical potential apparently converge at large φ 2 . This allows high-precision calculation of the fixed point in a more suitable set of coordinates. We argue that the singularities are generic and not an artifact of the large-N limit. We show that ignoring these singularities may lead to inaccurate approximations

  7. Sedimentation Velocity Analysis of Large Oligomeric Chromatin Complexes Using Interference Detection.

    Science.gov (United States)

    Rogge, Ryan A; Hansen, Jeffrey C

    2015-01-01

    Sedimentation velocity experiments measure the transport of molecules in solution under centrifugal force. Here, we describe a method for monitoring the sedimentation of very large biological molecular assemblies using the interference optical systems of the analytical ultracentrifuge. The mass, partial-specific volume, and shape of macromolecules in solution affect their sedimentation rates as reflected in the sedimentation coefficient. The sedimentation coefficient is obtained by measuring the solute concentration as a function of radial distance during centrifugation. Monitoring the concentration can be accomplished using interference optics, absorbance optics, or the fluorescence detection system, each with inherent advantages. The interference optical system captures data much faster than these other optical systems, allowing for sedimentation velocity analysis of extremely large macromolecular complexes that sediment rapidly at very low rotor speeds. Supramolecular oligomeric complexes produced by self-association of 12-mer chromatin fibers are used to illustrate the advantages of the interference optics. Using interference optics, we show that chromatin fibers self-associate at physiological divalent salt concentrations to form structures that sediment between 10,000 and 350,000S. The method for characterizing chromatin oligomers described in this chapter will be generally useful for characterization of any biological structures that are too large to be studied by the absorbance optical system. © 2015 Elsevier Inc. All rights reserved.

  8. A note on U(N) integrals in the large N limit

    International Nuclear Information System (INIS)

    O'Brien, K.H.; Zuber, J.B.

    1984-01-01

    The U(N) integral ∫DUexp[Ntr(UJ+Usup(*)Jsup(*))]=exp(N 2 W) is reconsidered in the large N limit and the coefficients of the expansion of W in the moments of the eigenvalues of (JJsup(*)) explicitly computed. (orig.)

  9. HOW TO AVOID GIVING THE RIGHT ANSWERS TO THE WRONG QUESTIONS: THE NEED FOR INTEGRATED ASSESSMENTS OF COMPLEX HEALTH TECHNOLOGIES.

    Science.gov (United States)

    Gerhardus, Ansgar; Oortwijn, Wija; van der Wilt, Gert Jan

    2017-01-01

    Health technologies are becoming increasingly complex and contemporary health technology assessment (HTA) is only partly equipped to address this complexity. The project "Integrated assessments of complex health technologies" (INTEGRATE-HTA), funded by the European Commission, was initiated with the overall objective to develop concepts and methods to enable patient-centered, integrated assessments of the effectiveness, and the economic, social, cultural, and ethical issues of complex technologies that take context and implementation issues into account. The project resulted in a series of guidances that should support the work of HTA scientists and decision makers alike.

  10. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  11. Universal properties of type IIB and F-theory flux compactifications at large complex structure

    International Nuclear Information System (INIS)

    Marsh, M.C. David; Sousa, Kepa

    2016-01-01

    We consider flux compactifications of type IIB string theory and F-theory in which the respective superpotentials at large complex structure are dominated by cubic or quartic terms in the complex structure moduli. In this limit, the low-energy effective theory exhibits universal properties that are insensitive to the details of the compactification manifold or the flux configuration. Focussing on the complex structure and axio-dilaton sector, we show that there are no vacua in this region and the spectrum of the Hessian matrix is highly peaked and consists only of three distinct eigenvalues (0, 2m 3/2 2 and 8m 3/2 2 ), independently of the number of moduli. We briefly comment on how the inclusion of Kähler moduli affect these findings. Our results generalise those of Brodie & Marsh http://dx.doi.org/10.1007/JHEP01(2016)037, in which these universal properties were found in a subspace of the large complex structure limit of type IIB compactifications.

  12. Investigation on the integral output power model of a large-scale wind farm

    Institute of Scientific and Technical Information of China (English)

    BAO Nengsheng; MA Xiuqian; NI Weidou

    2007-01-01

    The integral output power model of a large-scale wind farm is needed when estimating the wind farm's output over a period of time in the future.The actual wind speed power model and calculation method of a wind farm made up of many wind turbine units are discussed.After analyzing the incoming wind flow characteristics and their energy distributions,and after considering the multi-effects among the wind turbine units and certain assumptions,the incoming wind flow model of multi-units is built.The calculation algorithms and steps of the integral output power model of a large-scale wind farm are provided.Finally,an actual power output of the wind farm is calculated and analyzed by using the practical measurement wind speed data.The characteristics of a large-scale wind farm are also discussed.

  13. Integrative shell of the program complex MARS (Version 1.0) radiation transfer in three-dimensional geometries

    International Nuclear Information System (INIS)

    Degtyarev, I.I.; Lokhovitskij, A.E.; Maslov, M.A.; Yazynin, I.A.

    1994-01-01

    The first version of integrative shell of the program complex MARS is written for calculating radiation transfer in the three-dimensional geometries. The integrative shell allows the user to work in convenient form with complex MARS, creat input files data and get graphic visualization of calculated functions. Version 1.0 is adapted for personal computers of types IBM-286,386,486 with operative size memory not smaller than 500K. 5 refs

  14. Integrated complex care model: lessons learned from inter-organizational partnership.

    Science.gov (United States)

    Cohen, Eyal; Bruce-Barrett, Cindy; Kingsnorth, Shauna; Keilty, Krista; Cooper, Anna; Daub, Stacey

    2011-01-01

    Providing integrated care for children with medical complexity in Canada is challenging as these children are, by definition, in need of coordinated care from disparate providers, organizations and funders across the continuum in order to optimize health outcomes. We describe the development of an inter-organizational team constructed as a unique tripartite partnership of an acute care hospital, a children's rehabilitation hospital and a home/community health organization focused on children who frequently use services across these three organizations. Model building and operationalization within the Canadian healthcare system is emphasized. Key challenges identified to date include communication and policy barriers as well as optimizing interactions with families; critical enablers have been alignment with policy trends in healthcare and inter-organizational commitment to integrate at the point of care. Considerations for policy developments supporting full integration across service sectors are raised. Early indicators of success include the enrolment of 34 clients and patients and the securing of funds to evaluate and expand the model to serve more children.

  15. Studies on combined model based on functional objectives of large scale complex engineering

    Science.gov (United States)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  16. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  17. Two-scale large deviations for chemical reaction kinetics through second quantization path integral

    International Nuclear Information System (INIS)

    Li, Tiejun; Lin, Feng

    2016-01-01

    Motivated by the study of rare events for a typical genetic switching model in systems biology, in this paper we aim to establish the general two-scale large deviations for chemical reaction systems. We build a formal approach to explicitly obtain the large deviation rate functionals for the considered two-scale processes based upon the second quantization path integral technique. We get three important types of large deviation results when the underlying two timescales are in three different regimes. This is realized by singular perturbation analysis to the rate functionals obtained by the path integral. We find that the three regimes possess the same deterministic mean-field limit but completely different chemical Langevin approximations. The obtained results are natural extensions of the classical large volume limit for chemical reactions. We also discuss its implication on the single-molecule Michaelis–Menten kinetics. Our framework and results can be applied to understand general multi-scale systems including diffusion processes. (paper)

  18. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    Energy Technology Data Exchange (ETDEWEB)

    J' Tia Patrice Taylor; David E. Shropshire

    2009-09-01

    Abstract This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated

  19. Spacer capture and integration by a type I-F Cas1-Cas2-3 CRISPR adaptation complex.

    Science.gov (United States)

    Fagerlund, Robert D; Wilkinson, Max E; Klykov, Oleg; Barendregt, Arjan; Pearce, F Grant; Kieper, Sebastian N; Maxwell, Howard W R; Capolupo, Angela; Heck, Albert J R; Krause, Kurt L; Bostina, Mihnea; Scheltema, Richard A; Staals, Raymond H J; Fineran, Peter C

    2017-06-27

    CRISPR-Cas adaptive immune systems capture DNA fragments from invading bacteriophages and plasmids and integrate them as spacers into bacterial CRISPR arrays. In type I-E and II-A CRISPR-Cas systems, this adaptation process is driven by Cas1-Cas2 complexes. Type I-F systems, however, contain a unique fusion of Cas2, with the type I effector helicase and nuclease for invader destruction, Cas3. By using biochemical, structural, and biophysical methods, we present a structural model of the 400-kDa Cas1 4 -Cas2-3 2 complex from Pectobacterium atrosepticum with bound protospacer substrate DNA. Two Cas1 dimers assemble on a Cas2 domain dimeric core, which is flanked by two Cas3 domains forming a groove where the protospacer binds to Cas1-Cas2. We developed a sensitive in vitro assay and demonstrated that Cas1-Cas2-3 catalyzed spacer integration into CRISPR arrays. The integrase domain of Cas1 was necessary, whereas integration was independent of the helicase or nuclease activities of Cas3. Integration required at least partially duplex protospacers with free 3'-OH groups, and leader-proximal integration was stimulated by integration host factor. In a coupled capture and integration assay, Cas1-Cas2-3 processed and integrated protospacers independent of Cas3 activity. These results provide insight into the structure of protospacer-bound type I Cas1-Cas2-3 adaptation complexes and their integration mechanism.

  20. VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration

    Science.gov (United States)

    Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.

    2012-09-01

    VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to

  1. On-chip multi-wavelength laser sources fabricated using generic photonic integration technology

    NARCIS (Netherlands)

    Latkowski, S.; Williams, K.A.; Bente, E.A.J.M.

    Generic photonic integration technology platforms allow for design and fabrication of large complexity application specific photonic integrated circuits. Monolithic active-passive integration on indium phosphide substrate naturally enables a reliable co-integration of optical gain elements and

  2. Integrated versus fragmented implementation of complex innovations in acute health care

    Science.gov (United States)

    Woiceshyn, Jaana; Blades, Kenneth; Pendharkar, Sachin R.

    2017-01-01

    Background: Increased demand and escalating costs necessitate innovation in health care. The challenge is to implement complex innovations—those that require coordinated use across the adopting organization to have the intended benefits. Purpose: We wanted to understand why and how two of five similar hospitals associated with the same health care authority made more progress with implementing a complex inpatient discharge innovation whereas the other three experienced more difficulties in doing so. Methodology: We conducted a qualitative comparative case study of the implementation process at five comparable urban hospitals adopting the same inpatient discharge innovation mandated by their health care authority. We analyzed documents and conducted 39 interviews of the health care authority and hospital executives and frontline managers across the five sites over a 1-year period while the implementation was ongoing. Findings: In two and a half years, two of the participating hospitals had made significant progress with implementing the innovation and had begun to realize benefits; they exemplified an integrated implementation mode. Three sites had made minimal progress, following a fragmented implementation mode. In the former mode, a semiautonomous health care organization developed a clear overall purpose and chose one umbrella initiative to implement it. The integrative initiative subsumed the rest and guided resource allocation and the practices of hospital executives, frontline managers, and staff who had bought into it. In contrast, in the fragmented implementation mode, the health care authority had several overlapping, competing innovations that overwhelmed the sites and impeded their implementation. Practice Implications: Implementing a complex innovation across hospital sites required (a) early prioritization of one initiative as integrative, (b) the commitment of additional (traded off or new) human resources, (c) deliberate upfront planning and

  3. Complexity and network dynamics in physiological adaptation: an integrated view.

    Science.gov (United States)

    Baffy, György; Loscalzo, Joseph

    2014-05-28

    Living organisms constantly interact with their surroundings and sustain internal stability against perturbations. This dynamic process follows three fundamental strategies (restore, explore, and abandon) articulated in historical concepts of physiological adaptation such as homeostasis, allostasis, and the general adaptation syndrome. These strategies correspond to elementary forms of behavior (ordered, chaotic, and static) in complex adaptive systems and invite a network-based analysis of the operational characteristics, allowing us to propose an integrated framework of physiological adaptation from a complex network perspective. Applicability of this concept is illustrated by analyzing molecular and cellular mechanisms of adaptation in response to the pervasive challenge of obesity, a chronic condition resulting from sustained nutrient excess that prompts chaotic exploration for system stability associated with tradeoffs and a risk of adverse outcomes such as diabetes, cardiovascular disease, and cancer. Deconstruction of this complexity holds the promise of gaining novel insights into physiological adaptation in health and disease. Published by Elsevier Inc.

  4. Integration of complex-wide mixed low-level waste activities for program acceleration and optimization

    International Nuclear Information System (INIS)

    McKenney, D.E.

    1998-01-01

    In July 1996, the US Department of Energy (DOE) chartered a contractor-led effort to develop a suite of technically defensible, integrated alternatives which would allow the Environmental Management program to accomplish its mission objectives in an accelerated fashion and at a reduced cost. These alternatives, or opportunities, could then be evaluated by DOE and stakeholders for possible implementation, given precursor requirements (regulatory changes, etc.) could be met and benefits to the Complex realized. This contractor effort initially focused on six waste types, one of which was Mixed Low-Level Waste (MLLW). Many opportunities were identified by the contractor team for integrating MLLW activities across the DOE Complex. These opportunities were further narrowed to six that had the most promise for implementation and savings to the DOE Complex. The opportunities include six items: (1) the consolidation of individual site analytical services procurement efforts, (2) the consolidation of individual site MLLW treatment services procurement efforts, (3) establishment of ''de minimus'' radioactivity levels, (4) standardization of characterization requirements, (5) increased utilization of existing DOE treatment facilities, and (6) using a combination of DOE and commercial MLLW disposal capacity. The results of the integration effort showed that by managing MLLW activities across the DOE Complex as a cohesive unit rather than as independent site efforts, the DOE could improve the rate of progress toward meeting its objectives and reduce its overall MLLW program costs. Savings potential for MLLW, if the identified opportunities could be implemented, could total $224 million or more. Implementation of the opportunities also could result in the acceleration of the MLLW ''work off schedule'' across the DOE Complex by five years

  5. Symplectic integrators for large scale molecular dynamics simulations: A comparison of several explicit methods

    International Nuclear Information System (INIS)

    Gray, S.K.; Noid, D.W.; Sumpter, B.G.

    1994-01-01

    We test the suitability of a variety of explicit symplectic integrators for molecular dynamics calculations on Hamiltonian systems. These integrators are extremely simple algorithms with low memory requirements, and appear to be well suited for large scale simulations. We first apply all the methods to a simple test case using the ideas of Berendsen and van Gunsteren. We then use the integrators to generate long time trajectories of a 1000 unit polyethylene chain. Calculations are also performed with two popular but nonsymplectic integrators. The most efficient integrators of the set investigated are deduced. We also discuss certain variations on the basic symplectic integration technique

  6. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    Science.gov (United States)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well

  7. Understanding large multiprotein complexes: applying a multiple allosteric networks model to explain the function of the Mediator transcription complex.

    Science.gov (United States)

    Lewis, Brian A

    2010-01-15

    The regulation of transcription and of many other cellular processes involves large multi-subunit protein complexes. In the context of transcription, it is known that these complexes serve as regulatory platforms that connect activator DNA-binding proteins to a target promoter. However, there is still a lack of understanding regarding the function of these complexes. Why do multi-subunit complexes exist? What is the molecular basis of the function of their constituent subunits, and how are these subunits organized within a complex? What is the reason for physical connections between certain subunits and not others? In this article, I address these issues through a model of network allostery and its application to the eukaryotic RNA polymerase II Mediator transcription complex. The multiple allosteric networks model (MANM) suggests that protein complexes such as Mediator exist not only as physical but also as functional networks of interconnected proteins through which information is transferred from subunit to subunit by the propagation of an allosteric state known as conformational spread. Additionally, there are multiple distinct sub-networks within the Mediator complex that can be defined by their connections to different subunits; these sub-networks have discrete functions that are activated when specific subunits interact with other activator proteins.

  8. The magnetic g-tensors for ion complexes with large spin-orbit coupling

    International Nuclear Information System (INIS)

    Chang, P.K.L.; Liu, Y.S.

    1977-01-01

    A nonperturbative method for calculating the magnetic g-tensors is presented and discussed for complexes of transition metal ions of large spin-orbit coupling, in the ground term 2 D. A numerical example for CuCl 2 .2H 2 O is given [pt

  9. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  10. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  11. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    Science.gov (United States)

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  12. Managing complexity insights, concepts, applications

    CERN Document Server

    Helbing, Dirk

    2007-01-01

    Each chapter in Managing Complexity focuses on analyzing real-world complex systems and transferring knowledge from the complex-systems sciences to applications in business, industry and society. The interdisciplinary contributions range from markets and production through logistics, traffic control, and critical infrastructures, up to network design, information systems, social conflicts and building consensus. They serve to raise readers' awareness concerning the often counter-intuitive behavior of complex systems and to help them integrate insights gained in complexity research into everyday planning, decision making, strategic optimization, and policy. Intended for a broad readership, the contributions have been kept largely non-technical and address a general, scientifically literate audience involved in corporate, academic, and public institutions.

  13. MacroBac: New Technologies for Robust and Efficient Large-Scale Production of Recombinant Multiprotein Complexes.

    Science.gov (United States)

    Gradia, Scott D; Ishida, Justin P; Tsai, Miaw-Sheue; Jeans, Chris; Tainer, John A; Fuss, Jill O

    2017-01-01

    Recombinant expression of large, multiprotein complexes is essential and often rate limiting for determining structural, biophysical, and biochemical properties of DNA repair, replication, transcription, and other key cellular processes. Baculovirus-infected insect cell expression systems are especially well suited for producing large, human proteins recombinantly, and multigene baculovirus systems have facilitated studies of multiprotein complexes. In this chapter, we describe a multigene baculovirus system called MacroBac that uses a Biobricks-type assembly method based on restriction and ligation (Series 11) or ligation-independent cloning (Series 438). MacroBac cloning and assembly is efficient and equally well suited for either single subcloning reactions or high-throughput cloning using 96-well plates and liquid handling robotics. MacroBac vectors are polypromoter with each gene flanked by a strong polyhedrin promoter and an SV40 poly(A) termination signal that minimize gene order expression level effects seen in many polycistronic assemblies. Large assemblies are robustly achievable, and we have successfully assembled as many as 10 genes into a single MacroBac vector. Importantly, we have observed significant increases in expression levels and quality of large, multiprotein complexes using a single, multigene, polypromoter virus rather than coinfection with multiple, single-gene viruses. Given the importance of characterizing functional complexes, we believe that MacroBac provides a critical enabling technology that may change the way that structural, biophysical, and biochemical research is done. © 2017 Elsevier Inc. All rights reserved.

  14. Integrating water and agricultural management: collaborative governance for a complex policy problem.

    Science.gov (United States)

    Fish, Rob D; Ioris, Antonio A R; Watson, Nigel M

    2010-11-01

    This paper examines governance requirements for integrating water and agricultural management (IWAM). The institutional arrangements for the agriculture and water sectors are complex and multi-dimensional, and integration cannot therefore be achieved through a simplistic 'additive' policy process. Effective integration requires the development of a new collaborative approach to governance that is designed to cope with scale dependencies and interactions, uncertainty and contested knowledge, and interdependency among diverse and unequal interests. When combined with interdisciplinary research, collaborative governance provides a viable normative model because of its emphasis on reciprocity, relationships, learning and creativity. Ultimately, such an approach could lead to the sorts of system adaptations and transformations that are required for IWAM. Copyright © 2009 Elsevier B.V. All rights reserved.

  15. Does reef architectural complexity influence resource availability for a large reef-dwelling invertebrate?

    Science.gov (United States)

    Lozano-Álvarez, Enrique; Luviano-Aparicio, Nelia; Negrete-Soto, Fernando; Barradas-Ortiz, Cecilia; Aguíñiga-García, Sergio; Morillo-Velarde, Piedad S.; Álvarez-Filip, Lorenzo; Briones-Fourzán, Patricia

    2017-10-01

    In coral reefs, loss of architectural complexity and its associated habitat degradation is expected to affect reef specialists in particular due to changes in resource availability. We explored whether these features could potentially affect populations of a large invertebrate, the spotted spiny lobster Panulirus guttatus, which is an obligate Caribbean coral reef-dweller with a limited home range. We selected two separate large coral reef patches in Puerto Morelos (Mexico) that differed significantly in structural complexity and level of degradation, as assessed via the rugosity index, habitat assessment score, and percent cover of various benthic components. On each reef, we estimated density of P. guttatus and sampled lobsters to analyze their stomach contents, three different condition indices, and stable isotopes (δ15N and δ13C) in muscle. Lobster density did not vary with reef, suggesting that available crevices in the less complex patch still provided adequate refuge to these lobsters. Lobsters consumed many food types, dominated by mollusks and crustaceans, but proportionally more crustaceans (herbivore crabs) in the less complex patch, which had more calcareous macroalgae and algal turf. Lobsters from both reefs had a similar condition (all three indices) and mean δ15N, suggesting a similar quality of diet between reefs related to their opportunistic feeding, but differed in mean δ13C values, reflecting the different carbon sources between reefs and providing indirect evidence of individuals of P. guttatus foraging exclusively over their home reef. Overall, we found no apparent effects of architectural complexity, at least to the degree observed in our less complex patch, on density, condition, or trophic level of P. guttatus.

  16. Complex Behavior in an Integrate-and-Fire Neuron Model Based on Small World Networks

    International Nuclear Information System (INIS)

    Lin Min; Chen Tianlun

    2005-01-01

    Based on our previously pulse-coupled integrate-and-fire neuron model in small world networks, we investigate the complex behavior of electroencephalographic (EEG)-like activities produced by such a model. We find EEG-like activities have obvious chaotic characteristics. We also analyze the complex behaviors of EEG-like signals, such as spectral analysis, reconstruction of the phase space, the correlation dimension, and so on.

  17. A large deviation principle in H\\"older norm for multiple fractional integrals

    OpenAIRE

    Sanz-Solé, Marta; Torrecilla-Tarantino, Iván

    2007-01-01

    For a fractional Brownian motion $B^H$ with Hurst parameter $H\\in]{1/4},{1/2}[\\cup]{1/2},1[$, multiple indefinite integrals on a simplex are constructed and the regularity of their sample paths are studied. Then, it is proved that the family of probability laws of the processes obtained by replacing $B^H$ by $\\epsilon^{{1/2}} B^H$ satisfies a large deviation principle in H\\"older norm. The definition of the multiple integrals relies upon a representation of the fractional Brownian motion in t...

  18. LEGO-NMR spectroscopy: a method to visualize individual subunits in large heteromeric complexes.

    Science.gov (United States)

    Mund, Markus; Overbeck, Jan H; Ullmann, Janina; Sprangers, Remco

    2013-10-18

    Seeing the big picture: Asymmetric macromolecular complexes that are NMR active in only a subset of their subunits can be prepared, thus decreasing NMR spectral complexity. For the hetero heptameric LSm1-7 and LSm2-8 rings NMR spectra of the individual subunits of the complete complex are obtained, showing a conserved RNA binding site. This LEGO-NMR technique makes large asymmetric complexes accessible to detailed NMR spectroscopic studies. © 2013 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of Creative Commons the Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

  19. An Exploratory Study into Perceived Task Complexity, Topic Specificity and Usefulness for Integrated Search

    DEFF Research Database (Denmark)

    Ingwersen, Peter; Lioma, Christina; Larsen, Birger

    2012-01-01

    We investigate the relations between user perceptions of work task complexity, topic specificity, and usefulness of retrieved results. 23 academic researchers submitted detailed descriptions of 65 real-life work tasks in the physics domain, and assessed documents retrieved from an integrated...... collection consisting of full text research articles in PDF, abstracts, and bibliographic records [6]. Bibliographic records were found to be more precise than full text PDFs, regardless of task complexity and topic specificity. PDFs were found to be more useful. Overall, for higher task complexity and topic...

  20. Overview of the ITER Tokamak complex building and integration of plant systems toward construction

    Energy Technology Data Exchange (ETDEWEB)

    Cordier, Jean-Jacques, E-mail: jean-jacques.cordier@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Bak, Joo-Shik [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Baudry, Alain [Engage Consortium, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Benchikhoune, Magali [Fusion For Energy (F4E), c/ Josep Pla, n.2, Torres Diagonal Litoral, E-08019 Barcelona (Spain); Carafa, Leontin; Chiocchio, Stefano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Darbour, Romaric [Fusion For Energy (F4E), c/ Josep Pla, n.2, Torres Diagonal Litoral, E-08019 Barcelona (Spain); Elbez, Joelle; Di Giuseppe, Giovanni; Iwata, Yasuhiro; Jeannoutot, Thomas; Kotamaki, Miikka; Kuehn, Ingo; Lee, Andreas; Levesy, Bruno; Orlandi, Sergio [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Packer, Rachel [Engage Consortium, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Patisson, Laurent; Reich, Jens; Rigoni, Giuliano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); and others

    2015-10-15

    at mid-2014. The paper gives an overview of the final configuration of the ITER nuclear buildings and highlights the large progress made on the final integration of the plant systems in the Tokamak complex. The revised design of the Tokamak machine supporting system is also described.

  1. Solving very large scattering problems using a parallel PWTD-enhanced surface integral equation solver

    KAUST Repository

    Liu, Yang

    2013-07-01

    The computational complexity and memory requirements of multilevel plane wave time domain (PWTD)-accelerated marching-on-in-time (MOT)-based surface integral equation (SIE) solvers scale as O(NtNs(log 2)Ns) and O(Ns 1.5); here N t and Ns denote numbers of temporal and spatial basis functions discretizing the current [Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003]. In the past, serial versions of these solvers have been successfully applied to the analysis of scattering from perfect electrically conducting as well as homogeneous penetrable targets involving up to Ns ≈ 0.5 × 106 and Nt ≈ 10 3. To solve larger problems, parallel PWTD-enhanced MOT solvers are called for. Even though a simple parallelization strategy was demonstrated in the context of electromagnetic compatibility analysis [M. Lu et al., in Proc. IEEE Int. Symp. AP-S, 4, 4212-4215, 2004], by and large, progress in this area has been slow. The lack of progress can be attributed wholesale to difficulties associated with the construction of a scalable PWTD kernel. © 2013 IEEE.

  2. An integration of minimum local feature representation methods to recognize large variation of foods

    Science.gov (United States)

    Razali, Mohd Norhisham bin; Manshor, Noridayu; Halin, Alfian Abdul; Mustapha, Norwati; Yaakob, Razali

    2017-10-01

    Local invariant features have shown to be successful in describing object appearances for image classification tasks. Such features are robust towards occlusion and clutter and are also invariant against scale and orientation changes. This makes them suitable for classification tasks with little inter-class similarity and large intra-class difference. In this paper, we propose an integrated representation of the Speeded-Up Robust Feature (SURF) and Scale Invariant Feature Transform (SIFT) descriptors, using late fusion strategy. The proposed representation is used for food recognition from a dataset of food images with complex appearance variations. The Bag of Features (BOF) approach is employed to enhance the discriminative ability of the local features. Firstly, the individual local features are extracted to construct two kinds of visual vocabularies, representing SURF and SIFT. The visual vocabularies are then concatenated and fed into a Linear Support Vector Machine (SVM) to classify the respective food categories. Experimental results demonstrate impressive overall recognition at 82.38% classification accuracy based on the challenging UEC-Food100 dataset.

  3. “Strategies of Large Systems Integrators in Malaysia to Overcome Lower Margins”

    OpenAIRE

    Omardin, Daniel Yusoff

    2013-01-01

    The Malaysian Information and Technology (IT) scene is now no longer seen as market for high returns as it has reached it peak. There has been a tremendous growth of IT related companies in the past 10 years. All this competition is now leading to a price war making the industry less attractive and harder for the system integrators that have been doing business in this market for the past 20 years. This paper investigates the lost of margins of three large system integrators in Malaysia and i...

  4. Knowledge Sharing Strategies for Large Complex Building Projects.

    Directory of Open Access Journals (Sweden)

    Esra Bektas

    2013-06-01

    Full Text Available The construction industry is a project-based sector with a myriad of actors such as architects, construction companies, consultants, producers of building materials (Anumba et al., 2005. The interaction between the project partners is often quite limited, which leads to insufficient knowledge sharing during the project and knowledge being unavailable for reuse (Fruchter et al. 2002. The result can be a considerable amount of extra work, delays and cost overruns. Design outcomes that are supposed to function as boundary objects across different disciplines can lead to misinterpretation of requirements, project content and objectives. In this research, knowledge is seen as resulting from social interactions; knowledge resides in communities and it is generated through social relationships (Wenger 1998, Olsson et al. 2008. Knowledge is often tacit, intangible and context-dependent and it is articulated in the changing responsibilities, roles, attitudes and values that are present in the work environment (Bresnen et al., 2003. In a project environment, knowledge enables individuals to solve problems, take decisions, and apply these decisions to actions. In order to achieve a shared understanding and minimize the misunderstanding and misinterpretations among project actors, it is necessary to share knowledge (Fong 2003. Sharing knowledge is particularly crucial in large complex building projects (LCBPs in order to accelerate the building process, improve architectural quality and prevent mistakes or undesirable results. However, knowledge sharing is often hampered through professional or organizational boundaries or contractual concerns. When knowledge is seen as an organizational asset, there is little willingness among project organizations to share their knowledge. Individual people may recognize the need to promote knowledge sharing throughout the project, but typically there is no deliberate strategy agreed by all project partners to address

  5. Department of Energy environmental management complex-wide integration using systems engineering

    International Nuclear Information System (INIS)

    Fairbourn, P.

    1997-01-01

    A systems engineering approach was successfully used to recommend changes to environmental management activities across the DOE Complex. A team of technical experts and systems engineers developed alternatives that could save tax payers billions of dollars if the barriers are removed to allow complete implementation. The alternatives are technically-based and defensible, and are being worked through the stakeholder review process. The integration process and implementing project structure are both discussed

  6. Symplectic integration for complex wigglers

    International Nuclear Information System (INIS)

    Forest, E.; Ohmi, K.

    1992-01-01

    Using the example of the helical wiggler proposed for the KEK photon factory, we show how to integrate the equation of motion through the wiggler. The integration is performed in cartesian coordinates. For the usual expanded Hamiltonian (without square root), we derive a first order symplectic integrator for the purpose of tracking through a wiggler in a ring. We also show how to include classical radiation for the computation of the damping decrement

  7. Development of an integrated genome informatics, data management and workflow infrastructure: A toolbox for the study of complex disease genetics

    Directory of Open Access Journals (Sweden)

    Burren Oliver S

    2004-01-01

    Full Text Available Abstract The genetic dissection of complex disease remains a significant challenge. Sample-tracking and the recording, processing and storage of high-throughput laboratory data with public domain data, require integration of databases, genome informatics and genetic analyses in an easily updated and scaleable format. To find genes involved in multifactorial diseases such as type 1 diabetes (T1D, chromosome regions are defined based on functional candidate gene content, linkage information from humans and animal model mapping information. For each region, genomic information is extracted from Ensembl, converted and loaded into ACeDB for manual gene annotation. Homology information is examined using ACeDB tools and the gene structure verified. Manually curated genes are extracted from ACeDB and read into the feature database, which holds relevant local genomic feature data and an audit trail of laboratory investigations. Public domain information, manually curated genes, polymorphisms, primers, linkage and association analyses, with links to our genotyping database, are shown in Gbrowse. This system scales to include genetic, statistical, quality control (QC and biological data such as expression analyses of RNA or protein, all linked from a genomics integrative display. Our system is applicable to any genetic study of complex disease, of either large or small scale.

  8. Large Complex Odontoma of Mandible in a Young Boy: A Rare and Unusual Case Report

    Directory of Open Access Journals (Sweden)

    G. Siva Prasad Reddy

    2014-01-01

    Full Text Available Odontomas are the most common odontogenic tumors. They are broadly classified in to Compound Odontoma and Complex Odontoma. Among them complex odontoma is a rare tumor. Occasionally this tumor becomes large, causing expansion of bone followed by facial asymmetry. Otherwise these tumors are asymptomatic and are generally diagnosed on radiographic examination. We report a rare case of complex odontoma of mandible in a young boy. The tumor was treated by surgical excision under general anesthesia.

  9. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    Science.gov (United States)

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  10. Economic testing of large integrated switching circuits - a challenge to the test engineer

    International Nuclear Information System (INIS)

    Kreinberg, W.

    1978-01-01

    With reference to large integrated switching circuits, one can use an incoming standard programme test or the customer's switching circuits. The author describes the development of suitable, extensive and economical test programmes. (orig.) [de

  11. Exploring the dynamic and complex integration of sustainability performance measurement into product development

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Morioka, S.; Pigosso, Daniela Cristina Antelmi

    2016-01-01

    In order to deal with the complex and dynamic nature of sustainability integration into the product development process, this research explore the use of a qualitative System Dynamics approach by using the causal loop diagram (CLD) tool. A literature analysis was followed by a case study, aiming ...

  12. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  13. Geophysical mapping of complex glaciogenic large-scale structures

    DEFF Research Database (Denmark)

    Høyer, Anne-Sophie

    2013-01-01

    This thesis presents the main results of a four year PhD study concerning the use of geophysical data in geological mapping. The study is related to the Geocenter project, “KOMPLEKS”, which focuses on the mapping of complex, large-scale geological structures. The study area is approximately 100 km2...... data types and co-interpret them in order to improve our geological understanding. However, in order to perform this successfully, methodological considerations are necessary. For instance, a structure indicated by a reflection in the seismic data is not always apparent in the resistivity data...... information) can be collected. The geophysical data are used together with geological analyses from boreholes and pits to interpret the geological history of the hill-island. The geophysical data reveal that the glaciotectonic structures truncate at the surface. The directions of the structures were mapped...

  14. Surgical Treatment of a Large Complex Odontoma

    Directory of Open Access Journals (Sweden)

    Burak Cezairli

    2017-08-01

    Full Text Available The treatment modalities for odontomas are generally depend on the tumors size. Small and medium lesions can usually be removed easily allowing preservation of surrounding anatomical structures. In our study, we reported a conservative surgical treatment of a large complex odontoma. A 19-year-old woman was referred to our clinic after an incidentally observed lesion on her right mandibular angle. The patient was symptom-free at the time of visit. Computed tomography (CT images showed a mass with a size of 3.5 cm x 3 cm x 2 cm. CT sections and tridimensional images showed partially eroded buccal and lingual cortex. Surgical treatment was indicated with an initial diagnosis of compound odontoma. The lesion removed after sectioning with bur and maxillo-mandibular fixation (MMF were not thought to be necessary while the buccal and lingual cortexes were mostly reliable for preventing a fracture. In our case, the size of the odontoma was suitable for a conservative treatment method and with this modality we managed to prevent a possible fracture and eliminate the disadvantages of MMF.

  15. Complex Nonlinearity Chaos, Phase Transitions, Topology Change and Path Integrals

    CERN Document Server

    Ivancevic, Vladimir G

    2008-01-01

    Complex Nonlinearity: Chaos, Phase Transitions, Topology Change and Path Integrals is a book about prediction & control of general nonlinear and chaotic dynamics of high-dimensional complex systems of various physical and non-physical nature and their underpinning geometro-topological change. The book starts with a textbook-like expose on nonlinear dynamics, attractors and chaos, both temporal and spatio-temporal, including modern techniques of chaos–control. Chapter 2 turns to the edge of chaos, in the form of phase transitions (equilibrium and non-equilibrium, oscillatory, fractal and noise-induced), as well as the related field of synergetics. While the natural stage for linear dynamics comprises of flat, Euclidean geometry (with the corresponding calculation tools from linear algebra and analysis), the natural stage for nonlinear dynamics is curved, Riemannian geometry (with the corresponding tools from nonlinear, tensor algebra and analysis). The extreme nonlinearity – chaos – corresponds to th...

  16. Integration of large chemical kinetic mechanisms via exponential methods with Krylov approximations to Jacobian matrix functions

    KAUST Repository

    Bisetti, Fabrizio

    2012-01-01

    with the computational cost associated with the time integration of stiff, large chemical systems, a novel approach is proposed. The approach combines an exponential integrator and Krylov subspace approximations to the exponential function of the Jacobian matrix

  17. Parallel time domain solvers for electrically large transient scattering problems

    KAUST Repository

    Liu, Yang

    2014-09-26

    Marching on in time (MOT)-based integral equation solvers represent an increasingly appealing avenue for analyzing transient electromagnetic interactions with large and complex structures. MOT integral equation solvers for analyzing electromagnetic scattering from perfect electrically conducting objects are obtained by enforcing electric field boundary conditions and implicitly time advance electric surface current densities by iteratively solving sparse systems of equations at all time steps. Contrary to finite difference and element competitors, these solvers apply to nonlinear and multi-scale structures comprising geometrically intricate and deep sub-wavelength features residing atop electrically large platforms. Moreover, they are high-order accurate, stable in the low- and high-frequency limits, and applicable to conducting and penetrable structures represented by highly irregular meshes. This presentation reviews some recent advances in the parallel implementations of time domain integral equation solvers, specifically those that leverage multilevel plane-wave time-domain algorithm (PWTD) on modern manycore computer architectures including graphics processing units (GPUs) and distributed memory supercomputers. The GPU-based implementation achieves at least one order of magnitude speedups compared to serial implementations while the distributed parallel implementation are highly scalable to thousands of compute-nodes. A distributed parallel PWTD kernel has been adopted to solve time domain surface/volume integral equations (TDSIE/TDVIE) for analyzing transient scattering from large and complex-shaped perfectly electrically conducting (PEC)/dielectric objects involving ten million/tens of millions of spatial unknowns.

  18. Complex Formation Control of Large-Scale Intelligent Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Lei

    2012-01-01

    Full Text Available A new formation framework of large-scale intelligent autonomous vehicles is developed, which can realize complex formations while reducing data exchange. Using the proposed hierarchy formation method and the automatic dividing algorithm, vehicles are automatically divided into leaders and followers by exchanging information via wireless network at initial time. Then, leaders form formation geometric shape by global formation information and followers track their own virtual leaders to form line formation by local information. The formation control laws of leaders and followers are designed based on consensus algorithms. Moreover, collision-avoiding problems are considered and solved using artificial potential functions. Finally, a simulation example that consists of 25 vehicles shows the effectiveness of theory.

  19. Integration of radiation and physical safety in large radiator facilities

    International Nuclear Information System (INIS)

    Lima, P.P.M.; Benedito, A.M.; Lima, C.M.A.; Silva, F.C.A. da

    2017-01-01

    Growing international concern about radioactive sources after the Sept. 11, 2001 event has led to a strengthening of physical safety. There is evidence that the illicit use of radioactive sources is a real possibility and may result in harmful radiological consequences for the population and the environment. In Brazil there are about 2000 medical, industrial and research facilities with radioactive sources, of which 400 are Category 1 and 2 classified by the - International Atomic Energy Agency - AIEA, where large irradiators occupy a prominent position due to the very high cobalt-60 activities. The radiological safety is well established in these facilities, due to the intense work of the authorities in the Country. In the paper the main aspects on radiological and physical safety applied in the large radiators are presented, in order to integrate both concepts for the benefit of the safety as a whole. The research showed that the items related to radiation safety are well defined, for example, the tests on the access control devices to the irradiation room. On the other hand, items related to physical security, such as effective control of access to the company, use of safety cameras throughout the company, are not yet fully incorporated. Integration of radiation and physical safety is fundamental for total safety. The elaboration of a Brazilian regulation on the subject is of extreme importance

  20. The Mediator complex: a central integrator of transcription

    Science.gov (United States)

    Allen, Benjamin L.; Taatjes, Dylan J.

    2016-01-01

    The RNA polymerase II (pol II) enzyme transcribes all protein-coding and most non-coding RNA genes and is globally regulated by Mediator, a large, conformationally flexible protein complex with variable subunit composition (for example, a four-subunit CDK8 module can reversibly associate). These biochemical characteristics are fundamentally important for Mediator's ability to control various processes important for transcription, including organization of chromatin architecture and regulation of pol II pre-initiation, initiation, re-initiation, pausing, and elongation. Although Mediator exists in all eukaryotes, a variety of Mediator functions appear to be specific to metazoans, indicative of more diverse regulatory requirements. PMID:25693131

  1. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  2. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    Science.gov (United States)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These

  3. Neural networks supporting audiovisual integration for speech: A large-scale lesion study.

    Science.gov (United States)

    Hickok, Gregory; Rogalsky, Corianne; Matchin, William; Basilakos, Alexandra; Cai, Julia; Pillay, Sara; Ferrill, Michelle; Mickelsen, Soren; Anderson, Steven W; Love, Tracy; Binder, Jeffrey; Fridriksson, Julius

    2018-06-01

    Auditory and visual speech information are often strongly integrated resulting in perceptual enhancements for audiovisual (AV) speech over audio alone and sometimes yielding compelling illusory fusion percepts when AV cues are mismatched, the McGurk-MacDonald effect. Previous research has identified three candidate regions thought to be critical for AV speech integration: the posterior superior temporal sulcus (STS), early auditory cortex, and the posterior inferior frontal gyrus. We assess the causal involvement of these regions (and others) in the first large-scale (N = 100) lesion-based study of AV speech integration. Two primary findings emerged. First, behavioral performance and lesion maps for AV enhancement and illusory fusion measures indicate that classic metrics of AV speech integration are not necessarily measuring the same process. Second, lesions involving superior temporal auditory, lateral occipital visual, and multisensory zones in the STS are the most disruptive to AV speech integration. Further, when AV speech integration fails, the nature of the failure-auditory vs visual capture-can be predicted from the location of the lesions. These findings show that AV speech processing is supported by unimodal auditory and visual cortices as well as multimodal regions such as the STS at their boundary. Motor related frontal regions do not appear to play a role in AV speech integration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Intracytoplasmic maturation of the human immunodeficiency virus type 1 reverse transcription complexes determines their capacity to integrate into chromatin

    Directory of Open Access Journals (Sweden)

    Kashanchi Fatah

    2006-01-01

    Full Text Available Abstract Background The early events of the HIV-1 life cycle include entry of the viral core into target cell, assembly of the reverse transcription complex (RTCs performing reverse transcription, its transformation into integration-competent complexes called pre-integration complexes (PICs, trafficking of complexes into the nucleus, and finally integration of the viral DNA into chromatin. Molecular details and temporal organization of these processes remain among the least investigated and most controversial problems in the biology of HIV. Results To quantitatively evaluate maturation and nuclear translocation of the HIV-1 RTCs, nucleoprotein complexes isolated from the nucleus (nRTC and cytoplasm (cRTC of HeLa cells infected with MLV Env-pseudotyped HIV-1 were analyzed by real-time PCR. While most complexes completed reverse transcription in the cytoplasm, some got into the nucleus before completing DNA synthesis. The HIV-specific RNA complexes could get into the nucleus when reverse transcription was blocked by reverse transcriptase inhibitor, although nuclear import of RNA complexes was less efficient than of DNA-containing RTCs. Analysis of the RTC nuclear import in synchronized cells infected in the G2/M phase of the cell cycle showed enrichment in the nuclei of RTCs containing incomplete HIV-1 DNA compared to non-synchronized cells, where RTCs with complete reverse transcripts prevailed. Immunoprecipitation assays identified viral proteins IN, Vpr, MA, and cellular Ini1 and PML associated with both cRTCs and nRTCs, whereas CA was detected only in cRTCs and RT was diminished in nRTCs. Cytoplasmic maturation of the complexes was associated with increased immunoreactivity with anti-Vpr and anti-IN antibodies, and decreased reactivity with antibodies to RT. Both cRTCs and nRTCs carried out endogenous reverse transcription reaction in vitro. In contrast to cRTCs, in vitro completion of reverse transcription in nRTCs did not increase their

  5. Direct evaluation of free energy for large system through structure integration approach.

    Science.gov (United States)

    Takeuchi, Kazuhito; Tanaka, Ryohei; Yuge, Koretaka

    2015-09-30

    We propose a new approach, 'structure integration', enabling direct evaluation of configurational free energy for large systems. The present approach is based on the statistical information of lattice. Through first-principles-based simulation, we find that the present method evaluates configurational free energy accurately in disorder states above critical temperature.

  6. Developing integrated parametric planning models for budgeting and managing complex projects

    Science.gov (United States)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  7. Integration in New Product Development: Case Study in a Large Brazilian

    Directory of Open Access Journals (Sweden)

    Daniel Jugend

    2012-02-01

    Full Text Available Proficiency in management activities undertaken in product development processes is regarded as a key competitive advantage for companies, particularly for high-tech industrial firms, which benefit from the important competitiveness factor of launching products with a differentiated technological content. This paper’s objective was to identify, through case study, practices for integration between the roles of R & D with others involved in product development in a large Brazilian company of industrial automation. The results suggest some management practices to improve the integration in new products development, such as the use of employees from marketing with knowledge and experience previously gained from R & D activities and uses the heavyweight product manager to solve synchronization problems between product and technology development.

  8. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  9. Management of large complex multi-stakeholders projects: a bibliometric approach

    Directory of Open Access Journals (Sweden)

    Aline Sacchi Homrich

    2017-06-01

    Full Text Available The growing global importance of large infrastructure projects has piqued the interest of many researchers in a variety of issues related to the management of large, multi-stakeholder projects, characterized by their high complexity and intense interaction among numerous stake-holders with distinct levels of responsibility. The objective of this study is to provide an overview of the academic literature focused on the management of these kinds of projects, describing the main themes considered, the lines of research identified and prominent trends. Bibliometric analysis techniques were used as well as network and content analysis. Research for information was performed in the scientific database, ISI Web of Knowledge and Scopus. The initial sample analysis consisted of 144 papers published between 1984 and 2014 and expanded to the references cited in these papers. The models identified in the literature converge with the following key-processes: project delivery systems; risk-management models; project cost management; public-private partnership.

  10. Hierarchical and Matrix Structures in a Large Organizational Email Network: Visualization and Modeling Approaches

    OpenAIRE

    Sims, Benjamin H.; Sinitsyn, Nikolai; Eidenbenz, Stephan J.

    2014-01-01

    This paper presents findings from a study of the email network of a large scientific research organization, focusing on methods for visualizing and modeling organizational hierarchies within large, complex network datasets. In the first part of the paper, we find that visualization and interpretation of complex organizational network data is facilitated by integration of network data with information on formal organizational divisions and levels. By aggregating and visualizing email traffic b...

  11. 3D-Printed Disposable Wireless Sensors with Integrated Microelectronics for Large Area Environmental Monitoring

    KAUST Repository

    Farooqui, Muhammad Fahad

    2017-05-19

    Large area environmental monitoring can play a crucial role in dealing with crisis situations. However, it is challenging as implementing a fixed sensor network infrastructure over large remote area is economically unfeasible. This work proposes disposable, compact, dispersible 3D-printed wireless sensor nodes with integrated microelectronics which can be dispersed in the environment and work in conjunction with few fixed nodes for large area monitoring applications. As a proof of concept, the wireless sensing of temperature, humidity, and H2S levels are shown which are important for two critical environmental conditions namely forest fires and industrial leaks. These inkjet-printed sensors and an antenna are realized on the walls of a 3D-printed cubic package which encloses the microelectronics developed on a 3D-printed circuit board. Hence, 3D printing and inkjet printing are uniquely combined in order to realize a low-cost, fully integrated wireless sensor node.

  12. Integrated complex care coordination for children with medical complexity: A mixed-methods evaluation of tertiary care-community collaboration

    Directory of Open Access Journals (Sweden)

    Cohen Eyal

    2012-10-01

    Full Text Available Abstract Background Primary care medical homes may improve health outcomes for children with special healthcare needs (CSHCN, by improving care coordination. However, community-based primary care practices may be challenged to deliver comprehensive care coordination to complex subsets of CSHCN such as children with medical complexity (CMC. Linking a tertiary care center with the community may achieve cost effective and high quality care for CMC. The objective of this study was to evaluate the outcomes of community-based complex care clinics integrated with a tertiary care center. Methods A before- and after-intervention study design with mixed (quantitative/qualitative methods was utilized. Clinics at two community hospitals distant from tertiary care were staffed by local community pediatricians with the tertiary care center nurse practitioner and linked with primary care providers. Eighty-one children with underlying chronic conditions, fragility, requirement for high intensity care and/or technology assistance, and involvement of multiple providers participated. Main outcome measures included health care utilization and expenditures, parent reports of parent- and child-quality of life [QOL (SF-36®, CPCHILD©, PedsQL™], and family-centered care (MPOC-20®. Comparisons were made in equal (up to 1 year pre- and post-periods supplemented by qualitative perspectives of families and pediatricians. Results Total health care system costs decreased from median (IQR $244 (981 per patient per month (PPPM pre-enrolment to $131 (355 PPPM post-enrolment (p=.007, driven primarily by fewer inpatient days in the tertiary care center (p=.006. Parents reported decreased out of pocket expenses (p© domains [Health Standardization Section (p=.04; Comfort and Emotions (p=.03], while total CPCHILD© score decreased between baseline and 1 year (p=.003. Parents and providers reported the ability to receive care close to home as a key benefit. Conclusions Complex

  13. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  14. Generating functional analysis of complex formation and dissociation in large protein interaction networks

    International Nuclear Information System (INIS)

    Coolen, A C C; Rabello, S

    2009-01-01

    We analyze large systems of interacting proteins, using techniques from the non-equilibrium statistical mechanics of disordered many-particle systems. Apart from protein production and removal, the most relevant microscopic processes in the proteome are complex formation and dissociation, and the microscopic degrees of freedom are the evolving concentrations of unbound proteins (in multiple post-translational states) and of protein complexes. Here we only include dimer-complexes, for mathematical simplicity, and we draw the network that describes which proteins are reaction partners from an ensemble of random graphs with an arbitrary degree distribution. We show how generating functional analysis methods can be used successfully to derive closed equations for dynamical order parameters, representing an exact macroscopic description of the complex formation and dissociation dynamics in the infinite system limit. We end this paper with a discussion of the possible routes towards solving the nontrivial order parameter equations, either exactly (in specific limits) or approximately.

  15. Talking about the institutional complexity of the integrated rehabilitation system-the importance of coordination.

    Science.gov (United States)

    Miettinen, Sari; Ashorn, Ulla; Lehto, Juhani

    2013-01-01

    Rehabilitation in Finland is a good example of functions divided among several welfare sectors, such as health services and social services. The rehabilitation system in Finland is a complex one and there have been many efforts to create a coordinated entity. The purpose of this study is to open up a complex welfare system at the upper policy level and to understand the meaning of coordination at the level of service delivery. We shed light in particular on the national rehabilitation policy in Finland and how the policy has tried to overcome the negative effects of institutional complexity. In this study we used qualitative content analysis and frame analysis. As a result we identified four different welfare state frames with distinct features of policy problems, policy alternatives and institutional failure. The rehabilitation policy in Finland seems to be divided into different components which may cause problems at the level of service delivery and thus in the integration of services. Bringing these components together could at policy level enable a shared view of the rights of different population groups, effective management of integration at the level of service delivery and also an opportunity for change throughout the rehabilitation system.

  16. Fast and accurate detection of spread source in large complex networks.

    Science.gov (United States)

    Paluch, Robert; Lu, Xiaoyan; Suchecki, Krzysztof; Szymański, Bolesław K; Hołyst, Janusz A

    2018-02-06

    Spread over complex networks is a ubiquitous process with increasingly wide applications. Locating spread sources is often important, e.g. finding the patient one in epidemics, or source of rumor spreading in social network. Pinto, Thiran and Vetterli introduced an algorithm (PTVA) to solve the important case of this problem in which a limited set of nodes act as observers and report times at which the spread reached them. PTVA uses all observers to find a solution. Here we propose a new approach in which observers with low quality information (i.e. with large spread encounter times) are ignored and potential sources are selected based on the likelihood gradient from high quality observers. The original complexity of PTVA is O(N α ), where α ∈ (3,4) depends on the network topology and number of observers (N denotes the number of nodes in the network). Our Gradient Maximum Likelihood Algorithm (GMLA) reduces this complexity to O (N 2 log (N)). Extensive numerical tests performed on synthetic networks and real Gnutella network with limitation that id's of spreaders are unknown to observers demonstrate that for scale-free networks with such limitation GMLA yields higher quality localization results than PTVA does.

  17. An investigation of multidisciplinary complex health care interventions - steps towards an integrative treatment model in the rehabilitation of People with Multiple Sclerosis

    Directory of Open Access Journals (Sweden)

    Skovgaard Lasse

    2012-04-01

    Full Text Available Abstract Background The Danish Multiple Sclerosis Society initiated a large-scale bridge building and integrative treatment project to take place from 2004–2010 at a specialized Multiple Sclerosis (MS hospital. In this project, a team of five conventional health care practitioners and five alternative practitioners was set up to work together in developing and offering individualized treatments to 200 people with MS. The purpose of this paper is to present results from the six year treatment collaboration process regarding the development of an integrative treatment model. Discussion The collaborative work towards an integrative treatment model for people with MS, involved six steps: 1 Working with an initial model 2 Unfolding the different treatment philosophies 3 Discussing the elements of the Intervention-Mechanism-Context-Outcome-scheme (the IMCO-scheme 4 Phrasing the common assumptions for an integrative MS program theory 5 Developing the integrative MS program theory 6 Building the integrative MS treatment model. The model includes important elements of the different treatment philosophies represented in the team and thereby describes a common understanding of the complexity of the courses of treatment. Summary An integrative team of practitioners has developed an integrative model for combined treatments of People with Multiple Sclerosis. The model unites different treatment philosophies and focuses on process-oriented factors and the strengthening of the patients’ resources and competences on a physical, an emotional and a cognitive level.

  18. Heuristic Relative Entropy Principles with Complex Measures: Large-Degree Asymptotics of a Family of Multi-variate Normal Random Polynomials

    Science.gov (United States)

    Kiessling, Michael Karl-Heinz

    2017-10-01

    Let z\\in C, let σ ^2>0 be a variance, and for N\\in N define the integrals E_N^{}(z;σ ) := {1/σ } \\int _R\\ (x^2+z^2) e^{-{1/2σ^2 x^2}}{√{2π }}/dx \\quad if N=1, {1/σ } \\int _{R^N} \\prod \\prod \\limits _{1≤ k1. These are expected values of the polynomials P_N^{}(z)=\\prod _{1≤ n≤ N}(X_n^2+z^2) whose 2 N zeros ± i X_k^{}_{k=1,\\ldots ,N} are generated by N identically distributed multi-variate mean-zero normal random variables {X_k}N_{k=1} with co-variance {Cov}_N^{}(X_k,X_l)=(1+σ ^2-1/N)δ _{k,l}+σ ^2-1/N(1-δ _{k,l}). The E_N^{}(z;σ ) are polynomials in z^2, explicitly computable for arbitrary N, yet a list of the first three E_N^{}(z;σ ) shows that the expressions become unwieldy already for moderate N—unless σ = 1, in which case E_N^{}(z;1) = (1+z^2)^N for all z\\in C and N\\in N. (Incidentally, commonly available computer algebra evaluates the integrals E_N^{}(z;σ ) only for N up to a dozen, due to memory constraints). Asymptotic evaluations are needed for the large- N regime. For general complex z these have traditionally been limited to analytic expansion techniques; several rigorous results are proved for complex z near 0. Yet if z\\in R one can also compute this "infinite-degree" limit with the help of the familiar relative entropy principle for probability measures; a rigorous proof of this fact is supplied. Computer algebra-generated evidence is presented in support of a conjecture that a generalization of the relative entropy principle to signed or complex measures governs the N→ ∞ asymptotics of the regime iz\\in R. Potential generalizations, in particular to point vortex ensembles and the prescribed Gauss curvature problem, and to random matrix ensembles, are emphasized.

  19. Integrative analysis for finding genes and networks involved in diabetes and other complex diseases

    DEFF Research Database (Denmark)

    Bergholdt, R.; Størling, Zenia, Marian; Hansen, Kasper Lage

    2007-01-01

    We have developed an integrative analysis method combining genetic interactions, identified using type 1 diabetes genome scan data, and a high-confidence human protein interaction network. Resulting networks were ranked by the significance of the enrichment of proteins from interacting regions. We...... identified a number of new protein network modules and novel candidate genes/proteins for type 1 diabetes. We propose this type of integrative analysis as a general method for the elucidation of genes and networks involved in diabetes and other complex diseases....

  20. Plastic influence functions for calculating J-integral of complex-cracks in pipe

    International Nuclear Information System (INIS)

    Jeong, Jae-Uk; Choi, Jae-Boong; Kim, Moon-Ki; Huh, Nam-Su; Kim, Yun-Jae

    2016-01-01

    In this study, the plastic influence functions, h_1, for estimates of J-integral of a pipe with a complex crack were newly proposed based on the systematic 3-dimensional (3-D) elastic-plastic finite element (FE) analyses by using Ramberg-Osgood (R-O) relation, in which global bending moment, axial tension and internal pressure were considered as loading conditions. Based on the present plastic influence functions, the GE/EPRI-type J-estimation scheme for complex-cracked pipes was suggested, and the results from the proposed J-estimation were compared with the FE results using both R-O fit parameters and actual tensile data of SA376 TP304 stainless steel. The comparison results demonstrate that although the proposed scheme provided sensitive J estimations according to fitting ranges of R-O parameters, it showed overall good agreements with the FE results using R-O relation. Thus, the proposed engineering J prediction method can be utilized to assess instability of a complex crack in pipes for R-O material. - Highlights: • New h_1values of GE/EPRI method for complex-cracked pipes are proposed. • The plastic limit loads of complex-cracked pipes using Mises yield criterion are provided. • The new J estimates of complex-cracked pipes are proposed based on GE/EPRI concept. • The proposed estimates for J are validated against 3-D finite element results.

  1. Thinking in complexity the complex dynamics of matter, mind, and mankind

    CERN Document Server

    Mainzer, Klaus

    1994-01-01

    The theory of nonlinear complex systems has become a successful and widely used problem-solving approach in the natural sciences - from laser physics, quantum chaos and meteorology to molecular modeling in chemistry and computer simulations of cell growth in biology In recent times it has been recognized that many of the social, ecological and political problems of mankind are also of a global, complex and nonlinear nature And one of the most exciting topics of present scientific and public interest is the idea that even the human mind is governed largely by the nonlinear dynamics of complex systems In this wide-ranging but concise treatment Prof Mainzer discusses, in nontechnical language, the common framework behind these endeavours Special emphasis is given to the evolution of new structures in natural and cultural systems and it is seen clearly how the new integrative approach of complexity theory can give new insights that were not available using traditional reductionistic methods

  2. Large system change challenges: addressing complex critical issues in linked physical and social domains

    Science.gov (United States)

    Waddell, Steve; Cornell, Sarah; Hsueh, Joe; Ozer, Ceren; McLachlan, Milla; Birney, Anna

    2015-04-01

    Most action to address contemporary complex challenges, including the urgent issues of global sustainability, occurs piecemeal and without meaningful guidance from leading complex change knowledge and methods. The potential benefit of using such knowledge is greater efficacy of effort and investment. However, this knowledge and its associated tools and methods are under-utilized because understanding about them is low, fragmented between diverse knowledge traditions, and often requires shifts in mindsets and skills from expert-led to participant-based action. We have been engaged in diverse action-oriented research efforts in Large System Change for sustainability. For us, "large" systems can be characterized as large-scale systems - up to global - with many components, of many kinds (physical, biological, institutional, cultural/conceptual), operating at multiple levels, driven by multiple forces, and presenting major challenges for people involved. We see change of such systems as complex challenges, in contrast with simple or complicated problems, or chaotic situations. In other words, issues and sub-systems have unclear boundaries, interact with each other, and are often contradictory; dynamics are non-linear; issues are not "controllable", and "solutions" are "emergent" and often paradoxical. Since choices are opportunity-, power- and value-driven, these social, institutional and cultural factors need to be made explicit in any actionable theory of change. Our emerging network is sharing and building a knowledge base of experience, heuristics, and theories of change from multiple disciplines and practice domains. We will present our views on focal issues for the development of the field of large system change, which include processes of goal-setting and alignment; leverage of systemic transitions and transformation; and the role of choice in influencing critical change processes, when only some sub-systems or levels of the system behave in purposeful ways

  3. Axiomatic design in large systems complex products, buildings and manufacturing systems

    CERN Document Server

    Suh, Nam

    2016-01-01

    This book provides a synthesis of recent developments in Axiomatic Design theory and its application in large complex systems. Introductory chapters provide concise tutorial materials for graduate students and new practitioners, presenting the fundamentals of Axiomatic Design and relating its key concepts to those of model-based systems engineering. A mathematical exposition of design axioms is also provided. The main body of the book, which represents a concentrated treatment of several applications, is divided into three parts covering work on: complex products; buildings; and manufacturing systems. The book shows how design work in these areas can benefit from the scientific and systematic underpinning provided by Axiomatic Design, and in so doing effectively combines the state of the art in design research with practice. All contributions were written by an international group of leading proponents of Axiomatic Design. The book concludes with a call to action motivating further research into the engineeri...

  4. A Variable Stiffness Analysis Model for Large Complex Thin-Walled Guide Rail

    Directory of Open Access Journals (Sweden)

    Wang Xiaolong

    2016-01-01

    Full Text Available Large complex thin-walled guide rail has complicated structure and no uniform low rigidity. The traditional cutting simulations are time consuming due to huge computation especially in large workpiece. To solve these problems, a more efficient variable stiffness analysis model has been propose, which can obtain quantitative stiffness value of the machining surface. Applying simulate cutting force in sampling points using finite element analysis software ABAQUS, the single direction variable stiffness rule can be obtained. The variable stiffness matrix has been propose by analyzing multi-directions coupling variable stiffness rule. Combining with the three direction cutting force value, the reasonability of existing processing parameters can be verified and the optimized cutting parameters can be designed.

  5. Integrated ecotechnology approach towards treatment of complex wastewater with simultaneous bioenergy production.

    Science.gov (United States)

    Hemalatha, Manupati; Sravan, J Shanthi; Yeruva, Dileep Kumar; Venkata Mohan, S

    2017-10-01

    Sequential integration of three stage diverse biological processes was studied by exploiting the individual process advantage towards enhanced treatment of complex chemical based wastewater. A successful attempt to integrate sequence batch reactor (SBR) with bioelectrochemical treatment (BET) and finally with microalgae treatment was studied. The sequential integration has showed individual substrate degradation (COD) of 55% in SBR, 49% in BET and 56% in microalgae, accounting for a consolidated treatment efficiency of 90%. Nitrates removal efficiency of 25% was observed in SBR, 31% in BET and 44% in microalgae, with a total efficiency of 72%. The SBR treated effluents fed to BET with the electrode intervention showed TDS removal. BET exhibited relatively higher process performance than SBR. The integration approach significantly overcame the individual process limitations along with value addition as biomass (1.75g/L), carbohydrates (640mg/g), lipids (15%) and bioelectricity. The study resulted in providing a strategy of combining SBR as pretreatment step to BET process and finally polishing with microalgae cultivation achieving the benefits of enhanced wastewater treatment along with value addition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Arabidopsis GCP3-interacting protein 1/MOZART 1 is an integral component of the γ-tubulin-containing microtubule nucleating complex.

    Science.gov (United States)

    Nakamura, Masayoshi; Yagi, Noriyoshi; Kato, Takehide; Fujita, Satoshi; Kawashima, Noriyuki; Ehrhardt, David W; Hashimoto, Takashi

    2012-07-01

    Microtubules in eukaryotic cells are nucleated from ring-shaped complexes that contain γ-tubulin and a family of homologous γ-tubulin complex proteins (GCPs), but the subunit composition of the complexes can vary among fungi, animals and plants. Arabidopsis GCP3-interacting protein 1 (GIP1), a small protein with no homology to the GCP family, interacts with GCP3 in vitro, and is a plant homolog of vertebrate mitotic-spindle organizing protein associated with a ring of γ-tubulin 1 (MOZART1), a recently identified component of the γ-tubulin complex in human cell lines. In this study, we characterized two closely related Arabidopsis GIP1s: GIP1a and GIP1b. Single mutants of gip1a and gip1b were indistinguishable from wild-type plants, but their double mutant was embryonic lethal, and showed impaired development of male gametophytes. Functional fusions of GIP1a with green fluorescent protein (GFP) were used to purify GIP1a-containing complexes from Arabidopsis plants, which contained all the subunits (except NEDD1) previously identified in the Arabidopsis γ-tubulin complexes. GIP1a and GIP1b interacted specifically with Arabidopsis GCP3 in yeast. GFP-GIP1a labeled mitotic microtubule arrays in a pattern largely consistent with, but partly distinct from, the localization of the γ-tubulin complex containing GCP2 or GCP3 in planta. In interphase cortical arrays, the labeled complexes were preferentially recruited to existing microtubules, from which new microtubules were efficiently nucleated. However, in contrast to complexes labeled with tagged GCP2 or GCP3, their recruitment to cortical areas with no microtubules was rarely observed. These results indicate that GIP1/MOZART1 is an integral component of a subset of the Arabidopsis γ-tubulin complexes. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  7. Optimizing liquid effluent monitoring at a large nuclear complex.

    Science.gov (United States)

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  8. Model-based identification and use of task complexity factors of human integrated systems

    International Nuclear Information System (INIS)

    Ham, Dong-Han; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Task complexity is one of the conceptual constructs that are critical to explain and predict human performance in human integrated systems. A basic approach to evaluating the complexity of tasks is to identify task complexity factors and measure them. Although a great deal of task complexity factors have been studied, there is still a lack of conceptual frameworks for identifying and organizing them analytically, which can be generally used irrespective of the types of domains and tasks. This study proposes a model-based approach to identifying and using task complexity factors, which has two facets—the design aspects of a task and complexity dimensions. Three levels of design abstraction, which are functional, behavioral, and structural aspects of a task, characterize the design aspect of a task. The behavioral aspect is further classified into five cognitive processing activity types. The complexity dimensions explain a task complexity from different perspectives, which are size, variety, and order/organization. Twenty-one task complexity factors are identified by the combination of the attributes of each facet. Identification and evaluation of task complexity factors based on this model is believed to give insights for improving the design quality of tasks. This model for complexity factors can also be used as a referential framework for allocating tasks and designing information aids. The proposed approach is applied to procedure-based tasks of nuclear power plants (NPPs) as a case study to demonstrate its use. Last, we compare the proposed approach with other studies and then suggest some future research directions.

  9. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  10. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    Science.gov (United States)

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  11. DISTILLER: a data integration framework to reveal condition dependency of complex regulons in Escherichia coli.

    Science.gov (United States)

    Lemmens, Karen; De Bie, Tijl; Dhollander, Thomas; De Keersmaecker, Sigrid C; Thijs, Inge M; Schoofs, Geert; De Weerdt, Ami; De Moor, Bart; Vanderleyden, Jos; Collado-Vides, Julio; Engelen, Kristof; Marchal, Kathleen

    2009-01-01

    We present DISTILLER, a data integration framework for the inference of transcriptional module networks. Experimental validation of predicted targets for the well-studied fumarate nitrate reductase regulator showed the effectiveness of our approach in Escherichia coli. In addition, the condition dependency and modularity of the inferred transcriptional network was studied. Surprisingly, the level of regulatory complexity seemed lower than that which would be expected from RegulonDB, indicating that complex regulatory programs tend to decrease the degree of modularity.

  12. 3D-Printed Disposable Wireless Sensors with Integrated Microelectronics for Large Area Environmental Monitoring

    KAUST Repository

    Farooqui, Muhammad Fahad; Karimi, Muhammad Akram; Salama, Khaled N.; Shamim, Atif

    2017-01-01

    disposable, compact, dispersible 3D-printed wireless sensor nodes with integrated microelectronics which can be dispersed in the environment and work in conjunction with few fixed nodes for large area monitoring applications. As a proof of concept

  13. Potentials and challenges of integration for complex metal oxides in CMOS devices and beyond

    International Nuclear Information System (INIS)

    Kim, Y; Pham, C; Chang, J P

    2015-01-01

    This review focuses on recent accomplishments on complex metal oxide based multifunctional materials and the potential they hold in advancing integrated circuits. It begins with metal oxide based high-κ materials to highlight the success of their integration since 45 nm complementary metal–oxide–semiconductor (CMOS) devices. By simultaneously offering a higher dielectric constant for improved capacitance as well as providing a thicker physical layer to prevent the quantum mechanical tunnelling of electrons, high-κ materials have enabled the continued down-scaling of CMOS based devices. The most recent technology driver has been the demand to lower device power consumption, which requires the design and synthesis of novel materials, such as complex metal oxides that exhibit remarkable tunability in their ferromagnetic, ferroelectric and multiferroic properties. These properties make them suitable for a wide variety of applications such as magnetoelectric random access memory, radio frequency band pass filters, antennae and magnetic sensors. Single-phase multiferroics, while rare, offer unique functionalities which have motivated much scientific and technological research to ascertain the origins of their multiferroicity and their applicability to potential devices. However, due to the weak magnetoelectric coupling for single-phase multiferroics, engineered multiferroic composites based on magnetostrictive ferromagnets interfacing piezoelectrics or ferroelectrics have shown enhanced multiferroic behaviour from effective strain coupling at the interface. In addition, nanostructuring of the ferroic phases has demonstrated further improvement in the coupling effect. Therefore, single-phase and engineered composite multiferroics consisting of complex metal oxides are reviewed in terms of magnetoelectric coupling effects and voltage controlled ferromagnetic properties, followed by a review on the integration challenges that need to be overcome to realize the

  14. IoT European Large-Scale Pilots – Integration, Experimentation and Testing

    OpenAIRE

    Guillén, Sergio Gustavo; Sala, Pilar; Fico, Giuseppe; Arredondo, Maria Teresa; Cano, Alicia; Posada, Jorge; Gutierrez, Germán; Palau, Carlos; Votis, Konstantinos; Verdouw, Cor N.; Wolfert, Sjaak; Beers, George; Sundmaeker, Harald; Chatzikostas, Grigoris; Ziegler, Sébastien

    2017-01-01

    The IoT European Large-Scale Pilots Programme includes the innovation consortia that are collaborating to foster the deployment of IoT solutions in Europe through the integration of advanced IoT technologies across the value chain, demonstration of multiple IoT applications at scale and in a usage context, and as close as possible to operational conditions. The programme projects are targeted, goal-driven initiatives that propose IoT approaches to specific real-life industrial/societal challe...

  15. Narrative persuasion, causality, complex integration, and support for obesity policy.

    Science.gov (United States)

    Niederdeppe, Jeff; Shapiro, Michael A; Kim, Hye Kyung; Bartolo, Danielle; Porticella, Norman

    2014-01-01

    Narrative messages have the potential to convey causal attribution information about complex social issues. This study examined attributions about obesity, an issue characterized by interrelated biological, behavioral, and environmental causes. Participants were randomly assigned to read one of three narratives emphasizing societal causes and solutions for obesity or an unrelated story that served as the control condition. The three narratives varied in the extent to which the character in the story acknowledged personal responsibility (high, moderate, and none) for controlling her weight. Stories that featured no acknowledgment and moderate acknowledgment of personal responsibility, while emphasizing environmental causes and solutions, were successful at increasing societal cause attributions about obesity and, among conservatives, increasing support for obesity-related policies relative to the control group. The extent to which respondents were able to make connections between individual and environmental causes of obesity (complex integration) mediated the relationship between the moderate acknowledgment condition and societal cause attributions. We conclude with a discussion of the implications of this work for narrative persuasion theory and health communication campaigns.

  16. Integrated numerical platforms for environmental dose assessments of large tritium inventory facilities

    International Nuclear Information System (INIS)

    Castro, P.; Ardao, J.; Velarde, M.; Sedano, L.; Xiberta, J.

    2013-01-01

    Related with a prospected new scenario of large inventory tritium facilities [KATRIN at TLK, CANDUs, ITER, EAST, other coming] the prescribed dosimetric limits by ICRP-60 for tritium committed-doses are under discussion requiring, in parallel, to surmount the highly conservative assessments by increasing the refinement of dosimetric-assessments in many aspects. Precise Lagrangian-computations of dosimetric cloud-evolution after standardized (normal/incidental/SBO) tritium cloud emissions are today numerically open to the perfect match of real-time meteorological-data, and patterns data at diverse scales for prompt/early and chronic tritium dose assessments. The trends towards integrated-numerical-platforms for environmental-dose assessments of large tritium inventory facilities under development.

  17. Identifying protein complex by integrating characteristic of core-attachment into dynamic PPI network.

    Directory of Open Access Journals (Sweden)

    Xianjun Shen

    Full Text Available How to identify protein complex is an important and challenging task in proteomics. It would make great contribution to our knowledge of molecular mechanism in cell life activities. However, the inherent organization and dynamic characteristic of cell system have rarely been incorporated into the existing algorithms for detecting protein complexes because of the limitation of protein-protein interaction (PPI data produced by high throughput techniques. The availability of time course gene expression profile enables us to uncover the dynamics of molecular networks and improve the detection of protein complexes. In order to achieve this goal, this paper proposes a novel algorithm DCA (Dynamic Core-Attachment. It detects protein-complex core comprising of continually expressed and highly connected proteins in dynamic PPI network, and then the protein complex is formed by including the attachments with high adhesion into the core. The integration of core-attachment feature into the dynamic PPI network is responsible for the superiority of our algorithm. DCA has been applied on two different yeast dynamic PPI networks and the experimental results show that it performs significantly better than the state-of-the-art techniques in terms of prediction accuracy, hF-measure and statistical significance in biology. In addition, the identified complexes with strong biological significance provide potential candidate complexes for biologists to validate.

  18. Introduction to Large-sized Test Facility for validating Containment Integrity under Severe Accidents

    International Nuclear Information System (INIS)

    Na, Young Su; Hong, Seongwan; Hong, Seongho; Min, Beongtae

    2014-01-01

    An overall assessment of containment integrity can be conducted properly by examining the hydrogen behavior in the containment building. Under severe accidents, an amount of hydrogen gases can be generated by metal oxidation and corium-concrete interaction. Hydrogen behavior in the containment building strongly depends on complicated thermal hydraulic conditions with mixed gases and steam. The performance of a PAR can be directly affected by the thermal hydraulic conditions, steam contents, gas mixture behavior and aerosol characteristics, as well as the operation of other engineering safety systems such as a spray. The models in computer codes for a severe accident assessment can be validated based on the experiment results in a large-sized test facility. The Korea Atomic Energy Research Institute (KAERI) is now preparing a large-sized test facility to examine in detail the safety issues related with hydrogen including the performance of safety devices such as a PAR in various severe accident situations. This paper introduces the KAERI test facility for validating the containment integrity under severe accidents. To validate the containment integrity, a large-sized test facility is necessary for simulating complicated phenomena induced by an amount of steam and gases, especially hydrogen released into the containment building under severe accidents. A pressure vessel 9.5 m in height and 3.4 m in diameter was designed at the KAERI test facility for the validating containment integrity, which was based on the THAI test facility with the experimental safety and the reliable measurement systems certified for a long time. This large-sized pressure vessel operated in steam and iodine as a corrosive agent was made by stainless steel 316L because of corrosion resistance for a long operating time, and a vessel was installed in at KAERI in March 2014. In the future, the control systems for temperature and pressure in a vessel will be constructed, and the measurement system

  19. Methods Dealing with Complexity in Selecting Joint Venture Contractors for Large-Scale Infrastructure Projects

    Directory of Open Access Journals (Sweden)

    Ru Liang

    2018-01-01

    Full Text Available The magnitude of business dynamics has increased rapidly due to increased complexity, uncertainty, and risk of large-scale infrastructure projects. This fact made it increasingly tough to “go alone” into a contractor. As a consequence, joint venture contractors with diverse strengths and weaknesses cooperatively bid for bidding. Understanding project complexity and making decision on the optimal joint venture contractor is challenging. This paper is to study how to select joint venture contractors for undertaking large-scale infrastructure projects based on a multiattribute mathematical model. Two different methods are developed to solve the problem. One is based on ideal points and the other one is based on balanced ideal advantages. Both of the two methods consider individual difference in expert judgment and contractor attributes. A case study of Hong Kong-Zhuhai-Macao-Bridge (HZMB project in China is used to demonstrate how to apply these two methods and their advantages.

  20. Study of integrated optimization design of wind farm in complex terrain

    DEFF Research Database (Denmark)

    Xu, Chang; Chen, Dandan; Han, Xingxing

    2017-01-01

    wind farm design in complex terrain and setting up integrated optimization mathematical model for micro-site selection, power lines and road maintenance design etc.. Based on the existing 1-year wind measurement data in the wind farm area, the genetic algorithm was used to optimize the micro......-site selection. On the basis of location optimization of wind turbine, the optimization algorithms such as single-source shortest path algorithm and minimum spanning tree algorithm were used to optimize electric lines and maintenance roads. The practice shows that the research results can provide important...

  1. Does company size matter? Validation of an integrative model of safety behavior across small and large construction companies.

    Science.gov (United States)

    Guo, Brian H W; Yiu, Tak Wing; González, Vicente A

    2018-02-01

    Previous safety climate studies primarily focused on either large construction companies or the construction industry as a whole, while little is known about whether company size has significant effects on workers' understanding of safety climate measures and relationships between safety climate factors and safety behavior. Thus, this study aims to: (a) test the measurement equivalence (ME) of a safety climate measure across workers from small and large companies; (b) investigate if company size alters the causal structure of the integrative model developed by Guo, Yiu, and González (2016). Data were collected from 253 construction workers in New Zealand using a safety climate measure. This study used multi-group confirmatory factor analyses (MCFA) to test the measurement equivalence of the safety climate measure and structure invariance of the integrative model. Results indicate that workers from small and large companies understood the safety climate measure in a similar manner. In addition, it was suggested that company size does not change the causal structure and mediational processes of the integrative model. Both measurement equivalence of the safety climate measure and structural invariance of the integrative model were supported by this study. Practical applications: Findings of this study provided strong support for a meaningful use of the safety climate measure across construction companies in different sizes. Safety behavior promotion strategies designed based on the integrative model may be well suited for both large and small companies. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  2. Exploring Integration of Care for Children Living with Complex Care Needs across the European Union and European Economic Area.

    Science.gov (United States)

    Brenner, Maria; O'Shea, Miriam; J Larkin, Philip; Kamionka, Stine Lundstroem; Berry, Jay; Hiscock, Harriet; Rigby, Michael; Blair, Mitch

    2017-04-24

    The aim of this paper is to report on the development of surveys to explore integration of care for children living with complex care needs across the European Union (EU) and European Economic Area (EEA). Each survey consists of a vignette and questions adapted from the Standards for Systems of Care for Children and Youth with Special Health Care Needs and the Eurobarometer Survey . A Country Agent in each country, a local expert in child health services, will obtain data from indigenous sources. We identified 'in-principle' complex problems and adapted surveys to capture care integration. We expect to get rich data to understand perceptions and to inform actions for a number of complex health issues. The study has the potential to make a wide contribution to individual countries of the EU/EEA to understand their own integration of services mapped against responses from other member states. Early results are expected in Spring 2017.

  3. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Science.gov (United States)

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented.

  4. Perspectives on integrating the US radioactive waste disposal system

    International Nuclear Information System (INIS)

    Culler, F.L.; Croff, A.G.

    1990-01-01

    The waste management systems being developed and deployed by the DOE Office of Civilian Radioactive Waste Management (OCRWM) is large, complex, decentralized, and long term. As a result, a systems integration approach has been implemented by OCRWM. The fundamentals of systems integration and its application are examined in the context of the OCRWM program. This application is commendable, and some additional systems integration features are suggested to enhance its benefits. 6 refs., 1 fig

  5. When is vertical integration profitable? Focus on a large upstream company in the gas market

    International Nuclear Information System (INIS)

    Hatlebakk, Magnus

    2001-12-01

    This note discusses basic economic mechanisms that may affect the profitability of vertical integration in the European gas industry. It concentrates on reasonable strategies for a large upstream company which considers a stronger engagement downstream. The note warns against the effect of simplified conclusions with regard to the impact of vertical integration. It applies a simple model of successive oligopolies to discuss double mark-ups, exclusions, barriers to entry, etc

  6. Large-scale Wind Power integration in a Hydro-Thermal Power Market

    OpenAIRE

    Trøtscher, Thomas

    2007-01-01

    This master thesis describes a quadratic programming model used to calculate the spot prices in an efficient multi-area power market. The model has been adapted to Northern Europe, with focus on Denmark West and the integration of large quantities of wind power. In the model, demand and supply of electricity are equated, at an hourly time resolution, to find the spot price in each area. Historical load values are used to represent demand which is assumed to be completely inelastic. Supply i...

  7. Pilot Study of Topical Copper Chlorophyllin Complex in Subjects With Facial Acne and Large Pores.

    Science.gov (United States)

    Stephens, Thomas J; McCook, John P; Herndon, James H

    2015-06-01

    Acne vulgaris is one of the most common skin diseases treated by dermatologists. Salts of copper chlorophyllin complex are semi-synthetic naturally-derived compounds with antioxidant, anti-inflammatory and wound healing activity that have not been previously tested topically in the treatment of acne-prone skin with enlarged pores. This single-center pilot study was conducted to assess the efficacy and safety of a liposomal dispersion of topically applied sodium copper chlorophyllin complex in subjects with mild-moderate acne and large, visible pores over a course of 3 weeks. Subjects were supplied with the test product, a topical gel containing a liposomal dispersion of sodium copper chlorophyllin complex (0.1%) with directions to apply a small amount to the facial area twice daily. Clinical assessments were performed at screening/baseline and at week 3. VISIA readings were taken and self-assessment questionnaires were conducted. 10 subjects were enrolled and completed the 3-week study. All clinical efficacy parameters showed statistically significant improvements over baseline at week 3. The study product was well tolerated. Subject questionnaires showed the test product was highly rated. In this pilot study, a topical formulation containing a liposomal dispersion of sodium copper chlorophyllin complex was shown to be clinically effective and well tolerated for the treatment of mild-moderate acne and large, visible pores when used for 3 weeks.

  8. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  9. Integrative Bioengineering Institute

    Energy Technology Data Exchange (ETDEWEB)

    Eddington, David; Magin,L,Richard; Hetling, John; Cho, Michael

    2009-01-09

    Microfabrication enables many exciting experimental possibilities for medicine and biology that are not attainable through traditional methods. However, in order for microfabricated devices to have an impact they must not only provide a robust solution to a current unmet need, but also be simple enough to seamlessly integrate into standard protocols. Broad dissemination of bioMEMS has been stymied by the common aim of replacing established and well accepted protocols with equally or more complex devices, methods, or materials. The marriage of a complex, difficult to fabricate bioMEMS device with a highly variable biological system is rarely successful. Instead, the design philosophy of my lab aims to leverage a beneficial microscale phenomena (e.g. fast diffusion at the microscale) within a bioMEMS device and adapt to established methods (e.g. multiwell plate cell culture) and demonstrate a new paradigm for the field (adapt instead of replace). In order for the field of bioMEMS to mature beyond novel proof-of-concept demonstrations, researchers must focus on developing systems leveraging these phenomena and integrating into standard labs, which have largely been ignored. Towards this aim, the Integrative Bioengineering Institute has been established.

  10. Integrated digital control and man-machine interface for complex remote handing systems

    International Nuclear Information System (INIS)

    Rowe, J.C.; Spille, R.F.; Zimmermann, S.D.

    1987-01-01

    The Advanced Integrated Maintenance System (AIMS) is part of a continuing effort within the Consolidated Fuel Reprocessing Program at Oak Ridge National Laboratory to develop and extend the capabilities of remote manipulation and maintenance technology. The AIMS is a totally integrated approach to remote handling in hazardous environments. State-of-the-art computer systems connected through a high-speed distributed control system that supports the flexibility and expandability needed for large integrated maintenance applications. A man-Machine Interface provides high-level human interaction through a powerful color graphics menu-controlled operator console. An auxiliary control system handles the real-time processing needs for a variety of support hardware. A pair of dedicated fiber-optic-linked master/slave computer systems control the Advanced Servomanipulator master/slave arms using powerful distributed digital processing methods. The FORTH language was used as a real-time operating and development environment for the entire system, and all of these components are integrated into a control room concept that represents the latest advancements in the development of remote maintenance facilities for hazardous environments

  11. Integrated digital control and man-machine interface for complex remote handling systems

    International Nuclear Information System (INIS)

    Rowe, J.C.; Spille, R.F.; Zimmermann, S.D.

    1986-12-01

    The Advanced Integrated Maintenance System (AIMS) is part of a continuing effort within the Consolidated Fuel Reprocessing Program at Oak Ridge National Laboratory to develop and extend the capabilities of remote manipulation and maintenance technology. The AIMS is a totally integrated approach to remote handling in hazardous environments. State-of-the-art computer systems connected through a high-speed communication network provide a real-time distributed control system that supports the flexibility and expandability needed for large integrated maintenance applications. A Man-Machine Interface provides high-level human interaction through a powerful color graphics menu-controlled operator console. An auxiliary control system handles the real-time processing needs for a variety of support hardware. A pair of dedicated fiber-optic-linked master/slave computer system control the Advanced Servomanipulator master/slave arms using powerful distributed digital processing methods. The FORTH language was used as a real-time operating and development environment for the entire system, and all of these components are integrated into a control room concept that represents the latest advancements in the development of remote maintenance facilities for hazardous environments

  12. Complex fluid network optimization and control integrative design based on nonlinear dynamic model

    International Nuclear Information System (INIS)

    Sui, Jinxue; Yang, Li; Hu, Yunan

    2016-01-01

    In view of distribution according to complex fluid network’s needs, this paper proposed one optimization computation method of the nonlinear programming mathematical model based on genetic algorithm. The simulation result shows that the overall energy consumption of the optimized fluid network has a decrease obviously. The control model of the fluid network is established based on nonlinear dynamics. We design the control law based on feedback linearization, take the optimal value by genetic algorithm as the simulation data, can also solve the branch resistance under the optimal value. These resistances can provide technical support and reference for fluid network design and construction, so can realize complex fluid network optimization and control integration design.

  13. TradeWind. Integrating wind. Developing Europe's power market for the large-scale integration of wind power. Final report

    Energy Technology Data Exchange (ETDEWEB)

    2009-02-15

    Based on a single European grid and power market system, the TradeWind project explores to what extent large-scale wind power integration challenges could be addressed by reinforcing interconnections between Member States in Europe. Additionally, the project looks at the conditions required for a sound power market design that ensures a cost-effective integration of wind power at EU level. In this way, the study addresses two issues of key importance for the future integration of renewable energy, namely the weak interconnectivity levels between control zones and the inflexibility and fragmented nature of the European power market. Work on critical transmission paths and interconnectors is slow for a variety of reasons including planning and administrative barriers, lack of public acceptance, insufficient economic incentives for TSOs, and the lack of a joint European approach by the key stakeholders. (au)

  14. DNA-Directed Assembly of Capture Tools for Constitutional Studies of Large Protein Complexes.

    Science.gov (United States)

    Meyer, Rebecca; Faesen, Alex; Vogel, Katrin; Jeganathan, Sadasivam; Musacchio, Andrea; Niemeyer, Christof M

    2015-06-10

    Large supramolecular protein complexes, such as the molecular machinery involved in gene regulation, cell signaling, or cell division, are key in all fundamental processes of life. Detailed elucidation of structure and dynamics of such complexes can be achieved by reverse-engineering parts of the complexes in order to probe their interactions with distinctive binding partners in vitro. The exploitation of DNA nanostructures to mimic partially assembled supramolecular protein complexes in which the presence and state of two or more proteins are decisive for binding of additional building blocks is reported here. To this end, four-way DNA Holliday junction motifs bearing a fluorescein and a biotin tag, for tracking and affinity capture, respectively, are site-specifically functionalized with centromeric protein (CENP) C and CENP-T. The latter serves as baits for binding of the so-called KMN component, thereby mimicking early stages of the assembly of kinetochores, structures that mediate and control the attachment of microtubules to chromosomes in the spindle apparatus. Results from pull-down experiments are consistent with the hypothesis that CENP-C and CENP-T may bind cooperatively to the KMN network. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  16. Electromagnetic scattering of large structures in layered earths using integral equations

    Science.gov (United States)

    Xiong, Zonghou; Tripp, Alan C.

    1995-07-01

    An electromagnetic scattering algorithm for large conductivity structures in stratified media has been developed and is based on the method of system iteration and spatial symmetry reduction using volume electric integral equations. The method of system iteration divides a structure into many substructures and solves the resulting matrix equation using a block iterative method. The block submatrices usually need to be stored on disk in order to save computer core memory. However, this requires a large disk for large structures. If the body is discretized into equal-size cells it is possible to use the spatial symmetry relations of the Green's functions to regenerate the scattering impedance matrix in each iteration, thus avoiding expensive disk storage. Numerical tests show that the system iteration converges much faster than the conventional point-wise Gauss-Seidel iterative method. The numbers of cells do not significantly affect the rate of convergency. Thus the algorithm effectively reduces the solution of the scattering problem to an order of O(N2), instead of O(N3) as with direct solvers.

  17. Impacts of large dams on the complexity of suspended sediment dynamics in the Yangtze River

    Science.gov (United States)

    Wang, Yuankun; Rhoads, Bruce L.; Wang, Dong; Wu, Jichun; Zhang, Xiao

    2018-03-01

    The Yangtze River is one of the largest and most important rivers in the world. Over the past several decades, the natural sediment regime of the Yangtze River has been altered by the construction of dams. This paper uses multi-scale entropy analysis to ascertain the impacts of large dams on the complexity of high-frequency suspended sediment dynamics in the Yangtze River system, especially after impoundment of the Three Gorges Dam (TGD). In this study, the complexity of sediment dynamics is quantified by framing it within the context of entropy analysis of time series. Data on daily sediment loads for four stations located in the mainstem are analyzed for the past 60 years. The results indicate that dam construction has reduced the complexity of short-term (1-30 days) variation in sediment dynamics near the structures, but that complexity has actually increased farther downstream. This spatial pattern seems to reflect a filtering effect of the dams on the on the temporal pattern of sediment loads as well as decreased longitudinal connectivity of sediment transfer through the river system, resulting in downstream enhancement of the influence of local sediment inputs by tributaries on sediment dynamics. The TGD has had a substantial impact on the complexity of sediment series in the mainstem of the Yangtze River, especially after it became fully operational. This enhanced impact is attributed to the high trapping efficiency of this dam and its associated large reservoir. The sediment dynamics "signal" becomes more spatially variable after dam construction. This study demonstrates the spatial influence of dams on the high-frequency temporal complexity of sediment regimes and provides valuable information that can be used to guide environmental conservation of the Yangtze River.

  18. Inferring genetic architecture of complex traits using Bayesian integrative analysis of genome and transcriptiome data

    DEFF Research Database (Denmark)

    Ehsani, Alireza; Sørensen, Peter; Pomp, Daniel

    2012-01-01

    Background To understand the genetic architecture of complex traits and bridge the genotype-phenotype gap, it is useful to study intermediate -omics data, e.g. the transcriptome. The present study introduces a method for simultaneous quantification of the contributions from single nucleotide......-modal distribution of genomic values collapses, when gene expressions are added to the model Conclusions With increased availability of various -omics data, integrative approaches are promising tools for understanding the genetic architecture of complex traits. Partitioning of explained variances at the chromosome...

  19. Speciation on the rocks: integrated systematics of the Heteronotia spelea species complex (Gekkota; Reptilia from Western and Central Australia.

    Directory of Open Access Journals (Sweden)

    Mitzy Pepper

    Full Text Available The isolated uplands of the Australian arid zone are known to provide mesic refuges in an otherwise xeric landscape, and divergent lineages of largely arid zone taxa have persisted in these regions following the onset of Miocene aridification. Geckos of the genus Heteronotia are one such group, and have been the subject of many genetic studies, including H. spelea, a strongly banded form that occurs in the uplands of the Pilbara and Central Ranges regions of the Australian arid zone. Here we assess the systematics of these geckos based on detailed examination of morphological and genetic variation. The H. spelea species complex is a monophyletic lineage to the exclusion of the H. binoei and H. planiceps species complexes. Within the H. spelea complex, our previous studies based on mtDNA and nine nDNA loci found populations from the Central Ranges to be genetically divergent from Pilbara populations. Here we supplement our published molecular data with additional data gathered from central Australian samples. In the spirit of integrative species delimitation, we combine multi-locus, coalescent-based lineage delimitation with extensive morphological analyses to test species boundaries, and we describe the central populations as a new species, H. fasciolatus sp. nov. In addition, within the Pilbara there is strong genetic evidence for three lineages corresponding to northeastern (type, southern, and a large-bodied melanic population isolated in the northwest. Due to its genetic distinctiveness and extreme morphological divergence from all other Heteronotia, we describe the melanic form as a new species, H. atra sp. nov. The northeastern and southern Pilbara populations are morphologically indistinguishable with the exception of a morpho-type in the southeast that has a banding pattern resembling H. planiceps from the northern monsoonal tropics. Pending more extensive analyses, we therefore treat Pilbara H. spelea as a single species with

  20. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  1. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    Science.gov (United States)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  2. General support for integrated assessment research. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, Hadi

    2001-03-01

    The climate change problem spans an extraordinarily large number of disciplines from earth sciences to social and political sciences. The interaction of processes described by these different fields is why climate change is such a complex issue. Keeping track of these interactions and bringing coherence to the assumptions underlying each disciplinary insight on the climate problem is a massive undertaking. Integrated assessment is an interdisciplinary approach designed to provide systematic evaluations of technically complex problems such as the analysis of environmental change challenges facing humanity. Ph.D. theses stemming from this application are summarized. Then some aspects of Integrated Climate Assessment Models are described.

  3. SV40 large T-p53 complex: evidence for the presence of two immunologically distinct forms of p53

    International Nuclear Information System (INIS)

    Milner, J.; Gamble, J.

    1985-01-01

    The transforming protein of SV40 is the large T antigen. Large T binds a cellular protein, p53, which is potentially oncogenic by virtue of its functional involvement in the control of cell proliferation. This raises the possibility that p53 may mediate, in part, the transforming function of SV40 large T. Two immunologically distinct forms of p53 have been identified in normal cells: the forms are cell-cycle dependent, one being restricted to nondividing cells (p53-Go) and the second to dividing cells (p53-G divided by). The authors have now dissociated and probed the multimeric complex of SV40 large T-p53 for the presence of immunologically distinct forms of p53. Here they present evidence for the presence of p53-Go and p53-G divided by complexed with SV40 large T

  4. Automated NMR fragment based screening identified a novel interface blocker to the LARG/RhoA complex.

    Directory of Open Access Journals (Sweden)

    Jia Gao

    Full Text Available The small GTPase cycles between the inactive GDP form and the activated GTP form, catalyzed by the upstream guanine exchange factors. The modulation of such process by small molecules has been proven to be a fruitful route for therapeutic intervention to prevent the over-activation of the small GTPase. The fragment based approach emerging in the past decade has demonstrated its paramount potential in the discovery of inhibitors targeting such novel and challenging protein-protein interactions. The details regarding the procedure of NMR fragment screening from scratch have been rarely disclosed comprehensively, thus restricts its wider applications. To achieve a consistent screening applicable to a number of targets, we developed a highly automated protocol to cover every aspect of NMR fragment screening as possible, including the construction of small but diverse libray, determination of the aqueous solubility by NMR, grouping compounds with mutual dispersity to a cocktail, and the automated processing and visualization of the ligand based screening spectra. We exemplified our streamlined screening in RhoA alone and the complex of the small GTPase RhoA and its upstream guanine exchange factor LARG. Two hits were confirmed from the primary screening in cocktail and secondary screening over individual hits for LARG/RhoA complex, while one of them was also identified from the screening for RhoA alone. HSQC titration of the two hits over RhoA and LARG alone, respectively, identified one compound binding to RhoA.GDP at a 0.11 mM affinity, and perturbed the residues at the switch II region of RhoA. This hit blocked the formation of the LARG/RhoA complex, validated by the native gel electrophoresis, and the titration of RhoA to ¹⁵N labeled LARG in the absence and presence the compound, respectively. It therefore provides us a starting point toward a more potent inhibitor to RhoA activation catalyzed by LARG.

  5. Talking about the institutional complexity of the integrated rehabilitation system – the importance of coordination

    Directory of Open Access Journals (Sweden)

    Sari Miettinen

    2013-03-01

    Full Text Available Rehabilitation in Finland is a good example of functions divided among several welfare sectors, such as health services and social services.  The rehabilitation system in Finland is a complex one and there have been many efforts to create a coordinated entity. The purpose of this study is to open up a complex welfare system at the upper policy level and to understand the meaning of coordination at the level of service delivery. We shed light in particular on the national rehabilitation policy in Finland and how the policy has tried to overcome the negative effects of institutional complexity. In this study we used qualitative content analysis and frame analysis. As a result we identified four different welfare state frames with distinct features of policy problems, policy alternatives and institutional failure. The rehabilitation policy in Finland seems to be divided into different components which may cause problems at the level of service delivery and thus in the integration of services. Bringing these components together could at policy level enable a shared view of the rights of different population groups, effective management of integration at the level of service delivery and also an opportunity for change throughout the rehabilitation system.

  6. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks

    Directory of Open Access Journals (Sweden)

    Raja Jurdak

    2008-11-01

    Full Text Available Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  7. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks.

    Science.gov (United States)

    Jurdak, Raja; Nafaa, Abdelhamid; Barbirato, Alessio

    2008-11-24

    Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  8. Test methods of total dose effects in very large scale integrated circuits

    International Nuclear Information System (INIS)

    He Chaohui; Geng Bin; He Baoping; Yao Yujuan; Li Yonghong; Peng Honglun; Lin Dongsheng; Zhou Hui; Chen Yusheng

    2004-01-01

    A kind of test method of total dose effects (TDE) is presented for very large scale integrated circuits (VLSI). The consumption current of devices is measured while function parameters of devices (or circuits) are measured. Then the relation between data errors and consumption current can be analyzed and mechanism of TDE in VLSI can be proposed. Experimental results of 60 Co γ TDEs are given for SRAMs, EEPROMs, FLASH ROMs and a kind of CPU

  9. Real and complex analysis

    CERN Document Server

    Apelian, Christopher; Taft, Earl; Nashed, Zuhair

    2009-01-01

    The Spaces R, Rk, and CThe Real Numbers RThe Real Spaces RkThe Complex Numbers CPoint-Set Topology Bounded SetsClassification of Points Open and Closed SetsNested Intervals and the Bolzano-Weierstrass Theorem Compactness and Connectedness Limits and Convergence Definitions and First Properties Convergence Results for SequencesTopological Results for Sequences Properties of Infinite SeriesManipulations of Series in RFunctions: Definitions and Limits DefinitionsFunctions as MappingsSome Elementary Complex FunctionsLimits of FunctionsFunctions: Continuity and Convergence Continuity Uniform Continuity Sequences and Series of FunctionsThe DerivativeThe Derivative for f: D1 → RThe Derivative for f: Dk → RThe Derivative for f: Dk → RpThe Derivative for f: D → CThe Inverse and Implicit Function TheoremsReal IntegrationThe Integral of f: [a, b] → RProperties of the Riemann Integral Further Development of Integration TheoryVector-Valued and Line IntegralsComplex IntegrationIntroduction to Complex Integrals Fu...

  10. Deciphering the clinical effect of drugs through large-scale data integration

    DEFF Research Database (Denmark)

    Kjærulff, Sonny Kim

    . This work demonstrates the power of a strategy that uses clinical data mining in association with chemical biology in order to reduce the search space and aid identification of novel drug actions. The second article described in chapter 3 outlines a high confidence side-effect-drug interaction dataset. We...... demonstrates the importance of using high-confidence drug-side-effect data in deciphering the effect of small molecules in humans. In summary, this thesis presents computational systems chemical biology approaches that can help identify clinical effects of small molecules through large-scale data integration...

  11. A dynamic globalization model for large eddy simulation of complex turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hae Cheon; Park, No Ma; Kim, Jin Seok [Seoul National Univ., Seoul (Korea, Republic of)

    2005-07-01

    A dynamic subgrid-scale model is proposed for large eddy simulation of turbulent flows in complex geometry. The eddy viscosity model by Vreman [Phys. Fluids, 16, 3670 (2004)] is considered as a base model. A priori tests with the original Vreman model show that it predicts the correct profile of subgrid-scale dissipation in turbulent channel flow but the optimal model coefficient is far from universal. Dynamic procedures of determining the model coefficient are proposed based on the 'global equilibrium' between the subgrid-scale dissipation and viscous dissipation. An important feature of the proposed procedures is that the model coefficient determined is globally constant in space but varies only in time. Large eddy simulations with the present dynamic model are conducted for forced isotropic turbulence, turbulent channel flow and flow over a sphere, showing excellent agreements with previous results.

  12. Complexity impact factors on the integration process of ERP and non ERP systems : a basis for an evaluation instrument

    NARCIS (Netherlands)

    Janssens, G.; Hoeijenbos, M.; Kusters, R.J.; Cuaresma, M.J.E.; Shishkov, B.; Cordeiro, J.

    2011-01-01

    This study shows an expert confirmed initial list of factors which influence the complexity of the integration process of ERP systems and non ERP systems. After a thorough search for complexity factors in scientific literature, a survey amongst 8 experts in a leading European long special steel

  13. Priority Directions of The Regional Food Complex Effectiveness Increase

    Directory of Open Access Journals (Sweden)

    Dmitry Andreyevich Karkh

    2015-06-01

    Full Text Available In the article, the modern trends of management integration as a solution of food and sectoral problems are considered. On the basis of national and foreign experience of development of integration, the classification of integration in economic systems is developed. On the basis of the given classification, the concept of integration is conducted. In the article, much attention is given to the food problem solution, which depends more on the agro-industrial complex based on metaintegration. The leading place of a food complex in the system of agro-industrial complex is caused by a role of food and food raw materials in life of the population of the country. The ratio of the sectors participating in the food production and consumer goods makes the sectoral structure of agro-industrial complex. In the conditions of the resource limitation necessary for the production and food delivery to the domestic market, the role of trade will increase not only in agro-industrial complex, but in all national economy. Also, in this article, the phenomenon of clusters is studied. The attention to clusters as to innovation systems reflects a rising tide of interest of economic science to the questions of economics functioning in regional level and understanding of a role of specific local resources in stimulation of innovative opportunities and competitiveness of small and medium business. Creating a cluster, participants develop the spatial and organizational integrated structure, in interaction of legal entities the status is saved and cooperation provides competitive advantages with other business entities. The role of the state in integration of cluster formations in the Russian Federation is more significant, than in any other country. The state represented by regional authorities actively participates in decision-making process by the business located in its catchment area not only through membership in governing bodies of large joint stock companies.

  14. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  15. Characterization of Aftershock Sequences from Large Strike-Slip Earthquakes Along Geometrically Complex Faults

    Science.gov (United States)

    Sexton, E.; Thomas, A.; Delbridge, B. G.

    2017-12-01

    Large earthquakes often exhibit complex slip distributions and occur along non-planar fault geometries, resulting in variable stress changes throughout the region of the fault hosting aftershocks. To better discern the role of geometric discontinuities on aftershock sequences, we compare areas of enhanced and reduced Coulomb failure stress and mean stress for systematic differences in the time dependence and productivity of these aftershock sequences. In strike-slip faults, releasing structures, including stepovers and bends, experience an increase in both Coulomb failure stress and mean stress during an earthquake, promoting fluid diffusion into the region and further failure. Conversely, Coulomb failure stress and mean stress decrease in restraining bends and stepovers in strike-slip faults, and fluids diffuse away from these areas, discouraging failure. We examine spatial differences in seismicity patterns along structurally complex strike-slip faults which have hosted large earthquakes, such as the 1992 Mw 7.3 Landers, the 2010 Mw 7.2 El-Mayor Cucapah, the 2014 Mw 6.0 South Napa, and the 2016 Mw 7.0 Kumamoto events. We characterize the behavior of these aftershock sequences with the Epidemic Type Aftershock-Sequence Model (ETAS). In this statistical model, the total occurrence rate of aftershocks induced by an earthquake is λ(t) = λ_0 + \\sum_{i:t_i

  16. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  17. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  18. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem

    2016-12-28

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity of the linear precoder and receiver that maximize the minimum signal-to-interference-plus-noise ratio subject to a given power constraint. To this end, we consider the asymptotic regime in which M and K grow large with a given ratio. Tools from random matrix theory (RMT) are then used to compute, in closed form, accurate approximations for the parameters of the optimal precoder and receiver, when imperfect channel state information (modeled by the generic Gauss-Markov formulation form) is available at the BS. The asymptotic analysis allows us to derive the asymptotically optimal linear precoder and receiver that are characterized by a lower complexity (due to the dependence on the large scale components of the channel) and, possibly, by a better resilience to imperfect channel state information. However, the implementation of both is still challenging as it requires fast inversions of large matrices in every coherence period. To overcome this issue, we apply the truncated polynomial expansion (TPE) technique to the precoding and receiving vector of each UE and make use of RMT to determine the optimal weighting coefficients on a per- UE basis that asymptotically solve the max-min SINR problem. Numerical results are used to validate the asymptotic analysis in the finite system regime and to show that the proposed TPE transceivers efficiently mimic the optimal ones, while requiring much lower computational complexity.

  19. DISEÑO CURRICULAR: DE LA INTEGRACIÓN A LA COMPLEJIDAD (CURRICULUM DESIGN: FROM INTEGRATION TO COMPLEXITY

    Directory of Open Access Journals (Sweden)

    Badilla Saxe Eleonora

    2009-08-01

    Full Text Available Resumen:En este ensayo se muestra un tránsito que podría seguir el diseño curricular desde la integración hacia la complejidad. Se parte de las ideas de James Beane y John Deswey sobre la Integración Curricular y las Actividades Ocupacionales respectivamente para pasar por los llamados Ejes Transversales y el Enfoque por Proyectos, con miras a evolucionar hacia la Pedagogía de la Complejidad, tomando como base las ideas para promover el pensamiento complejo que propone Edgar Morin. Como conclusión se propone un cambio en la metáfora con la cual se diseñan currículos y planes de estudio.Abstract:This essay shows a path that can be taken when designing curriculum, that goes from Integration onto Complexity. It starts with Jeames Beane´s Curriculum Integration and John Dewey´s Occupational Activities, going through Transversal Axis and Project Approach, with the goal to evolve into the Pedagogy of Complexity, based on Edgar Morin´s ideas to stimulate complex thinking. A change of metaphor for curriculum design is proposed, as a conclusion.

  20. Kernel methods for large-scale genomic data analysis

    Science.gov (United States)

    Xing, Eric P.; Schaid, Daniel J.

    2015-01-01

    Machine learning, particularly kernel methods, has been demonstrated as a promising new tool to tackle the challenges imposed by today’s explosive data growth in genomics. They provide a practical and principled approach to learning how a large number of genetic variants are associated with complex phenotypes, to help reveal the complexity in the relationship between the genetic markers and the outcome of interest. In this review, we highlight the potential key role it will have in modern genomic data processing, especially with regard to integration with classical methods for gene prioritizing, prediction and data fusion. PMID:25053743

  1. How complex can integrated optical circuits become?

    NARCIS (Netherlands)

    Smit, M.K.; Hill, M.T.; Baets, R.G.F.; Bente, E.A.J.M.; Dorren, H.J.S.; Karouta, F.; Koenraad, P.M.; Koonen, A.M.J.; Leijtens, X.J.M.; Nötzel, R.; Oei, Y.S.; Waardt, de H.; Tol, van der J.J.G.M.; Khoe, G.D.

    2007-01-01

    The integration scale in Photonic Integrated Circuits will be pushed to VLSI-level in the coming decade. This will bring major changes in both application and manufacturing. In this paper developments in Photonic Integration are reviewed and the limits for reduction of device demensions are

  2. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  3. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers....... The analysis was based on using the energy model Balmorel to determine the optimum dispatch of HPs in the system. The potential heat sources in Copenhagen for use in HPs were determined based on data related to temperatures, flows, and hydrography at different locations, while respecting technical constraints...

  4. Role for ribosome-associated complex and stress-seventy subfamily B (RAC-Ssb) in integral membrane protein translation.

    Science.gov (United States)

    Acosta-Sampson, Ligia; Döring, Kristina; Lin, Yuping; Yu, Vivian Y; Bukau, Bernd; Kramer, Günter; Cate, Jamie H D

    2017-12-01

    Targeting of most integral membrane proteins to the endoplasmic reticulum is controlled by the signal recognition particle, which recognizes a hydrophobic signal sequence near the protein N terminus. Proper folding of these proteins is monitored by the unfolded protein response and involves protein degradation pathways to ensure quality control. Here, we identify a new pathway for quality control of major facilitator superfamily transporters that occurs before the first transmembrane helix, the signal sequence recognized by the signal recognition particle, is made by the ribosome. Increased rates of translation elongation of the N-terminal sequence of these integral membrane proteins can divert the nascent protein chains to the ribosome-associated complex and stress-seventy subfamily B chaperones. We also show that quality control of integral membrane proteins by ribosome-associated complex-stress-seventy subfamily B couples translation rate to the unfolded protein response, which has implications for understanding mechanisms underlying human disease and protein production in biotechnology. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  6. Obsessive-compulsive disorder, which genes? Which functions? Which pathways? An integrated holistic view regarding OCD and its complex genetic etiology.

    Science.gov (United States)

    Bozorgmehr, Ali; Ghadirivasfi, Mohammad; Shahsavand Ananloo, Esmaeil

    2017-09-01

    Obsessive-compulsive disorder (OCD) is characterized by recurrent obtrusive and repetitive acts typically occurred following anxiety. In the last two decades, studies done on the gene sequences, large-scale and point mutations and gene-gene, gene-environment and gene-drug interactions have led to the discovery of hundreds of genes associated with OCD. Although each gene in turn is a part of the etiology of this disorder; however, OCD, like other mental disorders is complex and a comprehensive and integrated view is necessary to understand its genetic basis. In this study, through an extensive review of existing published studies, all genes associated with OCD were found. Then, in order to integrate the results, all the interactions between these genes were explored and the achievement was represented as an interactive genetic network. Furthermore, the reconstructed network was analyzed. It was found that GRIN2A, GRIN2B and GRIA2 are the most central nodes in the network. Functional and pathway enrichment analysis showed that glutamate-related pathways are the main deficient systems in patients with OCD. By studying genes shared between OCD and other diseases, it was cleared that OCD, epilepsy and some types of cancer have the most number of shared genes. The results of this study, in addition to reviewing the available results as a comprehensive and integrated manner, provide new hypotheses for future studies.

  7. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  8. Low-cost sensor integrators for measuring the transmissivity of complex canopies to photosynthetically active radiation

    International Nuclear Information System (INIS)

    Newman, S.M.

    1985-01-01

    A system has been designed, tested and evaluated for measuring the transmissivities of complex canopies to photosynthetically active radiation (PAR). The system consists of filtered silicon photocells in cosine corrected mounts with outputs integrated by the use of chemical coulometers. The reading accumulated by the coulometers was taken electronically by the use of microcomputers. The low-cost sensor integrators, which do not require batteries, performed as expected and proved ideal for the study of agroforestry systems in remote areas. Information on the PAR transmissivity of a temperate agroforestry system in the form of an intercropped orchard is also presented. (author)

  9. Enabling the Integrated Assessment of Large Marine Ecosystems: Informatics to the Forefront of Science-Based Decision Support

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; Beaulieu, S. E.; Maffei, A. R.; West, P.; Hare, J. A.

    2012-12-01

    Integrated assessments of large marine ecosystems require the understanding of interactions between environmental, ecological, and socio-economic factors that affect production and utilization of marine natural resources. Assessing the functioning of complex coupled natural-human systems calls for collaboration between natural and social scientists across disciplinary and national boundaries. We are developing a platform to implement and sustain informatics solutions for these applications, providing interoperability among very diverse and heterogeneous data and information sources, as well as multi-disciplinary organizations and people. We have partnered with NOAA NMFS scientists to facilitate the deployment of an integrated ecosystem approach to management in the Northeast U.S. (NES) and California Current Large Marine Ecosystems (LMEs). Our platform will facilitate the collaboration and knowledge sharing among NMFS natural and social scientists, promoting community participation in integrating data, models, and knowledge. Here, we present collaborative software tools developed to aid the production of the Ecosystem Status Report (ESR) for the NES LME. The ESR addresses the D-P-S portion of the DPSIR (Driver-Pressure-State-Impact-Response) management framework: reporting data, indicators, and information products for climate drivers, physical and human (fisheries) pressures, and ecosystem state (primary and secondary production and higher trophic levels). We are developing our tools in open-source software, with the main tool based on a web application capable of providing the ability to work on multiple data types from a variety of sources, providing an effective way to share the source code used to generate data products and associated metadata as well as track workflow provenance to allow in the reproducibility of a data product. Our platform retrieves data, conducts standard analyses, reports data quality and other standardized metadata, provides iterative

  10. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  11. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  12. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  13. International Requirements for Large Integration of Renewable Energy Sources

    DEFF Research Database (Denmark)

    Molina-Garcia, Angel; Hansen, Anca Daniela; Muljadi, Ed

    2017-01-01

    Most European countries have concerns about the integration of large amounts of renewable energy sources (RES) into electric power systems, and this is currently a topic of growing interest. In January 2008, the European Commission published the 2020 package, which proposes committing the European...... Union to a 20% reduction in greenhouse gas emissions, to achieve a target of deriving 20% of the European Union's final energy consumption from renewable sources, and to achieve 20% improvement in energy efficiency both by the year 2020 [1]. Member states have different individual goals to meet...... these overall objectives, and they each need to provide a detailed roadmap describing how they will meet these legally binding targets [2]. At this time, RES are an indispensable part of the global energy mix, which has been partially motivated by the continuous increases in hydropower as well as the rapid...

  14. Nuclear trafficking of the HIV-1 pre-integration complex depends on the ADAM10 intracellular domain

    International Nuclear Information System (INIS)

    Endsley, Mark A.; Somasunderam, Anoma D.; Li, Guangyu; Oezguen, Numan; Thiviyanathan, Varatharasa; Murray, James L.; Rubin, Donald H.; Hodge, Thomas W.

    2014-01-01

    Previously, we showed that ADAM10 is necessary for HIV-1 replication in primary human macrophages and immortalized cell lines. Silencing ADAM10 expression interrupted the HIV-1 life cycle prior to nuclear translocation of viral cDNA. Furthermore, our data indicated that HIV-1 replication depends on the expression of ADAM15 and γ-secretase, which proteolytically processes ADAM10. Silencing ADAM15 or γ-secretase expression inhibits HIV-1 replication between reverse transcription and nuclear entry. Here, we show that ADAM10 expression also supports replication in CD4 + T lymphocytes. The intracellular domain (ICD) of ADAM10 associates with the HIV-1 pre-integration complex (PIC) in the cytoplasm and immunoprecipitates and co-localizes with HIV-1 integrase, a key component of PIC. Taken together, our data support a model whereby ADAM15/γ-secretase processing of ADAM10 releases the ICD, which then incorporates into HIV-1 PIC to facilitate nuclear trafficking. Thus, these studies suggest ADAM10 as a novel therapeutic target for inhibiting HIV-1 prior to nuclear entry. - Highlights: • Nuclear trafficking of the HIV-1 pre-integration complex depends on ADAM10. • ADAM10 associates with HIV-1 integrase in the pre-integration complex. • HIV-1 replication depends on the expression of ADAM15 and γ-secretase. • Silencing ADAM15 or γ-secretase expression inhibits nuclear import of viral cDNA. • ADAM10 is important for HIV-1 replication in human macrophages and CD4 + T lymphocytes

  15. Nuclear trafficking of the HIV-1 pre-integration complex depends on the ADAM10 intracellular domain

    Energy Technology Data Exchange (ETDEWEB)

    Endsley, Mark A., E-mail: maendsle@utmb.edu [Department Internal Medicine, Division of Infectious Diseases, University of Texas Medical Branch, 301 University Blvd, Galveston, TX 77555 (United States); Somasunderam, Anoma D., E-mail: asomasun@utmb.edu [Department Internal Medicine, Division of Infectious Diseases, University of Texas Medical Branch, 301 University Blvd, Galveston, TX 77555 (United States); Li, Guangyu, E-mail: LIG001@mail.etsu.edu [Department of Internal Medicine, Quillen College of Medicine, East Tennessee State University, Johnson City, TN 37614 (United States); Oezguen, Numan, E-mail: numan.oezguen@bcm.edu [Department of Pathology and Immunology, Microbiome Center, Texas Children' s Hospital, Houston, TX 77030 (United States); Thiviyanathan, Varatharasa, E-mail: Varatharasa.Thiviyanathan@uth.tmc.edu [Institute of Molecular Medicine, University of Texas Health Science Center, Houston, TX 77030 (United States); Murray, James L., E-mail: jmurray100@yahoo.com [GeneTAG Technology, Inc., 3155 Northwoods Place, Norcross, GA 30071 (United States); Rubin, Donald H., E-mail: don.h.rubin@vanderbilt.edu [Research Medicine, VA Tennessee Valley Healthcare System, 1310 24th Ave. South, Nashville, TN 37212 (United States); Departments of Medicine, Pathology, Microbiology and Immunology, Vanderbilt University School of Medicine, 1161 21st Ave South, Nashville, TN 37232 (United States); Hodge, Thomas W., E-mail: twhodge3@gmail.com [Pre-clinical and Antiviral Research, Tamir Biotechnology, Inc., 12625 High Bluff Dr., Suite 113, San Diego, CA 92130 (United States); and others

    2014-04-15

    Previously, we showed that ADAM10 is necessary for HIV-1 replication in primary human macrophages and immortalized cell lines. Silencing ADAM10 expression interrupted the HIV-1 life cycle prior to nuclear translocation of viral cDNA. Furthermore, our data indicated that HIV-1 replication depends on the expression of ADAM15 and γ-secretase, which proteolytically processes ADAM10. Silencing ADAM15 or γ-secretase expression inhibits HIV-1 replication between reverse transcription and nuclear entry. Here, we show that ADAM10 expression also supports replication in CD4{sup +} T lymphocytes. The intracellular domain (ICD) of ADAM10 associates with the HIV-1 pre-integration complex (PIC) in the cytoplasm and immunoprecipitates and co-localizes with HIV-1 integrase, a key component of PIC. Taken together, our data support a model whereby ADAM15/γ-secretase processing of ADAM10 releases the ICD, which then incorporates into HIV-1 PIC to facilitate nuclear trafficking. Thus, these studies suggest ADAM10 as a novel therapeutic target for inhibiting HIV-1 prior to nuclear entry. - Highlights: • Nuclear trafficking of the HIV-1 pre-integration complex depends on ADAM10. • ADAM10 associates with HIV-1 integrase in the pre-integration complex. • HIV-1 replication depends on the expression of ADAM15 and γ-secretase. • Silencing ADAM15 or γ-secretase expression inhibits nuclear import of viral cDNA. • ADAM10 is important for HIV-1 replication in human macrophages and CD4{sup +} T lymphocytes.

  16. Audio-Tactile Integration and the Influence of Musical Training

    OpenAIRE

    Kuchenbuch, Anja; Paraskevopoulos, Evangelos; Herholz, Sibylle C.; Pantev, Christo

    2014-01-01

    Perception of our environment is a multisensory experience; information from different sensory systems like the auditory, visual and tactile is constantly integrated. Complex tasks that require high temporal and spatial precision of multisensory integration put strong demands on the underlying networks but it is largely unknown how task experience shapes multisensory processing. Long-term musical training is an excellent model for brain plasticity because it shapes the human brain at function...

  17. A Hybrid Neuro-Fuzzy Model For Integrating Large Earth-Science Datasets

    Science.gov (United States)

    Porwal, A.; Carranza, J.; Hale, M.

    2004-12-01

    A GIS-based hybrid neuro-fuzzy approach to integration of large earth-science datasets for mineral prospectivity mapping is described. It implements a Takagi-Sugeno type fuzzy inference system in the framework of a four-layered feed-forward adaptive neural network. Each unique combination of the datasets is considered a feature vector whose components are derived by knowledge-based ordinal encoding of the constituent datasets. A subset of feature vectors with a known output target vector (i.e., unique conditions known to be associated with either a mineralized or a barren location) is used for the training of an adaptive neuro-fuzzy inference system. Training involves iterative adjustment of parameters of the adaptive neuro-fuzzy inference system using a hybrid learning procedure for mapping each training vector to its output target vector with minimum sum of squared error. The trained adaptive neuro-fuzzy inference system is used to process all feature vectors. The output for each feature vector is a value that indicates the extent to which a feature vector belongs to the mineralized class or the barren class. These values are used to generate a prospectivity map. The procedure is demonstrated by an application to regional-scale base metal prospectivity mapping in a study area located in the Aravalli metallogenic province (western India). A comparison of the hybrid neuro-fuzzy approach with pure knowledge-driven fuzzy and pure data-driven neural network approaches indicates that the former offers a superior method for integrating large earth-science datasets for predictive spatial mathematical modelling.

  18. Integration and Implementation Sciences: Building a New Specialization

    Directory of Open Access Journals (Sweden)

    Gabriele Bammer

    2005-12-01

    Full Text Available Developing a new specialization - Integration and Implementation Sciences - may be an effective way to draw together and significantly strengthen the theory and methods necessary to tackle complex societal issues and problems. This paper presents an argument for such a specialization, beginning with a brief review of calls for new research approaches that combine disciplines and interact more closely with policy and practice. It posits that the core elements of Integration and Implementation Sciences already exist, but that the field is currently characterized by fragmentation and marginalization. The paper then outlines three sets of characteristics that will delineate Integration and Implementation Sciences. First is that the specialization will aim to find better ways to deal with the defining elements of many current societal issues and problems: namely complexity, uncertainty, change, and imperfection. Second is that there will be three theoretical and methodological pillars for doing this: 1 systems thinking and complexity science, 2 participatory methods, and 3 knowledge management, exchange, and implementation. Third, operationally, Integration and Implementation Sciences will be grounded in practical application, and generally involve large-scale collaboration. The paper concludes by examining where Integration and Implementation Sciences would sit in universities, and outlines a program for further development of the field. An appendix provides examples of Integration and Implementation Sciences in action.

  19. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  20. An Improved Conceptually-Based Method for Analysis of Communication Network Structure of Large Complex Organizations.

    Science.gov (United States)

    Richards, William D., Jr.

    Previous methods for determining the communication structure of organizations work well for small or simple organizations, but are either inadequate or unwieldy for use with large complex organizations. An improved method uses a number of different measures and a series of successive approximations to order the communication matrix such that…

  1. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  2. Optimal integrated sizing and planning of hubs with midsize/large CHP units considering reliability of supply

    International Nuclear Information System (INIS)

    Moradi, Saeed; Ghaffarpour, Reza; Ranjbar, Ali Mohammad; Mozaffari, Babak

    2017-01-01

    Highlights: • New hub planning formulation is proposed to exploit assets of midsize/large CHPs. • Linearization approaches are proposed for two-variable nonlinear CHP fuel function. • Efficient operation of addressed CHPs & hub devices at contingencies are considered. • Reliability-embedded integrated planning & sizing is formulated as one single MILP. • Noticeable results for costs & reliability-embedded planning due to mid/large CHPs. - Abstract: Use of multi-carrier energy systems and the energy hub concept has recently been a widespread trend worldwide. However, most of the related researches specialize in CHP systems with constant electricity/heat ratios and linear operating characteristics. In this paper, integrated energy hub planning and sizing is developed for the energy systems with mid-scale and large-scale CHP units, by taking their wide operating range into consideration. The proposed formulation is aimed at taking the best use of the beneficial degrees of freedom associated with these units for decreasing total costs and increasing reliability. High-accuracy piecewise linearization techniques with approximation errors of about 1% are introduced for the nonlinear two-dimensional CHP input-output function, making it possible to successfully integrate the CHP sizing. Efficient operation of CHP and the hub at contingencies is extracted via a new formulation, which is developed to be incorporated to the planning and sizing problem. Optimal operation, planning, sizing and contingency operation of hub components are integrated and formulated as a single comprehensive MILP problem. Results on a case study with midsize CHPs reveal a 33% reduction in total costs, and it is demonstrated that the proposed formulation ceases the need for additional components/capacities for increasing reliability of supply.

  3. Integrated logistic support studies using behavioral Monte Carlo simulation, supported by Generalized Stochastic Petri Nets

    International Nuclear Information System (INIS)

    Garnier, Robert; Chevalier, Marcel

    2000-01-01

    Studying large and complex industrial sites, requires more and more accuracy in modeling. In particular, when considering Spares, Maintenance and Repair / Replacement processes, determining optimal Integrated Logistic Support policies requires a high level modeling formalism, in order to make the model as close as possible to the real considered processes. Generally, numerical methods are used to process this kind of study. In this paper, we propose an alternate way to process optimal Integrated Logistic Support policy determination when dealing with large, complex and distributed multi-policies industrial sites. This method is based on the use of behavioral Monte Carlo simulation, supported by Generalized Stochastic Petri Nets. (author)

  4. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem; Kammoun, Abla; Sanguinetti, Luca; Debbah, Merouane; Alouini, Mohamed-Slim

    2016-01-01

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity

  5. The rubber hand illusion in complex regional pain syndrome: preserved ability to integrate a rubber hand indicates intact multisensory integration.

    Science.gov (United States)

    Reinersmann, Annika; Landwehrt, Julia; Krumova, Elena K; Peterburs, Jutta; Ocklenburg, Sebastian; Güntürkün, Onur; Maier, Christoph

    2013-09-01

    In patients with complex regional pain syndrome (CRPS) type 1, processing of static tactile stimuli is impaired, whereas more complex sensory integration functions appear preserved. This study investigated higher order multisensory integration of body-relevant stimuli using the rubber hand illusion in CRPS patients. Subjective self-reports and skin conductance responses to watching the rubber hand being harmed were compared among CRPS patients (N=24), patients with upper limb pain of other origin (N=21, clinical control group), and healthy subjects (N=24). Additionally, the influence of body representation (body plasticity [Trinity Assessment of Body Plasticity], neglect-like severity symptoms), and clinical signs of illusion strength were investigated. For statistical analysis, 1-way analysis of variance, t test, Pearson correlation, with α=0.05 were used. CRPS patients did not differ from healthy subjects and the control group with regard to their illusion strength as assessed by subjective reports or skin conductance response values. Stronger left-sided rubber hand illusions were reported by healthy subjects and left-side-affected CRPS patients. Moreover, for this subgroup, illness duration and illusion strength were negatively correlated. Overall, severity of neglect-like symptoms and clinical signs were not related to illusion strength. However, patients with CRPS of the right hand reported significantly stronger neglect-like symptoms and significantly lower illusion strength of the affected hand than patients with CRPS of the left hand. The weaker illusion of CRPS patients with strong neglect-like symptoms on the affected hand supports the role of top-down processes modulating body ownership. Moreover, the intact ability to perceive illusory ownership confirms the notion that, despite impaired processing of proprioceptive or tactile input, higher order multisensory integration is unaffected in CRPS. Copyright © 2013 International Association for the Study

  6. Inferior Olive HCN1 Channels Coordinate Synaptic Integration and Complex Spike Timing

    Directory of Open Access Journals (Sweden)

    Derek L.F. Garden

    2018-02-01

    Full Text Available Cerebellar climbing-fiber-mediated complex spikes originate from neurons in the inferior olive (IO, are critical for motor coordination, and are central to theories of cerebellar learning. Hyperpolarization-activated cyclic-nucleotide-gated (HCN channels expressed by IO neurons have been considered as pacemaker currents important for oscillatory and resonant dynamics. Here, we demonstrate that in vitro, network actions of HCN1 channels enable bidirectional glutamatergic synaptic responses, while local actions of HCN1 channels determine the timing and waveform of synaptically driven action potentials. These roles are distinct from, and may complement, proposed pacemaker functions of HCN channels. We find that in behaving animals HCN1 channels reduce variability in the timing of cerebellar complex spikes, which serve as a readout of IO spiking. Our results suggest that spatially distributed actions of HCN1 channels enable the IO to implement network-wide rules for synaptic integration that modulate the timing of cerebellar climbing fiber signals.

  7. Systems engineering for very large systems

    Science.gov (United States)

    Lewkowicz, Paul E.

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  8. The age calibration of integrated ultraviolet colors and young stellar clusters in the Large Magellanic Cloud

    International Nuclear Information System (INIS)

    Barbero, J.; Brocato, E.; Cassatella, A.; Castellani, V.; Geyer, E.H.

    1990-01-01

    Integrated colors in selected far-UV bands are presented for a large sample of Large Magellanic Cloud (LMC) clusters. Theoretical calculations of these integrated colors are derived and discussed. The location in the two-color diagram C(18-28), C(15-31) is expected to be a sensitive but smooth function of cluster age for ages in the range 5 to 800 million yr. Theoretical results appear in very good agreement with the observed colors of LMC clusters. From this comparison, the gap in the observed colors is suggested to be caused by the lack of LMC clusters in the range of ages between 200 million to one billion yr. The two-color location of old globulars is discussed, also in connection with available data for the M31 clusters. 36 refs

  9. Par@Graph - a parallel toolbox for the construction and analysis of large complex climate networks

    NARCIS (Netherlands)

    Tantet, A.J.J.

    2015-01-01

    In this paper, we present Par@Graph, a software toolbox to reconstruct and analyze complex climate networks having a large number of nodes (up to at least 106) and edges (up to at least 1012). The key innovation is an efficient set of parallel software tools designed to leverage the inherited hybrid

  10. Regulating with imagery and the complexity of basic emotions. Comment on "The quartet theory of human emotions: An integrative and neurofunctional model" by S. Koelsch et al.

    Science.gov (United States)

    Meyer, Marcel; Kuchinke, Lars

    2015-06-01

    Literature, music and the arts have long attested to the complexity of human emotions. Hitherto, psychological and biological theories of emotions have largely neglected this rich heritage. In their review Koelsch and colleagues [1] have embarked upon the pioneering endeavour of integrating the diverse perspectives in emotion research. Noting that the focus of prior neurobiological theories relies mainly on animal studies, the authors sought to complement this body of research with a model of complex ("moral") emotions in humans (henceforth: complex emotions). According to this novel framework, there are four main interacting affective centres in the brain. Each centre is associated with a dominant affective function, such as ascending activation (brainstem), pain/pleasure (diencephalon), attachment-related affects (hippocampus) or moral emotions and unconscious cognitive appraisal (orbitofrontal cortex). Furthermore, language is ascribed a key role in (a) the communication of subjective feeling (reconfiguration) and (b) in the conscious regulation of emotions (by means of logic and rational thought).

  11. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  12. Improved Peak Detection and Deconvolution of Native Electrospray Mass Spectra from Large Protein Complexes.

    Science.gov (United States)

    Lu, Jonathan; Trnka, Michael J; Roh, Soung-Hun; Robinson, Philip J J; Shiau, Carrie; Fujimori, Danica Galonic; Chiu, Wah; Burlingame, Alma L; Guan, Shenheng

    2015-12-01

    Native electrospray-ionization mass spectrometry (native MS) measures biomolecules under conditions that preserve most aspects of protein tertiary and quaternary structure, enabling direct characterization of large intact protein assemblies. However, native spectra derived from these assemblies are often partially obscured by low signal-to-noise as well as broad peak shapes because of residual solvation and adduction after the electrospray process. The wide peak widths together with the fact that sequential charge state series from highly charged ions are closely spaced means that native spectra containing multiple species often suffer from high degrees of peak overlap or else contain highly interleaved charge envelopes. This situation presents a challenge for peak detection, correct charge state and charge envelope assignment, and ultimately extraction of the relevant underlying mass values of the noncovalent assemblages being investigated. In this report, we describe a comprehensive algorithm developed for addressing peak detection, peak overlap, and charge state assignment in native mass spectra, called PeakSeeker. Overlapped peaks are detected by examination of the second derivative of the raw mass spectrum. Charge state distributions of the molecular species are determined by fitting linear combinations of charge envelopes to the overall experimental mass spectrum. This software is capable of deconvoluting heterogeneous, complex, and noisy native mass spectra of large protein assemblies as demonstrated by analysis of (1) synthetic mononucleosomes containing severely overlapping peaks, (2) an RNA polymerase II/α-amanitin complex with many closely interleaved ion signals, and (3) human TriC complex containing high levels of background noise. Graphical Abstract ᅟ.

  13. Engineering a large application software project: the controls of the CERN PS accelerator complex

    International Nuclear Information System (INIS)

    Benincasa, G.P.; Daneels, A.; Heymans, P.; Serre, Ch.

    1985-01-01

    The CERN PS accelerator complex has been progressively converted to full computer controls without interrupting its full-time operation (more than 6000 hours per year with on average not more than 1% of the total down-time due to controls). The application software amounts to 120 man-years and 450'000 instructions: it compares with other large software projects, also outside the accelerator world: e.g. Skylab's ground support software. This paper outlines the application software structure which takes into account technical requirements and constraints (resulting from the complexity of the process and its operation) and economical and managerial ones. It presents the engineering and management techniques used to promote implementation, testing and commissioning within budget, manpower and time constraints and concludes with experience gained

  14. A density-based clustering model for community detection in complex networks

    Science.gov (United States)

    Zhao, Xiang; Li, Yantao; Qu, Zehui

    2018-04-01

    Network clustering (or graph partitioning) is an important technique for uncovering the underlying community structures in complex networks, which has been widely applied in various fields including astronomy, bioinformatics, sociology, and bibliometric. In this paper, we propose a density-based clustering model for community detection in complex networks (DCCN). The key idea is to find group centers with a higher density than their neighbors and a relatively large integrated-distance from nodes with higher density. The experimental results indicate that our approach is efficient and effective for community detection of complex networks.

  15. Development of a Deterministic Optimization Model for Design of an Integrated Utility and Hydrogen Supply Network

    International Nuclear Information System (INIS)

    Hwangbo, Soonho; Lee, In-Beum; Han, Jeehoon

    2014-01-01

    Lots of networks are constructed in a large scale industrial complex. Each network meet their demands through production or transportation of materials which are needed to companies in a network. Network directly produces materials for satisfying demands in a company or purchase form outside due to demand uncertainty, financial factor, and so on. Especially utility network and hydrogen network are typical and major networks in a large scale industrial complex. Many studies have been done mainly with focusing on minimizing the total cost or optimizing the network structure. But, few research tries to make an integrated network model by connecting utility network and hydrogen network. In this study, deterministic mixed integer linear programming model is developed for integrating utility network and hydrogen network. Steam Methane Reforming process is necessary for combining two networks. After producing hydrogen from Steam-Methane Reforming process whose raw material is steam vents from utility network, produced hydrogen go into hydrogen network and fulfill own needs. Proposed model can suggest optimized case in integrated network model, optimized blueprint, and calculate optimal total cost. The capability of the proposed model is tested by applying it to Yeosu industrial complex in Korea. Yeosu industrial complex has the one of the biggest petrochemical complex and various papers are based in data of Yeosu industrial complex. From a case study, the integrated network model suggests more optimal conclusions compared with previous results obtained by individually researching utility network and hydrogen network

  16. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  17. Gcn4-Mediator Specificity Is Mediated by a Large and Dynamic Fuzzy Protein-Protein Complex.

    Science.gov (United States)

    Tuttle, Lisa M; Pacheco, Derek; Warfield, Linda; Luo, Jie; Ranish, Jeff; Hahn, Steven; Klevit, Rachel E

    2018-03-20

    Transcription activation domains (ADs) are inherently disordered proteins that often target multiple coactivator complexes, but the specificity of these interactions is not understood. Efficient transcription activation by yeast Gcn4 requires its tandem ADs and four activator-binding domains (ABDs) on its target, the Mediator subunit Med15. Multiple ABDs are a common feature of coactivator complexes. We find that the large Gcn4-Med15 complex is heterogeneous and contains nearly all possible AD-ABD interactions. Gcn4-Med15 forms via a dynamic fuzzy protein-protein interface, where ADs bind the ABDs in multiple orientations via hydrophobic regions that gain helicity. This combinatorial mechanism allows individual low-affinity and specificity interactions to generate a biologically functional, specific, and higher affinity complex despite lacking a defined protein-protein interface. This binding strategy is likely representative of many activators that target multiple coactivators, as it allows great flexibility in combinations of activators that can cooperate to regulate genes with variable coactivator requirements. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Concurrent use of data base and graphics computer workstations to provide graphic access to large, complex data bases for robotics control of nuclear surveillance and maintenance

    International Nuclear Information System (INIS)

    Dalton, G.R.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida is part of a multiuniversity research effort, sponsored by the US Department of Energy which is under way to develop and deploy an advanced semi-autonomous robotic system for use in nuclear power stations. This paper reports on the development of the computer tools necessary to gain convenient graphic access to the intelligence implicit in a large complex data base such as that in a nuclear reactor plant. This program is integrated as a man/machine interface within the larger context of the total computerized robotic planning and control system. The portion of the project described here addresses the connection between the three-dimensional displays on an interactive graphic workstation and a data-base computer running a large data-base server program. Programming the two computers to work together to accept graphic queries and return answers on the graphic workstation is a key part of the interactive capability developed

  19. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  20. The search for Pleiades in trait constellations: functional integration and phenotypic selection in the complex flowers of Morrenia brachystephana (Apocynaceae).

    Science.gov (United States)

    Baranzelli, M C; Sérsic, A N; Cocucci, A A

    2014-04-01

    Pollinator-mediated natural selection on single traits, such as corolla tube or spur length, has been well documented. However, flower phenotypes are usually complex, and selection is expected to act on several traits that functionally interact rather than on a single isolated trait. Despite the fact that selection on complex phenotypes is expectedly widespread, multivariate selection modelling on such phenotypes still remains under-explored in plants. Species of the subfamily Asclepiadoideae (Apocynaceae) provide an opportunity to study such complex flower contrivances integrated by fine-scaled organs from disparate developmental origin. We studied the correlation structure among linear floral traits (i) by testing a priori morphological, functional or developmental hypotheses among traits and (ii) by exploring the organization of flower covariation, considering alternative expectations of modular organization or whole flower integration through conditional dependence analysis (CDA) and integration matrices. The phenotypic selection approach was applied to determine whether floral traits involved in the functioning of the pollination mechanism were affected by natural selection. Floral integration was low, suggesting that flowers are organized in more than just one correlation pleiad; our hypothetical functional correlation matrix was significantly correlated with the empirical matrix, and the CDA revealed three putative modules. Analyses of phenotypic selection showed significant linear and correlational gradients, lending support to expectations of functional interactions between floral traits. Significant correlational selection gradients found involved traits of different floral whorls, providing evidence for the existence of functional integration across developmental domains. © 2014 The Authors. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  1. The clinically-integrated randomized trial: proposed novel method for conducting large trials at low cost

    Directory of Open Access Journals (Sweden)

    Scardino Peter T

    2009-03-01

    Full Text Available Abstract Introduction Randomized controlled trials provide the best method of determining which of two comparable treatments is preferable. Unfortunately, contemporary randomized trials have become increasingly expensive, complex and burdened by regulation, so much so that many trials are of doubtful feasibility. Discussion Here we present a proposal for a novel, streamlined approach to randomized trials: the "clinically-integrated randomized trial". The key aspect of our methodology is that the clinical experience of the patient and doctor is virtually indistinguishable whether or not the patient is randomized, primarily because outcome data are obtained from routine clinical data, or from short, web-based questionnaires. Integration of a randomized trial into routine clinical practice also implies that there should be an attempt to randomize every patient, a corollary of which is that eligibility criteria are minimized. The similar clinical experience of patients on- and off-study also entails that the marginal cost of putting an additional patient on trial is negligible. We propose examples of how the clinically-integrated randomized trial might be applied in four distinct areas of medicine: comparisons of surgical techniques, "me too" drugs, rare diseases and lifestyle interventions. Barriers to implementing clinically-integrated randomized trials are discussed. Conclusion The proposed clinically-integrated randomized trial may allow us to enlarge dramatically the number of clinical questions that can be addressed by randomization.

  2. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  3. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  4. Complex Security System for Premises Under Conditions of Large Volume of Passenger Traffic

    Directory of Open Access Journals (Sweden)

    Yakubov Vladimir

    2016-01-01

    Full Text Available Subsystems of the design of a complex security system for premises under conditions of large volume of passenger traffic are considered. These subsystems provide video- and thermal imaging control, radio wave tomography, and gas analysis. Simultaneous application of all examined variants will essentially increase the probability of timely prevention of dangerous situations with the probability of false alarm as low as possible. It is important that finally, this will provide protection of population and will facilitate the work of intelligence services.

  5. Complex and extensive post-transcriptional regulation revealed by integrative proteomic and transcriptomic analysis of metabolite stress response in Clostridium acetobutylicum.

    Science.gov (United States)

    Venkataramanan, Keerthi P; Min, Lie; Hou, Shuyu; Jones, Shawn W; Ralston, Matthew T; Lee, Kelvin H; Papoutsakis, E Terry

    2015-01-01

    Clostridium acetobutylicum is a model organism for both clostridial biology and solvent production. The organism is exposed to its own toxic metabolites butyrate and butanol, which trigger an adaptive stress response. Integrative analysis of proteomic and RNAseq data may provide novel insights into post-transcriptional regulation. The identified iTRAQ-based quantitative stress proteome is made up of 616 proteins with a 15 % genome coverage. The differentially expressed proteome correlated poorly with the corresponding differential RNAseq transcriptome. Up to 31 % of the differentially expressed proteins under stress displayed patterns opposite to those of the transcriptome, thus suggesting significant post-transcriptional regulation. The differential proteome of the translation machinery suggests that cells employ a different subset of ribosomal proteins under stress. Several highly upregulated proteins but with low mRNA levels possessed mRNAs with long 5'UTRs and strong RBS scores, thus supporting the argument that regulatory elements on the long 5'UTRs control their translation. For example, the oxidative stress response rubrerythrin was upregulated only at the protein level up to 40-fold without significant mRNA changes. We also identified many leaderless transcripts, several displaying different transcriptional start sites, thus suggesting mRNA-trimming mechanisms under stress. Downregulation of Rho and partner proteins pointed to changes in transcriptional elongation and termination under stress. The integrative proteomic-transcriptomic analysis demonstrated complex expression patterns of a large fraction of the proteome. Such patterns could not have been detected with one or the other omic analyses. Our analysis proposes the involvement of specific molecular mechanisms of post-transcriptional regulation to explain the observed complex stress response.

  6. Integrated Modeling of Complex Optomechanical Systems

    Science.gov (United States)

    Andersen, Torben; Enmark, Anita

    2011-09-01

    Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

  7. Integrated Visualisation and Description of Complex Systems

    National Research Council Canada - National Science Library

    Goodburn, D

    1999-01-01

    ... on system topographies and feature overlays. System information from the domain's information space is filtered and integrated into a Composite Systems Model that provides a basis for consistency and integration between all system views...

  8. Lithography requirements in complex VLSI device fabrication

    International Nuclear Information System (INIS)

    Wilson, A.D.

    1985-01-01

    Fabrication of complex very large scale integration (VLSI) circuits requires continual advances in lithography to satisfy: decreasing minimum linewidths, larger chip sizes, tighter linewidth and overlay control, increasing topography to linewidth ratios, higher yield demands, increased throughput, harsher device processing, lower lithography cost, and a larger part number set with quick turn-around time. Where optical, electron beam, x-ray, and ion beam lithography can be applied to judiciously satisfy the complex VLSI circuit fabrication requirements is discussed and those areas that are in need of major further advances are addressed. Emphasis will be placed on advanced electron beam and storage ring x-ray lithography

  9. Dependency of {gamma}-secretase complex activity on the structural integrity of the bilayer

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Hua, E-mail: hzhou2@lbl.gov [Life Sciences Division, Lawrence Berkeley National Laboratory, University of California, Berkeley, CA 94720 (United States); Zhou, Shuxia; Walian, Peter J.; Jap, Bing K. [Life Sciences Division, Lawrence Berkeley National Laboratory, University of California, Berkeley, CA 94720 (United States)

    2010-11-12

    Research highlights: {yields} Partial solubilization of membranes with CHAPSO can increase {gamma}-secretase activity. {yields} Completely solubilized {gamma}-secretase is inactive. {yields} Purified {gamma}-secretase regains activity after reconstitution into lipid bilayers. {yields} A broad range of detergents can be used to successfully reconstitute {gamma}-secretase. -- Abstract: {gamma}-secretase is a membrane protein complex associated with the production of A{beta} peptides that are pathogenic in Alzheimer's disease. We have characterized the activity of {gamma}-secretase complexes under a variety of detergent solubilization and reconstitution conditions, and the structural state of proteoliposomes by electron microscopy. We found that {gamma}-secretase activity is highly dependent on the physical state or integrity of the membrane bilayer - partial solubilization may increase activity while complete solubilization will abolish it. The activity of well-solubilized {gamma}-secretase can be restored to near native levels when properly reconstituted into a lipid bilayer environment.

  10. Visualizing the complex functions and mechanisms of the anaphase promoting complex/cyclosome (APC/C)

    Science.gov (United States)

    Alfieri, Claudio; Zhang, Suyang

    2017-01-01

    The anaphase promoting complex or cyclosome (APC/C) is a large multi-subunit E3 ubiquitin ligase that orchestrates cell cycle progression by mediating the degradation of important cell cycle regulators. During the two decades since its discovery, much has been learnt concerning its role in recognizing and ubiquitinating specific proteins in a cell-cycle-dependent manner, the mechanisms governing substrate specificity, the catalytic process of assembling polyubiquitin chains on its target proteins, and its regulation by phosphorylation and the spindle assembly checkpoint. The past few years have witnessed significant progress in understanding the quantitative mechanisms underlying these varied APC/C functions. This review integrates the overall functions and properties of the APC/C with mechanistic insights gained from recent cryo-electron microscopy (cryo-EM) studies of reconstituted human APC/C complexes. PMID:29167309

  11. Siemens: Smart Technologies for Large Control Systems

    CERN Multimedia

    CERN. Geneva; BAKANY, Elisabeth

    2015-01-01

    The CERN Large Hadron Collider (LHC) is known to be one of the most complex scientific machines ever built by mankind. Its correct functioning relies on the integration of a multitude of interdependent industrial control systems, which provide different and essential services to run and protect the accelerators and experiments. These systems have to deal with several millions of data points (e.g. sensors, actuators, configuration parameters, etc…) which need to be acquired, processed, archived and analysed. Since more than 20 years, CERN and Siemens have developed a strong collaboration to deal with the challenges for these large systems. The presentation will cover the current work on the SCADA (Supervisory Control and Data Acquisition) systems and Data Analytics Frameworks.

  12. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  13. Complex researches on substantiation of construction and seismic stability of large dams in seismic region

    International Nuclear Information System (INIS)

    Negmatullaev, S.Kh.; Yasunov, P.A.

    2001-01-01

    This article is devoted to complex researches on substantiation of construction and seismic stability of large dams (Nurec hydroelectric power station) in seismic region. Geological, seismological, model, and engineering investigations are discussed in this work. At construction of Nurec hydroelectric power station the rich experience is accumulated. This experience can be used in analogous seismically active regions at construction similar hydroelectric power stations.

  14. The SMC5/6 complex is involved in crucial processes during human spermatogenesis

    NARCIS (Netherlands)

    Verver, Dideke E.; Langedijk, Nathalia S. M.; Jordan, Philip W.; Repping, Sjoerd; Hamer, Geert

    2014-01-01

    Genome integrity is crucial for safe reproduction. Therefore, chromatin structure and dynamics should be tightly regulated during germ cell development. Chromatin structure and function are in large part determined by the structural maintenance of chromosomes (SMC) protein complexes, of which SMC5/6

  15. Impacts of large-scale offshore wind farm integration on power systems through VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Hongzhi; Chen, Zhe

    2013-01-01

    The potential of offshore wind energy has been commonly recognized and explored globally. Many countries have implemented and planned offshore wind farms to meet their increasing electricity demands and public environmental appeals, especially in Europe. With relatively less space limitation......, an offshore wind farm could have a capacity rating to hundreds of MWs or even GWs that is large enough to compete with conventional power plants. Thus the impacts of a large offshore wind farm on power system operation and security should be thoroughly studied and understood. This paper investigates...... the impacts of integrating a large-scale offshore wind farm into the transmission system of a power grid through VSC-HVDC connection. The concerns are focused on steady-state voltage stability, dynamic voltage stability and transient angle stability. Simulation results based on an exemplary power system...

  16. Large-scale grid-enabled lattice-Boltzmann simulations of complex fluid flow in porous media and under shear

    NARCIS (Netherlands)

    Harting, J.D.R.; Venturoli, M.; Coveney, P.V.

    2004-01-01

    Well–designed lattice Boltzmann codes exploit the essentially embarrassingly parallel features of the algorithm and so can be run with considerable efficiency on modern supercomputers. Such scalable codes permit us to simulate the behaviour of increasingly large quantities of complex condensed

  17. Harnessing Product Complexity: An Integrative Approach

    OpenAIRE

    Orfi, Nihal Mohamed Sherif

    2011-01-01

    In today's market, companies are faced with pressure to increase variety in product offerings. While increasing variety can help increase market share and sales growth, the costs of doing so can be significant. Ultimately, variety causes complexity in products and processes to soar, which negatively impacts product development, quality, production scheduling, efficiency and more. Product variety is just one common cause of product complexity, a topic that several researchers have tackled with...

  18. Integration of large amounts of wind power. Markets for trading imbalances

    Energy Technology Data Exchange (ETDEWEB)

    Neimane, Viktoria; Axelsson, Urban [Vattenfall Research and Development AB, Stockholm (Sweden); Gustafsson, Johan; Gustafsson, Kristian [Vattenfall Nordic Generation Management, Stockholm (Sweden); Murray, Robin [Vattenfall Vindkraft AB, Stockholm (Sweden)

    2008-07-01

    The well-known concerns about wind power are related to its intermittent nature and difficulty to make exact forecasts. The expected increase in balancing and reserve requirements due to wind power has been investigated in several studies. This paper takes the next step in studying integration of large amounts of wind power in Sweden. Several wind power producers' and corresponding balance providers' perspective is taken and their imbalance costs modeled. Larger producers having wind power spread over larger geographical areas will have lower relative costs than producers having their units concentrated within limited geographical area. Possibilities of the wind power producers to reduce the imbalance costs by acting on after sales market are exposed and compared. (orig.)

  19. Complex saddle points and the sign problem in complex Langevin simulation

    International Nuclear Information System (INIS)

    Hayata, Tomoya; Hidaka, Yoshimasa; Tanizaki, Yuya

    2016-01-01

    We show that complex Langevin simulation converges to a wrong result within the semiclassical analysis, by relating it to the Lefschetz-thimble path integral, when the path-integral weight has different phases among dominant complex saddle points. Equilibrium solution of the complex Langevin equation forms local distributions around complex saddle points. Its ensemble average approximately becomes a direct sum of the average in each local distribution, where relative phases among them are dropped. We propose that by taking these phases into account through reweighting, we can solve the wrong convergence problem. However, this prescription may lead to a recurrence of the sign problem in the complex Langevin method for quantum many-body systems.

  20. Wall modeled large eddy simulations of complex high Reynolds number flows with synthetic inlet turbulence

    International Nuclear Information System (INIS)

    Patil, Sunil; Tafti, Danesh

    2012-01-01

    Highlights: ► Large eddy simulation. ► Wall layer modeling. ► Synthetic inlet turbulence. ► Swirl flows. - Abstract: Large eddy simulations of complex high Reynolds number flows are carried out with the near wall region being modeled with a zonal two layer model. A novel formulation for solving the turbulent boundary layer equation for the effective tangential velocity in a generalized co-ordinate system is presented and applied in the near wall zonal treatment. This formulation reduces the computational time in the inner layer significantly compared to the conventional two layer formulations present in the literature and is most suitable for complex geometries involving body fitted structured and unstructured meshes. The cost effectiveness and accuracy of the proposed wall model, used with the synthetic eddy method (SEM) to generate inlet turbulence, is investigated in turbulent channel flow, flow over a backward facing step, and confined swirling flows at moderately high Reynolds numbers. Predictions are compared with available DNS, experimental LDV data, as well as wall resolved LES. In all cases, there is at least an order of magnitude reduction in computational cost with no significant loss in prediction accuracy.

  1. Integrating over Higgs branches

    International Nuclear Information System (INIS)

    Moore, G.; Shatashvili, S.

    2000-01-01

    We develop some useful techniques for integrating over Higgs branches in supersymmetric theories with 4 and 8 supercharges. In particular, we define a regularized volume for hyperkaehler quotients. We evaluate this volume for certain ALE and ALF spaces in terms of the hyperkaehler periods. We also reduce these volumes for a large class of hyperkaehler quotients to simpler integrals. These quotients include complex coadjoint orbits, instanton moduli spaces on R 4 and ALE manifolds, Hitchin spaces, and moduli spaces of (parabolic) Higgs bundles on Riemann surfaces. In the case of Hitchin spaces the evaluation of the volume reduces to a summation over solutions of Bethe ansatz equations for the non-linear Schroedinger system. We discuss some applications of our results. (orig.)

  2. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases

    NARCIS (Netherlands)

    Heidema, A.G.; Boer, J.M.A.; Nagelkerke, N.; Mariman, E.C.M.; A, van der D.L.; Feskens, E.J.M.

    2006-01-01

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods

  3. Energy optimization and prediction of complex petrochemical industries using an improved artificial neural network approach integrating data envelopment analysis

    International Nuclear Information System (INIS)

    Han, Yong-Ming; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-01-01

    Graphical abstract: This paper proposed an energy optimization and prediction of complex petrochemical industries based on a DEA-integrated ANN approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA-ANN prediction model is effectively verified by executing a linear comparison between all DMUs and the effective DMUs through the standard data source from the UCI (University of California at Irvine) repository. Finally, the proposed model is validated through an application in a complex ethylene production system of China petrochemical industry. Meanwhile, the optimization result and the prediction value are obtained to reduce energy consumption of the ethylene production system, guide ethylene production and improve energy efficiency. - Highlights: • The DEA-integrated ANN approach is proposed. • The DEA-ANN prediction model is effectively verified through the standard data source from the UCI repository. • The energy optimization and prediction framework of complex petrochemical industries based on the proposed method is obtained. • The proposed method is valid and efficient in improvement of energy efficiency in complex petrochemical plants. - Abstract: Since the complex petrochemical data have characteristics of multi-dimension, uncertainty and noise, it is difficult to accurately optimize and predict the energy usage of complex petrochemical systems. Therefore, this paper proposes a data envelopment analysis (DEA) integrated artificial neural network (ANN) approach (DEA-ANN). The proposed approach utilizes the DEA model with slack variables for sensitivity analysis to determine the effective decision making units (DMUs) and indicate the optimized direction of the ineffective DMUs. Compared with the traditional ANN approach, the DEA

  4. Geo-Semantic Framework for Integrating Long-Tail Data and Model Resources for Advancing Earth System Science

    Science.gov (United States)

    Elag, M.; Kumar, P.

    2014-12-01

    Often, scientists and small research groups collect data, which target to address issues and have limited geographic or temporal range. A large number of such collections together constitute a large database that is of immense value to Earth Science studies. Complexity of integrating these data include heterogeneity in dimensions, coordinate systems, scales, variables, providers, users and contexts. They have been defined as long-tail data. Similarly, we use "long-tail models" to characterize a heterogeneous collection of models and/or modules developed for targeted problems by individuals and small groups, which together provide a large valuable collection. Complexity of integrating across these models include differing variable names and units for the same concept, model runs at different time steps and spatial resolution, use of differing naming and reference conventions, etc. Ability to "integrate long-tail models and data" will provide an opportunity for the interoperability and reusability of communities' resources, where not only models can be combined in a workflow, but each model will be able to discover and (re)use data in application specific context of space, time and questions. This capability is essential to represent, understand, predict, and manage heterogeneous and interconnected processes and activities by harnessing the complex, heterogeneous, and extensive set of distributed resources. Because of the staggering production rate of long-tail models and data resulting from the advances in computational, sensing, and information technologies, an important challenge arises: how can geoinformatics bring together these resources seamlessly, given the inherent complexity among model and data resources that span across various domains. We will present a semantic-based framework to support integration of "long-tail" models and data. This builds on existing technologies including: (i) SEAD (Sustainable Environmental Actionable Data) which supports curation

  5. GEOMETRIC COMPLEXITY ANALYSIS IN AN INTEGRATIVE TECHNOLOGY EVALUATION MODEL (ITEM FOR SELECTIVE LASER MELTING (SLM#

    Directory of Open Access Journals (Sweden)

    S. Merkt

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Selective laser melting (SLM is becoming an economically viable choice for manufacturing complex serial parts. This paper focuses on a geometric complexity analysis as part of the integrative technology evaluation model (ITEM presented here. In contrast to conventional evaluation methodologies, the ITEM considers interactions between product and process innovations generated by SLM. The evaluation of manufacturing processes that compete with SLM is the main goal of ITEM. The paper includes a complexity analysis of a test part from Festo AG. The paper closes with a discussion of how the expanded design freedom of SLM can be used to improve company operations, and how the complexity analysis presented here can be seen as a starting point for feature-based complexity analysis..

    AFRIKAANSE OPSOMMING: Selektiewe lasersmelting word geleidelik ’n gangbare ekonomiese keuse vir die vervaar-diging van opeenvolgende komplekse onderdele. Die navorsing is toegespits op die ontleding van meetkundige kompleksiteit as ’n gedeelte van ’n integrerende tegnologiese evalueringsmodel. Gemeet teen konvensionele evalueringsmodelle behandel die genoemde metode interaksies tussen produkte- en prosesinnovasies wat gegenereer word. Die navorsing behandel ’n kompleksiteitsontleding van ’n toetsonderdeel van die firma FESTO AG. Die resultaat toon hoe kompleksiteits-analise gebruik kan word as die vertrekpunt vir eienskapsgebaseerde analise.

  6. Heritability and demographic analyses in the large isolated population of Val Borbera suggest advantages in mapping complex traits genes.

    Directory of Open Access Journals (Sweden)

    Michela Traglia

    2009-10-01

    Full Text Available Isolated populations are a useful resource for mapping complex traits due to shared stable environment, reduced genetic complexity and extended Linkage Disequilibrium (LD compared to the general population. Here we describe a large genetic isolate from the North West Apennines, the mountain range that runs through Italy from the North West Alps to the South.The study involved 1,803 people living in 7 villages of the upper Borbera Valley. For this large population cohort, data from genealogy reconstruction, medical questionnaires, blood, anthropometric and bone status QUS parameters were evaluated. Demographic and epidemiological analyses indicated a substantial genetic component contributing to each trait variation as well as overlapping genetic determinants and family clustering for some traits.The data provide evidence for significant heritability of medical relevant traits that will be important in mapping quantitative traits. We suggest that this population isolate is suitable to identify rare variants associated with complex phenotypes that may be difficult to study in larger but more heterogeneous populations.

  7. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  8. The system of computer simulation and organizational management of large enterprises activity

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Study on the construction of an integrated technical support is carried out by the example of organizational information systems (or administrative and economic management activities of large organizations. As part of the management information system, comprehensive technical support related to other parts of the system, first of all, to the information database managementsystem, which covers all types of information required for planning and management, and an algorithm for processing this information. This means that not only the control system determines the required set of technical means, but it features a significant effect on the composition and organization of the management information system database. A feature of the integrated logistics is the variety of hardware functions, a large number of device types, different ways of interaction of the operator and equipment, the possibility of a different line-up and aggregation devices. The complex of technical means of information management systems have all the features of a complex system: versatility, availability feedbacks multicriteriality, hierarchical structure, the presence of allocated parts connected to each other by complex interactions, the uncertainty of the behavior of these parts, which is the result of the ultimate reliability of technical means and the influence of environmental disturbances . For this reason, the tasks associated with the creation of an integrated logistics management information system should be solved with the system approach. To maximize the efficiency of the information management system required the construction of technological complex with minimal installation and operation, which leads to the need to choose the optimal variant of technical means of the number of possible. The decision of the main objectives of integrated logistics can be reduced to the construction of the joint number of languages - character sets or alphabets describing the input

  9. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  10. How Project Managers Really Manage: An Indepth Look at Some Managers of Large, Complex NASA Projects

    Science.gov (United States)

    Mulenburg, Gerald M.; Impaeilla, Cliff (Technical Monitor)

    2000-01-01

    This paper reports on a research study by the author that examined ten contemporary National Aeronautics and Space Administration (NASA) complex projects. In-depth interviews with the project managers of these projects provided qualitative data about the inner workings of the project and the methodologies used in establishing and managing the projects. The inclusion of a variety of space, aeronautics, and ground based projects from several different NASA research centers helped to reduce potential bias in the findings toward any one type of project, or technical discipline. The findings address the participants and their individual approaches. The discussion includes possible implications for project managers of other large, complex, projects.

  11. Control of Genome Integrity by RFC Complexes; Conductors of PCNA Loading onto and Unloading from Chromatin during DNA Replication

    Directory of Open Access Journals (Sweden)

    Yasushi Shiomi

    2017-01-01

    Full Text Available During cell division, genome integrity is maintained by faithful DNA replication during S phase, followed by accurate segregation in mitosis. Many DNA metabolic events linked with DNA replication are also regulated throughout the cell cycle. In eukaryotes, the DNA sliding clamp, proliferating cell nuclear antigen (PCNA, acts on chromatin as a processivity factor for DNA polymerases. Since its discovery, many other PCNA binding partners have been identified that function during DNA replication, repair, recombination, chromatin remodeling, cohesion, and proteolysis in cell-cycle progression. PCNA not only recruits the proteins involved in such events, but it also actively controls their function as chromatin assembles. Therefore, control of PCNA-loading onto chromatin is fundamental for various replication-coupled reactions. PCNA is loaded onto chromatin by PCNA-loading replication factor C (RFC complexes. Both RFC1-RFC and Ctf18-RFC fundamentally function as PCNA loaders. On the other hand, after DNA synthesis, PCNA must be removed from chromatin by Elg1-RFC. Functional defects in RFC complexes lead to chromosomal abnormalities. In this review, we summarize the structural and functional relationships among RFC complexes, and describe how the regulation of PCNA loading/unloading by RFC complexes contributes to maintaining genome integrity.

  12. Systems and complexity thinking in the general practice literature: an integrative, historical narrative review.

    Science.gov (United States)

    Sturmberg, Joachim P; Martin, Carmel M; Katerndahl, David A

    2014-01-01

    Over the past 7 decades, theories in the systems and complexity sciences have had a major influence on academic thinking and research. We assessed the impact of complexity science on general practice/family medicine. We performed a historical integrative review using the following systematic search strategy: medical subject heading [humans] combined in turn with the terms complex adaptive systems, nonlinear dynamics, systems biology, and systems theory, limited to general practice/family medicine and published before December 2010. A total of 16,242 articles were retrieved, of which 49 were published in general practice/family medicine journals. Hand searches and snowballing retrieved another 35. After a full-text review, we included 56 articles dealing specifically with systems sciences and general/family practice. General practice/family medicine engaged with the emerging systems and complexity theories in 4 stages. Before 1995, articles tended to explore common phenomenologic general practice/family medicine experiences. Between 1995 and 2000, articles described the complex adaptive nature of this discipline. Those published between 2000 and 2005 focused on describing the system dynamics of medical practice. After 2005, articles increasingly applied the breadth of complex science theories to health care, health care reform, and the future of medicine. This historical review describes the development of general practice/family medicine in relation to complex adaptive systems theories, and shows how systems sciences more accurately reflect the discipline's philosophy and identity. Analysis suggests that general practice/family medicine first embraced systems theories through conscious reorganization of its boundaries and scope, before applying empirical tools. Future research should concentrate on applying nonlinear dynamics and empirical modeling to patient care, and to organizing and developing local practices, engaging in community development, and influencing

  13. Hybrid Integration of Solid-State Quantum Emitters on a Silicon Photonic Chip.

    Science.gov (United States)

    Kim, Je-Hyung; Aghaeimeibodi, Shahriar; Richardson, Christopher J K; Leavitt, Richard P; Englund, Dirk; Waks, Edo

    2017-12-13

    Scalable quantum photonic systems require efficient single photon sources coupled to integrated photonic devices. Solid-state quantum emitters can generate single photons with high efficiency, while silicon photonic circuits can manipulate them in an integrated device structure. Combining these two material platforms could, therefore, significantly increase the complexity of integrated quantum photonic devices. Here, we demonstrate hybrid integration of solid-state quantum emitters to a silicon photonic device. We develop a pick-and-place technique that can position epitaxially grown InAs/InP quantum dots emitting at telecom wavelengths on a silicon photonic chip deterministically with nanoscale precision. We employ an adiabatic tapering approach to transfer the emission from the quantum dots to the waveguide with high efficiency. We also incorporate an on-chip silicon-photonic beamsplitter to perform a Hanbury-Brown and Twiss measurement. Our approach could enable integration of precharacterized III-V quantum photonic devices into large-scale photonic structures to enable complex devices composed of many emitters and photons.

  14. An Integrated Approach for Monitoring Contemporary and Recruitable Large Woody Debris

    Directory of Open Access Journals (Sweden)

    Jeffrey J. Richardson

    2016-09-01

    Full Text Available Large woody debris (LWD plays a critical structural role in riparian ecosystems, but it can be difficult and time-consuming to quantify and survey in the field. We demonstrate an automated method for quantifying LWD using aerial LiDAR and object-based image analysis techniques, as well as a manual method for quantifying LWD using image interpretation derived from LiDAR rasters and aerial four-band imagery. In addition, we employ an established method for estimating the number of individual trees within the riparian forest. These methods are compared to field data showing high accuracies for the LWD method and moderate accuracy for the individual tree method. These methods can be integrated to quantify the contemporary and recruitable LWD in a river system.

  15. Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice.

    Science.gov (United States)

    Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M

    2012-10-01

    To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Reconceptualizing children's complex discharge with health systems theory: novel integrative review with embedded expert consultation and theory development.

    Science.gov (United States)

    Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh

    2014-05-01

    To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA  CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW  Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.

  17. An Integrated Circuit for Radio Astronomy Correlators Supporting Large Arrays of Antennas

    Science.gov (United States)

    D'Addario, Larry R.; Wang, Douglas

    2016-01-01

    Radio telescopes that employ arrays of many antennas are in operation, and ever larger ones are being designed and proposed. Signals from the antennas are combined by cross-correlation. While the cost of most components of the telescope is proportional to the number of antennas N, the cost and power consumption of cross-correlationare proportional to N2 and dominate at sufficiently large N. Here we report the design of an integrated circuit (IC) that performs digital cross-correlations for arbitrarily many antennas in a power-efficient way. It uses an intrinsically low-power architecture in which the movement of data between devices is minimized. In a large system, each IC performs correlations for all pairs of antennas but for a portion of the telescope's bandwidth (the so-called "FX" structure). In our design, the correlations are performed in an array of 4096 complex multiply-accumulate (CMAC) units. This is sufficient to perform all correlations in parallel for 64 signals (N=32 antennas with 2 opposite-polarization signals per antenna). When N is larger, the input data are buffered in an on-chipmemory and the CMACs are re-used as many times as needed to compute all correlations. The design has been synthesized and simulated so as to obtain accurate estimates of the IC's size and power consumption. It isintended for fabrication in a 32 nm silicon-on-insulator process, where it will require less than 12mm2 of silicon area and achieve an energy efficiency of 1.76 to 3.3 pJ per CMAC operation, depending on the number of antennas. Operation has been analyzed in detail up to N = 4096. The system-level energy efficiency, including board-levelI/O, power supplies, and controls, is expected to be 5 to 7 pJ per CMAC operation. Existing correlators for the JVLA (N = 32) and ALMA (N = 64) telescopes achieve about 5000 pJ and 1000 pJ respectively usingapplication-specific ICs in older technologies. To our knowledge, the largest-N existing correlator is LEDA atN = 256; it

  18. Integrating transition theory and bioecological theory: a theoretical perspective for nurses supporting the transition to adulthood for young people with medical complexity.

    Science.gov (United States)

    Joly, Elizabeth

    2016-06-01

    To present a discussion of a theoretical perspective developed through integrating Meleis' Transition Theory and Bronfenbrenner's Bioecological Theory of Human Development to inform nursing and advanced nursing practice supporting the transition to adulthood for young people with medical complexity. Theoretical perspectives to inform nursing practice in supporting successful transition are limited, yet nurses frequently encounter young people with medical complexity during the transition to adulthood. Discussion paper. A literature search of CINAHL and Medline was conducted in 2014 and included articles from 2003-2014; informal discussions with families; the author's experiences in a transition program. The integrated theoretical perspective described in this paper can inform nurses and advanced practice nurses on contextual influences, program and intervention development across spheres of influence and outcomes for the transition to adulthood for young people with medical complexity. Young people and their families require effective reciprocal interactions with individuals and services across sectors to successfully transition to adulthood and become situated in the adult world. Intervention must also extend beyond the young person to include providers, services and health and social policy. Nurses can take a leadership role in supporting the transition to adulthood for young people with medical complexity through direct care, case management, education and research. It is integral that nurses holistically consider developmental processes, complexity and contextual conditions that promote positive outcomes during and beyond the transition to adulthood. © 2016 John Wiley & Sons Ltd.

  19. SPECTROSCOPIC STUDY OF THE N159/N160 COMPLEX IN THE LARGE MAGELLANIC CLOUD

    International Nuclear Information System (INIS)

    Farina, Cecilia; Bosch, Guillermo L.; Morrell, Nidia I.; Barba, Rodolfo H.; Walborn, Nolan R.

    2009-01-01

    We present a spectroscopic study of the N159/N160 massive star-forming region south of 30 Doradus in the Large Magellanic Cloud, classifying a total of 189 stars in the field of the complex. Most of them belong to O and early B spectral classes; we have also found some uncommon and very interesting spectra, including members of the Onfp class, a Be P Cygni star, and some possible multiple systems. Using spectral types as broad indicators of evolutionary stages, we considered the evolutionary status of the region as a whole. We infer that massive stars at different evolutionary stages are present throughout the region, favoring the idea of a common time for the origin of recent star formation in the N159/N160 complex as a whole, while sequential star formation at different rates is probably present in several subregions.

  20. Petrochemical refinery and integrated petrochemical complexes; Refinaria petroquimica e complexos petroquimicos integrados

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Patricia C. dos; Seidl, Peter R.; Borschiver, Suzana [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Escola de Quimica

    2008-07-01

    Global demand for light olefins points to strong prospects for growth, stimulating investments in overall productive capacity. With propylene demand growing slightly faster than that of ethylene, rising prices and difficulties in supplies of petrochemical feedstocks (mainly naphtha and natural gas), steam crackers alone are not able to fill the light olefins gap nor do they allow extraordinary margins. As petrochemical market dynamics also influence refining activities, there has been significant progress in the development of technologies for petrochemical refining, such as Petrochemical FCC. This petrochemistry-refining integration offers great opportunities for synergism since both industries share many common challenges, like more severe environmental requirements and optimizing the use of utilities. However, in the case of valuation of non-conventional oils (which tend to increase in importance in oil markets), to take full advantage of this opportunity to add value to low cost streams, deep conversion and treatment processes are of great significance in refining scheme to have enough feedstock for cracking. In this context, a petrochemical refinery seems to be an important alternative source of petrochemicals and may be integrated or not to a petrochemical complex. (author)

  1. Managing Active Learning Processes in Large First Year Physics Classes: The Advantages of an Integrated Approach

    Directory of Open Access Journals (Sweden)

    Michael J. Drinkwater

    2014-09-01

    Full Text Available Turning lectures into interactive, student-led question and answer sessions is known to increase learning, but enabling interaction in a large class seems aninsurmountable task. This can discourage adoption of this new approach – who has time to individualize responses, address questions from over 200 students and encourage active participation in class? An approach adopted by a teaching team in large first-year classes at a research-intensive university appears to provide a means to do so. We describe the implementation of active learning strategies in a large first-year undergraduate physics unit of study, replacing traditional, content-heavy lectures with an integrated approach to question-driven learning. A key feature of our approach is that it facilitates intensive in-class discussions by requiring students to engage in preparatory reading and answer short written quizzes before every class. The lecturer uses software to rapidly analyze the student responses and identify the main issues faced by the students before the start of each class. We report the success of the integration of student preparation with this analysis and feedback framework, and the impact on the in-class discussions. We also address some of the difficulties commonly experienced by staff preparing for active learning classes.

  2. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong

    2013-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and tip speed ratio, the optimal airfoils are designed based on the local speed ratios. To achieve high power performance at low cost, the airfoils are designed...... with an objective of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on the previous in-house airfoil family which were optimized at a Reynolds number of 3...... million. A novel shape perturbation function is introduced to optimize the geometry on the existing airfoils and thus simplify the design procedure. The viscos/inviscid code Xfoil is used as the aerodynamic tool for airfoil optimization where the Reynolds number is set at 16 million with a free...

  3. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2014-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and a tip speed ratio, optimal airfoils are designed based on the local speed ratios. To achieve a high power performance at low cost, the airfoils are designed...... with the objectives of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on a previous in-house designed airfoil family which was optimized at a Reynolds number...... of 3 million. A novel shape perturbation function is introduced to optimize the geometry based on the existing airfoils which simplifies the design procedure. The viscous/inviscid interactive code XFOIL is used as the aerodynamic tool for airfoil optimization at a Reynolds number of 16 million...

  4. Large scale integration of intermittent renewable energy sources in the Greek power sector

    International Nuclear Information System (INIS)

    Voumvoulakis, Emmanouil; Asimakopoulou, Georgia; Danchev, Svetoslav; Maniatis, George; Tsakanikas, Aggelos

    2012-01-01

    As a member of the European Union, Greece has committed to achieve ambitious targets for the penetration of renewable energy sources (RES) in gross electricity consumption by 2020. Large scale integration of RES requires a suitable mixture of compatible generation units, in order to deal with the intermittency of wind velocity and solar irradiation. The scope of this paper is to examine the impact of large scale integration of intermittent energy sources, required to meet the 2020 RES target, on the generation expansion plan, the fuel mix and the spinning reserve requirements of the Greek electricity system. We perform hourly simulation of the intermittent RES generation to estimate residual load curves on a monthly basis, which are then inputted in a WASP-IV model of the Greek power system. We find that the decarbonisation effort, with the rapid entry of RES and the abolishment of the grandfathering of CO 2 allowances, will radically transform the Greek electricity sector over the next 10 years, which has wide-reaching policy implications. - Highlights: ► Greece needs 8.8 to 9.3 GW additional RES installations by 2020. ► RES capacity credit varies between 12.2% and 15.3%, depending on interconnections. ► Without institutional changes, the reserve requirements will be more than double. ► New CCGT installed capacity will probably exceed the cost-efficient level. ► Competitive pressures should be introduced in segments other than day-ahead market.

  5. Threshold corrections, generalised prepotentials and Eichler integrals

    CERN Document Server

    Angelantonj, Carlo; Pioline, Boris

    2015-06-12

    We continue our study of one-loop integrals associated to BPS-saturated amplitudes in $\\mathcal{N}=2$ heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur-Poincar\\'e series in the complex structure modulus. The closure of Niebur-Poincar\\'e series under modular derivatives implies that such integrals derive from holomorphic prepotentials $f_n$, generalising the familiar prepotential of $\\mathcal{N}=2$ supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involv...

  6. Large-scale quantum photonic circuits in silicon

    Directory of Open Access Journals (Sweden)

    Harris Nicholas C.

    2016-08-01

    Full Text Available Quantum information science offers inherently more powerful methods for communication, computation, and precision measurement that take advantage of quantum superposition and entanglement. In recent years, theoretical and experimental advances in quantum computing and simulation with photons have spurred great interest in developing large photonic entangled states that challenge today’s classical computers. As experiments have increased in complexity, there has been an increasing need to transition bulk optics experiments to integrated photonics platforms to control more spatial modes with higher fidelity and phase stability. The silicon-on-insulator (SOI nanophotonics platform offers new possibilities for quantum optics, including the integration of bright, nonclassical light sources, based on the large third-order nonlinearity (χ(3 of silicon, alongside quantum state manipulation circuits with thousands of optical elements, all on a single phase-stable chip. How large do these photonic systems need to be? Recent theoretical work on Boson Sampling suggests that even the problem of sampling from e30 identical photons, having passed through an interferometer of hundreds of modes, becomes challenging for classical computers. While experiments of this size are still challenging, the SOI platform has the required component density to enable low-loss and programmable interferometers for manipulating hundreds of spatial modes.

  7. Integration of metabolomics data into metabolic networks.

    Science.gov (United States)

    Töpfer, Nadine; Kleessen, Sabrina; Nikoloski, Zoran

    2015-01-01

    Metabolite levels together with their corresponding metabolic fluxes are integrative outcomes of biochemical transformations and regulatory processes and they can be used to characterize the response of biological systems to genetic and/or environmental changes. However, while changes in transcript or to some extent protein levels can usually be traced back to one or several responsible genes, changes in fluxes and particularly changes in metabolite levels do not follow such rationale and are often the outcome of complex interactions of several components. The increasing quality and coverage of metabolomics technologies have fostered the development of computational approaches for integrating metabolic read-outs with large-scale models to predict the physiological state of a system. Constraint-based approaches, relying on the stoichiometry of the considered reactions, provide a modeling framework amenable to analyses of large-scale systems and to the integration of high-throughput data. Here we review the existing approaches that integrate metabolomics data in variants of constrained-based approaches to refine model reconstructions, to constrain flux predictions in metabolic models, and to relate network structural properties to metabolite levels. Finally, we discuss the challenges and perspectives in the developments of constraint-based modeling approaches driven by metabolomics data.

  8. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  9. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  10. Retention of habitat complexity minimizes disassembly of reef fish communities following disturbance: a large-scale natural experiment.

    Directory of Open Access Journals (Sweden)

    Michael J Emslie

    Full Text Available High biodiversity ecosystems are commonly associated with complex habitats. Coral reefs are highly diverse ecosystems, but are under increasing pressure from numerous stressors, many of which reduce live coral cover and habitat complexity with concomitant effects on other organisms such as reef fishes. While previous studies have highlighted the importance of habitat complexity in structuring reef fish communities, they employed gradient or meta-analyses which lacked a controlled experimental design over broad spatial scales to explicitly separate the influence of live coral cover from overall habitat complexity. Here a natural experiment using a long term (20 year, spatially extensive (∼ 115,000 kms(2 dataset from the Great Barrier Reef revealed the fundamental importance of overall habitat complexity for reef fishes. Reductions of both live coral cover and habitat complexity had substantial impacts on fish communities compared to relatively minor impacts after major reductions in coral cover but not habitat complexity. Where habitat complexity was substantially reduced, species abundances broadly declined and a far greater number of fish species were locally extirpated, including economically important fishes. This resulted in decreased species richness and a loss of diversity within functional groups. Our results suggest that the retention of habitat complexity following disturbances can ameliorate the impacts of coral declines on reef fishes, so preserving their capacity to perform important functional roles essential to reef resilience. These results add to a growing body of evidence about the importance of habitat complexity for reef fishes, and represent the first large-scale examination of this question on the Great Barrier Reef.

  11. Integral and measure from rather simple to rather complex

    CERN Document Server

    Mackevicius, Vigirdas

    2014-01-01

    This book is devoted to integration, one of the two main operations in calculus. In Part 1, the definition of the integral of a one-variable function is different (not essentially, but rather methodically) from traditional definitions of Riemann or Lebesgue integrals. Such an approach allows us, on the one hand, to quickly develop the practical skills of integration as well as, on the other hand, in Part 2, to pass naturally to the more general Lebesgue integral. Based on the latter, in Part 2, the author develops a theory of integration for functions of several variables. In Part 3, within

  12. The challenge of integrating large scale wind power

    Energy Technology Data Exchange (ETDEWEB)

    Kryszak, B.

    2007-07-01

    The support of renewable energy sources is one of the key issues in current energy policies. The paper presents aspects of the integration of wind power in the electric power system from the perspective of a Transmission System Operator (TSO). Technical, operational and market aspects related to the integration of more than 8000 MW of installed wind power into the Transmission Network of Vattenfall Europe Transmission are discussed, and experiences with the transmission of wind power, wind power prediction, balancing of wind power, power production behaviour and fluctuations are reported. Moreover, issues for wind power integration on a European level will be discussed with the background of a wind power study. (auth)

  13. AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users

    Science.gov (United States)

    Maiersperger, T.

    2017-12-01

    The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.

  14. Task Phase Recognition for Highly Mobile Workers in Large Building Complexes

    DEFF Research Database (Denmark)

    Stisen, Allan; Mathisen, Andreas; Krogh, Søren

    2016-01-01

    requirements on the accuracy of the indoor positioning, and thus come with low deployment and maintenance effort in real-world settings. We evaluated the proposed methods in a large hospital complex, where the highly mobile workers were recruited among the non-clinical workforce. The evaluation is based......-scale indoor work environments, namely from a WiFi infrastructure providing coarse grained indoor positioning, from inertial sensors in the workers’ mobile phones, and from a task management system yielding information about the scheduled tasks’ start and end locations. The methods presented have low...... on manually labelled real-world data collected over 4 days of regular work life of the mobile workforce. The collected data yields 83 tasks in total involving 8 different orderlies from a major university hospital with a building area of 160, 000 m2. The results show that the proposed methods can distinguish...

  15. Large space antenna communications systems: Integrated Langley Research Center/Jet Propulsion Laboratory development activities. 2: Langley Research Center activities

    Science.gov (United States)

    Cambell, T. G.; Bailey, M. C.; Cockrell, C. R.; Beck, F. B.

    1983-01-01

    The electromagnetic analysis activities at the Langley Research Center are resulting in efficient and accurate analytical methods for predicting both far- and near-field radiation characteristics of large offset multiple-beam multiple-aperture mesh reflector antennas. The utilization of aperture integration augmented with Geometrical Theory of Diffraction in analyzing the large reflector antenna system is emphasized.

  16. Characterisation of large catastrophic landslides using an integrated field, remote sensing and numerical modelling approach

    OpenAIRE

    Wolter, Andrea Elaine

    2014-01-01

    I apply a forensic, multidisciplinary approach that integrates engineering geology field investigations, engineering geomorphology mapping, long-range terrestrial photogrammetry, and a numerical modelling toolbox to two large rock slope failures to study their causes, initiation, kinematics, and dynamics. I demonstrate the significance of endogenic and exogenic processes, both separately and in concert, in contributing to landscape evolution and conditioning slopes for failure, and use geomor...

  17. How to integrate divergent integrals: a pure numerical approach to complex loop calculations

    International Nuclear Information System (INIS)

    Caravaglios, F.

    2000-01-01

    Loop calculations involve the evaluation of divergent integrals. Usually [G. 't Hooft, M. Veltman, Nucl. Phys. B 44 (1972) 189] one computes them in a number of dimensions different than four where the integral is convergent and then one performs the analytical continuation and considers the Laurent expansion in powers of ε=n-4. In this paper we discuss a method to extract directly all coefficients of this expansion by means of concrete and well defined integrals in a five-dimensional space. We by-pass the formal and symbolic procedure of analytic continuation; instead we can numerically compute the integrals to extract directly both the coefficient of the pole 1/ε and the finite part

  18. Control protocol: large scale implementation at the CERN PS complex - a first assessment

    International Nuclear Information System (INIS)

    Abie, H.; Benincasa, G.; Coudert, G.; Davydenko, Y.; Dehavay, C.; Gavaggio, R.; Gelato, G.; Heinze, W.; Legras, M.; Lustig, H.; Merard, L.; Pearson, T.; Strubin, P.; Tedesco, J.

    1994-01-01

    The Control Protocol is a model-based, uniform access procedure from a control system to accelerator equipment. It was proposed at CERN about 5 years ago and prototypes were developed in the following years. More recently, this procedure has been finalized and implemented at a large scale in the PS Complex. More than 300 pieces of equipment are now using this protocol in normal operation and another 300 are under implementation. These include power converters, vacuum systems, beam instrumentation devices, RF equipment, etc. This paper describes how the single general procedure is applied to the different kinds of equipment. The advantages obtained are also discussed. ((orig.))

  19. Group actions, non-Kähler complex manifolds and SKT structures

    Directory of Open Access Journals (Sweden)

    Poddar Mainak

    2018-02-01

    Full Text Available We give a construction of integrable complex structures on the total space of a smooth principal bundle over a complex manifold, with an even dimensional compact Lie group as structure group, under certain conditions. This generalizes the constructions of complex structure on compact Lie groups by Samelson and Wang, and on principal torus bundles by Calabi-Eckmann and others. It also yields large classes of new examples of non-Kähler compact complex manifolds. Moreover, under suitable restrictions on the base manifold, the structure group, and characteristic classes, the total space of the principal bundle admits SKT metrics. This generalizes recent results of Grantcharov et al. We study the Picard group and the algebraic dimension of the total space in some cases. We also use a slightly generalized version of the construction to obtain (non-Kähler complex structures on tangential frame bundles of complex orbifolds.

  20. Simulations of photochemical smog formation in complex urban areas

    Science.gov (United States)

    Muilwijk, C.; Schrijvers, P. J. C.; Wuerz, S.; Kenjereš, S.

    2016-12-01

    In the present study we numerically investigated the dispersion of photochemical reactive pollutants in complex urban areas by applying an integrated Computational Fluid Dynamics (CFD) and Computational Reaction Dynamics (CRD) approach. To model chemical reactions involved in smog generation, the Generic Reaction Set (GRS) approach is used. The GRS model was selected since it does not require detailed modeling of a large set of reactive components. Smog formation is modeled first in the case of an intensive traffic emission, subjected to low to moderate wind conditions in an idealized two-dimensional street canyon with a building aspect ratio (height/width) of one. It is found that Reactive Organic Components (ROC) play an important role in the chemistry of smog formation. In contrast to the NOx/O3 photochemical steady state model that predicts a depletion of the (ground level) ozone, the GRS model predicts generation of ozone. Secondly, the effect of direct sunlight and shadow within the street canyon on the chemical reaction dynamics is investigated for three characteristic solar angles (morning, midday and afternoon). Large differences of up to one order of magnitude are found in the ozone production for different solar angles. As a proof of concept for real urban areas, the integrated CFD/CRD approach is applied for a real scale (1 × 1 km2) complex urban area (a district of the city of Rotterdam, The Netherlands) with high traffic emissions. The predicted pollutant concentration levels give realistic values that correspond to moderate to heavy smog. It is concluded that the integrated CFD/CRD method with the GRS model of chemical reactions is both accurate and numerically robust, and can be used for modeling of smog formation in complex urban areas.

  1. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  2. Data mining in large sets of complex data

    CERN Document Server

    Cordeiro, Robson L F; Júnior, Caetano Traina

    2013-01-01

    The amount and the complexity of the data gathered by current enterprises are increasing at an exponential rate. Consequently, the analysis of Big Data is nowadays a central challenge in Computer Science, especially for complex data. For example, given a satellite image database containing tens of Terabytes, how can we find regions aiming at identifying native rainforests, deforestation or reforestation? Can it be made automatically? Based on the work discussed in this book, the answers to both questions are a sound "yes", and the results can be obtained in just minutes. In fact, results that

  3. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  4. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  5. ETANA-DL: Managing Complex Information Applications - an Archaeology Digital Library

    OpenAIRE

    Ravindranathan, Unni; Shen, Rao; Goncalves, Marcos A.; Fan, Weiguo; Fox, Edward A.; Flanagan, James

    2004-01-01

    Archaeological research results in the generation of large quantities of heterogeneous information managed by different projects using custom information systems. We will demonstrate a prototype Digital Library (DL) for integrating and managing archaeological data and providing services useful to various user communities. ETANA-DL is a model-based, componentized, extensible, archaeological DL that manages complex information sources using the client-server paradigm of the Open Archives Initia...

  6. Structured approaches to large-scale systems: Variational integrators for interconnected Lagrange-Dirac systems and structured model reduction on Lie groups

    Science.gov (United States)

    Parks, Helen Frances

    This dissertation presents two projects related to the structured integration of large-scale mechanical systems. Structured integration uses the considerable differential geometric structure inherent in mechanical motion to inform the design of numerical integration schemes. This process improves the qualitative properties of simulations and becomes especially valuable as a measure of accuracy over long time simulations in which traditional Gronwall accuracy estimates lose their meaning. Often, structured integration schemes replicate continuous symmetries and their associated conservation laws at the discrete level. Such is the case for variational integrators, which discretely replicate the process of deriving equations of motion from variational principles. This results in the conservation of momenta associated to symmetries in the discrete system and conservation of a symplectic form when applicable. In the case of Lagrange-Dirac systems, variational integrators preserve a discrete analogue of the Dirac structure preserved in the continuous flow. In the first project of this thesis, we extend Dirac variational integrators to accommodate interconnected systems. We hope this work will find use in the fields of control, where a controlled system can be thought of as a "plant" system joined to its controller, and in the approach of very large systems, where modular modeling may prove easier than monolithically modeling the entire system. The second project of the thesis considers a different approach to large systems. Given a detailed model of the full system, can we reduce it to a more computationally efficient model without losing essential geometric structures in the system? Asked without the reference to structure, this is the essential question of the field of model reduction. The answer there has been a resounding yes, with Principal Orthogonal Decomposition (POD) with snapshots rising as one of the most successful methods. Our project builds on previous work

  7. Numerical methods for engine-airframe integration

    International Nuclear Information System (INIS)

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison of full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment

  8. Gcn4-Mediator Specificity Is Mediated by a Large and Dynamic Fuzzy Protein-Protein Complex

    Directory of Open Access Journals (Sweden)

    Lisa M. Tuttle

    2018-03-01

    Full Text Available Summary: Transcription activation domains (ADs are inherently disordered proteins that often target multiple coactivator complexes, but the specificity of these interactions is not understood. Efficient transcription activation by yeast Gcn4 requires its tandem ADs and four activator-binding domains (ABDs on its target, the Mediator subunit Med15. Multiple ABDs are a common feature of coactivator complexes. We find that the large Gcn4-Med15 complex is heterogeneous and contains nearly all possible AD-ABD interactions. Gcn4-Med15 forms via a dynamic fuzzy protein-protein interface, where ADs bind the ABDs in multiple orientations via hydrophobic regions that gain helicity. This combinatorial mechanism allows individual low-affinity and specificity interactions to generate a biologically functional, specific, and higher affinity complex despite lacking a defined protein-protein interface. This binding strategy is likely representative of many activators that target multiple coactivators, as it allows great flexibility in combinations of activators that can cooperate to regulate genes with variable coactivator requirements. : Tuttle et al. report a “fuzzy free-for-all” interaction mechanism that explains how seemingly unrelated transcription activators converge on a limited number of coactivator targets. The mechanism provides a rationale for the observation that individually weak and low-specificity interactions can combine to produce biologically critical function without requiring highly ordered structure. Keywords: transcription activation, intrinsically disordered proteins, fuzzy binding

  9. Xeno-Free and Defined Human Embryonic Stem Cell-Derived Retinal Pigment Epithelial Cells Functionally Integrate in a Large-Eyed Preclinical Model

    Directory of Open Access Journals (Sweden)

    Alvaro Plaza Reyes

    2016-01-01

    Full Text Available Human embryonic stem cell (hESC-derived retinal pigment epithelial (RPE cells could replace lost tissue in geographic atrophy (GA but efficacy has yet to be demonstrated in a large-eyed model. Also, production of hESC-RPE has not yet been achieved in a xeno-free and defined manner, which is critical for clinical compliance and reduced immunogenicity. Here we describe an effective differentiation methodology using human laminin-521 matrix with xeno-free and defined medium. Differentiated cells exhibited characteristics of native RPE including morphology, pigmentation, marker expression, monolayer integrity, and polarization together with phagocytic activity. Furthermore, we established a large-eyed GA model that allowed in vivo imaging of hESC-RPE and host retina. Cells transplanted in suspension showed long-term integration and formed polarized monolayers exhibiting phagocytic and photoreceptor rescue capacity. We have developed a xeno-free and defined hESC-RPE differentiation method and present evidence of functional integration of clinically compliant hESC-RPE in a large-eyed disease model.

  10. The Effectiveness of an Electronic Security Management System in a Privately Owned Apartment Complex

    Science.gov (United States)

    Greenberg, David F.; Roush, Jeffrey B.

    2009-01-01

    Poisson and negative binomial regression methods are used to analyze the monthly time series data to determine the effects of introducing an integrated security management system including closed-circuit television (CCTV), door alarm monitoring, proximity card access, and emergency call boxes to a large privately-owned complex of apartment…

  11. THE GOULD's BELT VERY LARGE ARRAY SURVEY. I. THE OPHIUCHUS COMPLEX

    International Nuclear Information System (INIS)

    Dzib, Sergio A.; Loinard, Laurent; Rodríguez, Luis F.; Ortiz-León, Gisela N.; Pech, Gerardo; Rivera, Juana L.; Mioduszewski, Amy J.; Torres, Rosa M.; Boden, Andrew F.; Hartmann, Lee; Evans, Neal J. II; Briceño, Cesar; Tobin, John

    2013-01-01

    We present large-scale (∼2000 arcmin 2 ), deep (∼20 μJy), high-resolution (∼1'') radio observations of the Ophiuchus star-forming complex obtained with the Karl G. Jansky Very Large Array at λ = 4 and 6 cm. In total, 189 sources were detected, 56 of them associated with known young stellar sources, and 4 with known extragalactic objects; the other 129 remain unclassified, but most of them are most probably background quasars. The vast majority of the young stars detected at radio wavelengths have spectral types K or M, although we also detect four objects of A/F/B types and two brown dwarf candidates. At least half of these young stars are non-thermal (gyrosynchrotron) sources, with active coronas characterized by high levels of variability, negative spectral indices, and (in some cases) significant circular polarization. As expected, there is a clear tendency for the fraction of non-thermal sources to increase from the younger (Class 0/I or flat spectrum) to the more evolved (Class III or weak line T Tauri) stars. The young stars detected both in X-rays and at radio wavelengths broadly follow a Güdel-Benz relation, but with a different normalization than the most radioactive types of stars. Finally, we detect a ∼70 mJy compact extragalactic source near the center of the Ophiuchus core, which should be used as gain calibrator for any future radio observations of this region

  12. Complex dewetting scenarios of ultrathin silicon films for large-scale nanoarchitectures.

    Science.gov (United States)

    Naffouti, Meher; Backofen, Rainer; Salvalaglio, Marco; Bottein, Thomas; Lodari, Mario; Voigt, Axel; David, Thomas; Benkouider, Abdelmalek; Fraj, Ibtissem; Favre, Luc; Ronda, Antoine; Berbezier, Isabelle; Grosso, David; Abbarchi, Marco; Bollani, Monica

    2017-11-01

    Dewetting is a ubiquitous phenomenon in nature; many different thin films of organic and inorganic substances (such as liquids, polymers, metals, and semiconductors) share this shape instability driven by surface tension and mass transport. Via templated solid-state dewetting, we frame complex nanoarchitectures of monocrystalline silicon on insulator with unprecedented precision and reproducibility over large scales. Phase-field simulations reveal the dominant role of surface diffusion as a driving force for dewetting and provide a predictive tool to further engineer this hybrid top-down/bottom-up self-assembly method. Our results demonstrate that patches of thin monocrystalline films of metals and semiconductors share the same dewetting dynamics. We also prove the potential of our method by fabricating nanotransfer molding of metal oxide xerogels on silicon and glass substrates. This method allows the novel possibility of transferring these Si-based patterns on different materials, which do not usually undergo dewetting, offering great potential also for microfluidic or sensing applications.

  13. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network.

    Directory of Open Access Journals (Sweden)

    Sumithra Alagarsamy

    Full Text Available Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff Equation using an Integration Factor (DiffEIF, minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network.

  14. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network.

    Science.gov (United States)

    Alagarsamy, Sumithra; Rajagopalan, S P

    2017-01-01

    Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network.

  15. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Science.gov (United States)

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  16. Large current MOSFET on photonic silicon-on-insulator wafers and its monolithic integration with a thermo-optic 2 × 2 Mach-Zehnder switch.

    Science.gov (United States)

    Cong, G W; Matsukawa, T; Chiba, T; Tadokoro, H; Yanagihara, M; Ohno, M; Kawashima, H; Kuwatsuka, H; Igarashi, Y; Masahara, M; Ishikawa, H

    2013-03-25

    n-channel body-tied partially depleted metal-oxide-semiconductor field-effect transistors (MOSFETs) were fabricated for large current applications on a silicon-on-insulator wafer with photonics-oriented specifications. The MOSFET can drive an electrical current as large as 20 mA. We monolithically integrated this MOSFET with a 2 × 2 Mach-Zehnder interferometer optical switch having thermo-optic phase shifters. The static and dynamic performances of the integrated device are experimentally evaluated.

  17. The OXL format for the exchange of integrated datasets

    Directory of Open Access Journals (Sweden)

    Taubert Jan

    2007-12-01

    Full Text Available A prerequisite for systems biology is the integration and analysis of heterogeneous experimental data stored in hundreds of life-science databases and millions of scientific publications. Several standardised formats for the exchange of specific kinds of biological information exist. Such exchange languages facilitate the integration process; however they are not designed to transport integrated datasets. A format for exchanging integrated datasets needs to i cover data from a broad range of application domains, ii be flexible and extensible to combine many different complex data structures, iii include metadata and semantic definitions, iv include inferred information, v identify the original data source for integrated entities and vi transport large integrated datasets. Unfortunately, none of the exchange formats from the biological domain (e.g. BioPAX, MAGE-ML, PSI-MI, SBML or the generic approaches (RDF, OWL fulfil these requirements in a systematic way.

  18. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  19. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  20. Abelian scalar theory at large global charge

    Energy Technology Data Exchange (ETDEWEB)

    Loukas, Orestis [Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, University of Bern (Switzerland)

    2017-09-15

    We elaborate on Abelian complex scalar models, which are dictated by natural actions (all couplings are of order one), at fixed and large global U(1) charge in an arbitrary number of dimensions. The ground state vertical stroke v right angle is coherently constructed by the zero modes and the appearance of a centrifugal potential is quantum mechanically verified. Using the path integral formulation we systematically analyze the quantum fluctuations around vertical stroke v right angle in order to derive an effective action for the Goldstone mode, which becomes perturbatively meaningful when the charge is large. In this regime we explicitly show, by computing the first few loop corrections, that the whole construction is stable against quantum effects, in the sense that any higher derivative couplings to Goldstone's tree-level action are suppressed by appropriate powers of the large charge. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  1. Direct Observation of Very Large Zero-Field Splitting in a Tetrahedral Ni(II)Se4 Coordination Complex.

    Science.gov (United States)

    Jiang, Shang-Da; Maganas, Dimitrios; Levesanos, Nikolaos; Ferentinos, Eleftherios; Haas, Sabrina; Thirunavukkuarasu, Komalavalli; Krzystek, J; Dressel, Martin; Bogani, Lapo; Neese, Frank; Kyritsis, Panayotis

    2015-10-14

    The high-spin (S = 1) tetrahedral Ni(II) complex [Ni{(i)Pr2P(Se)NP(Se)(i)Pr2}2] was investigated by magnetometry, spectroscopic, and quantum chemical methods. Angle-resolved magnetometry studies revealed the orientation of the magnetization principal axes. The very large zero-field splitting (zfs), D = 45.40(2) cm(-1), E = 1.91(2) cm(-1), of the complex was accurately determined by far-infrared magnetic spectroscopy, directly observing transitions between the spin sublevels of the triplet ground state. These are the largest zfs values ever determined--directly--for a high-spin Ni(II) complex. Ab initio calculations further probed the electronic structure of the system, elucidating the factors controlling the sign and magnitude of D. The latter is dominated by spin-orbit coupling contributions of the Ni ions, whereas the corresponding effects of the Se atoms are remarkably smaller.

  2. Tele-operation of the electric power generation and transmission in large complex; Teleoperacao da geracao e transmissao de energia eletrica em complexo de grande porte

    Energy Technology Data Exchange (ETDEWEB)

    Martini, Jose Sidnei Colombo

    1992-07-01

    This work presents the specification and development outstanding aspects of a large supervisory and control power system. The engineering experiment, named SSCH - Hierarchical Supervision and Control System, occurred between 1982 and 1992 at Sao Paulo State and is an industrial automation power system most expressive cases carried out in this decade. More than complex development technical details, important management aspects concerning with large project execution are discussed. The SSCHs special characteristics are its dimension, complexity and realtime network analysis resources, applied to large power systems. (author)

  3. SPITZER VIEW OF YOUNG MASSIVE STARS IN THE LARGE MAGELLANIC CLOUD H II COMPLEXES. II. N 159

    International Nuclear Information System (INIS)

    Chen, C.-H. Rosie; Indebetouw, Remy; Chu, You-Hua; Gruendl, Robert A.; Seale, Jonathan P.; Testor, Gerard; Heitsch, Fabian; Meixner, Margaret; Sewilo, Marta

    2010-01-01

    The H II complex N 159 in the Large Magellanic Cloud is used to study massive star formation in different environments, as it contains three giant molecular clouds (GMCs) that have similar sizes and masses but exhibit different intensities of star formation. We identify candidate massive young stellar objects (YSOs) using infrared photometry, and model their spectral energy distributions to constrain mass and evolutionary state. Good fits are obtained for less evolved Type I, I/II, and II sources. Our analysis suggests that there are massive embedded YSOs in N 159B, a maser source, and several ultracompact H II regions. Massive O-type YSOs are found in GMCs N 159-E and N 159-W, which are associated with ionized gas, i.e., where massive stars formed a few Myr ago. The third GMC, N 159-S, has neither O-type YSOs nor evidence of previous massive star formation. This correlation between current and antecedent formation of massive stars suggests that energy feedback is relevant. We present evidence that N 159-W is forming YSOs spontaneously, while collapse in N 159-E may be triggered. Finally, we compare star formation rates determined from YSO counts with those from integrated Hα and 24 μm luminosities and expected from gas surface densities. Detailed dissection of extragalactic GMCs like the one presented here is key to revealing the physics underlying commonly used star formation scaling laws.

  4. Isotopic shifts in chemical exchange systems. 1. Large isotope effects in the complexation of Na+ isotopes by macrocyclic polyethers

    International Nuclear Information System (INIS)

    Knoechel, A.; Wilken, R.D.

    1981-01-01

    The complexation of 24 Na + and 22 Na + by 18 of the most widely used macrocyclic polyethers (crown ethers and monocyclic and bicyclic aminopolyethers) has been investigated in view of possible equilibrium isotope shifts. Solvated salts and polyether complexes were distributed differently into two phases and isotope ratios determined in both phases. Chloroform/water systems were shown to be particularly suitable to the investigations allowing favorable distribution for Na + and 13 of the 18 polyethers employed. With crown ethers 24 Na + enrichment varied from nonsignficant values (for large crown ethers) up to 3.1 +- 0.4% (18-crown-6). In the case of bicyclic aminopolyethers, ligands with cages of optimum size to accommodate Na + showed 24 Na + enrichment between O (nonsignificant) (2.2/sub B/2./sub B/) and 5.2 +- 1.8% (2.2.1). In contrast, for 2.2.2. and its derivatives, being too large for Na + , 22 Na + enrichment varying from O (nonsignificant) (2.2.2.p) up to 5.4 +- 0.5% (2.2.2.) has been observed. These values are remarkably high. They are explained by different bonding in solvate structure and polyether complex by using the theoretical approach of Bigeleisen

  5. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  6. Deterministic patterned growth of high-mobility large-crystal graphene: a path towards wafer scale integration

    Science.gov (United States)

    Miseikis, Vaidotas; Bianco, Federica; David, Jérémy; Gemmi, Mauro; Pellegrini, Vittorio; Romagnoli, Marco; Coletti, Camilla

    2017-06-01

    We demonstrate rapid deterministic (seeded) growth of large single-crystals of graphene by chemical vapour deposition (CVD) utilising pre-patterned copper substrates with chromium nucleation sites. Arrays of graphene single-crystals as large as several hundred microns are grown with a periodicity of up to 1 mm. The graphene is transferred to target substrates using aligned and contamination- free semi-dry transfer. The high quality of the synthesised graphene is confirmed by Raman spectroscopy and transport measurements, demonstrating room-temperature carrier mobility of 21 000 cm2 V-1 s-1 when transferred on top of hexagonal boron nitride. By tailoring the nucleation of large single-crystals according to the desired device geometry, it will be possible to produce complex device architectures based on single-crystal graphene, thus paving the way to the adoption of CVD graphene in wafer-scale fabrication.

  7. Integrated Transceivers for Millimeter Wave and Cellular Communication

    OpenAIRE

    TIRED, TOBIAS

    2016-01-01

    Abstract:This doctoral thesis is addresses two topics in integrated circuit design: multiband direct conversion cellular receivers for cellular frequencies and beam steering transmitters for millimeter wave communication for the cellular backhaul. The trend towards cellular terminals supporting ever more different frequency bands has resulted in complex radio frontends with a large number of RF inputs. Common receivers have, for performance reasons, in the past used differential RF inputs. Ho...

  8. Exergoeconomic improvement of a complex cogeneration system integrated with a professional process simulator

    International Nuclear Information System (INIS)

    Vieira, Leonardo S.; Donatelli, Joao L.; Cruz, Manuel E.

    2009-01-01

    In this paper, the application of an iterative exergoeconomic methodology for improvement of thermal systems to a complex combined-cycle cogeneration plant is presented. The methodology integrates exergoeconomics with a professional process simulator, and represents an alternative to conventional mathematical optimization techniques, because it reduces substantially the number of variables to be considered in the improvement process. By exploiting the computational power of a simulator, the integrated approach permits the optimization routine to ignore the variables associated with the thermodynamic equations, and thus to deal only with the economic equations and objective function. In addition, the methodology combines recent available exergoeconomic techniques with qualitative and quantitative criteria to identify only those decision variables, which matter for the improvement of the system. To demonstrate the strengths of the methodology, it is here applied to a 24-component cogeneration plant, which requires O(10 3 ) variables for its simulation. The results which are obtained, are compared to those reached using a conventional mathematical optimization procedure, also coupled to the process simulator. It is shown that, for engineering purposes, improvement of the system is often more cost effective and less time consuming than optimization of the system.

  9. Integrative Physiology 2.0’: integration of systems biology into physiology and its application to cardiovascular homeostasis

    Science.gov (United States)

    Kuster, Diederik W D; Merkus, Daphne; van der Velden, Jolanda; Verhoeven, Adrie J M; Duncker, Dirk J

    2011-01-01

    Since the completion of the Human Genome Project and the advent of the large scaled unbiased ‘-omics’ techniques, the field of systems biology has emerged. Systems biology aims to move away from the traditional reductionist molecular approach, which focused on understanding the role of single genes or proteins, towards a more holistic approach by studying networks and interactions between individual components of networks. From a conceptual standpoint, systems biology elicits a ‘back to the future’ experience for any integrative physiologist. However, many of the new techniques and modalities employed by systems biologists yield tremendous potential for integrative physiologists to expand their tool arsenal to (quantitatively) study complex biological processes, such as cardiac remodelling and heart failure, in a truly holistic fashion. We therefore advocate that systems biology should not become/stay a separate discipline with ‘-omics’ as its playing field, but should be integrated into physiology to create ‘Integrative Physiology 2.0’. PMID:21224228

  10. Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits

    Science.gov (United States)

    Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin

    2009-01-01

    Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.

  11. Combining complex networks and data mining: Why and how

    Science.gov (United States)

    Zanin, M.; Papo, D.; Sousa, P. A.; Menasalvas, E.; Nicchi, A.; Kubik, E.; Boccaletti, S.

    2016-05-01

    The increasing power of computer technology does not dispense with the need to extract meaningful information out of data sets of ever growing size, and indeed typically exacerbates the complexity of this task. To tackle this general problem, two methods have emerged, at chronologically different times, that are now commonly used in the scientific community: data mining and complex network theory. Not only do complex network analysis and data mining share the same general goal, that of extracting information from complex systems to ultimately create a new compact quantifiable representation, but they also often address similar problems too. In the face of that, a surprisingly low number of researchers turn out to resort to both methodologies. One may then be tempted to conclude that these two fields are either largely redundant or totally antithetic. The starting point of this review is that this state of affairs should be put down to contingent rather than conceptual differences, and that these two fields can in fact advantageously be used in a synergistic manner. An overview of both fields is first provided, some fundamental concepts of which are illustrated. A variety of contexts in which complex network theory and data mining have been used in a synergistic manner are then presented. Contexts in which the appropriate integration of complex network metrics can lead to improved classification rates with respect to classical data mining algorithms and, conversely, contexts in which data mining can be used to tackle important issues in complex network theory applications are illustrated. Finally, ways to achieve a tighter integration between complex networks and data mining, and open lines of research are discussed.

  12. Importance of Mediator complex in the regulation and integration of diverse signaling pathways in plants

    Directory of Open Access Journals (Sweden)

    Subhasis eSamanta

    2015-09-01

    Full Text Available Basic transcriptional machinery in eukaryotes is assisted by a number of cofactors, which either increase or decrease the rate of transcription. Mediator complex is one such cofactor, and recently has drawn a lot of interest because of its integrative power to converge different signaling pathways before channelling the transcription instructions to the RNA polymerase II machinery. Like yeast and metazoans, plants do possess the Mediator complex across the kingdom, and its isolation and subunit analyses have been reported from the model plant, Arabidopsis. Genetic and molecular analyses have unravelled important regulatory roles of Mediator subunits at every stage of plant life cycle starting from flowering to embryo and organ development, to even size determination. It also contributes immensely to the survival of plants against different environmental vagaries by the timely activation of its resistance mechanisms. Here, we have provided an overview of plant Mediator complex starting from its discovery to regulation of stoichiometry of its subunits. We have also reviewed involvement of different Mediator subunits in different processes and pathways including defense response pathways evoked by diverse biotic cues. Wherever possible, attempts have been made to provide mechanistic insight of Mediator’s involvement in these processes.

  13. Importance of Mediator complex in the regulation and integration of diverse signaling pathways in plants.

    Science.gov (United States)

    Samanta, Subhasis; Thakur, Jitendra K

    2015-01-01

    Basic transcriptional machinery in eukaryotes is assisted by a number of cofactors, which either increase or decrease the rate of transcription. Mediator complex is one such cofactor, and recently has drawn a lot of interest because of its integrative power to converge different signaling pathways before channeling the transcription instructions to the RNA polymerase II machinery. Like yeast and metazoans, plants do possess the Mediator complex across the kingdom, and its isolation and subunit analyses have been reported from the model plant, Arabidopsis. Genetic, and molecular analyses have unraveled important regulatory roles of Mediator subunits at every stage of plant life cycle starting from flowering to embryo and organ development, to even size determination. It also contributes immensely to the survival of plants against different environmental vagaries by the timely activation of its resistance mechanisms. Here, we have provided an overview of plant Mediator complex starting from its discovery to regulation of stoichiometry of its subunits. We have also reviewed involvement of different Mediator subunits in different processes and pathways including defense response pathways evoked by diverse biotic cues. Wherever possible, attempts have been made to provide mechanistic insight of Mediator's involvement in these processes.

  14. Proteomic analysis of the dysferlin protein complex unveils its importance for sarcolemmal maintenance and integrity.

    Directory of Open Access Journals (Sweden)

    Antoine de Morrée

    Full Text Available Dysferlin is critical for repair of muscle membranes after damage. Mutations in dysferlin lead to a progressive muscular dystrophy. Recent studies suggest additional roles for dysferlin. We set out to study dysferlin's protein-protein interactions to obtain comprehensive knowledge of dysferlin functionalities in a myogenic context. We developed a robust and reproducible method to isolate dysferlin protein complexes from cells and tissue. We analyzed the composition of these complexes in cultured myoblasts, myotubes and skeletal muscle tissue by mass spectrometry and subsequently inferred potential protein functions through bioinformatics analyses. Our data confirm previously reported interactions and support a function for dysferlin as a vesicle trafficking protein. In addition novel potential functionalities were uncovered, including phagocytosis and focal adhesion. Our data reveal that the dysferlin protein complex has a dynamic composition as a function of myogenic differentiation. We provide additional experimental evidence and show dysferlin localization to, and interaction with the focal adhesion protein vinculin at the sarcolemma. Finally, our studies reveal evidence for cross-talk between dysferlin and its protein family member myoferlin. Together our analyses show that dysferlin is not only a membrane repair protein but also important for muscle membrane maintenance and integrity.

  15. Exploring integration of care for children living with complex care needs across the European union and European economic area

    DEFF Research Database (Denmark)

    Brenner, Maria; O’Shea, Miriam; Larkin, Philip J.

    2017-01-01

    Introduction: The aim of this paper is to report on the development of surveys to explore integration of care for children living with complex care needs across the European Union (EU) and European Economic Area (EEA). Theory and methods: Each survey consists of a vignette and questions adapted...... from the Standards for Systems of Care for Children and Youth with Special Health Care Needs and the Eurobarometer Survey. A Country Agent in each country, a local expert in child health services, will obtain data from indigenous sources. Results: We identified ‘in-principle’ complex problems...

  16. Spatio-temporal databases complex motion pattern queries

    CERN Document Server

    Vieira, Marcos R

    2013-01-01

    This brief presents several new query processing techniques, called complex motion pattern queries, specifically designed for very large spatio-temporal databases of moving objects. The brief begins with the definition of flexible pattern queries, which are powerful because of the integration of variables and motion patterns. This is followed by a summary of the expressive power of patterns and flexibility of pattern queries. The brief then present the Spatio-Temporal Pattern System (STPS) and density-based pattern queries. STPS databases contain millions of records with information about mobi

  17. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    Science.gov (United States)

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  18. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  19. "Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2011-09-01

    Full Text Available Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found not to have modeled the analyses to take account of the complex sample (Johnson & Elliott, 1998 even when publishing in highly-regarded journals. It is well known that failure to appropriately model the complex sample can substantially bias the results of the analysis. Examples presented in this paper highlight the risk of error of inference and mis-estimation of parameters from failure to analyze these data sets appropriately.

  20. Channel Capacity Calculation at Large SNR and Small Dispersion within Path-Integral Approach

    Science.gov (United States)

    Reznichenko, A. V.; Terekhov, I. S.

    2018-04-01

    We consider the optical fiber channel modelled by the nonlinear Shrödinger equation with additive white Gaussian noise. Using Feynman path-integral approach for the model with small dispersion we find the first nonzero corrections to the conditional probability density function and the channel capacity estimations at large signal-to-noise ratio. We demonstrate that the correction to the channel capacity in small dimensionless dispersion parameter is quadratic and positive therefore increasing the earlier calculated capacity for a nondispersive nonlinear optical fiber channel in the intermediate power region. Also for small dispersion case we find the analytical expressions for simple correlators of the output signals in our noisy channel.

  1. Complex integration and Cauchy's theorem

    CERN Document Server

    Watson, GN

    2012-01-01

    This brief monograph by one of the great mathematicians of the early twentieth century offers a single-volume compilation of propositions employed in proofs of Cauchy's theorem. Developing an arithmetical basis that avoids geometrical intuitions, Watson also provides a brief account of the various applications of the theorem to the evaluation of definite integrals.Author G. N. Watson begins by reviewing various propositions of Poincaré's Analysis Situs, upon which proof of the theorem's most general form depends. Subsequent chapters examine the calculus of residues, calculus optimization, the

  2. Experience with LHC Magnets from Prototyping to Large Scale Industrial Production and Integration

    CERN Multimedia

    Rossi, L

    2004-01-01

    The construction of the LHC superconducting magnets is approaching its half way to completion. At the end of 2003, main dipoles cold masses for more than one octant were delivered; meanwhile the winding for the second octant was almost completed. The other large magnets, like the main quadrupoles and the insertion quadrupoles, have entered into series production as well. Providing more than 20 km of superconducting magnets, with the quality required for an accelerator like LHC, is an unprecedented challenge in term of complexity that has required many steps from the construction of 1 meterlong magnets in the laboratory to today’s production of more than one 15 meter-long magnet per day in Industry. The work and its organization is made even more complex by the fact that CERN supplies most of the critical components and part of the main tooling to the magnet manufacturers, both for cost reduction and for quality issues. In this paper the critical aspects of the construction will be reviewed and the actual ...

  3. Large-scale offshore wind energy. Cost analysis and integration in the Dutch electricity market

    International Nuclear Information System (INIS)

    De Noord, M.

    1999-02-01

    The results of analysis of the construction and integration costs of large-scale offshore wind energy (OWE) farms in 2010 are presented. The integration of these farms (1 and 5 GW) in the Dutch electricity distribution system have been regarded against the background of a liberalised electricity market. A first step is taken for the determination of costs involved in solving integration problems. Three different types of foundations are examined: the mono-pile, the jacket and a new type of foundation: the concrete caisson pile: all single-turbine-single-support structures. For real offshore applications (>10 km offshore, at sea-depths >20 m), the concrete caisson pile is regarded as the most suitable. The price/power ratios of wind turbines are analysed. It is assumed that in 2010 turbines in the power range of 3-5 MW are available. The main calculations have been conducted for a 3 MW turbine. The main choice in electrical infrastructure is for AC or DC. Calculations show that at distances of 30 km offshore and more, the use of HVDC will result in higher initial costs but lower operating costs. The share of operating and maintenance (O ampersand M) costs in the kWh cost price is approximately 3.3%. To be able to compare the two farms, a base case is derived with a construction time of 10 years for both. The energy yield is calculated for a wind regime offshore of 9.0 m/s annual mean wind speed. Per 3 MW turbine this results in an annual energy production of approximately 12 GWh. The total farm efficiency amounts to 82%, resulting in a total farm capacity factor of 38%. With a required internal rate of return of 15%, the kWh cost price amounts to 0.24 DFl and 0.21 DFl for the 1 GW and 5 GW farms respectively in the base case. The required internal rate of return has a large effect on the kWh cost price, followed by costs of subsystems. O ampersand M costs have little effect on the cost price. Parameter studies show that a small cost reduction of 5% is possible when

  4. Sub-10 nm colloidal lithography for circuit-integrated spin-photo-electronic devices

    Directory of Open Access Journals (Sweden)

    Adrian Iovan

    2012-12-01

    Full Text Available Patterning of materials at sub-10 nm dimensions is at the forefront of nanotechnology and employs techniques of various complexity, efficiency, areal scale, and cost. Colloid-based patterning is known to be capable of producing individual sub-10 nm objects. However, ordered, large-area nano-arrays, fully integrated into photonic or electronic devices have remained a challenging task. In this work, we extend the practice of colloidal lithography to producing large-area sub-10 nm point-contact arrays and demonstrate their circuit integration into spin-photo-electronic devices. The reported nanofabrication method should have broad application areas in nanotechnology as it allows ballistic-injection devices, even for metallic materials with relatively short characteristic relaxation lengths.

  5. Manufacturing of large and integral-type steel forgings for nuclear steam supply system components

    International Nuclear Information System (INIS)

    Kawaguchi, S.; Tsukada, H.; Suzuki, K.; Sato, I.; Onodera, S.

    1986-01-01

    Forgings for the reactor pressure vessel (RPV) of the pressurized heavy water reactor (PHWR) 700 MWe, which is composed of seven major parts and nozzles totaling about 965 tons, were successfully developed. These forgings are: 1. Flanges: an outside diameter of 8440 mm and a weight of 238 tons max, requiring an ingot of 570 tons. 2. Shells and torus: an outside diameter of about 8000 mm with large height. 3. Cover dome: a diameter of 6800 mm and a thickness of 460 mm, requiring a blank forging before forming of 8000 mm in diameter and 550 m thick. The material designation is 20Mn-Mo-Ni 5 5 (equivalent to SA508, Class 3). In this paper, the manufacturing of and the properties of such large and integral forgings are discussed, including an overview of manufacturing processes for ultralarge-sized forgings over the last two decades

  6. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  7. Building effective workforce management practices through shared governance and technology systems integration.

    Science.gov (United States)

    Krive, Jacob

    2013-01-01

    In integrated delivery networks (IDNs) with complex management structures, shared governance in nursing is a proven model for health care delivery. After Advocate Health Care, the largest IDN in Illinois, implemented shared governance in its nursing, clinical, and non-clinical departments and restructured the organization's technology use, it benefited greatly from a new, shared decision-making process. After listening to business consultants, clinical professionals, and information technology experts, hospitals should take the blended, or comprehensive, approach to new projects. They can succeed by promoting communication supported by an integrated computer platform that helps nursing and business executives reach a consensus. Traditional modes of operation, in which individual administrative, clinical, and technology departments separately introduce innovation, do not deliver an advantage. However, models that incorporate open communication, integration, and knowledge sharing will help large IDNs and other complex health care organizations make the best possible use of their resources and investments.

  8. Merkel cell polyomavirus recruits MYCL to the EP400 complex to promote oncogenesis.

    Directory of Open Access Journals (Sweden)

    Jingwei Cheng

    2017-10-01

    Full Text Available Merkel cell carcinoma (MCC frequently contains integrated copies of Merkel cell polyomavirus DNA that express a truncated form of Large T antigen (LT and an intact Small T antigen (ST. While LT binds RB and inactivates its tumor suppressor function, it is less clear how ST contributes to MCC tumorigenesis. Here we show that ST binds specifically to the MYC homolog MYCL (L-MYC and recruits it to the 15-component EP400 histone acetyltransferase and chromatin remodeling complex. We performed a large-scale immunoprecipitation for ST and identified co-precipitating proteins by mass spectrometry. In addition to protein phosphatase 2A (PP2A subunits, we identified MYCL and its heterodimeric partner MAX plus the EP400 complex. Immunoprecipitation for MAX and EP400 complex components confirmed their association with ST. We determined that the ST-MYCL-EP400 complex binds together to specific gene promoters and activates their expression by integrating chromatin immunoprecipitation with sequencing (ChIP-seq and RNA-seq. MYCL and EP400 were required for maintenance of cell viability and cooperated with ST to promote gene expression in MCC cell lines. A genome-wide CRISPR-Cas9 screen confirmed the requirement for MYCL and EP400 in MCPyV-positive MCC cell lines. We demonstrate that ST can activate gene expression in a EP400 and MYCL dependent manner and this activity contributes to cellular transformation and generation of induced pluripotent stem cells.

  9. Merkel cell polyomavirus recruits MYCL to the EP400 complex to promote oncogenesis.

    Science.gov (United States)

    Cheng, Jingwei; Park, Donglim Esther; Berrios, Christian; White, Elizabeth A; Arora, Reety; Yoon, Rosa; Branigan, Timothy; Xiao, Tengfei; Westerling, Thomas; Federation, Alexander; Zeid, Rhamy; Strober, Benjamin; Swanson, Selene K; Florens, Laurence; Bradner, James E; Brown, Myles; Howley, Peter M; Padi, Megha; Washburn, Michael P; DeCaprio, James A

    2017-10-01

    Merkel cell carcinoma (MCC) frequently contains integrated copies of Merkel cell polyomavirus DNA that express a truncated form of Large T antigen (LT) and an intact Small T antigen (ST). While LT binds RB and inactivates its tumor suppressor function, it is less clear how ST contributes to MCC tumorigenesis. Here we show that ST binds specifically to the MYC homolog MYCL (L-MYC) and recruits it to the 15-component EP400 histone acetyltransferase and chromatin remodeling complex. We performed a large-scale immunoprecipitation for ST and identified co-precipitating proteins by mass spectrometry. In addition to protein phosphatase 2A (PP2A) subunits, we identified MYCL and its heterodimeric partner MAX plus the EP400 complex. Immunoprecipitation for MAX and EP400 complex components confirmed their association with ST. We determined that the ST-MYCL-EP400 complex binds together to specific gene promoters and activates their expression by integrating chromatin immunoprecipitation with sequencing (ChIP-seq) and RNA-seq. MYCL and EP400 were required for maintenance of cell viability and cooperated with ST to promote gene expression in MCC cell lines. A genome-wide CRISPR-Cas9 screen confirmed the requirement for MYCL and EP400 in MCPyV-positive MCC cell lines. We demonstrate that ST can activate gene expression in a EP400 and MYCL dependent manner and this activity contributes to cellular transformation and generation of induced pluripotent stem cells.

  10. Combining eastern and western practices for safe and effective endoscopic resection of large complex colorectal lesions.

    Science.gov (United States)

    Emmanuel, Andrew; Gulati, Shraddha; Burt, Margaret; Hayee, Bu'Hussain; Haji, Amyn

    2018-05-01

    Endoscopic resection of large colorectal polyps is well established. However, significant differences in technique exist between eastern and western interventional endoscopists. We report the results of endoscopic resection of large complex colorectal lesions from a specialist unit that combines eastern and western techniques for assessment and resection. Endoscopic resections of colorectal lesions of at least 2 cm were included. Lesions were assessed using magnification chromoendoscopy supplemented by colonoscopic ultrasound in selected cases. A lesion-specific approach to resection with endoscopic mucosal resection or endoscopic submucosal dissection (ESD) was used. Surveillance endoscopy was performed at 3 (SC1) and 12 (SC2) months. Four hundred and sixty-six large (≥20 mm) colorectal lesions (mean size 54.8 mm) were resected. Three hundread and fifty-six were resected using endoscopic mucosal resection and 110 by ESD or hybrid ESD. Fifty-one percent of lesions had been subjected to previous failed attempts at resection or heavy manipulation (≥6 biopsies). Nevertheless, endoscopic resection was deemed successful after an initial attempt in 98%. Recurrence occurred in 15% and could be treated with endoscopic resection in most. Only two patients required surgery for perforation. Nine patients had postprocedure bleeding; only two required endoscopic clips. Ninety-six percent of patients without invasive cancer were free from recurrence and had avoided surgery at last follow-up. Combining eastern and western practices for assessment and resection results in safe and effective organ-conserving treatment of complex colorectal lesions. Accurate assessment before and after resection using magnification chromoendoscopy and a lesion-specific approach to resection, incorporating ESD where appropriate, are important factors in achieving these results.

  11. GHz modulation enabled using large extinction ratio waveguide-modulator integrated with 404 nm GaN laser diode

    KAUST Repository

    Shen, Chao

    2017-01-30

    A 404-nm emitting InGaN-based laser diode with integrated-waveguide-modulator showing a large extinction ratio of 11.3 dB was demonstrated on semipolar (2021) plane GaN substrate. The device shows a low modulation voltage of −2.5 V and ∼ GHz −3 dB bandwidth, enabling 1.7 Gbps data transmission.

  12. GHz modulation enabled using large extinction ratio waveguide-modulator integrated with 404 nm GaN laser diode

    KAUST Repository

    Shen, Chao; Lee, Changmin; Ng, Tien Khee; Speck, James S.; Nakamura, Shuji; DenBaars, Steven P.; Alyamani, Ahmed Y.; Eldesouki, Munir M.; Ooi, Boon S.

    2017-01-01

    A 404-nm emitting InGaN-based laser diode with integrated-waveguide-modulator showing a large extinction ratio of 11.3 dB was demonstrated on semipolar (2021) plane GaN substrate. The device shows a low modulation voltage of −2.5 V and ∼ GHz −3 dB bandwidth, enabling 1.7 Gbps data transmission.

  13. Estimating large complex projects Estimando proyectos grandes y complejos

    Directory of Open Access Journals (Sweden)

    Cliff Schexnayder

    2007-08-01

    Full Text Available Managing large capital construction projects requires the coordination of a multitude of human, organizational, technical, and natural resources. Quite often, the engineering and construction complexities of such projects are overshadowed by economic, societal, and political challenges. The ramifications and effects, which result from differences between early project cost estimates and the bid price or the final project cost, are significant. Over the time span between the initiation of a project and the completion of construction many factors influence a project's final costs. This time span is normally several years in duration but for highly complex and technologically challenging projects, project duration can easily exceed a decade. Over that period, changes to the project scope often occur. The subject here is a presentation of strategies that support realistic cost estimating. Through literature review and interviews with transportation agencies in the U.S. and internationally the authors developed a database of the factors that are the root causes of cost estimation problemsGestionar proyectos de construcción de grandes capitales requiere de la coordinación de una multitud de recursos humanos, organizacionales, técnicos y naturales. Frecuentemente, las complejidades del diseño y construcción de esos grandes proyectos son tapadas por sus desafíos económicos, políticos y sociales. Las ramificaciones y efectos que resultan de las diferencias entre la estimación de costo inicial, el costo de la propuesta adjudicada y el costo final del proyecto son significativas. Hay numerosos factores que inciden en el costo final del proyecto entre su inicio y finalización. La duración es generalmente de varios años y puede incluso superar la década para aquellos especialmente complejos y desafiantes. En ese período de tiempo, cambios en los alcances del proyecto cambian frecuentemente. El tópico del presente artículo es mostrar

  14. Status and Future Developments in Large Accelerator Control Systems

    International Nuclear Information System (INIS)

    Karen S. White

    2006-01-01

    Over the years, accelerator control systems have evolved from small hardwired systems to complex computer controlled systems with many types of graphical user interfaces and electronic data processing. Today's control systems often include multiple software layers, hundreds of distributed processors, and hundreds of thousands of lines of code. While it is clear that the next generation of accelerators will require much bigger control systems, they will also need better systems. Advances in technology will be needed to ensure the network bandwidth and CPU power can provide reasonable update rates and support the requisite timing systems. Beyond the scaling problem, next generation systems face additional challenges due to growing cyber security threats and the likelihood that some degree of remote development and operation will be required. With a large number of components, the need for high reliability increases and commercial solutions can play a key role towards this goal. Future control systems will operate more complex machines and need to present a well integrated, interoperable set of tools with a high degree of automation. Consistency of data presentation and exception handling will contribute to efficient operations. From the development perspective, engineers will need to provide integrated data management in the beginning of the project and build adaptive software components around a central data repository. This will make the system maintainable and ensure consistency throughout the inevitable changes during the machine lifetime. Additionally, such a large project will require professional project management and disciplined use of well-defined engineering processes. Distributed project teams will make the use of standards, formal requirements and design and configuration control vital. Success in building the control system of the future may hinge on how well we integrate commercial components and learn from best practices used in other industries

  15. Threshold corrections, generalised prepotentials and Eichler integrals

    Directory of Open Access Journals (Sweden)

    Carlo Angelantonj

    2015-08-01

    Full Text Available We continue our study of one-loop integrals associated to BPS-saturated amplitudes in N=2 heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur–Poincaré series in the complex structure modulus. The closure of Niebur–Poincaré series under modular derivatives implies that such integrals derive from holomorphic prepotentials fn, generalising the familiar prepotential of N=2 supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involving the Γ0(N Hauptmodul, a full characterisation of holomorphic prepotentials including their quantum monodromies, as well as concrete formulæ for holomorphic Yukawa couplings.

  16. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    Science.gov (United States)

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association

  17. A constraint logic programming approach to associate 1D and 3D structural components for large protein complexes.

    Science.gov (United States)

    Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang

    2007-01-01

    The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.

  18. Failure of large transformation projects from the viewpoint of complex adaptive systems: Management principles for dealing with project dynamics

    NARCIS (Netherlands)

    Janssen, M.; Voort, H. van der; Veenstra, A.F.E. van

    2015-01-01

    Many large transformation projects do not result in the outcomes desired or envisioned by the stakeholders. This type of project is characterised by dynamics which are both caused by and result of uncertainties and unexpected behaviour. In this paper a complex adaptive system (CAS) view was adopted

  19. An integrated micromechanical large particle in flow sorter (MILPIS)

    Science.gov (United States)

    Fuad, Nurul M.; Skommer, Joanna; Friedrich, Timo; Kaslin, Jan; Wlodkowic, Donald

    2015-06-01

    At present, the major hurdle to widespread deployment of zebrafish embryo and larvae in large-scale drug development projects is lack of enabling high-throughput analytical platforms. In order to spearhead drug discovery with the use of zebrafish as a model, platforms need to integrate automated pre-test sorting of organisms (to ensure quality control and standardization) and their in-test positioning (suitable for high-content imaging) with modules for flexible drug delivery. The major obstacle hampering sorting of millimetre sized particles such as zebrafish embryos on chip-based devices is their substantial diameter (above one millimetre), mass (above one milligram), which both lead to rapid gravitational-induced sedimentation and high inertial forces. Manual procedures associated with sorting hundreds of embryos are very monotonous and as such prone to significant analytical errors due to operator's fatigue. In this work, we present an innovative design of a micromechanical large particle in-flow sorter (MILPIS) capable of analysing, sorting and dispensing living zebrafish embryos for drug discovery applications. The system consisted of a microfluidic network, revolving micromechanical receptacle actuated by robotic servomotor and opto-electronic sensing module. The prototypes were fabricated in poly(methyl methacrylate) (PMMA) transparent thermoplastic using infrared laser micromachining. Elements of MILPIS were also fabricated in an optically transparent VisiJet resin using 3D stereolithography (SLA) processes (ProJet 7000HD, 3D Systems). The device operation was based on a rapidly revolving miniaturized mechanical receptacle. The latter function was to hold and position individual fish embryos for (i) interrogation, (ii) sorting decision-making and (iii) physical sorting..The system was designed to separate between fertilized (LIVE) and non-fertilized (DEAD) eggs, based on optical transparency using infrared (IR) emitters and receivers embedded in the system

  20. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  1. Human-Systems Integration (HSI) and the Network Integration Evaluations (NIEs), Part 2: A Deeper Dive into Mission Command Complexity and Cognitive Load

    Science.gov (United States)

    2015-03-01

    system or process more likely to break down or fail when faced with unusual or ambiguous situations. These cautions from the flight management arena...signs, and symbols, and other distinctions in human performance models . IEEE Transactions on Systems, Man and Cybernetics . 1983;13(3). 30 Salas E...itself is intrinsically complex and demanding. However, a work setting with a large number of design-related “rough edges” will give the impression of

  2. Singularities of n-fold integrals of the Ising class and the theory of elliptic curves

    International Nuclear Information System (INIS)

    Boukraa, S; Hassani, S; Maillard, J-M; Zenine, N

    2007-01-01

    We introduce some multiple integrals that are expected to have the same singularities as the singularities of the n-particle contributions χ (n) to the susceptibility of the square lattice Ising model. We find the Fuchsian linear differential equation satisfied by these multiple integrals for n = 1, 2, 3, 4 and only modulo some primes for n = 5 and 6, thus providing a large set of (possible) new singularities of χ (n) . We discuss the singularity structure for these multiple integrals by solving the Landau conditions. We find that the singularities of the associated ODEs identify (up to n = 6) with the leading pinch Landau singularities. The second remarkable obtained feature is that the singularities of the ODEs associated with the multiple integrals reduce to the singularities of the ODEs associated with a finite number of one-dimensional integrals. Among the singularities found, we underline the fact that the quadratic polynomial condition 1 + 3w + 4w 2 = 0, that occurs in the linear differential equation of χ (3) , actually corresponds to a remarkable property of selected elliptic curves, namely the occurrence of complex multiplication. The interpretation of complex multiplication for elliptic curves as complex fixed points of the selected generators of the renormalization group, namely isogenies of elliptic curves, is sketched. Most of the other singularities occurring in our multiple integrals are not related to complex multiplication situations, suggesting an interpretation in terms of (motivic) mathematical structures beyond the theory of elliptic curves

  3. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  4. Toward a Comprehensive Framework for Evaluating the Core Integration Features of Enterprise Integration Middleware Technologies

    Directory of Open Access Journals (Sweden)

    Hossein Moradi

    2013-01-01

    Full Text Available To achieve greater automation of their business processes, organizations face the challenge of integrating disparate systems. In attempting to overcome this problem, organizations are turning to different kinds of enterprise integration. Implementing enterprise integration is a complex task involving both technological and business challenges and requires appropriate middleware technologies. Different enterprise integration solutions provide various functions and features which lead to the complexity of their evaluation process. To overcome this complexity, appropriate tools for evaluating the core integration features of enterprise integration solutions is required. This paper proposes a new comprehensive framework for evaluating the core integration features of both intra-enterprise and inter-enterprise Integration's enabling technologies, which simplify the process of evaluating the requirements met by enterprise integration middleware technologies.The proposed framework for evaluating the core integration features of enterprise integration middleware technologies was enhanced using the structural and conceptual aspects of previous frameworks. It offers a new schema for which various enterprise integration middleware technologies are categorized in different classifications and are evaluated based on their supporting level for the core integration features' criteria. These criteria include the functional and supporting features. The proposed framework, which is a revised version of our previous framework in this area, has developed the scope, structure and content of the mentioned framework.

  5. Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor

    Science.gov (United States)

    Acree, C. W., Jr.

    2010-01-01

    Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.

  6. Benchmarking of London Dispersion-Accounting Density Functional Theory Methods on Very Large Molecular Complexes.

    Science.gov (United States)

    Risthaus, Tobias; Grimme, Stefan

    2013-03-12

    A new test set (S12L) containing 12 supramolecular noncovalently bound complexes is presented and used to evaluate seven different methods to account for dispersion in DFT (DFT-D3, DFT-D2, DFT-NL, XDM, dDsC, TS-vdW, M06-L) at different basis set levels against experimental, back-corrected reference energies. This allows conclusions about the performance of each method in an explorative research setting on "real-life" problems. Most DFT methods show satisfactory performance but, due to the largeness of the complexes, almost always require an explicit correction for the nonadditive Axilrod-Teller-Muto three-body dispersion interaction to get accurate results. The necessity of using a method capable of accounting for dispersion is clearly demonstrated in that the two-body dispersion contributions are on the order of 20-150% of the total interaction energy. MP2 and some variants thereof are shown to be insufficient for this while a few tested D3-corrected semiempirical MO methods perform reasonably well. Overall, we suggest the use of this benchmark set as a "sanity check" against overfitting to too small molecular cases.

  7. Large field radiotherapy

    International Nuclear Information System (INIS)

    Vanasek, J.; Chvojka, Z.; Zouhar, M.

    1984-01-01

    Calculations may prove that irradiation procedures, commonly used in radiotherapy and represented by large-capacity irradiation techniques, do not exceed certain limits of integral doses with favourable radiobiological action on the organism. On the other hand integral doses in supralethal whole-body irradiation, used in the therapy of acute leukemia, represent radiobiological values which without extreme and exceptional further interventions and teamwork are not compatible with life, and the radiotherapeutist cannot use such high doses without the backing of a large team. (author)

  8. An integrated approach to site selection for nuclear power plants

    International Nuclear Information System (INIS)

    Hassan, E.M.A.

    1975-01-01

    A method of analysing and evaluating the large number of factors influencing site selection is proposed, which can interrelate these factors and associated problems in an integrated way and at the same time establish a technique for site evaluation. The objective is to develop an integrated programme that illustrates the complexity and dynamic interrelationships of the various factors to develop an improved understanding of the functions and objectives of siting nuclear power plants and would aim finally at the development of an effective procedure and technique for site evaluation and/or comparative evaluation for making rational site-selection decisions. (author)

  9. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    Science.gov (United States)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  10. Quantum complex rotation and uniform semiclassical calculations of complex energy eigenvalues

    International Nuclear Information System (INIS)

    Connor, J.N.L.; Smith, A.D.

    1983-01-01

    Quantum and semiclassical calculations of complex energy eigenvalues have been carried out for an exponential potential of the form V 0 r 2 exp(-r) and Lennard-Jones (12,6) potential. A straightforward method, based on the complex coordinate rotation technique, is described for the quantum calculation of complex eigenenergies. For singular potentials, the method involves an inward and outward integration of the radial Schroedinger equation, followed by matching of the logarithmic derivatives of the wave functions at an intermediate point. For regular potentials, the method is simpler, as only an inward integration is required. Attention is drawn to the World War II researches of Hartree and co-workers who anticipated later quantum mechanical work on the complex rotation method. Complex eigenenergies are also calculated from a uniform semiclassical three turning point quantization formula, which allows for the proximity of the outer pair of complex turning points. Limiting cases of this formula, which are valid for very narrow or very broad widths, are also used in the calculations. We obtain good agreement between the semiclassical and quantum results. For the Lennard-Jones (12,6) potential, we compare resonance energies and widths from the complex energy definition of a resonance with those obtained from the time delay definition

  11. Automated Derivation of Complex System Constraints from User Requirements

    Science.gov (United States)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  12. Complexity: Outline of the NWO strategic theme Dynamics of complex systems

    NARCIS (Netherlands)

    Burgers, G.; Doelman, A.; Frenken, K.; Hogeweg, P.; Hommes, C.; van der Maas, H.; Mulder, B.; Stam, K.; van Steen, M.; Zandee, L.

    2008-01-01

    Dynamics of complex systems is one of the program 5 themes in the NWO (Netherlands Organisation for Scientific Research) strategy for the years 2007-2011. The ambition of the current proposal is to initiate integrated activities in the field of complex systems within the Netherlands, to provide

  13. Complexity : outline of the NWO strategic theme dynamics of complex systems

    NARCIS (Netherlands)

    Burgers, G.; Doelman, A.; Frenken, K.; Hogeweg, P.; Hommes, C.; Maas, van der H.; Mulder, B.; Stam, K.; Steen, van M.; Zandee, L.

    2008-01-01

    Dynamics of complex systems is one of the program 5 themes in the NWO (Netherlands Organisation for Scientific Research) strategy for the years 2007-2011. The ambition of the current proposal is to initiate integrated activities in the field of complex systems within the Netherlands, to provide

  14. Exploring the dynamics of formal and informal networks in complex multi-team development projects

    DEFF Research Database (Denmark)

    Kratzer, J.; Gemuenden, H. G.; Lettl, Christopher

    2007-01-01

    The increasing number of complex multi-team projects and the scarcity of knowledge about how to run them successfully, create a need for systematic empirical studies. We attempt to lessen this empirical gap by examining the overlap and structure of formally ascribed design interfaces and informal...... communication networks between participating teams in two complex multi-team projects in the space industry. We study the two projects longitudinally throughout the design and integration phases of product development. There are three major findings. First, formally ascribed design interfaces and informal...... communication networks overlap only marginally. Second, the structure of informal communication remains largely stable in the transition from the design to the integration phase. The third and most intriguing finding is that the weak overlap between formally ascribed design interfaces and the informal...

  15. Predictive Big Data Analytics: A Study of Parkinson?s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph

    2016-01-01

    Background A unique archive of Big Data on Parkinson?s Disease is collected, managed and disseminated by the Parkinson?s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  16. 8th international workshop on large-scale integration of wind power into power systems as well as on transmission networks for offshore wind farms. Proceedings

    International Nuclear Information System (INIS)

    Betancourt, Uta; Ackermann, Thomas

    2009-01-01

    Within the 8th International Workshop on Large-Scale Integration of Wind Power into Power Systems as well as on Transmission Networks for Offshore Wind Farms at 14th to 15th October, 2009 in Bremen (Federal Republic of Germany), lectures and posters were presented to the following sessions: (1) Keynote session and panel; (2) Grid integration studies and experience: Europe; (3) Connection of offshore wind farms; (4) Wind forecast; (5) High voltage direct current (HVDC); (6) German grid code issues; (7) Offshore grid connection; (8) Grid integration studies and experience: North America; (9) SUPWIND - Decision support tools for large scale integration of wind; (10) Windgrid - Wind on the grid: An integrated approach; (11) IEA Task 25; (12) Grid code issues; (13) Market Issues; (14) Offshore Grid; (15) Modelling; (16) Wind power and storage; (17) Power system balancing; (18) Wind turbine performance; (19) Modelling and offshore transformer.

  17. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    Science.gov (United States)

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  18. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  19. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  20. Stability of integral membrane proteins under high hydrostatic pressure: the LH2 and LH3 antenna pigment-protein complexes from photosynthetic bacteria.

    Science.gov (United States)

    Kangur, Liina; Timpmann, Kõu; Freiberg, Arvi

    2008-07-03

    The bacteriochlorophyll a-containing LH2 and LH3 antenna complexes are the integral membrane proteins that catalyze the photosynthetic process in purple photosynthetic bacteria. The LH2 complex from Rhodobacter sphaeroides shows characteristic strong absorbance at 800 and 850 nm due to the pigment molecules confined in two separate areas of the protein. In the LH3 complex from Rhodopesudomonas acidophila the corresponding bands peak at 800 and 820 nm. Using the bacteriochlorophyll a cofactors as intrinsic probes to monitor local changes in the protein structure, we investigate spectral responses of the antenna complexes to very high hydrostatic pressures up to 2.5 GPa when embedded into natural membrane environment or extracted with detergent. We first demonstrate that high pressure does induce significant alterations to the tertiary structure of the proteins not only in proximity of the 800 nm-absorbing bacteriochlorophyll a molecules known previously (Gall, A.; et al. Biochemistry 2003, 42, 13019) but also of the 850 nm- and 820 nm-absorbing molecules, including breakage of the hydrogen bond they are involved in. The membrane-protected complexes appear more resilient to damaging effects of the compression compared with the complexes extracted into mixed detergent-buffer environment. Increased resistance of the isolated complexes is observed at high protein concentration resulting aggregation as well as when cosolvent (glycerol) is added into the solution. These stability variations correlate with ability of penetration of the surrounding polar solvent (water) into the hydrophobic protein interiors, being thus the principal reason of the pressure-induced denaturation of the proteins. Considerable variability of elastic properties of the isolated complexes was also observed, tentatively assigned to heterogeneous protein packing in detergent micelles. While a number of the isolated complexes release most of their bacteriochlorophyll a content under high pressure

  1. Performance-Oriented Design of Large Passive Solar Roofs : A method for the integration of parametric modelling and genetic algorithms

    NARCIS (Netherlands)

    Turrin, M.; Von Buelow, P.; Stouffs, R.M.F.; Kilian, A.

    2010-01-01

    The paper addresses the design of large roof structures for semi outdoor spaces through an investigation of a type of performance-oriented design, which aims at integrating performance evaluations in the early stages of the design process. Particularly, aiming at improving daylight and thermal

  2. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  3. Mitofilin complexes: conserved organizers of mitochondrial membrane architecture.

    Science.gov (United States)

    Zerbes, Ralf M; van der Klei, Ida J; Veenhuis, Marten; Pfanner, Nikolaus; van der Laan, Martin; Bohnert, Maria

    2012-11-01

    Mitofilin proteins are crucial organizers of mitochondrial architecture. They are located in the inner mitochondrial membrane and interact with several protein complexes of the outer membrane, thereby generating contact sites between the two membrane systems of mitochondria. Within the inner membrane, mitofilins are part of hetero-oligomeric protein complexes that have been termed the mitochondrial inner membrane organizing system (MINOS). MINOS integrity is required for the maintenance of the characteristic morphology of the inner mitochondrial membrane, with an inner boundary region closely apposed to the outer membrane and cristae membranes, which form large tubular invaginations that protrude into the mitochondrial matrix and harbor the enzyme complexes of the oxidative phosphorylation machinery. MINOS deficiency comes along with a loss of crista junction structures and the detachment of cristae from the inner boundary membrane. MINOS has been conserved in evolution from unicellular eukaryotes to humans, where alterations of MINOS subunits are associated with multiple pathological conditions.

  4. Complex Networks

    CERN Document Server

    Evsukoff, Alexandre; González, Marta

    2013-01-01

    In the last decade we have seen the emergence of a new inter-disciplinary field focusing on the understanding of networks which are dynamic, large, open, and have a structure sometimes called random-biased. The field of Complex Networks is helping us better understand many complex phenomena such as the spread of  deseases, protein interactions, social relationships, to name but a few. Studies in Complex Networks are gaining attention due to some major scientific breakthroughs proposed by network scientists helping us understand and model interactions contained in large datasets. In fact, if we could point to one event leading to the widespread use of complex network analysis is the availability of online databases. Theories of Random Graphs from Erdös and Rényi from the late 1950s led us to believe that most networks had random characteristics. The work on large online datasets told us otherwise. Starting with the work of Barabási and Albert as well as Watts and Strogatz in the late 1990s, we now know th...

  5. On generalized de Rham-Hodge complexes, the related characteristic Chern classes and some applications to integrable multi-dimensional differential systems on Riemannian manifolds

    International Nuclear Information System (INIS)

    Bogolubov, Nikolai N. Jr.; Prykarpatsky, Anatoliy K.

    2006-12-01

    The differential-geometric aspects of generalized de Rham-Hodge complexes naturally related with integrable multi-dimensional differential systems of M. Gromov type, as well as the geometric structure of Chern characteristic classes are studied. Special differential invariants of the Chern type are constructed, their importance for the integrability of multi-dimensional nonlinear differential systems on Riemannian manifolds is discussed. An example of the three-dimensional Davey-Stewartson type nonlinear strongly integrable differential system is considered, its Cartan type connection mapping and related Chern type differential invariants are analyzed. (author)

  6. Large-scale complementary macroelectronics using hybrid integration of carbon nanotubes and IGZO thin-film transistors.

    Science.gov (United States)

    Chen, Haitian; Cao, Yu; Zhang, Jialu; Zhou, Chongwu

    2014-06-13

    Carbon nanotubes and metal oxide semiconductors have emerged as important materials for p-type and n-type thin-film transistors, respectively; however, realizing sophisticated macroelectronics operating in complementary mode has been challenging due to the difficulty in making n-type carbon nanotube transistors and p-type metal oxide transistors. Here we report a hybrid integration of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors to achieve large-scale (>1,000 transistors for 501-stage ring oscillators) complementary macroelectronic circuits on both rigid and flexible substrates. This approach of hybrid integration allows us to combine the strength of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors, and offers high device yield and low device variation. Based on this approach, we report the successful demonstration of various logic gates (inverter, NAND and NOR gates), ring oscillators (from 51 stages to 501 stages) and dynamic logic circuits (dynamic inverter, NAND and NOR gates).

  7. Integrated and spatially explicit modelling of the economic value of complex environmental change and its indirect effects

    OpenAIRE

    Bateman, Ian; Binner, Amy; Coombes, Emma; Day, Brett; Ferrini, Silvia; Fezzi, Carlo; Hutchins, Michael; Posen, Paulette

    2012-01-01

    Arguably the greatest challenge to contemporary research is to capture the inter-relatedness and complexity of the real world environment within models so at to better inform decision makers of the accurate and complete consequences of differing options. The paper presents an integrated model of the consequence of climate change upon land use and the secondary and subsequent effects arising subsequently. The model predicts the shift in land use which climate change is likely to induce and the...

  8. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  9. Sutherland models for complex reflection groups

    International Nuclear Information System (INIS)

    Crampe, N.; Young, C.A.S.

    2008-01-01

    There are known to be integrable Sutherland models associated to every real root system, or, which is almost equivalent, to every real reflection group. Real reflection groups are special cases of complex reflection groups. In this paper we associate certain integrable Sutherland models to the classical family of complex reflection groups. Internal degrees of freedom are introduced, defining dynamical spin chains, and the freezing limit taken to obtain static chains of Haldane-Shastry type. By considering the relation of these models to the usual BC N case, we are led to systems with both real and complex reflection groups as symmetries. We demonstrate their integrability by means of new Dunkl operators, associated to wreath products of dihedral groups

  10. Parental decision-making for medically complex infants and children: An integrated literature review

    Science.gov (United States)

    Allen, Kimberly A.

    2014-01-01

    Background Many children with life-threatening conditions who would have died at birth are now surviving months to years longer than previously expected. Understanding how parents make decisions is necessary to prevent parental regret about decision-making, which can lead to psychological distress, decreased physical health, and decreased quality of life for the parents. Objective The aim of this integrated literature review was to describe possible factors that affect parental decision-making for medically complex children. The critical decisions included continuation or termination of a high-risk pregnancy, initiation of life-sustaining treatments such as resuscitation, complex cardiothoracic surgery, use of experimental treatments, end-of-life care, and limitation of care or withdrawal of support. Design PubMed, Cumulative Index of Nursing and Allied Health Literature, and PsycINFO were searched using the combined key terms ‘parents and decision-making’ to obtain English language publications from 2000 to June 2013. Results The findings from each of the 31 articles retained were recorded. The strengths of the empirical research reviewed are that decisions about initiating life support and withdrawing life support have received significant attention. Researchers have explored how many different factors impact decision-making and have used multiple different research designs and data collection methods to explore the decision-making process. These initial studies lay the foundation for future research and have provided insight into parental decision-making during times of crisis. Conclusions Studies must begin to include both parents and providers so that researchers can evaluate how decisions are made for individual children with complex chronic conditions to understand the dynamics between parents and parent–provider relationships. The majority of studies focused on one homogenous diagnostic group of premature infants and children with complex congenital

  11. Parental decision-making for medically complex infants and children: an integrated literature review.

    Science.gov (United States)

    Allen, Kimberly A

    2014-09-01

    Many children with life-threatening conditions who would have died at birth are now surviving months to years longer than previously expected. Understanding how parents make decisions is necessary to prevent parental regret about decision-making, which can lead to psychological distress, decreased physical health, and decreased quality of life for the parents. The aim of this integrated literature review was to describe possible factors that affect parental decision-making for medically complex children. The critical decisions included continuation or termination of a high-risk pregnancy, initiation of life-sustaining treatments such as resuscitation, complex cardiothoracic surgery, use of experimental treatments, end-of-life care, and limitation of care or withdrawal of support. PubMed, Cumulative Index of Nursing and Allied Health Literature, and PsycINFO were searched using the combined key terms 'parents and decision-making' to obtain English language publications from 2000 to June 2013. The findings from each of the 31 articles retained were recorded. The strengths of the empirical research reviewed are that decisions about initiating life support and withdrawing life support have received significant attention. Researchers have explored how many different factors impact decision-making and have used multiple different research designs and data collection methods to explore the decision-making process. These initial studies lay the foundation for future research and have provided insight into parental decision-making during times of crisis. Studies must begin to include both parents and providers so that researchers can evaluate how decisions are made for individual children with complex chronic conditions to understand the dynamics between parents and parent-provider relationships. The majority of studies focused on one homogenous diagnostic group of premature infants and children with complex congenital heart disease. Thus comparisons across other child

  12. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  13. INTEGRATED QUANTITATIVE ASSESSMENT OF CHANGES IN NEURO-ENDOCRINE-IMMUNE COMPLEX AND METABOLISM IN RATS EXPOSED TO ACUTE COLD-IMMOBILIZATION STRESS

    Directory of Open Access Journals (Sweden)

    Sydoruk O Sydoruk

    2016-09-01

        Abstracts Background. It is known that the reaction of the neuroendocrine-immune complex to acute and chronic stress are different. It is also known about sex differences in stress reactions. Previously we have been carry out integrated quantitative estimation of neuroendocrine and immune responses to chronic restraint stress at male rats. The purpose of this study - to carry out integrated quantitative estimation of neuroendocrine, immune and metabolic responses to acute stress at male and female rats. Material and research methods. The experiment is at 58 (28 male and 30 female white rats Wistar line weighing 170-280 g (Mean=220 g; SD=28 g. The day after acute (water immersion restraint stress determined HRV, endocrine, immune and metabolic parameters as well as gastric mucosa injuries and comparing them with parameters of intact animals. Results. Acute cold-immobilization stress caused moderate injuries the stomach mucosa as erosions and ulcers. Among the metabolic parameters revealed increased activity Acid Phosphatase, Asparagine and Alanine Aminotranspherase as well as Creatinephosphokinase. It was also found to reduce plasma Testosterone as well as serum Potassium and Phosphate probably due to increased Parathyrine and Mineralocorticoid activity and Sympathotonic shift of sympatho-vagal balance. Integrated quantitative measure manifestations of Acute Stress as mean of modules of Z-Scores makes for 10 metabolic parameters 0,75±0,10 σ and for 8 neuro-endocrine parameters 0,40±0,07 σ. Among immune parameters some proved resistant to acute stress factors, while 10 significant suppressed and 12 activated. Integrated quantitative measure poststressory changes makes 0,73±0,08 σ. Found significant differences integrated status intact males and females, whereas after stress differences are insignificant. Conclusion. The approach to integrated quantitative assessment of neuroendocrine-immune complex and metabolism may be useful for testing the

  14. Large Differences in the Optical Spectrum Associated with the Same Complex: The Effect of the Anisotropy of the Embedding Lattice

    DEFF Research Database (Denmark)

    Aramburu, José Antonio; García-Fernández, Pablo; García Lastra, Juan Maria

    2017-01-01

    of the electric field created by the rest of lattice ions over the complex. To illustrate this concept we analyze the origin of the surprisingly large differences in the d–d optical transitions of two systems containing square-planar CuF42– complexes, CaCuF4, and center II in Cu2+-doped Ba2ZnF6, even though...... the Cu2+–F–distance difference is just found to be 1%. Using a minimalist first-principles model we show that the different morphology of the host lattices creates an anisotropic field that red-shifts the in vacuo complex transitions to the 1.25–1.70 eV range in CaCuF4, while it blue-shifts them to the 1...

  15. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    Science.gov (United States)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for

  16. The Detection of Hot Cores and Complex Organic Molecules in the Large Magellanic Cloud

    Science.gov (United States)

    Sewiło, Marta; Indebetouw, Remy; Charnley, Steven B.; Zahorecz, Sarolta; Oliveira, Joana M.; van Loon, Jacco Th.; Ward, Jacob L.; Chen, C.-H. Rosie; Wiseman, Jennifer; Fukui, Yasuo; Kawamura, Akiko; Meixner, Margaret; Onishi, Toshikazu; Schilke, Peter

    2018-02-01

    We report the first extragalactic detection of the complex organic molecules (COMs) dimethyl ether (CH3OCH3) and methyl formate (CH3OCHO) with the Atacama Large Millimeter/submillimeter Array (ALMA). These COMs, together with their parent species methanol (CH3OH), were detected toward two 1.3 mm continuum sources in the N 113 star-forming region in the low-metallicity Large Magellanic Cloud (LMC). Rotational temperatures ({T}{rot}∼ 130 K) and total column densities ({N}{rot}∼ {10}16 cm‑2) have been calculated for each source based on multiple transitions of CH3OH. We present the ALMA molecular emission maps for COMs and measured abundances for all detected species. The physical and chemical properties of two sources with COMs detection, and the association with H2O and OH maser emission, indicate that they are hot cores. The fractional abundances of COMs scaled by a factor of 2.5 to account for the lower metallicity in the LMC are comparable to those found at the lower end of the range in Galactic hot cores. Our results have important implications for studies of organic chemistry at higher redshift.

  17. Flexible and low-voltage integrated circuits constructed from high-performance nanocrystal transistors.

    Science.gov (United States)

    Kim, David K; Lai, Yuming; Diroll, Benjamin T; Murray, Christopher B; Kagan, Cherie R

    2012-01-01

    Colloidal semiconductor nanocrystals are emerging as a new class of solution-processable materials for low-cost, flexible, thin-film electronics. Although these colloidal inks have been shown to form single, thin-film field-effect transistors with impressive characteristics, the use of multiple high-performance nanocrystal field-effect transistors in large-area integrated circuits has not been shown. This is needed to understand and demonstrate the applicability of these discrete nanocrystal field-effect transistors for advanced electronic technologies. Here we report solution-deposited nanocrystal integrated circuits, showing nanocrystal integrated circuit inverters, amplifiers and ring oscillators, constructed from high-performance, low-voltage, low-hysteresis CdSe nanocrystal field-effect transistors with electron mobilities of up to 22 cm(2) V(-1) s(-1), current modulation >10(6) and subthreshold swing of 0.28 V dec(-1). We fabricated the nanocrystal field-effect transistors and nanocrystal integrated circuits from colloidal inks on flexible plastic substrates and scaled the devices to operate at low voltages. We demonstrate that colloidal nanocrystal field-effect transistors can be used as building blocks to construct complex integrated circuits, promising a viable material for low-cost, flexible, large-area electronics.

  18. Radar, geologic, airborne gamma ray and Landsat TM digital data integration for geological mapping of the Estrela granite complex (Para State)

    International Nuclear Information System (INIS)

    Cunha, Edson Ricardo Soares Pereira da

    2002-01-01

    This work is focused on the geotectonic context of the Carajas Mineral Province, Amazon Craton, which represents the most important Brazilian Mineral Province and hosts iron, cooper, gold, manganese and nickel deposits. At the end of Archean age, during the techno-metamorphic evolution, moderated alkaline granitoids were generated, such as, Estrela Granite Complex (EGC). This work has used digital integration products with the purpose of study the granite suite, its host rock, and the surrounded area. The digital integrated data were gamma-ray and geological data with satellite images (SAR-SAREX e TM-Landsat). The geophysics data, originally in 32 bits and grid format, were interpolated and converted to 8 bits images. The geological data (facies map) was digitalized and converted to a raster format. The remote sensing images were geometrically corrected to guarantee an accuracy on the geological mapping. On the data processing phase, SAR images were digital integrated with gamma-ray data, TM-Landsat image and the raster facies map. The IHS transformation was used as the technique to integrate the multi-source data. On the photogeological interpretation, SAR data were extremely important to permit the extraction of the main tectonic lineaments which occur on the following directions: +/- N45W, +/- N70W, +/- NS, +/- N20E, +/- N45E e +/- N75E. This procedure was done both in analogic and automatic form, being the automatic process more useful to complement information in the extracting process. Among the digital products generated, SAR/GAMA products (uranium, thorium and total count) were the ones that give the most important contribution. The interpretation of the SAR/GAMA's products added to the field campaign have allowed to map the limits of units that occur in the region and four facies of the Estrela Granite Complex were detected. The origin of the granite suite might be related to a magmatic differentiation or to distinct intrusion pulses. The use of the

  19. Integration of large chemical kinetic mechanisms via exponential methods with Krylov approximations to Jacobian matrix functions

    KAUST Repository

    Bisetti, Fabrizio

    2012-06-01

    Recent trends in hydrocarbon fuel research indicate that the number of species and reactions in chemical kinetic mechanisms is rapidly increasing in an effort to provide predictive capabilities for fuels of practical interest. In order to cope with the computational cost associated with the time integration of stiff, large chemical systems, a novel approach is proposed. The approach combines an exponential integrator and Krylov subspace approximations to the exponential function of the Jacobian matrix. The components of the approach are described in detail and applied to the ignition of stoichiometric methane-air and iso-octane-air mixtures, here described by two widely adopted chemical kinetic mechanisms. The approach is found to be robust even at relatively large time steps and the global error displays a nominal third-order convergence. The performance of the approach is improved by utilising an adaptive algorithm for the selection of the Krylov subspace size, which guarantees an approximation to the matrix exponential within user-defined error tolerance. The Krylov projection of the Jacobian matrix onto a low-dimensional space is interpreted as a local model reduction with a well-defined error control strategy. Finally, the performance of the approach is discussed with regard to the optimal selection of the parameters governing the accuracy of its individual components. © 2012 Copyright Taylor and Francis Group, LLC.

  20. Coping with Complex Environmental and Societal Flood Risk Management Decisions: An Integrated Multi-criteria Framework

    Directory of Open Access Journals (Sweden)

    Love Ekenberg

    2011-08-01

    Full Text Available During recent years, a great deal of attention has been focused on the financial risk management of natural disasters. One reason behind is that the economic losses from floods, windstorms, earthquakes and other disasters in both the developing and developed countries are escalating dramatically. It has become apparent that an integrated water resource management approach would be beneficial in order to take both the best interests of society and of the environment into consideration. One improvement consists of models capable of handling multiple criteria (conflicting objectives as well as multiple stakeholders (conflicting interests. A systems approach is applied for coping with complex environmental and societal risk management decisions with respect to flood catastrophe policy formation, wherein the emphasis is on computer-based modeling and simulation techniques combined with methods for evaluating strategies where numerous stakeholders are incorporated in the process. The resulting framework consists of a simulation model, a decision analytical tool, and a set of suggested policy strategies for policy formulation. The framework will aid decision makers with high risk complex environmental decisions subject to significant uncertainties.

  1. Evaluation of the depth-integration method of measuring water discharge in large rivers

    Science.gov (United States)

    Moody, J.A.; Troutman, B.M.

    1992-01-01

    The depth-integration method oor measuring water discharge makes a continuos measurement of the water velocity from the water surface to the bottom at 20 to 40 locations or verticals across a river. It is especially practical for large rivers where river traffic makes it impractical to use boats attached to taglines strung across the river or to use current meters suspended from bridges. This method has the additional advantage over the standard two- and eight-tenths method in that a discharge-weighted suspended-sediment sample can be collected at the same time. When this method is used in large rivers such as the Missouri, Mississippi and Ohio, a microwave navigation system is used to determine the ship's position at each vertical sampling location across the river, and to make accurate velocity corrections to compensate for shift drift. An essential feature is a hydraulic winch that can lower and raise the current meter at a constant transit velocity so that the velocities at all depths are measured for equal lengths of time. Field calibration measurements show that: (1) the mean velocity measured on the upcast (bottom to surface) is within 1% of the standard mean velocity determined by 9-11 point measurements; (2) if the transit velocity is less than 25% of the mean velocity, then average error in the mean velocity is 4% or less. The major source of bias error is a result of mounting the current meter above a sounding weight and sometimes above a suspended-sediment sampling bottle, which prevents measurement of the velocity all the way to the bottom. The measured mean velocity is slightly larger than the true mean velocity. This bias error in the discharge is largest in shallow water (approximately 8% for the Missouri River at Hermann, MO, where the mean depth was 4.3 m) and smallest in deeper water (approximately 3% for the Mississippi River at Vickbsurg, MS, where the mean depth was 14.5 m). The major source of random error in the discharge is the natural

  2. Assessment of integrated electrical resistivity data on complex aquifer structures in NE Nuba Mountains - Sudan

    Science.gov (United States)

    Mohamed, N. E.; Yaramanci, U.; Kheiralla, K. M.; Abdelgalil, M. Y.

    2011-07-01

    Two geophysical techniques were integrated to map the groundwater aquifers on complex geological settings, in the crystalline basement terrain in northeast Nuba Mountains. The water flow is structurally controlled by the northwest-southeast extensional faults as one of several in-situ deformational patterns that are attributed to the collision of the Pan-African oceanic assemblage of the Nubian shield against the pre-Pan African continental crust to the west. The structural lineaments and drainage systems have been enhanced by the remote sensing technique. The geophysical techniques used are: vertical electrical soundings (VES) and electrical resistivity tomography (ERT), in addition to hydraulic conductivity measurements. These measurements were designed to overlap in order to improve the producibility of the geophysical data and to provide a better interpretation of the hydrogeological setting of the aquifer complex structure. Smooth and Block inversion schemes were attempted for the observed ERT data to study their reliability in mapping the different geometries in the complex subsurface. The VES data was conducted where ERT survey was not accessible, and inverted smoothly and merged with the ERT in the 3D resistivity grid. The hydraulic conductivity was measured for 42 water samples collected from the distributed dug wells in the study area; where extremely high saline zones were recorded and have been compared to the resistivity values in the 3D model.

  3. Atypical language laterality is associated with large-scale disruption of network integration in children with intractable focal epilepsy.

    Science.gov (United States)

    Ibrahim, George M; Morgan, Benjamin R; Doesburg, Sam M; Taylor, Margot J; Pang, Elizabeth W; Donner, Elizabeth; Go, Cristina Y; Rutka, James T; Snead, O Carter

    2015-04-01

    Epilepsy is associated with disruption of integration in distributed networks, together with altered localization for functions such as expressive language. The relation between atypical network connectivity and altered localization is unknown. In the current study we tested whether atypical expressive language laterality was associated with the alteration of large-scale network integration in children with medically-intractable localization-related epilepsy (LRE). Twenty-three right-handed children (age range 8-17) with medically-intractable LRE performed a verb generation task in fMRI. Language network activation was identified and the Laterality index (LI) was calculated within the pars triangularis and pars opercularis. Resting-state data from the same cohort were subjected to independent component analysis. Dual regression was used to identify associations between resting-state integration and LI values. Higher positive values of the LI, indicating typical language localization were associated with stronger functional integration of various networks including the default mode network (DMN). The normally symmetric resting-state networks showed a pattern of lateralized connectivity mirroring that of language function. The association between atypical language localization and network integration implies a widespread disruption of neural network development. These findings may inform the interpretation of localization studies by providing novel insights into reorganization of neural networks in epilepsy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    Science.gov (United States)

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  5. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  6. Large Scale Production of Densified Hydrogen Using Integrated Refrigeration and Storage

    Science.gov (United States)

    Notardonato, William U.; Swanger, Adam Michael; Jumper, Kevin M.; Fesmire, James E.; Tomsik, Thomas M.; Johnson, Wesley L.

    2017-01-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage (IRAS) technology at NASA Kennedy Space Center led to the production of large quantities of solid densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. System energy balances and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing (up to 1 K), and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. Twenty silicon diode temperature sensors were recorded over approximately one month for testing at two different fill levels (33 67). The phenomenon, observed at both two fill levels, is described and presented detailed and explained herein., and The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  7. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    security limits. Under such scenario, progressive displacement of conventional generation by wind generation is expected to eventually lead a complex power system with least presence of central power plants. Consequently the support from conventional power plants is expected to reach its all-time low...... system voltage control responsibility from conventional power plants to wind turbines. With increased wind penetration and displaced conventional central power plants, dynamic voltage security has been identified as one of the challenging issue for large scale wind integration. To address the dynamic...... security issue, a WAMS based systematic voltage control scheme for large scale wind integrated power system has been proposed. Along with the optimal reactive power compensation, the proposed scheme considers voltage support from wind farms (equipped with voltage support functionality) and refurbished...

  8. Supervised maximum-likelihood weighting of composite protein networks for complex prediction

    Directory of Open Access Journals (Sweden)

    Yong Chern Han

    2012-12-01

    Full Text Available Abstract Background Protein complexes participate in many important cellular functions, so finding the set of existent complexes is essential for understanding the organization and regulation of processes in the cell. With the availability of large amounts of high-throughput protein-protein interaction (PPI data, many algorithms have been proposed to discover protein complexes from PPI networks. However, such approaches are hindered by the high rate of noise in high-throughput PPI data, including spurious and missing interactions. Furthermore, many transient interactions are detected between proteins that are not from the same complex, while not all proteins from the same complex may actually interact. As a result, predicted complexes often do not match true complexes well, and many true complexes go undetected. Results We address these challenges by integrating PPI data with other heterogeneous data sources to construct a composite protein network, and using a supervised maximum-likelihood approach to weight each edge based on its posterior probability of belonging to a complex. We then use six different clustering algorithms, and an aggregative clustering strategy, to discover complexes in the weighted network. We test our method on Saccharomyces cerevisiae and Homo sapiens, and show that complex discovery is improved: compared to previously proposed supervised and unsupervised weighting approaches, our method recalls more known complexes, achieves higher precision at all recall levels, and generates novel complexes of greater functional similarity. Furthermore, our maximum-likelihood approach allows learned parameters to be used to visualize and evaluate the evidence of novel predictions, aiding human judgment of their credibility. Conclusions Our approach integrates multiple data sources with supervised learning to create a weighted composite protein network, and uses six clustering algorithms with an aggregative clustering strategy to

  9. Integrating Infrastructure and Institutions for Water Security in Large Urban Areas

    Science.gov (United States)

    Padowski, J.; Jawitz, J. W.; Carrera, L.

    2015-12-01

    Urban growth has forced cities to procure more freshwater to meet demands; however the relationship between urban water security, water availability and water management is not well understood. This work quantifies the urban water security of 108 large cities in the United States (n=50) and Africa (n=58) based on their hydrologic, hydraulic and institutional settings. Using publicly available data, urban water availability was estimated as the volume of water available from local water resources and those captured via hydraulic infrastructure (e.g. reservoirs, wellfields, aqueducts) while urban water institutions were assessed according to their ability to deliver, supply and regulate water resources to cities. When assessing availability, cities relying on local water resources comprised a minority (37%) of those assessed. The majority of cities (55%) instead rely on captured water to meet urban demands, with African cities reaching farther and accessing a greater number and variety of sources for water supply than US cities. Cities using captured water generally had poorer access to local water resources and maintained significantly more complex strategies for water delivery, supply and regulatory management. Eight cities, all African, are identified in this work as having water insecurity issues. These cities lack sufficient infrastructure and institutional complexity to capture and deliver adequate amounts of water for urban use. Together, these findings highlight the important interconnection between infrastructure investments and management techniques for urban areas with a limited or dwindling natural abundance of water. Addressing water security challenges in the future will require that more attention be placed not only on increasing water availability, but on developing the institutional support to manage captured water supplies.

  10. Integrable spin chains and scattering amplitudes

    Energy Technology Data Exchange (ETDEWEB)

    Bartels, J.; Prygarin, A. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Lipatov, L.N. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Petersburg Nuclear Physics Institute (Russian Federation); Sankt-Peterburgskij Univ., St. Petersburg (Russian Federation)

    2011-04-15

    In this review we show that the multi-particle scattering amplitudes in N=4 SYM at large N{sub c} and in the multi-Regge kinematics for some physical regions have the high energy behavior appearing from the contribution of the Mandelstam cuts in the complex angular momentum plane of the corresponding t-channel partial waves. These Mandelstam cuts or Regge cuts are resulting from gluon composite states in the adjoint representation of the gauge group SU(N{sub c}). In the leading logarithmic approximation (LLA) their contribution to the six point amplitude is in full agreement with the known two-loop result. The Hamiltonian for the Mandelstam states constructed from n gluons in LLA coincides with the local Hamiltonian of an integrable open spin chain. We construct the corresponding wave functions using the integrals of motion and the Baxter-Sklyanin approach. (orig.)

  11. Real time information management for improving productivity in metallurgical complexes

    International Nuclear Information System (INIS)

    Bascur, O.A.; Kennedy, J.P.

    1999-01-01

    Applying the latest information technologies in industrial plants has become a serious challenge to management and technical teams. The availability of real time and historical operations information to identify the most critical part of the processing system from mechanical integrity is a must for global plant optimization. Expanded use of plant information on the desktop is a standard tool for revenue improvement, cost reduction, and adherence to production constraints. The industrial component desktop supports access to information for process troubleshooting, continuous improvement and innovation by plant and staff personnel. Collaboration between groups enables the implementation of an overall process effectiveness index based on losses due to equipment availability, production and product quality. The key to designing technology is to use the Internet based technologies created by Microsoft for its marketplace-office automation and the Web. Time derived variables are used for process analysis, troubleshooting and performance assessment. Connectivity between metallurgical complexes, research centers and their business system has become a reality. Two case studies of large integrated mining/metallurgical complexes are highlighted. (author)

  12. Extreme disorder in an ultrahigh-affinity protein complex

    DEFF Research Database (Denmark)

    Borgia, Alessandro; Borgia, Madeleine B; Bugge, Katrine

    2018-01-01

    Molecular communication in biology is mediated by protein interactions. According to the current paradigm, the specificity and affinity required for these interactions are encoded in the precise complementarity of binding interfaces. Even proteins that are disordered under physiological conditions...... with picomolar affinity, but fully retain their structural disorder, long-range flexibility and highly dynamic character. On the basis of closely integrated experiments and molecular simulations, we show that the interaction can be explained by the large opposite net charge of the two proteins, without requiring...... or that contain large unstructured regions commonly interact with well-structured binding sites on other biomolecules. Here we demonstrate the existence of an unexpected interaction mechanism: the two intrinsically disordered human proteins histone H1 and its nuclear chaperone prothymosin-α associate in a complex...

  13. Equivalent Method of Integrated Power Generation System of Wind, Photovoltaic and Energy Storage in Power Flow Calculation and Transient Simulation

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The integrated power generation system of wind, photovoltaic (PV) and energy storage is composed of several wind turbines, PV units and energy storage units. The detailed model of integrated generation is not suitable for the large-scale powe.r system simulation because of the model's complexity and long computation time. An equivalent method for power flow calculation and transient simulation of the integrated generation system is proposed based on actual projects, so as to establish the foundation of such integrated system simulation and analysis.

  14. Integrating Web-Based Teaching Tools into Large University Physics Courses

    Science.gov (United States)

    Toback, David; Mershin, Andreas; Novikova, Irina

    2005-12-01

    Teaching students in our large, introductory, calculus-based physics courses to be good problem-solvers is a difficult task. Not only must students be taught to understand and use the physics concepts in a problem, they must become adept at turning the physical quantities into symbolic variables, translating the problem into equations, and "turning the crank" on the mathematics to find both a closed-form solution and a numerical answer. Physics education research has shown that students' poor math skills and instructors' lack of pen-and-paper homework grading resources, two problems we face at our institution, can have a significant impact on problem-solving skill development.2-4 While Interactive Engagement methods appear to be the preferred mode of instruction,5 for practical reasons we have not been able to widely implement them. In this paper, we describe three Internet-based "teaching-while-quizzing" tools we have developed and how they have been integrated into our traditional lecture course in powerful but easy to incorporate ways.6 These are designed to remediate students' math deficiencies, automate homework grading, and guide study time toward problem solving. Our intent is for instructors who face similar obstacles to adopt these tools, which are available upon request.7

  15. Large-scale integration of optimal combinations of PV, wind and wave power into the electricity supply

    DEFF Research Database (Denmark)

    Lund, Henrik

    2006-01-01

    This article presents the results of analyses of large-scale integration of wind power, photo voltaic (PV) and wave power into a Danish reference energy system. The possibility of integrating Renewable Energy Sources (RES) into the electricity supply is expressed in terms of the ability to avoid...... ancillary services are needed in order to secure the electricity supply system. The idea is to benefit from the different patterns in the fluctuations of different renewable sources. And the purpose is to identify optimal mixtures from a technical point of view. The optimal mixture seems to be when onshore...... wind power produces approximately 50% of the total electricity production from RES. Meanwhile, the mixture between PV and wave power seems to depend on the total amount of electricity production from RES. When the total RES input is below 20% of demand, PV should cover 40% and wave power only 10%. When...

  16. Analysis of integrated video and radiation data

    International Nuclear Information System (INIS)

    Howell, J.A.; Menlove, H.O.; Rodriguez, C.A.; Beddingfield, D.; Vasil, A.

    1995-01-01

    We have developed prototype software for a facility-monitoring application that will detect anomalous activity in a nuclear facility. The software, which forms the basis of a simple model, automatically reviews and analyzes integrated safeguards data from continuous unattended monitoring systems. This technology, based on pattern recognition by neural networks, provides significant capability to analyze complex data and has the ability to learn and adapt to changing situations. It is well suited for large automated facilities, reactors, spent-fuel storage facilities, reprocessing plants, and nuclear material storage vaults

  17. Monolithic Ge-on-Si lasers for large-scale electronic-photonic integration

    Science.gov (United States)

    Liu, Jifeng; Kimerling, Lionel C.; Michel, Jurgen

    2012-09-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic-photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500-1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  18. Monolithic Ge-on-Si lasers for large-scale electronic–photonic integration

    International Nuclear Information System (INIS)

    Liu, Jifeng; Kimerling, Lionel C; Michel, Jurgen

    2012-01-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic–photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500–1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  19. Integration of borehole and seismic data to unravel complex stratigraphy: Case studies from the Mannville Group, Western Canada

    Science.gov (United States)

    Sarzalejo Silva, Sabrina Ester

    Understanding the stratigraphic architecture of geologically complex reservoirs, such as the heavy oil deposits of Western Canada, is essential to achieve an efficient hydrocarbon recovery. Borehole and 3-D seismic data were integrated to define the stratigraphic architecture and generate 3-dimensional geological models of the Mannville Group in Saskatchewan. The Mannville is a stratigraphically complex unit formed of fluvial to marine deposits. Two areas in west-central and southern Saskatchewan were examined in this study. In west-central Saskatchewan, the area corresponds to a stratigraphically controlled heavy oil reservoir with production from the undifferentiated Dina-Cummings Members of the Lower Cretaceous Mannville Group. The southern area, although non-prospective for hydrocarbons, shares many similarities with time-equivalent strata in areas of heavy oil production. Seismic sequence stratigraphic principles together with log signatures permitted the subdivision of the Mannville into different packages. An initial geological model was generated integrating seismic and well-log data Multiattribute analysis and neural networks were used to generate a pseudo-lithology or gamma-ray volume. The incorporation of borehole core data to the model and the subsequent integration with the lithological prediction were crucial to capture the distribution of reservoir and non-reservoir deposits in the study area. The ability to visualize the 3-D seismic data in a variety of ways, including arbitrary lines and stratal or horizon slicing techniques helped the definition of stratigraphic features such as channels and scroll bars that affect fluid flow in hydrocarbon producing areas. Small-scale heterogeneities in the reservoir were not resolved due to the resolution of the seismic data. Although not undertaken in this study, the resulting stratigraphic framework could be used to help construct a static reservoir model. Because of the small size of the 3-D seismic surveys

  20. The Challenge of Integrating Care in Dual Diagnosis; Anti-NMDA-Receptor Encephalitis; Presentation And Outcome In 3 Cases Referred For Complex Specialist Rehabilitation Services

    LENUS (Irish Health Repository)

    Carroll, A

    2018-03-01

    The successful implementation of an integrated care pathway (ICP) for any given condition is a challenge. Even more challenging is successful ICP implementation for individuals who have multiple co-morbidities. This is further compounded when there are dual mental health and physical disabilities that require integrated working across multiple disciplines, specialties, institutions and organisations. Anti-NMDA-Receptor encephalitis (aNMDARe) is a relatively new diagnostic entity with patients typically presenting with significant psychiatric symptoms followed by progressive neurological deterioration. In this case series, we describe 3 cases of females with aNMDARe who were referred for complex specialist rehabilitation (CSR) to The National Rehabilitation Hospital. CSR is the total active care of patients with a disabling condition, and their families, by a multi-professional team who have undergone recognised specialist training in rehabilitation, led \\/supported by a consultant trained and accredited in rehabilitation medicine (RM). These services provide for patients with highly complex rehabilitation needs that are beyond the scope of local services. In these cases, referral to CSR resulted in the construction of a bespoke integrated care pathway (ICP) that transcended the barriers between primary, secondary and tertiary care and across the boundaries of physical and mental health. A care pathway is a complex intervention for the mutual decision-making and organisation of care processes Rehabilitation services acted as the coordinator of services in these cases to ensure implementation of the care plan and to ensure successful transitions of care and supported local specialist and general teams in the management of these complex cases.