WorldWideScience

Sample records for replica management architecture

  1. File-based replica management

    CERN Document Server

    Kunszt, Peter Z; Stockinger, Heinz; Stockinger, Kurt

    2005-01-01

    Data replication is one of the best known strategies to achieve high levels of availability and fault tolerance, as well as minimal access times for large, distributed user communities using a world-wide Data Grid. In certain scientific application domains, the data volume can reach the order of several petabytes; in these domains, data replication and access optimization play an important role in the manageability and usability of the Grid. In this paper, we present the design and implementation of a replica management Grid middleware that was developed within the EDG project left bracket European Data Grid Project (EDG), http://www.eu-egee.org right bracket and is designed to be extensible so that user communities can adjust its detailed behavior according to their QoS requirements.

  2. Beyond Virtual Replicas: 3D Modeling and Maltese Prehistoric Architecture

    Directory of Open Access Journals (Sweden)

    Filippo Stanco

    2013-01-01

    Full Text Available In the past decade, computer graphics have become strategic for the development of projects aimed at the interpretation of archaeological evidence and the dissemination of scientific results to the public. Among all the solutions available, the use of 3D models is particularly relevant for the reconstruction of poorly preserved sites and monuments destroyed by natural causes or human actions. These digital replicas are, at the same time, a virtual environment that can be used as a tool for the interpretative hypotheses of archaeologists and as an effective medium for a visual description of the cultural heritage. In this paper, the innovative methodology and aims and outcomes of a virtual reconstruction of the Borg in-Nadur megalithic temple, carried out by Archeomatica Project of the University of Catania, are offered as a case study for a virtual archaeology of prehistoric Malta.

  3. Architecture for Data Management

    OpenAIRE

    Vukolic, Marko

    2015-01-01

    In this document we present the preliminary architecture of the SUPERCLOUD data management and storage. We start by defining the design requirements of the architecture, motivated by use cases and then review the state-of-the-art. We survey security and dependability technologies and discuss designs for the overall unifying architecture for data management that serves as an umbrella for different security and dependability data management features. Specifically the document lays out the archi...

  4. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise...... architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  5. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process......Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  6. Layered Fault Management Architecture

    National Research Council Canada - National Science Library

    Sztipanovits, Janos

    2004-01-01

    ... UAVs or Organic Air Vehicles. The approach of this effort was to analyze fault management requirements of formation flight for fleets of UAVs, and develop a layered fault management architecture which demonstrates significant...

  7. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  8. Pervasive Application Rights Management Architecture

    OpenAIRE

    Dusparic, Ivana

    2005-01-01

    This dissertation describes an application rights management architecture that combines license management with digital rights management to provide an integrated platform for the specification, generation, delivery and management of application usage rights for pervasive computing environments. A new rights expression language is developed, extended from the existing language, ODRL, which allows the expression of mobile application usage rights and supports fine-grained usage ...

  9. IT Service Management Architectures

    DEFF Research Database (Denmark)

    Tambo, Torben; Filtenborg, Jacob

    2018-01-01

    IT service providers tend to view their services as quasi-embedded in the client organisations infrastructure. Therefore, IT service providers lack a full picture of being an organisation with its own enterprise archicture. By systematically developing an enterprise architecture using the unifica...... the unification operating model, IT service providers can much more efficient develop relevant service catalogues with connected reporting services related to SLA's and KPI's based on ITIL and newer frameworks like SIAM....

  10. Democratic management and architecture school

    Directory of Open Access Journals (Sweden)

    Silvana Aparecida de Souza

    2011-10-01

    Full Text Available It is a conceptual and theoretical research on school organization and its democratization, focusing on one aspect of an objective nature: its architecture. The study was based on the academic literature on democratization and theoretical contribution of Michel Foucault, with regard to the analysis of space as a resourcecontrol, surveillance and training, going through a historical review of the modelconstruction of school buildings in Brazil. It is therefore a sociological analysis of the school environment, in relation to the democratization process of basic education, understood as ensuring that the conditions of access and permanence to a universalquality education, and conceived and gestated from collective interests of its users.We conclude that the architecture of public schools in Brazil do not provides democratic management, either by format controller of buildings constructed in the republican period, either by the current economic priority for the construction of public school buildings, which includes little or no space for collective activities. The character of the buildings remains controller, no more for its architecture, but made possible by technological development, which allows monitoring by video cameras, which is made with the permission and support of community.

  11. A Novel Buffer Management Architecture for Epidemic Routing in Delay Tolerant Networks (DTNs)

    KAUST Repository

    Elwhishi, Ahmed; Ho, Pin-Han; Naik, K.; Shihada, Basem

    2010-01-01

    Delay tolerant networks (DTNs) are wireless networks in which an end-to-end path for a given node pair can never exist for an extended period. It has been reported as a viable approach in launching multiple message replicas in order to increase message delivery ratio and reduce message delivery delay. This advantage, nonetheless, is at the expense of taking more buffer space at each node. The combination of custody and replication entails high buffer and bandwidth overhead. This paper investigates a new buffer management architecture for epidemic routing in DTNs, which helps each node to make a decision on which message should be forwarded or dropped. The proposed buffer management architecture is characterized by a suite of novel functional modules, including Summary Vector Exchange Module (SVEM), Networks State Estimation Module (NSEM), and Utility Calculation Module (UCM). Extensive simulation results show that the proposed buffer management architecture can achieve superb performance against its counterparts in terms of delivery ratio and delivery delay.

  12. A Novel Buffer Management Architecture for Epidemic Routing in Delay Tolerant Networks (DTNs)

    KAUST Repository

    Elwhishi, Ahmed

    2010-11-17

    Delay tolerant networks (DTNs) are wireless networks in which an end-to-end path for a given node pair can never exist for an extended period. It has been reported as a viable approach in launching multiple message replicas in order to increase message delivery ratio and reduce message delivery delay. This advantage, nonetheless, is at the expense of taking more buffer space at each node. The combination of custody and replication entails high buffer and bandwidth overhead. This paper investigates a new buffer management architecture for epidemic routing in DTNs, which helps each node to make a decision on which message should be forwarded or dropped. The proposed buffer management architecture is characterized by a suite of novel functional modules, including Summary Vector Exchange Module (SVEM), Networks State Estimation Module (NSEM), and Utility Calculation Module (UCM). Extensive simulation results show that the proposed buffer management architecture can achieve superb performance against its counterparts in terms of delivery ratio and delivery delay.

  13. A DISTRIBUTED PROGNOSTIC HEALTH MANAGEMENT ARCHITECTURE

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper introduces a generic distributed prognostic health management (PHM) architecture with specific application to the electrical power systems domain. Current...

  14. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  15. Maritime Domain Awareness Architecture Management Hub Strategy

    National Research Council Canada - National Science Library

    2008-01-01

    This document provides an initial high level strategy for carrying out the responsibilities of the national Maritime Domain Awareness Architecture Management Hub to deliver a standards based service...

  16. An Architectural Modelfor Intelligent Network Management

    Institute of Scientific and Technical Information of China (English)

    罗军舟; 顾冠群; 费翔

    2000-01-01

    Traditional network management approach involves the management of each vendor's equipment and network segment in isolation through its own proprietary element management system. It is necessary to set up a new network management architecture that calls for operation consolidation across vendor and technology boundaries. In this paper, an architectural model for Intelligent Network Management (INM) is presented. The INM system includes a manager system, which controls all subsystems and coordinates different management tasks; an expert system, which is responsible for handling particularly difficult problems, and intelligent agents, which bring the management closer to applications and user requirements by spreading intelligent agents through network segments or domain. In the expert system model proposed, especially an intelligent fault management system is given.The architectural model is to build the INM system to meet the need of managing modern network systems.

  17. Control architectures for IT management

    International Nuclear Information System (INIS)

    Wang Ting

    2003-01-01

    This paper summaries the three financial control architectures for IT department in an enterprise or organization, they are unallocated cost center, allocated cost center and profit center, analyses the characteristics of them and in the end gives the detailed suggestions for choosing these control architectures. (authors)

  18. Implementation and performance analysis of the LHCb LFC replica using Oracle streams technology

    CERN Document Server

    Düllmann, D; Martelli, B; Peco, G; Bonifazzi, F; Da Fonte Perez, E; Baranowski, Z; Vagnoni, V

    2007-01-01

    The presentation will describe the architecture and the deployment of the LHCb read-only File Catalogue for the LHC Computing Grid (LFC) replica implemented at the Italian INFN National Centre for Telematics and Informatics (CNAF), and evaluate a series of tests on the LFC with replica. The LHCb computing model foresees the replication of the central LFC database in every Tier-1, in order to assure more scalability and fault tolerance to LHCb applications Scientific data intensive applications use a large collection of files for storing data. In particular, as regards the HEP community, data generated by large detectors will be managed and stored using databases. The intensive access to information stored in databases by the Grid computing applications requires a distributed database replication in order to guarantee the scalability and, in case of failure, redundancy. Besides the results of the tests will be an important reference for all the Grid users This talk will describe the replica implementation of L...

  19. Enterprise Architecture in the Company Management Framework

    Directory of Open Access Journals (Sweden)

    Bojinov Bojidar Violinov

    2016-11-01

    Full Text Available The study aims to explore the role and importance of the concept of enterprise architecture in modern company management. For this purpose it clarifies the nature, scope, components of the enterprise architecture and relationships within it using the Zachman model. Based on the critical analysis of works by leading scientists, there presented a definition of enterprise architecture as a general description of all elements of strategic management of the company combined with description of its organizational, functional and operational structure, including the relationship between all tangible and intangible resources essential for its normal functioning and development. This in turn enables IT enterprise architecture to be defined as a set of corporate IT resources (hardware, software and technology, their interconnection and integration within the overall architecture of the company, as well as their formal description, methods and tools for their modeling and management in order to achieve strategic business goals of the organization. In conclusion the article summarizes the significance and role of enterprise architecture for strategic management of the company in today’s digital economy. The study underlines the importance of an integrated multidisciplinary approach to the work of a contemporary company, and the need for adequate matching and alignment of IT with business priorities and objectives of the company.

  20. A Reference Architecture for Space Information Management

    Science.gov (United States)

    Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.

    2006-01-01

    We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.

  1. A resource management architecture for metacomputing systems.

    Energy Technology Data Exchange (ETDEWEB)

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  2. Architectural mismatch issues in identity management deployment

    DEFF Research Database (Denmark)

    Andersen, Mads Schaarup

    2010-01-01

    Integrating Commercial Off-The-Shelf products in a company's software product portfolio offers business value, but introduces challenges from a software architecture perspective. In this paper, the research challenges in relation to identity management in the Danish municipality administration...... system called Opus, are outlined. Opus BRS is the identity management part of Opus. Opus integrates SAP, legacy mainframe systems, and other third party systems of the individual municipality. Each of these systems define their own software architecture and access control model, leading to architectural...... mismatch with an impact on security, usability, and maintainability. The research project is discussed and access control and identity provisioning are recognized as the major areas of interest in relation to the mismatch challenges. The project is carried out in close cooperation with KMD, one...

  3. An Architecture for Open Learning Management Systems

    NARCIS (Netherlands)

    Avgeriou, Paris; Retalis, Simos; Skordalakis, Manolis

    2003-01-01

    There exists an urgent demand on defining architectures for Learning Management Systems, so that high-level frameworks for understanding these systems can be discovered, and quality attributes like portability, interoperability, reusability and modifiability can be achieved. In this paper we propose

  4. Re-engineering Nascom's network management architecture

    Science.gov (United States)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated

  5. Business architecture management architecting the business for consistency and alignment

    CERN Document Server

    Simon, Daniel

    2015-01-01

    This book presents a comprehensive overview of enterprise architecture management with a specific focus on the business aspects. While recent approaches to enterprise architecture management have dealt mainly with aspects of information technology, this book covers all areas of business architecture from business motivation and models to business execution. The book provides examples of how architectural thinking can be applied in these areas, thus combining different perspectives into a consistent whole. In-depth experiences from end-user organizations help readers to understand the abstract concepts of business architecture management and to form blueprints for their own professional approach. Business architecture professionals, researchers, and others working in the field of strategic business management will benefit from this comprehensive volume and its hands-on examples of successful business architecture management practices.​.

  6. The Architecture of Financial Risk Management Systems

    Directory of Open Access Journals (Sweden)

    Iosif ZIMAN

    2013-01-01

    Full Text Available The architecture of systems dedicated to risk management is probably one of the more complex tasks to tackle in the world of finance. Financial risk has been at the center of attention since the explosive growth of financial markets and even more so after the 2008 financial crisis. At multiple levels, financial companies, financial regulatory bodies, governments and cross-national regulatory bodies, all have put the subject of financial risk in particular and the way it is calculated, managed, reported and monitored under intense scrutiny. As a result the technology underpinnings which support the implementation of financial risk systems has evolved considerably and has become one of the most complex areas involving systems and technology in the context of the financial industry. We present the main paradigms, require-ments and design considerations when undertaking the implementation of risk system and give examples of user requirements, sample product coverage and performance parameters.

  7. Replica methods for loopy sparse random graphs

    International Nuclear Information System (INIS)

    Coolen, ACC

    2016-01-01

    I report on the development of a novel statistical mechanical formalism for the analysis of random graphs with many short loops, and processes on such graphs. The graphs are defined via maximum entropy ensembles, in which both the degrees (via hard constraints) and the adjacency matrix spectrum (via a soft constraint) are prescribed. The sum over graphs can be done analytically, using a replica formalism with complex replica dimensions. All known results for tree-like graphs are recovered in a suitable limit. For loopy graphs, the emerging theory has an appealing and intuitive structure, suggests how message passing algorithms should be adapted, and what is the structure of theories describing spin systems on loopy architectures. However, the formalism is still largely untested, and may require further adjustment and refinement. (paper)

  8. Architectural Debt Management in Value-oriented Architecting

    NARCIS (Netherlands)

    Li, Z.; Liang, P.; Avgeriou, P.

    2014-01-01

    Architectural technical debt (ATD) may be incurred when making architecture decisions. In most cases, ATD is not effectively managed in the architecting process: It is not made explicit, and architecture decision making does not consider the ATD incurred by the different design options. This chapter

  9. Hybrid Power Management-Based Vehicle Architecture

    Science.gov (United States)

    Eichenberg, Dennis J.

    2011-01-01

    Hybrid Power Management (HPM) is the integration of diverse, state-of-the-art power devices in an optimal configuration for space and terrestrial applications (s ee figure). The appropriate application and control of the various power devices significantly improves overall system performance and efficiency. The basic vehicle architecture consists of a primary power source, and possibly other power sources, that provides all power to a common energy storage system that is used to power the drive motors and vehicle accessory systems. This architecture also provides power as an emergency power system. Each component is independent, permitting it to be optimized for its intended purpose. The key element of HPM is the energy storage system. All generated power is sent to the energy storage system, and all loads derive their power from that system. This can significantly reduce the power requirement of the primary power source, while increasing the vehicle reliability. Ultracapacitors are ideal for an HPM-based energy storage system due to their exceptionally long cycle life, high reliability, high efficiency, high power density, and excellent low-temperature performance. Multiple power sources and multiple loads are easily incorporated into an HPM-based vehicle. A gas turbine is a good primary power source because of its high efficiency, high power density, long life, high reliability, and ability to operate on a wide range of fuels. An HPM controller maintains optimal control over each vehicle component. This flexible operating system can be applied to all vehicles to considerably improve vehicle efficiency, reliability, safety, security, and performance. The HPM-based vehicle architecture has many advantages over conventional vehicle architectures. Ultracapacitors have a much longer cycle life than batteries, which greatly improves system reliability, reduces life-of-system costs, and reduces environmental impact as ultracapacitors will probably never need to be

  10. Architecture of a software quench management system

    International Nuclear Information System (INIS)

    Jerzy M. Nogiec et al.

    2001-01-01

    Testing superconducting accelerator magnets is inherently coupled with the proper handling of quenches; i.e., protecting the magnet and characterizing the quench process. Therefore, software implementations must include elements of both data acquisition and real-time controls. The architecture of the quench management software developed at Fermilab's Magnet Test Facility is described. This system consists of quench detection, quench protection, and quench characterization components that execute concurrently in a distributed system. Collaboration between the elements of quench detection, quench characterization and current control are discussed, together with a schema of distributed saving of various quench-related data. Solutions to synchronization and reliability in such a distributed quench system are also presented

  11. Space Station data management system architecture

    Science.gov (United States)

    Mallary, William E.; Whitelaw, Virginia A.

    1987-01-01

    Within the Space Station program, the Data Management System (DMS) functions in a dual role. First, it provides the hardware resources and software services which support the data processing, data communications, and data storage functions of the onboard subsystems and payloads. Second, it functions as an integrating entity which provides a common operating environment and human-machine interface for the operation and control of the orbiting Space Station systems and payloads by both the crew and the ground operators. This paper discusses the evolution and derivation of the requirements and issues which have had significant effect on the design of the Space Station DMS, describes the DMS components and services which support system and payload operations, and presents the current architectural view of the system as it exists in October 1986; one-and-a-half years into the Space Station Phase B Definition and Preliminary Design Study.

  12. Effectively Managing the Air Force Enterprise Architecture

    National Research Council Canada - National Science Library

    Sharkey, Jamie P

    2005-01-01

    The Air Force is developing and implementing an enterprise architecture to meet the Clinger-Cohen Act's requirement that all federal agencies use an architecture to guide their information technology (IT) investments...

  13. Hyper-V Replica essentials

    CERN Document Server

    Krstevski, Vangel

    2013-01-01

    a in various deployment scenarios.Hyper-V Replica Essentials is for Windows Server administrators who want to improve their system availability and speed up disaster recovery. You will need experience in Hyper-V deployment because Hyper-V Replica is built in the Hyper-V platform.

  14. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  15. Architecture-based Model for Preventive and Operative Crisis Management

    National Research Council Canada - National Science Library

    Jungert, Erland; Derefeldt, Gunilla; Hallberg, Jonas; Hallberg, Niklas; Hunstad, Amund; Thuren, Ronny

    2004-01-01

    .... A system that should support activities of this type must not only have a high capacity, with respect to the dataflow, but also have suitable tools for decision support. To overcome these problems, an architecture for preventive and operative crisis management is proposed. The architecture is based on models for command and control, but also for risk analysis.

  16. Creating technical heritage object replicas in a virtual environment

    Science.gov (United States)

    Egorova, Olga; Shcherbinin, Dmitry

    2016-03-01

    The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.

  17. Architecture for Integrated System Health Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Managing the health of vehicle, crew, and habitat systems is a primary function of flight controllers today. We propose to develop an architecture for automating...

  18. Towards a framework for managing enterprise architecture acceptance / Sonja Gilliland

    OpenAIRE

    Gilliland, Sonja

    2014-01-01

    An enterprise is a complex and changing entity, which is managed and maintained by humans. Enterprise architecture has been identified as an organisational strategy designed to assist enterprises with the understanding of complexity and the management of change. Acceptance, implementation and maintenance of enterprise architecture in organisations are complex and time-consuming. Work roles, responsibilities, common vocabulary, and buy-in are some of the cooperative human factors of stakeholde...

  19. Reference architecture of application services for personal wellbeing information management.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha

    2011-01-01

    Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.

  20. Managing the complexity of collective architectural designing

    NARCIS (Netherlands)

    Sebastian, R.

    2006-01-01

    This paper addresses the complexity of architectural designing whereby multiple designer work in close collaboration to conceive the design of an integrated building project at urban context. collaborative design conception has only recently been observed as a phenomenon of the built environment

  1. DOD Business Systems Modernization: Military Departments Need to Strengthen Management of Enterprise Architecture Programs

    National Research Council Canada - National Science Library

    Hite, Randolph C; Johnson, Tonia; Eagle, Timothy; Epps, Elena; Holland, Michael; Lakhmani, Neela; LaPaza, Rebecca; Le, Anh; Paintsil, Freda

    2008-01-01

    .... Our framework for managing and evaluating the status of architecture programs consists of 31 core elements related to architecture governance, content, use, and measurement that are associated...

  2. ARCHITECTURE SOFTWARE SOLUTION TO SUPPORT AND DOCUMENT MANAGEMENT QUALITY SYSTEM

    Directory of Open Access Journals (Sweden)

    Milan Eric

    2010-12-01

    Full Text Available One of the basis of a series of standards JUS ISO 9000 is quality system documentation. An architecture of the quality system documentation depends on the complexity of business system. An establishment of an efficient management documentation of system of quality is of a great importance for the business system, as well as in the phase of introducing the quality system and in further stages of its improvement. The study describes the architecture and capability of software solutions to support and manage the quality system documentation in accordance with the requirements of standards ISO 9001:2001, ISO 14001:2005 HACCP etc.

  3. Experiences with Architectural Software Configuration Management in Ragnarok

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1998-01-01

    This paper describes a model, denoted architectural software configuration management, that minimises the gap between software design and configuration management by allowing developers to do configuration- and version control of the abstractions and hierarchy in a software architecture. The model...... emphasises traceability and reproducibility by unifying the concepts version and bound configuration. Experiences with such a model, implemented in a prototype “Ragnarok”, from three real-life, small- to medium-sized, software development projects are reported. The conclusion is that the presented model...

  4. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  5. Information Architecture for Quality Management Support in Hospitals.

    Science.gov (United States)

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  6. Fault Management Architectures and the Challenges of Providing Software Assurance

    Science.gov (United States)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    The satellite systems Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most is system complexity due to a need to establish a multi-dimensional structure across hardware, software and operations. This structure is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. These architecture, implementation and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (VV) is challenging. A breakout session at the 2012 NASA Independent Verification Validation (IVV) Annual Workshop titled VV of Fault Management: Challenges and Successes exposed these issues in terms of VV for a representative set of architectures. NASA's IVV is funded by NASA's Software Assurance Research Program (SARP) in partnership with NASA's Jet Propulsion Laboratory (JPL) to extend the work performed at the Workshop session. NASA IVV will extract FM architectures across the IVV portfolio and evaluate the data set for robustness, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This work focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures, visibility, and associated VVIVV techniques provides a data set that can enable higher assurance that a satellite system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the satellite community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the

  7. Power Management for A Distributed Wireless Health Management Architecture

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed wireless architectures for prognostics is an important enabling step in prognostic research in order to achieve feasible real-time system health...

  8. Architectural Decision Management for Digital Transformation of Products and Services

    Directory of Open Access Journals (Sweden)

    Alfred Zimmermann

    2016-04-01

    Full Text Available The digitization of our society changes the way we live, work, learn, communicate, and collaborate. The Internet of Things, Enterprise Social Networks, Adaptive Case Management, Mobility systems, Analytics for Big Data, and Cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and service-oriented enterprise architectures. Our aim is to support flexibility and agile transformations for both business domains and related information technology. The present research paper investigates mechanisms for decision analytics in the context of multi-perspective explorations of enterprise services and their digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support. The paper’s context focuses on digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight on the example domain – Internet of Things.

  9. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  10. Power-managed smart lighting using a semantic interoperability architecture

    NARCIS (Netherlands)

    Bhardwaj, S.; Syed, Aly; Ozcelebi, T.; Lukkien, J.J.

    2011-01-01

    This paper presents a power-managed smart lighting system that allows collaboration of lighting consumer electronics (CE) devices and corresponding system architectures provided by different CE suppliers. In the example scenario, the rooms of a building are categorized as low and high priority, each

  11. Power-managed smart lighting using a semantic interoperability architecture

    NARCIS (Netherlands)

    Bhardwaj, S.; Syed, Aly; Ozcelebi, T.; Lukkien, J.J.

    2011-01-01

    We present a power-managed smart lighting system that allows collaboration of Consumer Electronics (CE) lighting-devices and corresponding system architectures provided by different CE suppliers. In the example scenario, the rooms of a building are categorized as low- and highpriority, each category

  12. An emergency management demonstrator using the high level architecture

    International Nuclear Information System (INIS)

    Williams, R.J.

    1996-12-01

    This paper addresses the issues of simulation interoperability within the emergency management training context. A prototype implementation in Java of a subset of the High Level Architecture (HLA) is described. The use of Web Browsers to provide graphical user interfaces to HLA is also investigated. (au)

  13. Case study B. Architectural design management using a project web

    NARCIS (Netherlands)

    DeClerck, F.; Pels, H.J.; Otter, den A.F.H.J.; Emmitt, S.; Prins, M.; Otter, den A.F.

    2009-01-01

    In this chapter the use and organization of use of a project website is described in the design and realization of a construction project. The case concerns a complicated project with a high number of different parties involved, managed by an architectural office and having an internationally

  14. Design methods and design theory for architectural design management

    NARCIS (Netherlands)

    Achten, H.H.; Otter, den A.F.H.J.; Achten, H.H.; Pels, H.J.

    2008-01-01

    Most parties that an architectural design manager meets in daily practice are engaged to some degree with design. What these parties are actually doing in a project is contingent with the concrete design project. Additionally, each party has some stake, and may employ different strategies to solve

  15. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  16. Practitioner Perspectives on a Disaster Management Architecture

    Science.gov (United States)

    Moe, K.; Evans, J. D.

    2012-12-01

    The Committee on Earth Observing Satellites (CEOS) Working Group on Information Systems and Services (WGISS) is constructing a high-level reference model for the use of satellites, sensors, models, and associated data products from many different global data and service providers in disaster response and risk assessment. To help streamline broad, effective access to satellite information, the reference model provides structured, shared, holistic views of distributed systems and services - in effect, a common vocabulary describing the system-of-systems building blocks and how they are composed for disaster management. These views are being inferred from real-world experience, by documenting and analyzing how practitioners have gone about using or providing satellite data to manage real disaster events or to assess or mitigate hazard risks. Crucial findings and insights come from case studies of three kinds of experience: - Disaster response and recovery (such as the 2008 Sichuan/Wenchuan earthquake in China; and the 2011 Tohoku earthquake and tsunami in Japan); - Technology pilot projects (such as NASA's Flood Sensor Web pilot in Namibia, or the interagency Virtual Mission Operation Center); - Information brokers (such as the International Charter: Space and Major Disasters, or the U.K.-based Disaster Management Constellation). Each of these experiences sheds light on the scope and stakeholders of disaster management; the information requirements for various disaster types and phases; and the services needed for effective access to information by a variety of users. They also highlight needs and gaps in the supply of satellite information for disaster management. One need stands out: rapid and effective access to complex data from multiple sources, across inter-organizational boundaries. This is the near-real-time challenge writ large: gaining access to satellite data resources from multiple organizationally distant and geographically disperse sources, to meet an

  17. Intelligent web data management software architectures and emerging technologies

    CERN Document Server

    Ma, Kun; Yang, Bo; Sun, Runyuan

    2016-01-01

    This book presents some of the emerging techniques and technologies used to handle Web data management. Authors present novel software architectures and emerging technologies and then validate using experimental data and real world applications. The contents of this book are focused on four popular thematic categories of intelligent Web data management: cloud computing, social networking, monitoring and literature management. The Volume will be a valuable reference to researchers, students and practitioners in the field of Web data management, cloud computing, social networks using advanced intelligence tools.

  18. Loss Database Architecture for Disaster Risk Management

    OpenAIRE

    RIOS DIAZ FRANCISCO; MARIN FERRER MONTSERRAT

    2018-01-01

    The reformed Union civil protection legislation (Decision on a Union Civil Protection Mechanism), which entered into force on 1 January 2014, is paving the way for more resilient communities by including key actions related to disaster prevention such as developing national risk assessments and the refinement of risk management planning. Under the Decision, Member States agreed to “develop risk assessments at national or appropriate sub- national level and make available to the Commission a s...

  19. An architecture model for multiple disease management information systems.

    Science.gov (United States)

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  20. An Architecture for Cross-Cloud System Management

    Science.gov (United States)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  1. Architecture

    OpenAIRE

    Clear, Nic

    2014-01-01

    When discussing science fiction’s relationship with architecture, the usual practice is to look at the architecture “in” science fiction—in particular, the architecture in SF films (see Kuhn 75-143) since the spaces of literary SF present obvious difficulties as they have to be imagined. In this essay, that relationship will be reversed: I will instead discuss science fiction “in” architecture, mapping out a number of architectural movements and projects that can be viewed explicitly as scien...

  2. Conceptual Architecture of Building Energy Management Open Source Software (BEMOSS)

    Energy Technology Data Exchange (ETDEWEB)

    Khamphanchai, Warodom; Saha, Avijit; Rathinavel, Kruthika; Kuzlu, Murat; Pipattanasomporn, Manisa; Rahman, Saifur; Akyol, Bora A.; Haack, Jereme N.

    2014-12-01

    The objective of this paper is to present a conceptual architecture of a Building Energy Management Open Source Software (BEMOSS) platform. The proposed BEMOSS platform is expected to improve sensing and control of equipment in small- and medium-sized buildings, reduce energy consumption and help implement demand response (DR). It aims to offer: scalability, robustness, plug and play, open protocol, interoperability, cost-effectiveness, as well as local and remote monitoring. In this paper, four essential layers of BEMOSS software architecture -- namely User Interface, Application and Data Management, Operating System and Framework, and Connectivity layers -- are presented. A laboratory test bed to demonstrate the functionality of BEMOSS located at the Advanced Research Institute of Virginia Tech is also briefly described.

  3. Proposal of a referential Enterprise Architecture management framework for companies

    Directory of Open Access Journals (Sweden)

    César Esquetini Cáceres

    2014-12-01

    Full Text Available (Received: 2014/11/26 - Accepted: 2014/12/17Enterprise Architecture (EA is conceived nowadays as an essential management activity to visualize and evaluate the future direction of a company. The objective of this paper is to make a literature review on EA to evaluate its role as management tool. It is also explained how EA can fulfill two fundamental purposes, first as a tool for assessing the current situation (self-assessment of an organization; second as a tool to model and simulate future scenarios that allow better decision making for the restructuration and development of improvement plans. Furthermore an analysis is made of the integration possibilities of EA with other business management methodologies, as balanced score card (BSC and the model of the European Foundation for Quality Management (EFQM. As the result a management framework is presented, which includes the required elements to achieve excellence and quality standards in organizations.

  4. The Architecture Improvement Method: cost management and systematic learning about strategic product architectures

    NARCIS (Netherlands)

    de Weerd-Nederhof, Petronella C.; Wouters, Marc; Teuns, Steven J.A.; Hissel, Paul H.

    2007-01-01

    The architecture improvement method (AIM) is a method for multidisciplinary product architecture improvement, addressing uncertainty and complexity and incorporating feedback loops, facilitating trade-off decision making during the architecture creation process. The research reported in this paper

  5. Architecture Framework for Fault Management Assessment and Design (AFFMAD), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Architecture Framework for Fault Management Assessment And Design(AFFMAD) provides Fault Management (FM) trade space exploration and rigorous performance constraint...

  6. A Hybrid Power Management (HPM) Based Vehicle Architecture

    Science.gov (United States)

    Eichenberg, Dennis J.

    2011-01-01

    Society desires vehicles with reduced fuel consumption and reduced emissions. This presents a challenge and an opportunity for industry and the government. The NASA John H. Glenn Research Center (GRC) has developed a Hybrid Power Management (HPM) based vehicle architecture for space and terrestrial vehicles. GRC's Electrical and Electromagnetics Branch of the Avionics and Electrical Systems Division initiated the HPM Program for the GRC Technology Transfer and Partnership Office. HPM is the innovative integration of diverse, state-of-the-art power devices in an optimal configuration for space and terrestrial applications. The appropriate application and control of the various power devices significantly improves overall system performance and efficiency. The basic vehicle architecture consists of a primary power source, and possibly other power sources, providing all power to a common energy storage system, which is used to power the drive motors and vehicle accessory systems, as well as provide power as an emergency power system. Each component is independent, permitting it to be optimized for its intended purpose. This flexible vehicle architecture can be applied to all vehicles to considerably improve system efficiency, reliability, safety, security, and performance. This unique vehicle architecture has the potential to alleviate global energy concerns, improve the environment, stimulate the economy, and enable new missions.

  7. The architecture of the management system of complex steganographic information

    Science.gov (United States)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  8. System Architecture and Mobility Management for Mobile Immersive Communications

    Directory of Open Access Journals (Sweden)

    Mehran Dowlatshahi

    2007-01-01

    Full Text Available We propose a system design for delivery of immersive communications to mobile wireless devices based on a distributed proxy model. It is demonstrated that this architecture addresses key technical challenges for the delivery of these services, that is, constraints on link capacity and power consumption in mobile devices. However, additional complexity is introduced with respect to application layer mobility management. The paper proposes three possible methods for updating proxy assignments in response to mobility management and compares the performance of these methods.

  9. An Agile Enterprise Regulation Architecture for Health Information Security Management

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-01-01

    Abstract Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  10. An agile enterprise regulation architecture for health information security management.

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital.

  11. Enhanced risk management by an emerging multi-agent architecture

    Science.gov (United States)

    Lin, Sin-Jin; Hsu, Ming-Fu

    2014-07-01

    Classification in imbalanced datasets has attracted much attention from researchers in the field of machine learning. Most existing techniques tend not to perform well on minority class instances when the dataset is highly skewed because they focus on minimising the forecasting error without considering the relative distribution of each class. This investigation proposes an emerging multi-agent architecture, grounded on cooperative learning, to solve the class-imbalanced classification problem. Additionally, this study deals further with the obscure nature of the multi-agent architecture and expresses comprehensive rules for auditors. The results from this study indicate that the presented model performs satisfactorily in risk management and is able to tackle a highly class-imbalanced dataset comparatively well. Furthermore, the knowledge visualised process, supported by real examples, can assist both internal and external auditors who must allocate limited detecting resources; they can take the rules as roadmaps to modify the auditing programme.

  12. Managing Innovation and Business Development Using Enterprise Architecture

    DEFF Research Database (Denmark)

    Tambo, Torben; Bækgaard, Lars

    2011-01-01

    technologies. In management of IT, it has become increasingly popular to use Enterprise Architecture (EA) as a method and supported by a series of formal frameworks. EA maps artifacts and motives against the business strategy. In this paper, MOT and EA are reviewed for their mutual potentials and issues. Two...... of well planned technological changes. Management of technology (MOT) addresses identification, selection, (long term) planning, designing, implementation and operation of technology based business development. Information Technology (IT) is a key enabler for a vast range of contemporary corporate...... case studies illustrate how enterprises can make major changes derived from business strategy observing both MOT and EA. This suggests a combined view inspired from the IT business’ dedication to EA using EA’s formalisms and the management orientation of MOT to improve understanding technological...

  13. A Risk Management Architecture for Emergency Integrated Aircraft Control

    Science.gov (United States)

    McGlynn, Gregory E.; Litt, Jonathan S.; Lemon, Kimberly A.; Csank, Jeffrey T.

    2011-01-01

    Enhanced engine operation--operation that is beyond normal limits--has the potential to improve the adaptability and safety of aircraft in emergency situations. Intelligent use of enhanced engine operation to improve the handling qualities of the aircraft requires sophisticated risk estimation techniques and a risk management system that spans the flight and propulsion controllers. In this paper, an architecture that weighs the risks of the emergency and of possible engine performance enhancements to reduce overall risk to the aircraft is described. Two examples of emergency situations are presented to demonstrate the interaction between the flight and propulsion controllers to facilitate the enhanced operation.

  14. System Architecture Modeling for Technology Portfolio Management using ATLAS

    Science.gov (United States)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  15. Data Sets Replicas Placements Strategy from Cost-Effective View in the Cloud

    Directory of Open Access Journals (Sweden)

    Xiuguo Wu

    2016-01-01

    Full Text Available Replication technology is commonly used to improve data availability and reduce data access latency in the cloud storage system by providing users with different replicas of the same service. Most current approaches largely focus on system performance improvement, neglecting management cost in deciding replicas number and their store places, which cause great financial burden for cloud users because the cost for replicas storage and consistency maintenance may lead to high overhead with the number of new replicas increased in a pay-as-you-go paradigm. In this paper, towards achieving the approximate minimum data sets management cost benchmark in a practical manner, we propose a replicas placements strategy from cost-effective view with the premise that system performance meets requirements. Firstly, we design data sets management cost models, including storage cost and transfer cost. Secondly, we use the access frequency and the average response time to decide which data set should be replicated. Then, the method of calculating replicas’ number and their store places with minimum management cost is proposed based on location problem graph. Both the theoretical analysis and simulations have shown that the proposed strategy offers the benefits of lower management cost with fewer replicas.

  16. Novel pervasive scenarios for home management: the Butlers architecture.

    Science.gov (United States)

    Denti, Enrico

    2014-01-01

    Many efforts today aim to energy saving, promoting the user's awareness and virtuous behavior in a sustainability perspective. Our houses, appliances, energy meters and devices are becoming smarter and connected, domotics is increasing possibilities in house automation and control, and ambient intelligence and assisted living are bringing attention onto people's needs from different viewpoints. Our assumption is that considering these aspects together allows for novel intriguing possibilities. To this end, in this paper we combine home energy management with domotics, coordination technologies, intelligent agents, ambient intelligence, ubiquitous technologies and gamification to devise novel scenarios, where energy monitoring and management is just the basic brick of a much wider and comprehensive home management system. The aim is to control home appliances well beyond energy consumption, combining home comfort, appliance scheduling, safety constraints, etc. with dynamically-changeable users' preferences, goals and priorities. At the same time, usability and attractiveness are seen as key success factors: so, the intriguing technologies available in most houses and smart devices are exploited to make the system configuration and use simpler, entertaining and attractive for users. These aspects are also integrated with ubiquitous and pervasive technologies, geo-localization, social networks and communities to provide enhanced functionalities and support smarter application scenarios, hereby further strengthening technology acceptation and diffusion. Accordingly, we first analyse the system requirements and define a reference multi-layer architectural model - the Butlers architecture - that specifies seven layers of functionalities, correlating the requirements, the corresponding technologies and the consequent value-added for users in each layer. Then, we outline a set of notable scenarios of increasing functionalities and complexity, discuss the structure of the

  17. Validation of the replica trick for simple models

    Science.gov (United States)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  18. Replica Fourier Transform: Properties and applications

    International Nuclear Information System (INIS)

    Crisanti, A.; De Dominicis, C.

    2015-01-01

    The Replica Fourier Transform is the generalization of the discrete Fourier Transform to quantities defined on an ultrametric tree. It finds use in conjunction of the replica method used to study thermodynamics properties of disordered systems such as spin glasses. Its definition is presented in a systematic and simple form and its use illustrated with some representative examples. In particular we give a detailed discussion of the diagonalization in the Replica Fourier Space of the Hessian matrix of the Gaussian fluctuations about the mean field saddle point of spin glass theory. The general results are finally discussed for a generic spherical spin glass model, where the Hessian can be computed analytically

  19. Design of management information system for nuclear industry architectural project costs

    International Nuclear Information System (INIS)

    Zhang Xingzhi; Li Wei

    1996-01-01

    Management Information System (MIS) for nuclear industry architectural project is analysed and designed in detail base on quota management and engineering budget management of nuclear industry in respect of the practice of Qinshan Second Phase 2 x 600 MW Project

  20. The GENiC architecture for integrated data centre energy management

    NARCIS (Netherlands)

    Pesch, D.; McGibney, A.; Sobonski, P.; Rea, S.; Scherer, Th.; Chen, L.; Engbersen, T.; Mehta, D.; O'Sullivan, B.; Pages, E.; Townley, J.; Kasinathan, Dh.; Torrens, J.I.; Zavrel, V.; Hensen, J.L.M.

    2015-01-01

    We present an architecture for integrated data centre energy management developed in the EC funded GENiC project. The architecture was devised to create a platform that can integrate functions for workload management, cooling, power management and control of heat recovery for future, highly

  1. Distributed Prognostics and Health Management with a Wireless Network Architecture

    Science.gov (United States)

    Goebel, Kai; Saha, Sankalita; Sha, Bhaskar

    2013-01-01

    A heterogeneous set of system components monitored by a varied suite of sensors and a particle-filtering (PF) framework, with the power and the flexibility to adapt to the different diagnostic and prognostic needs, has been developed. Both the diagnostic and prognostic tasks are formulated as a particle-filtering problem in order to explicitly represent and manage uncertainties in state estimation and remaining life estimation. Current state-of-the-art prognostic health management (PHM) systems are mostly centralized in nature, where all the processing is reliant on a single processor. This can lead to a loss in functionality in case of a crash of the central processor or monitor. Furthermore, with increases in the volume of sensor data as well as the complexity of algorithms, traditional centralized systems become for a number of reasons somewhat ungainly for successful deployment, and efficient distributed architectures can be more beneficial. The distributed health management architecture is comprised of a network of smart sensor devices. These devices monitor the health of various subsystems or modules. They perform diagnostics operations and trigger prognostics operations based on user-defined thresholds and rules. The sensor devices, called computing elements (CEs), consist of a sensor, or set of sensors, and a communication device (i.e., a wireless transceiver beside an embedded processing element). The CE runs in either a diagnostic or prognostic operating mode. The diagnostic mode is the default mode where a CE monitors a given subsystem or component through a low-weight diagnostic algorithm. If a CE detects a critical condition during monitoring, it raises a flag. Depending on availability of resources, a networked local cluster of CEs is formed that then carries out prognostics and fault mitigation by efficient distribution of the tasks. It should be noted that the CEs are expected not to suspend their previous tasks in the prognostic mode. When the

  2. An Open Distributed Architecture for Sensor Networks for Risk Management

    Directory of Open Access Journals (Sweden)

    Ralf Denzer

    2008-03-01

    Full Text Available Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word ‘sensors’ covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth’s surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: ‘Open Architecture and Spatial Data

  3. Enterprise Architecture : Management tool and blueprint for the organization

    NARCIS (Netherlands)

    Jonkers, Henk; Lankhorst, Marc M.; ter Doest, Hugo W.L.; Arbab, Farhad; Bosma, Hans; Wieringa, Roelf J.

    2006-01-01

    This is an editorial to a special issue of ISF on enterprise architecture.We define the concept of enterprise architecture, notivate its importance, and then introduce the papers in this special issue.

  4. Smart Traffic Management Protocol Based on VANET architecture

    Directory of Open Access Journals (Sweden)

    Amilcare Francesco Santamaria

    2014-01-01

    Full Text Available Nowadays one of the hottest theme in wireless environments research is the application of the newest technologies to road safety problems and traffic management exploiting the (VANET architecture. In this work, a novel protocol that aims to achieve a better traffic management is proposed. The overal system is able to reduce traffic level inside the city exploiting inter-communication among vehicles and support infrastructures also known as (V2V and (V2I communications. We design a network protocol called (STMP that takes advantages of IEEE 802.11p standard. On each road several sensors system are placed and they are responsible of monitoring. Gathered data are spread in the network exploiting ad-hoc protocol messages. The increasing knowledge about environment conditions make possible to take preventive actions. Moreover, having a realtime monitoring of the lanes it is possible to reveal roads and city blocks congestions in a shorter time. An external entity to the (VANET is responsible to manage traffic and rearrange traffic along the lanes of the city avoiding huge traffic levels.

  5. Application of Data Architecture Model in Enterprise Management

    Directory of Open Access Journals (Sweden)

    Shi Song

    2017-01-01

    Full Text Available Today is in the era of rapid development of information, data volume of high-speed expansion, it is difficult in the previous system for communication, sharing and integration. In order to integrate data resources, eliminate the “information island”, build enterprise development blueprints, people gradually realize the importance of top design. Many enterprises for their own development to establish their own enterprise architecture of the top design, and as its core data architecture model is also reflected in different industries according to different development. This paper mainly studies the data architecture model, expounds the role of data architecture model and its relationship.

  6. Concepts and diagram elements for architectural knowledge management

    NARCIS (Netherlands)

    Orlic, B.; Mak, R.H.; David, I.; Lukkien, J.J.

    2011-01-01

    Capturing architectural knowledge is very important for the evolution of software products. There is increasing awareness that an essential part of this knowledge is in fact the very process of architectural reasoning and decision making, and not just its end results. Therefore, a conceptual

  7. Using Enterprise Architecture for the Alignment of Information Systems in Supply Chain Management

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    Using information systems in supply chain management (SCM) has become commonplace, and therefore architectural issue are part of the agenda for this domain. This article uses three perspectives on enterprise architecture (EA) in the supply chain: The "correlation view," the "remote view...

  8. A Formally Verified Decentralized Key Management Architecture for Wireless Sensor Networks

    NARCIS (Netherlands)

    Law, Y.W.; Corin, R.J.; Etalle, Sandro; Hartel, Pieter H.

    We present a decentralized key management architecture for wireless sensor networks, covering the aspects of key deployment, key refreshment and key establishment. Our architecture is based on a clear set of assumptions and guidelines. Balance between security and energy consumption is achieved by

  9. IN QUEST OF TOTAL QUALITY MANAGEMENT PRINCIPLES IN ARCHITECTURAL DESIGN SERVICES: EVIDENCE FROM TURKEY

    Directory of Open Access Journals (Sweden)

    Umut Durmus

    2010-12-01

    Full Text Available Proposal: Architectural design companies increasingly recognize that time spent on management is not at the expense of their production and there are always better ways to organize business. Although architects have long placed a traditional emphasis on quality, quality management is still a new concept for the majority of architectural design companies, which have to organize relatively more complicated operations nowadays to meet their clients’ expectations. This study aims to understand how architectural design companies define quality and explores the extent to which Total Quality Management (TQM principles like continual improvement, employee involvement, customer satisfaction and others can be pertinent in these companies. Adopting a qualitative research strategy, the authors interviewed with the owner-managers of 10 widely-recognized architectural design companies of different size in Istanbul. The results from the content analysis of semi-structured interview data suggest that i TQM principles cannot be directly applied in architectural design companies without an appropriate translation; ii special characteristics of design services are important to explain quality-related perceptions of owner-managers; iii the owner-managers feel the pressure from the changing internal and external environmental conditions, however few of them adopt a systematic and documented approach to quality management. Architectural design offices which aim to establish a quality management system can benefit from this study to understand potential problem areas on their road.

  10. Cost management and cross-functional communication through product architectures

    NARCIS (Netherlands)

    Zwerink, Ruud; Wouters, Marc; Hissel, Paul; Kerssens-van Drongelen, I.C.

    2007-01-01

    Product architecture decisions regarding, for example, product modularity, component commonality, and design re-use, are important for balancing costs, responsiveness, quality, and other important business objectives. Firms are challenged with complex tradeoffs between competing design priorities,

  11. Developing intelligent transportation systems using the national ITS architecture: an executive edition for senior transportation managers

    Science.gov (United States)

    1998-02-01

    This document has been produced to provide senior transportation managers of state and local departments of transportation with practical guidance for deploying Intelligent Transportation Systems (ITS) consistent with the National ITS Architecture. T...

  12. Fiber-wireless convergence in next-generation communication networks systems, architectures, and management

    CERN Document Server

    Chang, Gee-Kung; Ellinas, Georgios

    2017-01-01

    This book investigates new enabling technologies for Fi-Wi convergence. The editors discuss Fi-Wi technologies at the three major network levels involved in the path towards convergence: system level, network architecture level, and network management level. The main topics will be: a. At system level: Radio over Fiber (digitalized vs. analogic, standardization, E-band and beyond) and 5G wireless technologies; b. Network architecture level: NGPON, WDM-PON, BBU Hotelling, Cloud Radio Access Networks (C-RANs), HetNets. c. Network management level: SDN for convergence, Next-generation Point-of-Presence, Wi-Fi LTE Handover, Cooperative MultiPoint. • Addresses the Fi-Wi convergence issues at three different levels, namely at the system level, network architecture level, and network management level • Provides approaches in communication systems, network architecture, and management that are expected to steer the evolution towards fiber-wireless convergence • Contributions from leading experts in the field of...

  13. An ODMG-compatible testbed architecture for scalable management and analysis of physics data

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.

    1997-01-01

    This paper describes a testbed architecture for the investigation and development of scalable approaches to the management and analysis of massive amounts of high energy physics data. The architecture has two components: an interface layer that is compliant with a substantial subset of the ODMG-93 Version 1.2 specification, and a lightweight object persistence manager that provides flexible storage and retrieval services on a variety of single- and multi-level storage architectures, and on a range of parallel and distributed computing platforms

  14. A Layered Software Architecture for the Management of a Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Domenico CONSOLI

    2011-01-01

    Full Text Available In this paper we describe a layered software architecture in the management of a manufactur-ing company that intensively uses computer technology. Application tools, new and legacy, after the updating, operate in a context of an open web oriented architecture. The software architecture enables the integration and interoperability among all tools that support business processes. Manufacturing Executive System and Text Mining tools are excellent interfaces, the former both for internal production and management processes and the latter for external processes coming from the market. In this way, it is possible to implement, a computer integrated factory, flexible and agile, that immediately responds to customer requirements.

  15. The architectural evaluation of buildings’ indices in explosion crisis management

    Directory of Open Access Journals (Sweden)

    Mahdi Bitarafan

    2016-12-01

    Full Text Available Identifying the probable damages plays an important role in preparing for encountering and resisting negative effects of martial attacks to urban areas. The ultimate goal of this study was to identify some facilities and solutions of immunizing buildings against marital attacks and resisting explosion effects. Explosion and its coming waves, which are caused by bombardment, will damage the buildings and cause difficulties. So, defining indices to identify architectural vulnerability of buildings in explosion is needed. The Basic indices for evaluating the blast-resistant architectural spaces were identified in this study using library resources. The proposed indices were extracted through interviewing architectural and explosive experts. This study has also applied group decision making method based on pairwise comparison model, and then the necessity degree of each index was calculated. Finally, the preferences and ultimate weights of the indices were determined.

  16. Dynamic information architecture system (DIAS) : multiple model simulation management

    International Nuclear Information System (INIS)

    Simunich, K. L.; Sydelko, P.; Dolph, J.; Christiansen, J.

    2002-01-01

    Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application contexts. The modeling domain of a specific DIAS-based simulation is determined by (1) software Entity (domain-specific) objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. In DIAS, models communicate only with Entity objects, never with each other. Each Entity object has a number of Parameter and Aspect (of behavior) objects associated with it. The Parameter objects contain the state properties of the Entity object. The Aspect objects represent the behaviors of the Entity object and how it interacts with other objects. DIAS extends the ''Object'' paradigm by abstraction of the object's dynamic behaviors, separating the ''WHAT'' from the ''HOW.'' DIAS object class definitions contain an abstract description of the various aspects of the object's behavior (the WHAT), but no implementation details (the HOW). Separate DIAS models/applications carry the implementation of object behaviors (the HOW). Any model deemed appropriate, including existing legacy-type models written in other languages, can drive entity object behavior. The DIAS design promotes plug-and-play of alternative models, with minimal recoding of existing applications. The DIAS Context Builder object builds a constructs or scenario for the simulation, based on developer specification and user inputs. Because DIAS is a discrete event simulation system, there is a Simulation Manager object with which all events are processed. Any class that registers to receive events must implement an event handler (method) to process the event during execution. Event handlers can schedule other events; create or remove Entities from the

  17. Dynamic information architecture system (DIAS) : multiple model simulation management.

    Energy Technology Data Exchange (ETDEWEB)

    Simunich, K. L.; Sydelko, P.; Dolph, J.; Christiansen, J.

    2002-05-13

    Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application contexts. The modeling domain of a specific DIAS-based simulation is determined by (1) software Entity (domain-specific) objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. In DIAS, models communicate only with Entity objects, never with each other. Each Entity object has a number of Parameter and Aspect (of behavior) objects associated with it. The Parameter objects contain the state properties of the Entity object. The Aspect objects represent the behaviors of the Entity object and how it interacts with other objects. DIAS extends the ''Object'' paradigm by abstraction of the object's dynamic behaviors, separating the ''WHAT'' from the ''HOW.'' DIAS object class definitions contain an abstract description of the various aspects of the object's behavior (the WHAT), but no implementation details (the HOW). Separate DIAS models/applications carry the implementation of object behaviors (the HOW). Any model deemed appropriate, including existing legacy-type models written in other languages, can drive entity object behavior. The DIAS design promotes plug-and-play of alternative models, with minimal recoding of existing applications. The DIAS Context Builder object builds a constructs or scenario for the simulation, based on developer specification and user inputs. Because DIAS is a discrete event simulation system, there is a Simulation Manager object with which all events are processed. Any class that registers to receive events must implement an event handler (method) to process the event during execution. Event handlers

  18. Fast Optimal Replica Placement with Exhaustive Search Using Dynamically Reconfigurable Processor

    Directory of Open Access Journals (Sweden)

    Hidetoshi Takeshita

    2011-01-01

    Full Text Available This paper proposes a new replica placement algorithm that expands the exhaustive search limit with reasonable calculation time. It combines a new type of parallel data-flow processor with an architecture tuned for fast calculation. The replica placement problem is to find a replica-server set satisfying service constraints in a content delivery network (CDN. It is derived from the set cover problem which is known to be NP-hard. It is impractical to use exhaustive search to obtain optimal replica placement in large-scale networks, because calculation time increases with the number of combinations. To reduce calculation time, heuristic algorithms have been proposed, but it is known that no heuristic algorithm is assured of finding the optimal solution. The proposed algorithm suits parallel processing and pipeline execution and is implemented on DAPDNA-2, a dynamically reconfigurable processor. Experiments show that the proposed algorithm expands the exhaustive search limit by the factor of 18.8 compared to the conventional algorithm search limit running on a Neumann-type processor.

  19. A System Architecture for Autonomous Demand Side Load Management in Smart Buildings

    DEFF Research Database (Denmark)

    Costanzo, Giuseppe Tommaso; Zhu, Guchuan; Anjos, Miguel F.

    2012-01-01

    This paper presents a system architecture for load management in smart buildings which enables autonomous demand side load management in the smart grid. Being of a layered structure composed of three main modules for admission control, load balancing, and demand response management...... in multiple time-scales and allows seamless integration of diverse techniques for online operation control, optimal scheduling, and dynamic pricing. The design of a home energy manager based on this architecture is illustrated and the simulation results with Matlab/Simulink confirm the viability...

  20. Trillo NPP full scope replica simulator project: The last great NPP simulation challenge in Spain

    International Nuclear Information System (INIS)

    Rivero, N.; Abascal, A.

    2006-01-01

    In the year 2000, Trillo NPP (Spanish PWR-KWU design nuclear power plant) and Tecnatom came to the agreement of developing a Trillo plant specific simulator, having as scope all the plant systems operated either from the main control room or from the emergency panels. The simulator operation should be carried out both through a control room replica and graphical user interface, this latter based on plant schematics and softpanels concept. Trillo simulator is to be primarily utilized as a pedagogical tool for the Trillo operational staff training. Because the engineering grade of the mathematical models, it will also have additional uses, such as: - Operation engineering (POE's validation, New Computerized Operator Support Systems Validation, etc).; - Emergency drills; -Plant design modifications assessment. This project has become the largest simulation task Tecnatom has ever undertaken, being structured in three different subprojects, namely: - Simulator manufacture, Simulator acceptance and Training material production. Most relevant technological innovations the project brings are: Highest accuracy in the Nuclear Island models, Advanced Configuration Management System, Open Software architecture, Human machine interface new design, Latest design I/O system and an Instructor Station with extended functionality. The Trillo simulator 'Ready for Training' event is due on September 2003, having started the Factory Acceptance Tests in Autumn 2002. (author)

  1. Traceability of Requirements and Software Architecture for Change Management

    NARCIS (Netherlands)

    Göknil, Arda

    2011-01-01

    At the present day, software systems get more and more complex. The requirements of software systems change continuously and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system

  2. Kinetics from Replica Exchange Molecular Dynamics Simulations.

    Science.gov (United States)

    Stelzl, Lukas S; Hummer, Gerhard

    2017-08-08

    Transitions between metastable states govern many fundamental processes in physics, chemistry and biology, from nucleation events in phase transitions to the folding of proteins. The free energy surfaces underlying these processes can be obtained from simulations using enhanced sampling methods. However, their altered dynamics makes kinetic and mechanistic information difficult or impossible to extract. Here, we show that, with replica exchange molecular dynamics (REMD), one can not only sample equilibrium properties but also extract kinetic information. For systems that strictly obey first-order kinetics, the procedure to extract rates is rigorous. For actual molecular systems whose long-time dynamics are captured by kinetic rate models, accurate rate coefficients can be determined from the statistics of the transitions between the metastable states at each replica temperature. We demonstrate the practical applicability of the procedure by constructing master equation (Markov state) models of peptide and RNA folding from REMD simulations.

  3. Homeowner's Architectural Responses to Crime in Dar Es Salaan : Its impacts and implications to urban architecture, urban design and urban management

    OpenAIRE

    Bulamile, Ludigija Boniface

    2009-01-01

    HTML clipboardThis study is about Homeowner’s architectural responses to crime in Dar es Salaam Tanzania: its impacts and implications to urban architecture, urban design and urban management. The study explores and examines the processes through which homeowners respond to crimes of burglary, home robbery and fear of it using architectural or physical elements. The processes are explored and examined using case study methodology in three cases in Dar es Salaam. The cases are residentia...

  4. 3D printed replicas for endodontic education.

    Science.gov (United States)

    Reymus, M; Fotiadou, C; Kessler, A; Heck, K; Hickel, R; Diegritz, C

    2018-06-14

    To assess the feasibility of producing artificial teeth for endodontic training using 3D printing technology, to analyse the accuracy of the printing process, and to evaluate the teeth by students when used during training. Sound extracted human teeth were selected, digitalized by cone beam computed tomography (CBCT) and appropriate software and finally reproduced by a stereolithographic printer. The printed teeth were scanned and compared with the original ones (trueness) and to one another (precision). Undergraduate dental students in the third and fourth years performed root canal treatment on printed molars and were subsequently asked to evaluate their experience with these compared to real teeth. The workflow was feasible for manufacturing 3D printed tooth replicas. The absolute deviation after printing (trueness) ranged from 50.9μm to 104.3μm. The values for precision ranged from 43.5μm to 68.2μm. Students reported great benefits in the use of the replicated teeth for training purposes. The presented workflow is feasible for any dental educational institution who has access to a CBCT unit and a stereolithographic printer. The accuracy of the printing process is suitable for the production of tooth replicas for endodontic training. Undergraduate students favoured the availability of these replicas and the fairness they ensured in training due to standardization. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  5. A critical inventory of preoperative skull replicas.

    Science.gov (United States)

    Fasel, J H D; Beinemann, J; Schaller, K; Gailloud, P

    2013-09-01

    Physical replicas of organs are used increasingly for preoperative planning. The quality of these models is generally accepted by surgeons. In view of the strong trend towards minimally invasive and personalised surgery, however, the aim of this investigation was to assess qualitatively the accuracy of such replicas, using skull models as an example. Skull imaging was acquired for three cadavers by computed tomography using clinical routine parameters. After digital three-dimensional (3D) reconstruction, physical replicas were produced by 3D printing. The facsimilia were analysed systematically and compared with the best gold standard possible: the macerated skull itself. The skull models were far from anatomically accurate. Non-conforming rendering was observed in particular for foramina, sutures, notches, fissures, grooves, channels, tuberosities, thin-walled structures, sharp peaks and crests, and teeth. Surgeons should be aware that preoperative models may not yet render the exact anatomy of the patient under consideration and are advised to continue relying, in specific conditions, on their own analysis of the native computed tomography or magnetic resonance imaging.

  6. Building Quality into Learning Management Systems – An Architecture-Centric Approach

    OpenAIRE

    Avgeriou, P.; Retalis, Simos; Skordalakis, Manolis

    2003-01-01

    The design and development of contemporary Learning Management Systems (LMS), is largely focused on satisfying functional requirements, rather than quality requirements, thus resulting in inefficient systems of poor software and business quality. In order to remedy this problem there is a research trend into specifying and evaluating software architectures for LMS, since quality at-tributes in a system depend profoundly on its architecture. This paper presents a case study of appraising the s...

  7. Inter organizational System Management for integrated service delivery: an Enterprise Architecture Perspective

    OpenAIRE

    Elmir, Abir; Elmir, Badr; Bounabat, Bouchaib

    2015-01-01

    Service sharing is a prominent operating model to support business. Many large inter-organizational networks have implemented some form of value added integrated services in order to reach efficiency and to reduce costs sustainably. Coupling Service orientation with enterprise architecture paradigm is very important at improving organizational performance through business process optimization. Indeed, enterprise architecture management is increasingly discussed because of information system r...

  8. Information management architecture for an integrated computing environment for the Environmental Restoration Program. Environmental Restoration Program, Volume 3, Interim technical architecture

    International Nuclear Information System (INIS)

    1994-09-01

    This third volume of the Information Management Architecture for an Integrated Computing Environment for the Environmental Restoration Program--the Interim Technical Architecture (TA) (referred to throughout the remainder of this document as the ER TA)--represents a key milestone in establishing a coordinated information management environment in which information initiatives can be pursued with the confidence that redundancy and inconsistencies will be held to a minimum. This architecture is intended to be used as a reference by anyone whose responsibilities include the acquisition or development of information technology for use by the ER Program. The interim ER TA provides technical guidance at three levels. At the highest level, the technical architecture provides an overall computing philosophy or direction. At this level, the guidance does not address specific technologies or products but addresses more general concepts, such as the use of open systems, modular architectures, graphical user interfaces, and architecture-based development. At the next level, the technical architecture provides specific information technology recommendations regarding a wide variety of specific technologies. These technologies include computing hardware, operating systems, communications software, database management software, application development software, and personal productivity software, among others. These recommendations range from the adoption of specific industry or Martin Marietta Energy Systems, Inc. (Energy Systems) standards to the specification of individual products. At the third level, the architecture provides guidance regarding implementation strategies for the recommended technologies that can be applied to individual projects and to the ER Program as a whole

  9. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  10. A simple security architecture for smart water management system

    CSIR Research Space (South Africa)

    Ntuli, N

    2016-05-01

    Full Text Available . Secure booting prevents installation of malicious code onto the device. By making sure that the booting process is secured, we can establish securely the root of trust for the device. Public key cryptography is utilized at this stage. During... Architecture 1168 Nonhlanhla Ntuli and Adnan Abu-Mahfouz / Procedia Computer Science 83 ( 2016 ) 1164 – 1169 3.2. Secure Communication While public key cryptography can be used in the first step (secure booting), it would be too heavy to use during...

  11. Managing changes in the enterprise architecture modelling context

    Science.gov (United States)

    Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya

    2016-07-01

    Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.

  12. A Holistic Management Architecture for Large-Scale Adaptive Networks

    National Research Council Canada - National Science Library

    Clement, Michael R

    2007-01-01

    This thesis extends the traditional notion of network management as an indicator of resource availability and utilization into a systemic model of resource requirements, capabilities, and adaptable...

  13. Right of way real property asset management : prototype data architecture.

    Science.gov (United States)

    2009-02-01

    The Texas Department of Transportation (TxDOT) is responsible for managing 1.1 million acres of land that : provide right of way for approximately 80,000 centerline miles of state-maintained roads. Management of : the huge right of way asset involves...

  14. Tomosynthesis-detected Architectural Distortion: Management Algorithm with Radiologic-Pathologic Correlation.

    Science.gov (United States)

    Durand, Melissa A; Wang, Steven; Hooley, Regina J; Raghu, Madhavi; Philpotts, Liane E

    2016-01-01

    As use of digital breast tomosynthesis becomes increasingly widespread, new management challenges are inevitable because tomosynthesis may reveal suspicious lesions not visible at conventional two-dimensional (2D) full-field digital mammography. Architectural distortion is a mammographic finding associated with a high positive predictive value for malignancy. It is detected more frequently at tomosynthesis than at 2D digital mammography and may even be occult at conventional 2D imaging. Few studies have focused on tomosynthesis-detected architectural distortions to date, and optimal management of these distortions has yet to be well defined. Since implementing tomosynthesis at our institution in 2011, we have learned some practical ways to assess architectural distortion. Because distortions may be subtle, tomosynthesis localization tools plus improved visualization of adjacent landmarks are crucial elements in guiding mammographic identification of elusive distortions. These same tools can guide more focused ultrasonography (US) of the breast, which facilitates detection and permits US-guided tissue sampling. Some distortions may be sonographically occult, in which case magnetic resonance imaging may be a reasonable option, both to increase diagnostic confidence and to provide a means for image-guided biopsy. As an alternative, tomosynthesis-guided biopsy, conventional stereotactic biopsy (when possible), or tomosynthesis-guided needle localization may be used to achieve tissue diagnosis. Practical uses for tomosynthesis in evaluation of architectural distortion are highlighted, potential complications are identified, and a working algorithm for management of tomosynthesis-detected architectural distortion is proposed. (©)RSNA, 2016.

  15. Generalized Information Architecture for Managing Requirements in IBM?s Rational DOORS(r) Application.

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner; Shannon, Sharon A.

    2014-12-01

    When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rational DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201

  16. Program information architecture/document hierarchy. [Information Management Systems, it's components and rationale

    Energy Technology Data Exchange (ETDEWEB)

    Woods, T.W.

    1991-09-01

    The Nuclear Waste Management System (NWMS) Management Systems Improvement Strategy (MSIS) (DOE 1990) requires that the information within the computer program and information management system be ordered into a precedence hierarchy for consistency. Therefore, the US Department of Energy (DOE). Office of Civilian Radioactive Waste Management (OCRWM) requested Westinghouse Hanford Company to develop a plan for NWMS program information which the MSIS calls a document hierarchy. This report provides the results of that effort and describes the management system as a program information architecture.'' 3 refs., 3 figs.

  17. Managing the Internet of Things architectures, theories, and applications

    CERN Document Server

    Hua, Kun

    2016-01-01

    The implementation and deployment of the Internet of Things (IoT) brings with it management challenges around seamless integration, heterogeneity, scalability, mobility, security, and many other issues. This comprehensive book explores these challenges and looks at possible solutions.

  18. An Architecture to Manage Incoming Traffic of Inter-Domain Routing Using OpenFlow Networks

    Directory of Open Access Journals (Sweden)

    Walber José Adriano Silva

    2018-04-01

    Full Text Available The Border Gateway Protocol (BGP is the current state-of-the-art inter-domain routing between Autonomous Systems (ASes. Although BGP has different mechanisms to manage outbound traffic in an AS domain, it lacks an efficient tool for inbound traffic control from transit ASes such as Internet Service Providers (ISPs. For inter-domain routing, the BGP’s destination-based forwarding paradigm limits the granularity of distributing the network traffic among the multiple paths of the current Internet topology. Thus, this work offered a new architecture to manage incoming traffic in the inter-domain using OpenFlow networks. The architecture explored direct inter-domain communication to exchange control information and the functionalities of the OpenFlow protocol. Based on the achieved results of the size of exchanging messages, the proposed architecture is not only scalable, but also capable of performing load balancing for inbound traffic using different strategies.

  19. Smart Building: Decision Making Architecture for Thermal Energy Management.

    Science.gov (United States)

    Uribe, Oscar Hernández; Martin, Juan Pablo San; Garcia-Alegre, María C; Santos, Matilde; Guinea, Domingo

    2015-10-30

    Smart applications of the Internet of Things are improving the performance of buildings, reducing energy demand. Local and smart networks, soft computing methodologies, machine intelligence algorithms and pervasive sensors are some of the basics of energy optimization strategies developed for the benefit of environmental sustainability and user comfort. This work presents a distributed sensor-processor-communication decision-making architecture to improve the acquisition, storage and transfer of thermal energy in buildings. The developed system is implemented in a near Zero-Energy Building (nZEB) prototype equipped with a built-in thermal solar collector, where optical properties are analysed; a low enthalpy geothermal accumulation system, segmented in different temperature zones; and an envelope that includes a dynamic thermal barrier. An intelligent control of this dynamic thermal barrier is applied to reduce the thermal energy demand (heating and cooling) caused by daily and seasonal weather variations. Simulations and experimental results are presented to highlight the nZEB thermal energy reduction.

  20. Smart Building: Decision Making Architecture for Thermal Energy Management

    Directory of Open Access Journals (Sweden)

    Oscar Hernández Uribe

    2015-10-01

    Full Text Available Smart applications of the Internet of Things are improving the performance of buildings, reducing energy demand. Local and smart networks, soft computing methodologies, machine intelligence algorithms and pervasive sensors are some of the basics of energy optimization strategies developed for the benefit of environmental sustainability and user comfort. This work presents a distributed sensor-processor-communication decision-making architecture to improve the acquisition, storage and transfer of thermal energy in buildings. The developed system is implemented in a near Zero-Energy Building (nZEB prototype equipped with a built-in thermal solar collector, where optical properties are analysed; a low enthalpy geothermal accumulation system, segmented in different temperature zones; and an envelope that includes a dynamic thermal barrier. An intelligent control of this dynamic thermal barrier is applied to reduce the thermal energy demand (heating and cooling caused by daily and seasonal weather variations. Simulations and experimental results are presented to highlight the nZEB thermal energy reduction.

  1. The Efficacy of Epidemic Algorithms on Detecting Node Replicas in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Narasimha Shashidhar

    2015-12-01

    Full Text Available A node replication attack against a wireless sensor network involves surreptitious efforts by an adversary to insert duplicate sensor nodes into the network while avoiding detection. Due to the lack of tamper-resistant hardware and the low cost of sensor nodes, launching replication attacks takes little effort to carry out. Naturally, detecting these replica nodes is a very important task and has been studied extensively. In this paper, we propose a novel distributed, randomized sensor duplicate detection algorithm called Discard to detect node replicas in group-deployed wireless sensor networks. Our protocol is an epidemic, self-organizing duplicate detection scheme, which exhibits emergent properties. Epidemic schemes have found diverse applications in distributed computing: load balancing, topology management, audio and video streaming, computing aggregate functions, failure detection, network and resource monitoring, to name a few. To the best of our knowledge, our algorithm is the first attempt at exploring the potential of this paradigm to detect replicas in a wireless sensor network. Through analysis and simulation, we show that our scheme achieves robust replica detection with substantially lower communication, computational and storage requirements than prior schemes in the literature.

  2. Energy Management Systems and tertiary regulation in hierarchical control architectures for islanded micro-grids

    DEFF Research Database (Denmark)

    Sanseverino, Eleonora Riva; Di Silvestre, Maria Luisa; Quang, Ninh Nguyen

    2015-01-01

    In this paper, the structure of the highest level of a hierarchical control architecture for micro-grids is proposed. Such structure includes two sub-levels: the Energy Management System, EMS, and the tertiary regulation. The first devoted to energy resources allocation in each time slot based...

  3. Educational strategies for architectural design management : the design of a new curriculum

    NARCIS (Netherlands)

    Prins, M.; Halman, J.I.M.

    1996-01-01

    This paper is about the design of a new curriculum on Architectural Design Management Systems. This curriculum is embedded in the Stan Ackermans lnstitute(SAI). The SAI is a school for continuing post graduate education on technological design. First some recent developments in the building industry

  4. Critiques, replicas and proposals for the New Urbanism Vision

    Directory of Open Access Journals (Sweden)

    Alaide Retana

    2014-03-01

    Full Text Available The new urbanism (NU is a vision of planning and urban design emerged in 1993, which finds its basis in the design of traditional communities. This trend has had various criticisms and replicas, which were reviewed in relation to urban sprawl, transportation, re-densifying, mix of uses of land, design, gentrification, pedestrianization and safety, which were analyzed in the neighborhood of Santa Barbara in Toluca, Mexico. This area was chosen for being traditional and forming part of the historical center of the city, which even though it was not designed under the guidelines of the NU, it has the quality of traditional, from which the NU would theoretically has taken its essence. The objective of this analysis is to establish whether the NU has the essence of a traditional Mexican neighborhood, as well as to check if the criticisms of the NU are informed when applied to a space belonging to a Mexican historic center that has been abandoned by problems of insecurity and degradation. The general conclusion is that the traditional neighborhoods have provided design elements to the NU, which will refute some of the criticisms, however, proposals for NU in neighborhoods of his-toric centers have to be based on the community, the architecture and existing urbanism, since these elements are those that give the identity.

  5. A Validation Study of the Impression Replica Technique.

    Science.gov (United States)

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  6. CogWnet: A Resource Management Architecture for Cognitive Wireless Networks

    KAUST Repository

    Alqerm, Ismail

    2013-07-01

    With the increasing adoption of wireless communication technologies, there is a need to improve management of existing radio resources. Cognitive radio is a promising technology to improve the utilization of wireless spectrum. Its operating principle is based on building an integrated hardware and software architecture that configures the radio to meet application requirements within the constraints of spectrum policy regulations. However, such an architecture must be able to cope with radio environment heterogeneity. In this paper, we propose a cognitive resource management architecture, called CogWnet, that allocates channels, re-configures radio transmission parameters to meet QoS requirements, ensures reliability, and mitigates interference. The architecture consists of three main layers: Communication Layer, which includes generic interfaces to facilitate the communication between the cognitive architecture and TCP/IP stack layers; Decision-Making Layer, which classifies the stack layers input parameters and runs decision-making optimization algorithms to output optimal transmission parameters; and Policy Layer to enforce policy regulations on the selected part of the spectrum. The efficiency of CogWnet is demonstrated through a testbed implementation and evaluation.

  7. A Customer Service Management Architecture for the Internet

    NARCIS (Netherlands)

    Sprenkels, Ron; Pras, Aiko; van Beijnum, Bernhard J.F.; de Goede, Leo; Ambler, Anthony; Calo, Seraphin B.; Kar, Gautam

    2000-01-01

    Managing services on the Internet is becoming more and more complex and time consuming for service providers since services are increasing both in number and complexity. Also the number of users per service is going up. A solution to this problem is to allow the service users themselves to partly

  8. Quality Risk Management. Modernising the Architecture of Quality Assurance

    Science.gov (United States)

    Raban, Colin; Turner, Liz

    2006-01-01

    Although the world is changing, quality management remains an area of relative calm. Many institutions continue to use elaborated versions of a model that is developed by the Council of Academic Awards and conceived at a time when higher education was not so exposed to market forces, when the policy and regulatory environment was relatively…

  9. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  10. Small wind power systems: market, applications, architectures and energy management

    International Nuclear Information System (INIS)

    Roboam, X.

    2005-01-01

    Context and stakes of small wind power systems are described in this paper by situating both supply and demand as well as the main application fields. Technical issues are then concerned in terms of system structure, energy management and network connection. (author)

  11. Dialogue management in a home machine environment : linguistic components over an agent architecture

    OpenAIRE

    Quesada Moreno, José Francisco; García, Federico; Sena Pichardo, María Esther; Bernal Bermejo, José Ángel; Amores Carredano, José Gabriel de

    2001-01-01

    This paper presents the main characteristics of an Agent-based Architecture for the design and implementation of a Spoken Dialogue System. From a theoretical point of view, the system is based on the Information State Update approach, in particular, the system aims at the management of Natural Command Language Dialogue Moves in a Home Machine Environment. Specifically, the paper is focused on the Natural Language Understanding and Dialogue Management Agents...

  12. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management

    OpenAIRE

    Qiang Luo; Qiang Luo; Yina Ma; Yina Ma; Meghana A. Bhatt; Meghana A. Bhatt; P. Read Montague; P. Read Montague; P. Read Montague; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng

    2017-01-01

    Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent...

  13. Multi-agent based distributed control architecture for microgrid energy management and optimization

    International Nuclear Information System (INIS)

    Basir Khan, M. Reyasudin; Jidin, Razali; Pasupuleti, Jagadeesh

    2016-01-01

    Highlights: • A new multi-agent based distributed control architecture for energy management. • Multi-agent coordination based on non-cooperative game theory. • A microgrid model comprised of renewable energy generation systems. • Performance comparison of distributed with conventional centralized control. - Abstract: Most energy management systems are based on a centralized controller that is difficult to satisfy criteria such as fault tolerance and adaptability. Therefore, a new multi-agent based distributed energy management system architecture is proposed in this paper. The distributed generation system is composed of several distributed energy resources and a group of loads. A multi-agent system based decentralized control architecture was developed in order to provide control for the complex energy management of the distributed generation system. Then, non-cooperative game theory was used for the multi-agent coordination in the system. The distributed generation system was assessed by simulation under renewable resource fluctuations, seasonal load demand and grid disturbances. The simulation results show that the implementation of the new energy management system proved to provide more robust and high performance controls than conventional centralized energy management systems.

  14. A healthcare management system for Turkey based on a service-oriented architecture.

    Science.gov (United States)

    Herand, Deniz; Gürder, Filiz; Taşkin, Harun; Yuksel, Emre Nuri

    2013-09-01

    The current Turkish healthcare management system has a structure that is extremely inordinate, cumbersome and inflexible. Furthermore, this structure has no common point of view and thus has no interoperability and responds slowly to innovations. The purpose of this study is to show that using which methods can the Turkish healthcare management system provide a structure that could be more modern, more flexible and more quick to respond to innovations and changes taking advantage of the benefits given by a service-oriented architecture (SOA). In this paper, the Turkish healthcare management system is chosen to be examined since Turkey is considered as one of the Third World countries and the information architecture of the existing healthcare management system of Turkey has not yet been configured with SOA, which is a contemporary innovative approach and should provide the base architecture of the new solution. The innovation of this study is the symbiosis of two main integration approaches, SOA and Health Level 7 (HL7), for integrating divergent healthcare information systems. A model is developed which is based on SOA and enables obtaining a healthcare management system having the SSF standards (HSSP Service Specification Framework) developed by the framework of the HSSP (Healthcare Services Specification Project) under the leadership of HL7 and the Object Management Group.

  15. Architecture for Customer Relationship Management Approaches in Financial Services

    OpenAIRE

    Geib, Malte; Reichold, Annette; Kolbe, Lutz; Brenner, Walter

    2005-01-01

    The majority of financial services companies in Germany and Switzerland have, with varying success, conducted customer relationship management (CRM) implementation projects. Nonetheless, the implementation of a specific CRM strategy that is aligned with company profitability and uses integrated information systems for performance measurement as well as for the control of marketing, sales, and service processes has been realized in very few companies.In this paper we present a framework for th...

  16. Semantic document architecture for desktop data integration and management

    OpenAIRE

    Nesic, Sasa; Jazayeri, Mehdi

    2011-01-01

    Over the last decade, personal desktops have faced the problem of information overload due to increasing computational power, easy access to the Web and cheap data storage. Moreover, an increasing number of diverse end-user desktop applications have led to the problem of information fragmentation. Each desktop application has its own data, unaware of related and relevant data in other applications. In other words, personal desktops face a lack of interoperability of data managed by differ...

  17. An Architecture for Context-Aware Knowledge Flow Management Systems

    OpenAIRE

    Jarrahi, Ali; Kangavari, Mohammad Reza

    2012-01-01

    The organizational knowledge is one of the most important and valuable assets of organizations. In such environment, organizations with broad, specialized and up-to-date knowledge, adequately using knowledge resources, will be more successful than their competitors. For effective use of knowledge, dynamic knowledge flow from the sources to destinations is essential. In this regard, a novel complex concept in knowledge management is the analysis, design and implementation of knowledge flow man...

  18. Used fuel management system architecture and interface analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nutt, Mark [Argonne National Laboratory, Argonne, IL (United States); Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Carter, Joe; Delley, Alexcia [Savannah River National Laboratory, Aiken, SC (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories, Albuquerque NM (United States); Cotton, Thomas [Complex Systems LLC, Washington, DC (United States)

    2013-07-01

    between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  19. Used fuel management system architecture and interface analyses

    International Nuclear Information System (INIS)

    Nutt, Mark; Howard, Robert; Busch, Ingrid; Carter, Joe; Delley, Alexcia; Hardin, Ernest; Kalinina, Elena; Cotton, Thomas

    2013-01-01

    between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  20. Popularity framework to process dataset traces and its application on dynamic replica reduction in the ATLAS experiment

    International Nuclear Information System (INIS)

    Molfetas, Angelos; Megino, Fernando Barreiro; Tykhonov, Andrii; Lassnig, Mario; Garonne, Vincent; Barisits, Martin; Campana, Simone; Dimitrov, Gancho; Jezequel, Stephane; Ueda, Ikuo; Viegas, Florbela Tique Aires

    2011-01-01

    The ATLAS experiment's data management system is constantly tracing file movement operations that occur on the Worldwide LHC Computing Grid (WLCG). Due to the large scale of the WLCG, statistical analysis of the traces is infeasible in real-time. Factors that contribute to the scalability problems include the capability for users to initiate on-demand queries, high dimensionality of tracer entries combined with very low cardinality parameters, and the large size of the namespace. These scalability issues are alleviated through the adoption of an incremental model that aggregates data for all combinations occurring in selected tracer fields on a daily basis. Using this model it is possible to query on-demand relevant statistics about system usage. We present an implementation of this popularity model in the experiment's distributed data management system, DQ2, and describe a direct application example of the popularity framework, an automated cleaning system, which uses the statistics to dynamically detect and reduce unpopular replicas from grid sites. This paper describes the architecture employed by the cleaning system and reports on the results collected from a prototype during the first months of the ATLAS collision data taking.

  1. Research on fine management and visualization of ancient architectures based on integration of 2D and 3D GIS technology

    International Nuclear Information System (INIS)

    Jun, Yan; Shaohua, Wang; Jiayuan, Li; Qingwu, Hu

    2014-01-01

    Aimed at ancient architectures which own the characteristics of huge data quantity, fine-grained and high-precise, a 3D fine management and visualization method for ancient architectures based on the integration of 2D and 3D GIS is proposed. Firstly, after analysing various data types and characters of digital ancient architectures, main problems and key technologies existing in the 2D and 3D data management are discussed. Secondly, data storage and indexing model of digital ancient architecture based on 2D and 3D GIS integration were designed and the integrative storage and management of 2D and 3D data were achieved. Then, through the study of data retrieval method based on the space-time indexing and hierarchical object model of ancient architecture, 2D and 3D interaction of fine-grained ancient architectures 3D models was achieved. Finally, take the fine database of Liangyi Temple belonging to Wudang Mountain as an example, fine management and visualization prototype of 2D and 3D integrative digital ancient buildings of Liangyi Temple was built and achieved. The integrated management and visual analysis of 10GB fine-grained model of the ancient architecture was realized and a new implementation method for the store, browse, reconstruction, and architectural art research of ancient architecture model was provided

  2. A Virtual Power Plant Architecture for the Demand-Side Management of Smart Prosumers

    Directory of Open Access Journals (Sweden)

    Marco Pasetti

    2018-03-01

    Full Text Available In this paper, we present a conceptual study on a Virtual Power Plant (VPP architecture for the optimal management of Distributed Energy Resources (DERs owned by prosumers participating in Demand-Side Management (DSM programs. Compared to classical VPP architectures, which aim to aggregate several DERs dispersed throughout the electrical grid, in the proposed VPP architecture the supervised physical domain is limited to single users, i.e., to single Points of Delivery (PODs of the distribution network. The VPP architecture is based on a service-oriented approach, where multiple agents cooperate to implement the optimal management of the prosumer’s assets, by also considering different forms of Demand Response (DR requests. The considered DR schemes range from Price-Based DRs to Event-Based DRs, covering both the normal operating functions and the emergency control requests applied in modern distribution networks. With respect to centralized approaches, in this study the control perspective is moved from the system level to the single prosumer’s level, who is allowed to independently provide flexible power profiles through the aggregation of multiple DERs. A generalized optimization model, formulated as a Mixed-Integer Linear Programming (MILP problem, is also introduced. Such a model is able to compute the optimal scheduling of a prosumer’s assets by considering both DR requests and end-users’ requirements in terms of comfort levels while minimizing the costs.

  3. Management of Distributed and Extendible Heterogeneous Radio Architectures

    DEFF Research Database (Denmark)

    Ramkumar, Venkata; Mihovska, Albena D.; Prasad, Neeli R.

    2009-01-01

    Wireless communication systems are dynamic by nature, which comes from several factors, namely: radio propagation impairments, traffic changes, interference conditions, user mobility, etc. In a heterogeneous environment, , the dynamic network behavior calls for a dynamic management of the radio...... resources; a process that associates a large number of parameters and quality/performance indicators that need to be set, measured, analyzed, and optimized. Radio-over-fiber (RoF) technology involves the use of optical fiber links to distribute radio frequency (RF) signals from a central location to remote...

  4. MASM: a market architecture for sensor management in distributed sensor networks

    Science.gov (United States)

    Viswanath, Avasarala; Mullen, Tracy; Hall, David; Garga, Amulya

    2005-03-01

    Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.

  5. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context.

    Science.gov (United States)

    Cerchecci, Matteo; Luti, Francesco; Mecocci, Alessandro; Parrino, Stefano; Peruzzi, Giacomo; Pozzebon, Alessandro

    2018-04-21

    This paper focuses on the realization of an Internet of Things (IoT) architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network) technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented.

  6. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context

    Directory of Open Access Journals (Sweden)

    Matteo Cerchecci

    2018-04-01

    Full Text Available This paper focuses on the realization of an Internet of Things (IoT architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented.

  7. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context

    Science.gov (United States)

    Cerchecci, Matteo; Luti, Francesco; Mecocci, Alessandro; Parrino, Stefano; Peruzzi, Giacomo

    2018-01-01

    This paper focuses on the realization of an Internet of Things (IoT) architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network) technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented. PMID:29690552

  8. An Empirical Investigation of Architectural Heritage Management Implications for Tourism: The Case of Portugal

    Directory of Open Access Journals (Sweden)

    Shahrbanoo Gholitabar

    2018-01-01

    Full Text Available The aims of this study are manifold. First, to investigate the potentials of architectural heritage in the context of tourism destination development, as well as examine public sector policies and make plans toward the preservation of these resources. Secondly, to appraise the outcome of preservation and its implications for tourism. The study is an effort to explore and understand the interrelationships between tourism and architectural heritage sites through tourist image and perception. For the purposes of this research, numerous heritage sites were sampled in Portugal. A mixed research method was utilized to gauge tourists’ image/perception of heritage resources, and impact (quantitative approach. A qualitative approach was utilized to assess the priority of tourists in their visits and public-sector policies toward heritage resource management and planning. The fuzzy logic method was used to assess the architectural value and the tourist and preservation potential of historical buildings in Porto/Aveiro. The contribution and implications of the study are also explained. The results revealed that architectural heritage resources have the most appeal to tourists. The study to date demonstrates the architectural value and tourist and preservation potential of the buildings observed via evaluation by fuzzy logic methods.

  9. Modeling Vocal Fold Intravascular Flow using Synthetic Replicas

    Science.gov (United States)

    Terry, Aaron D.; Ricks, Matthew T.; Thomson, Scott L.

    2017-11-01

    Vocal fold vibration that is induced by air flowing from the lungs is believed to decrease blood flow through the vocal folds. This is important due to the critical role of blood flow in maintaining tissue health. However, the precise mechanical relationships between vocal fold vibration and blood perfusion remain understudied. A platform for studying liquid perfusion in a synthetic, life-size, self-oscillating vocal fold replica has recently been developed. The replicas are fabricated using molded silicone with material properties comparable to those of human vocal fold tissues and that include embedded microchannels through which liquid is perfused. The replicas are mounted on an air flow supply tube to initiate flow-induced vibration. A liquid reservoir is attached to the microchannel to cause liquid to perfuse through replica in the anterior-posterior direction. As replica vibration is initiated and amplitude increases, perfusion flow rate decreases. In this presentation, the replica design will be presented, along with data quantifying the relationships between parameters such as replica vibration amplitude, stiffness, microchannel diameter, and perfusion flow rate. This work was supported by Grant NIDCD R01DC005788 from the National Institutes of Health.

  10. 10 Management Controller for Time and Space Partitioning Architectures

    Science.gov (United States)

    Lachaize, Jerome; Deredempt, Marie-Helene; Galizzi, Julien

    2015-09-01

    The Integrated Modular Avionics (IMA) has been industrialized in aeronautical domain to enable the independent qualification of different application softwares from different suppliers on the same generic computer, this latter computer being a single terminal in a deterministic network. This concept allowed to distribute efficiently and transparently the different applications across the network, sizing accurately the HW equipments to embed on the aircraft, through the configuration of the virtual computers and the virtual network. , This concept has been studied for space domain and requirements issued [D04],[D05]. Experiments in the space domain have been done, for the computer level, through ESA and CNES initiatives [D02] [D03]. One possible IMA implementation may use Time and Space Partitioning (TSP) technology. Studies on Time and Space Partitioning [D02] for controlling resources access such as CPU and memories and studies on hardware/software interface standardization [D01] showed that for space domain technologies where I/O components (or IP) do not cover advanced features such as buffering, descriptors or virtualization, CPU overhead in terms of performances is mainly due to shared interface management in the execution platform, and to the high frequency of I/O accesses, these latter leading to an important number of context switches. This paper will present a solution to reduce this execution overhead with an open, modular and configurable controller.

  11. Reusable Rocket Engine Advanced Health Management System. Architecture and Technology Evaluation: Summary

    Science.gov (United States)

    Pettit, C. D.; Barkhoudarian, S.; Daumann, A. G., Jr.; Provan, G. M.; ElFattah, Y. M.; Glover, D. E.

    1999-01-01

    In this study, we proposed an Advanced Health Management System (AHMS) functional architecture and conducted a technology assessment for liquid propellant rocket engine lifecycle health management. The purpose of the AHMS is to improve reusable rocket engine safety and to reduce between-flight maintenance. During the study, past and current reusable rocket engine health management-related projects were reviewed, data structures and health management processes of current rocket engine programs were assessed, and in-depth interviews with rocket engine lifecycle and system experts were conducted. A generic AHMS functional architecture, with primary focus on real-time health monitoring, was developed. Fourteen categories of technology tasks and development needs for implementation of the AHMS were identified, based on the functional architecture and our assessment of current rocket engine programs. Five key technology areas were recommended for immediate development, which (1) would provide immediate benefits to current engine programs, and (2) could be implemented with minimal impact on the current Space Shuttle Main Engine (SSME) and Reusable Launch Vehicle (RLV) engine controllers.

  12. Architecture of a micro grid energy manager; Arquitectura de un gestor energetico de microrredes

    Energy Technology Data Exchange (ETDEWEB)

    Jimeno-Huarte, J.; Anduaga-Muniozgueren, J.; Oyarzabal-Moreno, J.

    2009-07-01

    Micro grids are defined as a set of aggregated micro generators and loads operating like a unique system. Micro grids need energy management systems in order to coordinate the actions of the elements that compose them. This way, Micro grids provide useful services to connected users as well as to the electrical system. This paper presents the architecture of a Micro grid energy Manager applying multi agent based technologies and communication standards. an application of this architecture to the secondary regulation function has been performed using TECNALIA's Micro grid as validation platform. The implementation of the secondary regulation takes into account economical criteria while the technical restrictions of the controlled equipment are fulfilled. (Author) 14 refs.

  13. Peer-To-Peer Architectures in Distributed Data Management Systems for Large Hadron Collider Experiments

    CERN Document Server

    Lo Presti, Giuseppe; Lo Re, G; Orsini, L

    2005-01-01

    The main goal of the presented research is to investigate Peer-to-Peer architectures and to leverage distributed services to support networked autonomous systems. The research work focuses on development and demonstration of technologies suitable for providing autonomy and flexibility in the context of distributed network management and distributed data acquisition. A network management system enables the network administrator to monitor a computer network and properly handle any failure that can arise within the network. An online data acquisition (DAQ) system for high-energy physics experiments has to collect, combine, filter, and store for later analysis a huge amount of data, describing subatomic particles collision events. Both domains have tight constraints which are discussed and tackled in this work. New emerging paradigms have been investigated to design novel middleware architectures for such distributed systems, particularly the Active Networks paradigm and the Peer-to-Peer paradigm. A network man...

  14. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    Science.gov (United States)

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  15. A Global Navigation Management Architecture Applied to Autonomous Robots in Urban Environments

    OpenAIRE

    Kenmogne , Ide-Flore; Alves De Lima , Danilo; Corrêa Victorino , Alessandro

    2015-01-01

    International audience; This paper presents a global behavioral architecture used as a coordinator for the global navigation of an autonomous vehicle in an urban context including traffic laws and other features. As an extension to our previous work, the approach presented here focuses on how this manager uses perceived information (from low cost cameras and laser scanners) combined with digital road-map data to take decisions. This decision consists in retrieving the car's state regarding th...

  16. Seed-a distributed data base architecture for global management of steam-generator inspection data

    International Nuclear Information System (INIS)

    Soon Ju Kang; Yu Rak Choi; Hee Gon Woo; Seong Su Choi

    1996-01-01

    This paper deals with a data management system - called SEED (Steam-generator Eddy-current Expert Database) for global handling of SG (steam generator) tube inspection data in nuclear power plants. The SEED integrates all stages in SG tube inspection process and supports all data such as raw eddy current data, inspection history data, SG tube information, etc. SEED implemented under client/server computing architecture for supporting LAN/WAN based graphical user interface facilities using WWW programming tools. (author)

  17. Management practices and influences on IT architecture decisions: a case study in a telecom company

    OpenAIRE

    Hsing, Chen Wen; Souza, Cesar Alexandre de

    2012-01-01

    The study aims to analyze the IT architecture management practices associated with their degree of maturity and the influence of institutional and strategic factors on the decisions involved through a case study in a large telecom organization. The case study allowed us to identify practices that led the company to its current stage of maturity and identify practices that can lead the company to the next stage. The strategic influence was mentioned by most respondents and the institutional in...

  18. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  19. Gamma-ray dosimetry measurements of the Little Boy replica

    International Nuclear Information System (INIS)

    Plassmann, E.A.; Pederson, R.A.

    1984-01-01

    We present the current status of our gamma-ray dosimetry results for the Little Boy replica. Both Geiger-Mueller and thermoluminescent detectors were used in the measurements. Future work is needed to test assumptions made in data analysis

  20. SRF Cavity Surface Topography Characterization Using Replica Techniques

    Energy Technology Data Exchange (ETDEWEB)

    C. Xu, M.J. Kelley, C.E. Reece

    2012-07-01

    To better understand the roll of topography on SRF cavity performance, we seek to obtain detailed topographic information from the curved practical cavity surfaces. Replicas taken from a cavity interior surface provide internal surface molds for fine Atomic Force Microscopy (AFM) and stylus profilometry. In this study, we confirm the replica resolution both on surface local defects such as grain boundary and etching pits and compare the surface uniform roughness with the aid of Power Spectral Density (PSD) where we can statistically obtain roughness parameters at different scales. A series of sampling locations are at the same magnetic field chosen at the same latitude on a single cell cavity to confirm the uniformity. Another series of sampling locations at different magnetic field amplitudes are chosen for this replica on the same cavity for later power loss calculation. We also show that application of the replica followed by rinsing does not adversely affect the cavity performance.

  1. The Platform Architecture and Key Technology of Cloud Service that Support Wisdom City Management

    Directory of Open Access Journals (Sweden)

    Liang Xiao

    2013-05-01

    Full Text Available According to the new requirement of constructing “resource sharing and service on demand” wisdom city system, this paper put forward the platform architecture of cloud service for wisdom city management which support IaaS, PaaS and SaaS three types of service model on the basis of researching the operation mode of the wisdom city which under cloud computing environment and through the research of mass storing technology of cloud data, building technology of cloud resource pool, scheduling management methods and monitoring technology of cloud resource, security management and control technology of cloud platform and other key technologies. The platform supports wisdom city system to achieve business or resource scheduling management optimization and the unified and efficient management of large-scale hardware and software, which has the characteristics of cross-domain resource scheduling, cross-domain data sharing, cross-domain facilities integration and cross-domain service integration.

  2. Patrol Detection for Replica Attacks on Wireless Sensor Networks

    OpenAIRE

    Wang, Liang-Min; Shi, Yang

    2011-01-01

    Replica attack is a critical concern in the security of wireless sensor networks. We employ mobile nodes as patrollers to detect replicas distributed in different zones in a network, in which a basic patrol detection protocol and two detection algorithms for stationary and mobile modes are presented. Then we perform security analysis to discuss the defense strategies against the possible attacks on the proposed detection protocol. Moreover, we show the advantages of the proposed protocol by d...

  3. Accuracy of three-dimensional printing for manufacturing replica teeth

    OpenAIRE

    Lee, Keun-Young; Cho, Jin-Woo; Chang, Na-Young; Chae, Jong-Moon; Kang, Kyung-Hwa; Kim, Sang-Cheol; Cho, Jin-Hyoung

    2015-01-01

    Objective Three-dimensional (3D) printing is a recent technological development that may play a significant role in orthodontic diagnosis and treatment. It can be used to fabricate skull models or study models, as well as to make replica teeth in autotransplantation or tooth impaction cases. The aim of this study was to evaluate the accuracy of fabrication of replica teeth made by two types of 3D printing technologies. Methods Fifty extracted molar teeth were selected as samples. They were sc...

  4. Bayesian ensemble refinement by replica simulations and reweighting

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  5. Dynamic tables: an architecture for managing evolving, heterogeneous biomedical data in relational database management systems.

    Science.gov (United States)

    Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.

  6. ENTERPRISE ARCHITECTURE: AN INTERFACE CONCEPT BETWEEN THE ECONOMICS AND THE MANAGEMENT OF THE FIRM

    Directory of Open Access Journals (Sweden)

    José Carlos Cavalcanti

    2010-01-01

    Full Text Available This paper aims to broadly discuss a subject that intends to be an interface between the economics and the management of the firm: the Enterprise Architecture. This concept is viewed here as the most appropriate means to understand the impact of the information content, of the information systems, and of the information and communication technologies- ICTs on the internal technological and organizational choices of the firm. In support to this argument it relies on three main steps. Initially, a brief review of the main theories (economic and management of the firm is made highlighting their contributions, caveats and convergences. Then the paper bases its analysis on the concept of the firm as an “engine of information” and on a concept from the Computing Science and Engineering, Enterprise Architecture, to point out that these concepts bring up important contributions towards a more consistent interpretation of what the firm is (or how it is organized currently, in which is practically impossible to exist without the modern information tools. Finally, it is presented an innovative methodology, in an analogy to the Structure-Conduct-Performance Paradigm (that is traditionally used on the empirical market analysis, which identifies the firm according to three linear connected approaches: its architecture, its governance, and its growth strategy.

  7. ENTERPRISE ARCHITECTURE: AN INTERFACE CONCEPT BETWEEN THE ECONOMICS AND THE MANAGEMENT OF THE FIRM

    Directory of Open Access Journals (Sweden)

    José Carlos Cavalcanti

    2009-12-01

    Full Text Available This paper aims to broadly discuss a subject that intends to be an interface between the economics and the management of the firm: the Enterprise Architecture. This concept is viewed here as the most appropriate means to understand the impact of the information content, of the information systems, and of the information and communication technologies- ICTs on the internal technological and organizational choices of the firm. In support to this argument it relies on three main steps. Initially, a brief review of the main theories (economic and management of the firm is made highlighting their contributions, caveats and convergences. Then the paper bases its analysis on the concept of the firm as an “engine of information” and on a concept from the Computing Science and Engineering, Enterprise Architecture, to point out that these concepts bring up important contributions towards a more consistent interpretation of what the firm is (or how it is organized currently, in which is practically impossible to exist without the modern information tools. Finally, it is presented an innovative methodology, in an analogy to the Structure-Conduct-Performance Paradigm (that is traditionally used on the empirical market analysis, which identifies the firm according to three linear connected approaches: its architecture, its governance, and its growth strategy.

  8. Fully distributed monitoring architecture supporting multiple trackees and trackers in indoor mobile asset management application.

    Science.gov (United States)

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2014-03-21

    A tracking service like asset management is essential in a dynamic hospital environment consisting of numerous mobile assets (e.g., wheelchairs or infusion pumps) that are continuously relocated throughout a hospital. The tracking service is accomplished based on the key technologies of an indoor location-based service (LBS), such as locating and monitoring multiple mobile targets inside a building in real time. An indoor LBS such as a tracking service entails numerous resource lookups being requested concurrently and frequently from several locations, as well as a network infrastructure requiring support for high scalability in indoor environments. A traditional centralized architecture needs to maintain a geographic map of the entire building or complex in its central server, which can cause low scalability and traffic congestion. This paper presents a self-organizing and fully distributed indoor mobile asset management (MAM) platform, and proposes an architecture for multiple trackees (such as mobile assets) and trackers based on the proposed distributed platform in real time. In order to verify the suggested platform, scalability performance according to increases in the number of concurrent lookups was evaluated in a real test bed. Tracking latency and traffic load ratio in the proposed tracking architecture was also evaluated.

  9. Replica Fourier Tansforms on Ultrametric Trees, and Block-Diagonalizing Multi-Replica Matrices

    Science.gov (United States)

    de Dominicis, C.; Carlucci, D. M.; Temesvári, T.

    1997-01-01

    The analysis of objects living on ultrametric trees, in particular the block-diagonalization of 4-replica matrices M^{α β;γ^δ}, is shown to be dramatically simplified through the introduction of properly chosen operations on those objects. These are the Replica Fourier Transforms on ultrametric trees. Those transformations are defined and used in the present work. On montre que l'analyse d'objets vivant sur un arbre ultramétrique, en particulier, la diagonalisation par blocs d'une matrice M^{α β;γ^δ} dépendant de 4-répliques, se simplifie de façon dramatique si l'on introduit les opérations appropriées sur ces objets. Ce sont les Transformées de Fourier de Répliques sur un arbre ultramétrique. Ces transformations sont définies et utilisées dans le présent travail.

  10. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  11. Architectural management in the digital arena : proceedings of the CIB-W096 conference Vienna 2011, Vienna University of Technology, Austria, 13-14 October 2011

    NARCIS (Netherlands)

    Otter, den A.F.H.J.; Emmitt, S.; Achammer, Ch.

    2011-01-01

    Leading research into architectural design management is the CIB’s working committee W096 Architectural Management. CIB-W096 was officially established in 1993, following a conference on ‘Architectural Management’ at the University of Nottingham in the UK. Since this time the commission has been

  12. Connecting Architecture and Implementation

    Science.gov (United States)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  13. Application of a Multimedia Service and Resource Management Architecture for Fault Diagnosis.

    Science.gov (United States)

    Castro, Alfonso; Sedano, Andrés A; García, Fco Javier; Villoslada, Eduardo; Villagrá, Víctor A

    2017-12-28

    Nowadays, the complexity of global video products has substantially increased. They are composed of several associated services whose functionalities need to adapt across heterogeneous networks with different technologies and administrative domains. Each of these domains has different operational procedures; therefore, the comprehensive management of multi-domain services presents serious challenges. This paper discusses an approach to service management linking fault diagnosis system and Business Processes for Telefónica's global video service. The main contribution of this paper is the proposal of an extended service management architecture based on Multi Agent Systems able to integrate the fault diagnosis with other different service management functionalities. This architecture includes a distributed set of agents able to coordinate their actions under the umbrella of a Shared Knowledge Plane, inferring and sharing their knowledge with semantic techniques and three types of automatic reasoning: heterogeneous, ontology-based and Bayesian reasoning. This proposal has been deployed and validated in a real scenario in the video service offered by Telefónica Latam.

  14. A cognitive decision agent architecture for optimal energy management of microgrids

    International Nuclear Information System (INIS)

    Velik, Rosemarie; Nicolay, Pascal

    2014-01-01

    Highlights: • We propose an optimization approach for energy management in microgrids. • The optimizer emulates processes involved in human decision making. • Optimization objectives are energy self-consumption and financial gain maximization. • We gain improved optimization results in significantly reduced computation time. - Abstract: Via the integration of renewable energy and storage technologies, buildings have started to change from passive (electricity) consumers to active prosumer microgrids. Along with this development come a shift from centralized to distributed production and consumption models as well as discussions about the introduction of variable demand–supply-driven grid electricity prices. Together with upcoming ICT and automation technologies, these developments open space to a wide range of novel energy management and energy trading possibilities to optimally use available energy resources. However, what is considered as an optimal energy management and trading strategy heavily depends on the individual objectives and needs of a microgrid operator. Accordingly, elaborating the most suitable strategy for each particular system configuration and operator need can become quite a complex and time-consuming task, which can massively benefit from computational support. In this article, we introduce a bio-inspired cognitive decision agent architecture for optimized, goal-specific energy management in (interconnected) microgrids, which are additionally connected to the main electricity grid. For evaluating the performance of the architecture, a number of test cases are specified targeting objectives like local photovoltaic energy consumption maximization and financial gain maximization. Obtained outcomes are compared against a modified simulating annealing optimization approach in terms of objective achievement and computational effort. Results demonstrate that the cognitive decision agent architecture yields improved optimization results in

  15. Neurovascular Modeling: Small-Batch Manufacturing of Silicone Vascular Replicas

    Science.gov (United States)

    Chueh, J.Y.; Wakhloo, A.K.; Gounis, M.J.

    2009-01-01

    BACKGROUND AND PURPOSE Realistic, population based cerebrovascular replicas are required for the development of neuroendovascular devices. The objective of this work was to develop an efficient methodology for manufacturing realistic cerebrovascular replicas. MATERIALS AND METHODS Brain MR angiography data from 20 patients were acquired. The centerline of the vasculature was calculated, and geometric parameters were measured to describe quantitatively the internal carotid artery (ICA) siphon. A representative model was created on the basis of the quantitative measurements. Using this virtual model, we designed a mold with core-shell structure and converted it into a physical object by fused-deposit manufacturing. Vascular replicas were created by injection molding of different silicones. Mechanical properties, including the stiffness and luminal coefficient of friction, were measured. RESULTS The average diameter, length, and curvature of the ICA siphon were 4.15 ± 0.09 mm, 22.60 ± 0.79 mm, and 0.34 ± 0.02 mm-1 (average ± standard error of the mean), respectively. From these image datasets, we created a median virtual model, which was transformed into a physical replica by an efficient batch-manufacturing process. The coefficient of friction of the luminal surface of the replica was reduced by up to 55% by using liquid silicone rubber coatings. The modulus ranged from 0.67 to 1.15 MPa compared with 0.42 MPa from human postmortem studies, depending on the material used to make the replica. CONCLUSIONS Population-representative, smooth, and true-to-scale silicone arterial replicas with uniform wall thickness were successfully built for in vitro neurointerventional device-testing by using a batch-manufacturing process. PMID:19321626

  16. Management of cyber physical objects in the future Internet of Things methods, architectures and applications

    CERN Document Server

    Loscri, Valeria; Rovella, Anna; Fortino, Giancarlo

    2016-01-01

    This book focuses on new methods, architectures, and applications for the management of Cyber Physical Objects (CPOs) in the context of the Internet of Things (IoT). It covers a wide range of topics related to CPOs, such as resource management, hardware platforms, communication and control, and control and estimation over networks. It also discusses decentralized, distributed, and cooperative optimization as well as effective discovery, management, and querying of CPOs. Other chapters outline the applications of control, real-time aspects, and software for CPOs and introduce readers to agent-oriented CPOs, communication support for CPOs, real-world deployment of CPOs, and CPOs in Complex Systems. There is a focus on the importance of application of IoT technologies for Smart Cities.

  17. Analysis of control and management plane for hybrid fiber radio architectures

    DEFF Research Database (Denmark)

    Kardaras, Georgios; Pham, Tien Thang; Soler, José

    2010-01-01

    This paper presents the existing Radio over Fiber (RoF) architectures and focuses on the control and management plane of the Remote Antenna Unit (RAU). Broadband wireless standards, such as WiMAX and LTE, incorporate optical technologies following the distributed base station concept. The control...... and management of the RAU becomes a critical task, since it can facilitate allocation of resources, configuration and upgrade of the remote unit and constant monitoring of its performance. In the case of baseband over fiber, two protocols (OBSAI and CPRI) introduce a well-defined control and management plane....... In the case of intermediate/radio frequency over fiber, this paper presents a simple approach, which can provide configurability and real-time monitoring of the RAU over the same optical link. This is realized by multiplexing high frequency user data with baseband frequency control data at the Central Office...

  18. Communications System Architecture Development for Air Traffic Management and Aviation Weather Information Dissemination

    Science.gov (United States)

    Gallagher, Seana; Olson, Matt; Blythe, Doug; Heletz, Jacob; Hamilton, Griff; Kolb, Bill; Homans, Al; Zemrowski, Ken; Decker, Steve; Tegge, Cindy

    2000-01-01

    This document is the NASA AATT Task Order 24 Final Report. NASA Research Task Order 24 calls for the development of eleven distinct task reports. Each task was a necessary exercise in the development of comprehensive communications systems architecture (CSA) for air traffic management and aviation weather information dissemination for 2015, the definition of the interim architecture for 2007, and the transition plan to achieve the desired End State. The eleven tasks are summarized along with the associated Task Order reference. The output of each task was an individual task report. The task reports that make up the main body of this document include Task 5, Task 6, Task 7, Task 8, Task 10, and Task 11. The other tasks provide the supporting detail used in the development of the architecture. These reports are included in the appendices. The detailed user needs, functional communications requirements and engineering requirements associated with Tasks 1, 2, and 3 have been put into a relational database and are provided electronically.

  19. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    Science.gov (United States)

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. The architecture of psychological management: the Irish asylums (1801-1922).

    Science.gov (United States)

    Reuber, M

    1996-11-01

    This analysis examines some of the psychological, philosophical and sociological motives behind the development of pauper lunatic asylum architecture in Ireland during the time of the Anglo-Irish union (1801-1922). Ground plans and structural features are used to define five psycho-architectonic generations. While isolation and classification were the prime objectives in the first public asylum in Ireland (1810-1814), a combination of the ideas of a psychological, 'moral', management and 'panoptic' architecture led to a radial institutional design during the next phase of construction (1817-1835). The asylums of the third generation (1845-1855) lacked 'panoptic' features but they were still intended to allow a proper 'moral' management of the inmates, and to create a therapeutic family environment. By the time the institutions of the fourth epoch were erected (1862-1869) the 'moral' treatment approach had been given up, and asylums were built to allow a psychological management by 'association'. The last institutions (1894-1922) built before Ireland's acquisition of Dominion status (1922) were intended to foster the development of a curative society.

  1. Every Second Counts: Integrating Edge Computing and Service Oriented Architecture for Automatic Emergency Management

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2018-01-01

    Full Text Available Emergency management has long been recognized as a social challenge due to the criticality of the response time. In emergency situations such as severe traffic accidents, minimizing the response time, which requires close collaborations between all stakeholders involved and distributed intelligence support, leads to greater survival chance of the injured. However, the current response system is far from efficient, despite the rapid development of information and communication technologies. This paper presents an automated collaboration framework for emergency management that coordinates all stakeholders within the emergency response system and fully automates the rescue process. Applying the concept of multiaccess edge computing architecture, as well as choreography of the service oriented architecture, the system allows seamless coordination between multiple organizations in a distributed way through standard web services. A service choreography is designed to globally model the emergency management process from the time an accident occurs until the rescue is finished. The choreography can be synthesized to generate detailed specification on peer-to-peer interaction logic, and then the specification can be enacted and deployed on cloud infrastructures.

  2. Meme media and meme market architectures knowledge media for editing distributing and managing intellectual resources

    CERN Document Server

    Tanaka, Y

    2003-01-01

    "In this book, Yuzuru Tanaka proposes a powerful new paradigm: that knowledge media, or "memes," operate in a way that closely resembles the biological function of genes, with their network publishing repository working as a gene pool to accelerate the evolution of knowledge shared in our societies. In Meme Media and Meme Market Architectures: Knowledge Media for Editing, Distributing, and Managing Intellectual Resources, Tanaka outlines a ready-to-use knowledge media system, supplemented with sample media objects, which allows readers to experience the knowledge media paradigm."--Jacket.

  3. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Uuspaeae, P. [VTT Energy, Espoo (Finland)

    1996-12-31

    The scope of this research is data communications for electric utilities. Demand Side Management (DSM) calls for communication between the Electric Utility and the Customer. The communication capacity needed will depend on the functions that are chosen for DSM, and on the number of customers. Some functions may be handled with one-way communications, some functions require two-way communication. Utility Communication Architecture looks for an overall view of the communications needs and communication systems in an electric utility. The objective is to define and specify suitable and compatible communications procedures within the Utility and also to outside parties. (27 refs.)

  4. The Management of Manufacturing-Oriented Informatics Systems Using Efficient and Flexible Architectures

    Directory of Open Access Journals (Sweden)

    Constantin Daniel AVRAM

    2011-01-01

    Full Text Available Industry and in particular the manufacturing-oriented sector has always been researched and innovated as a result of technological progress, diversification and differentiation among consumers' demands. A company that provides to its customers products matching perfectly their demands at competitive prices has a great advantage over its competitors. Manufacturing-oriented information systems are becoming more flexible and configurable and they require integration with the entire organization. This can be done using efficient software architectures that will allow the coexistence between commercial solutions and open source components while sharing computing resources organized in grid infrastructures and under the governance of powerful management tools.

  5. Multilevel and Hybrid Architecture for Device Abstraction and Context Information Management in Smart Home Environments

    Science.gov (United States)

    Peláez, Víctor; González, Roberto; San Martín, Luis Ángel; Campos, Antonio; Lobato, Vanesa

    Hardware device management, and context information acquisition and abstraction are key factors to develop the ambient intelligent paradigm in smart homes. This work presents an architecture that addresses these two problems and provides a usable framework to develop applications easily. In contrast to other proposals, this work addresses performance issues specifically. Results show that the execution performance of the developed prototype is suitable for deployment in a real environment. In addition, the modular design of the system allows the user to develop applications using different techniques and different levels of abstraction.

  6. MANAGEMENT PRACTICES AND INFLUENCES ON IT ARCHITECTURE DECISIONS: A CASE STUDY IN A TELECOM COMPANY

    Directory of Open Access Journals (Sweden)

    Chen Wen Hsing

    2012-12-01

    Full Text Available The study aims to analyze the IT architecture management practices associated with their degree of maturity and the influence of institutional and strategic factors on the decisions involved through a case study in a large telecom organization. The case study allowed us to identify practices that led the company to its current stage of maturity and identify practices that can lead the company to the next stage. The strategic influence was mentioned by most respondents and the institutional influence was present in decisions related to innovation and those dealing with a higher level of uncertainties.

  7. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Uuspaeae, P [VTT Energy, Espoo (Finland)

    1997-12-31

    The scope of this research is data communications for electric utilities. Demand Side Management (DSM) calls for communication between the Electric Utility and the Customer. The communication capacity needed will depend on the functions that are chosen for DSM, and on the number of customers. Some functions may be handled with one-way communications, some functions require two-way communication. Utility Communication Architecture looks for an overall view of the communications needs and communication systems in an electric utility. The objective is to define and specify suitable and compatible communications procedures within the Utility and also to outside parties. (27 refs.)

  8. Accuracy of three-dimensional printing for manufacturing replica teeth.

    Science.gov (United States)

    Lee, Keun-Young; Cho, Jin-Woo; Chang, Na-Young; Chae, Jong-Moon; Kang, Kyung-Hwa; Kim, Sang-Cheol; Cho, Jin-Hyoung

    2015-09-01

    Three-dimensional (3D) printing is a recent technological development that may play a significant role in orthodontic diagnosis and treatment. It can be used to fabricate skull models or study models, as well as to make replica teeth in autotransplantation or tooth impaction cases. The aim of this study was to evaluate the accuracy of fabrication of replica teeth made by two types of 3D printing technologies. Fifty extracted molar teeth were selected as samples. They were scanned to generate high-resolution 3D surface model stereolithography files. These files were converted into physical models using two types of 3D printing technologies: Fused deposition modeling (FDM) and PolyJet technology. All replica teeth were scanned and 3D images generated. Computer software compared the replica teeth to the original teeth with linear measurements, volumetric measurements, and mean deviation measurements with best-fit alignment. Paired t-tests were used to statistically analyze the measurements. Most measurements of teeth formed using FDM tended to be slightly smaller, while those of the PolyJet replicas tended to be slightly larger, than those of the extracted teeth. Mean deviation measurements with best-fit alignment of FDM and PolyJet group were 0.047 mm and 0.038 mm, respectively. Although there were statistically significant differences, they were regarded as clinically insignificant. This study confirms that FDM and PolyJet technologies are accurate enough to be usable in orthodontic diagnosis and treatment.

  9. 2005 dossier: clay. Tome: architecture and management of the geologic disposal facility

    International Nuclear Information System (INIS)

    2005-01-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the design of a geologic disposal facility for high-level and long-lived radioactive wastes in argilite formations. Content: 1 - approach of the study: goal, main steps of the design study, iterative approach, content; 2 - general description: high-level and long-lived radioactive wastes, purposes of a reversible disposal, geologic context of the Meuse/Haute-Marne site - the Callovo-Oxfordian formation, design principles of the disposal facility architecture, role of the different disposal components; 3 - high-level and long-lived wastes: production scenarios, description of primary containers, inventory model, hypotheses about receipt fluxes of primary containers; 4- disposal containers: B-type waste containers, C-type waste containers, spent fuel disposal containers; 5 - disposal modules: B-type waste disposal modules, C-type waste disposal modules, spent-fuel disposal modules; 6 - overall underground architecture: main safety questions, overall design, dimensioning factors, construction logic and overall exploitation of the facility, dimensioning of galleries, underground architecture adaptation to different scenarios; 7 - boreholes and galleries: general needs, design principles retained, boreholes description, galleries description, building up of boreholes and galleries, durability of facilities, backfilling and sealing up of boreholes and galleries; 8 - surface facilities: general organization, nuclear area, industrial and administrative area, tailings area; 9 - nuclear exploitation means of the facility: receipt of primary containers and preparation of disposal containers, transfer of disposal containers from the surface to the disposal alveoles, setting up of containers inside alveoles; 10 - reversible management of the disposal: step by step disposal process, mastery of disposal behaviour and action capacity, observation and

  10. An integration architecture for knowledge management system and business process management system

    NARCIS (Netherlands)

    Jung, J.; Choi, I.; Song, M.S.

    2007-01-01

    Recently, interests in the notion of process-oriented knowledge management (PKM) from academia and industry have been significantly increased. Comprehensive research and development requirements along with a cogent framework, however, have not been proposed for integrating knowledge management (KM)

  11. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Tie-Jiun [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Gao, Jun [INPAC, Shanghai Key Laboratory for Particle Physics and Cosmology,Department of Physics and Astronomy, Shanghai Jiao-Tong University, Shanghai 200240 (China); High Energy Physics Division, Argonne National Laboratory,Argonne, Illinois, 60439 (United States); Huston, Joey [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Nadolsky, Pavel [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Schmidt, Carl; Stump, Daniel [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Wang, Bo-Ting; Xie, Ke Ping [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Dulat, Sayipjamal [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); School of Physics Science and Technology, Xinjiang University,Urumqi, Xinjiang 830046 (China); Center for Theoretical Physics, Xinjiang University,Urumqi, Xinjiang 830046 (China); Pumplin, Jon; Yuan, C.P. [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States)

    2017-03-20

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A correction is proposed to address a bias in asymmetric uncertainties introduced by the Taylor series approximation. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  12. Standard practice for production and evaluation of field metallographic replicas

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2001-01-01

    1.1 This practice covers recognized methods for the preparation and evaluation of cellulose acetate or plastic film replicas which have been obtained from metallographically prepared surfaces. It is designed for the evaluation of replicas to ensure that all significant features of a metallographically prepared surface have been duplicated and preserved on the replica with sufficient detail to permit both LM and SEM examination with optimum resolution and sensitivity. 1.2 This practice may be used as a controlling document in commercial situations. 1.3 The values stated in SI units are to be regarded as the standard. Inch-pound units given in parentheses are for information only. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  13. Procurement of Architectural and Engineering Services for Sustainable Buildings: A Guide for Federal Project Managers

    Energy Technology Data Exchange (ETDEWEB)

    2004-06-01

    This guide was prepared to be a resource for federal construction project managers and others who want to integrate the principles of sustainable design into the procurement of professional building design and consulting services. To economize on energy costs and improve the safety, comfort, and health of building occupants, building design teams can incorporate daylighting, energy efficiency, renewable energy, and passive solar design into all projects in which these elements are technically and economically feasible. The information presented here will help project leaders begin the process and manage the inclusion of sustainable design in the procurement process. The section on establishing selection criteria contains key elements to consider before selecting an architectural and engineering (A/E) firm. The section on preparing the statement of work discusses the broad spectrum of sustainable design services that an A/E firm can provide. Several helpful checklists are included.

  14. An Enhanced System Architecture for Optimized Demand Side Management in Smart Grid

    Directory of Open Access Journals (Sweden)

    Anzar Mahmood

    2016-04-01

    Full Text Available Demand Side Management (DSM through optimization of home energy consumption in the smart grid environment is now one of the well-known research areas. Appliance scheduling has been done through many different algorithms to reduce peak load and, consequently, the Peak to Average Ratio (PAR. This paper presents a Comprehensive Home Energy Management Architecture (CHEMA with integration of multiple appliance scheduling options and enhanced load categorization in a smart grid environment. The CHEMA model consists of six layers and has been modeled in Simulink with an embedded MATLAB code. A single Knapsack optimization technique is used for scheduling and four different cases of cost reduction are modeled at the second layer of CHEMA. Fault identification and electricity theft control have also been added in CHEMA. Furthermore, carbon footprint calculations have been incorporated in order to make the users aware of environmental concerns. Simulation results prove the effectiveness of the proposed model.

  15. Patrol Detection for Replica Attacks on Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2011-02-01

    Full Text Available Replica attack is a critical concern in the security of wireless sensor networks. We employ mobile nodes as patrollers to detect replicas distributed in different zones in a network, in which a basic patrol detection protocol and two detection algorithms for stationary and mobile modes are presented. Then we perform security analysis to discuss the defense strategies against the possible attacks on the proposed detection protocol. Moreover, we show the advantages of the proposed protocol by discussing and comparing the communication cost and detection probability with some existing methods.

  16. An agent based architecture for high-risk neonate management at neonatal intensive care unit.

    Science.gov (United States)

    Malak, Jaleh Shoshtarian; Safdari, Reza; Zeraati, Hojjat; Nayeri, Fatemeh Sadat; Mohammadzadeh, Niloofar; Farajollah, Seide Sedighe Seied

    2018-01-01

    In recent years, the use of new tools and technologies has decreased the neonatal mortality rate. Despite the positive effect of using these technologies, the decisions are complex and uncertain in critical conditions when the neonate is preterm or has a low birth weight or malformations. There is a need to automate the high-risk neonate management process by creating real-time and more precise decision support tools. To create a collaborative and real-time environment to manage neonates with critical conditions at the NICU (Neonatal Intensive Care Unit) and to overcome high-risk neonate management weaknesses by applying a multi agent based analysis and design methodology as a new solution for NICU management. This study was a basic research for medical informatics method development that was carried out in 2017. The requirement analysis was done by reviewing articles on NICU Decision Support Systems. PubMed, Science Direct, and IEEE databases were searched. Only English articles published after 1990 were included; also, a needs assessment was done by reviewing the extracted features and current processes at the NICU environment where the research was conducted. We analyzed the requirements and identified the main system roles (agents) and interactions by a comparative study of existing NICU decision support systems. The Universal Multi Agent Platform (UMAP) was applied to implement a prototype of our multi agent based high-risk neonate management architecture. Local environment agents interacted inside a container and each container interacted with external resources, including other NICU systems and consultation centers. In the NICU container, the main identified agents were reception, monitoring, NICU registry, and outcome prediction, which interacted with human agents including nurses and physicians. Managing patients at the NICU units requires online data collection, real-time collaboration, and management of many components. Multi agent systems are applied as

  17. EVALUATION OF UTILIZING SERVICE ORIENTED ARCHITECTURE AS A SUITABLE SOLUTION TO ALIGN UNIVERSITY MANAGEMENT INFORMATION SYSTEMS AND LEARNING MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. M. RIAD

    2009-10-01

    Full Text Available To help universities achieve their goals, it is important to align managerial functionalities side by side with educational aspects. Universities consume University Management Information Systems (UMIS to handle managerial aspects as they do with Learning Management Systems (LMS to achieve learning objectives. UMIS advances LMS by decades and has reached stable and mature consistency level. LMS is the newly acquired solution in Universities; compared to UMIS, and so adopting LMSs in universities can be achieved via three different deployment approaches. First approach believes in LMS ability to replace UMIS and performing its functionalities. Second approach presents the idea of extending UMIS to include LMS functionalities. Third approach arises from the shortages of the two proposed approaches and present integration between both as the appropriate deployment approach. Service Oriented Architecture (SOA is a design pattern that can be used as a suitable architectural solution to align UMIS and LMS. SOA can be utilized in universities to overcome some of information systems’ challenges like the integration between UMIS and LMS. This paper presents the current situation at Mansoura University; Egypt, presents integration as the most suitable solution, and evaluates three different implementation techniques: Dynamic Query, Stored Procedure, and Web services. Evaluation concludes that though SOA enhanced many different aspects of both UMIS and LMS; and consequently university overall. It is not recommended to adopt SOA via Web services as the building unit of the system, but as the interdisciplinary interface between systems.

  18. The Essential Leadership Role of Senior Management in Adopting Architectural Management and Modular Strategies (AMMS), with Perspectives on Experiences of European Automotive Firms

    DEFF Research Database (Denmark)

    Sanchez, Ron

    2015-01-01

    , however, to the fundamental changes in management and organizational processes a firm must undergo in order to implement architectural management and modular strategies ("AMMS") successfully. A common misperception among some senior managers is that implementing AMMS involves primarily some technical...... through the critical organizational and managerial changes required to implement and use AMMS effectively. This paper also suggests that there are two fundamentally different management approaches to leading the organizational change process needed to implement AMMS. We characterize......The potential benefits of architectural approaches to developing new products and of using modular architectures as the basis for new kinds of product strategies have been recognized since the 1990s and elaborated at some length in management research. Relatively little attention has been paid...

  19. Design management in the architectural engineering and construction sector : proceedings of the joint CIB W096 Architectural Management and CIB TG49. Architectural Engineering Conference held in conjunction with the 8th Brazilian Workshop on Building Design Management, University of Sao Paulo, 4-8 December 2008

    NARCIS (Netherlands)

    Melhado, S.; Prins, M.; Emmitt, S.; Bouchlaghem, D.; Otter, den A.F.H.J.

    2008-01-01

    Following the Denmark meeting, held in Lyngby 2005, the CIB W096 commission on Architectural Management merged its own meetings with two large events, the Adaptables Conference in Eindhoven 2006, and the CIB world Conference in Cape Town in 2007. Papers were invited under the theme Design Management

  20. Knowledge Management Practice in Two Australian Architecture-Engineering-Construction (AEC Companies

    Directory of Open Access Journals (Sweden)

    Patrick Zou

    2012-11-01

    Full Text Available Knowledge management (KM could be described as a management system that supports the creation, sharing and retrieving of valued information, expertise and insight within and across communities of people and related organizations using information and communication technologies and hence it is a combination of the effective application of information technlogy and management of human resources. KM is becoming a core competitive factor in construction operations. This paper presents the results of two case studies of KM practices in large AEC (architecture, engineering and construction companies through desk-top study and semi-structured interviews. The results indicate that implementing KM in AEC companies leads to competitive advantages and improved decision-making, problem solving and business performance. The results also indicateed that while technology plays an important role, top management commitment, total employee involvement, performance assessment and the culture of knowledge-learning and sharing must be considered when implementing KM. Therefore it is suggested that the implementation of KM should incorporate the company's vision, work processes, technology and culture, to improve the ability of knowledge creating, capturing, sharing, retrieving and ultimately, to improve the company's competitive advantage, decision making, problem solving and innovation.

  1. Replica-Based High-Performance Tuple Space Computing

    DEFF Research Database (Denmark)

    Andric, Marina; De Nicola, Rocco; Lluch Lafuente, Alberto

    2015-01-01

    of concurrency and data access. We investigate issues related to replica consistency, provide an operational semantics that guides the implementation of the language, and discuss the main synchronization mechanisms of our prototypical run-time framework. Finally, we provide a performance analysis, which includes...

  2. Replica scaling studies of hard missile impacts on reinforced concrete

    International Nuclear Information System (INIS)

    Barr, P.; Carter, P.G.; Howe, W.D.; Neilson, A.J.

    1982-01-01

    Missile and target combinations at three different liners scales have been used in an experimental assessment of the applicability of replica scaling to the dynamic behaviour of reinforced concrete structures impacted by rigid missiles. Experimental results are presented for models with relative linear scales of 1, 0.37 and 0.12. (orig.) [de

  3. Multiscale implementation of infinite-swap replica exchange molecular dynamics.

    Science.gov (United States)

    Yu, Tang-Qing; Lu, Jianfeng; Abrams, Cameron F; Vanden-Eijnden, Eric

    2016-10-18

    Replica exchange molecular dynamics (REMD) is a popular method to accelerate conformational sampling of complex molecular systems. The idea is to run several replicas of the system in parallel at different temperatures that are swapped periodically. These swaps are typically attempted every few MD steps and accepted or rejected according to a Metropolis-Hastings criterion. This guarantees that the joint distribution of the composite system of replicas is the normalized sum of the symmetrized product of the canonical distributions of these replicas at the different temperatures. Here we propose a different implementation of REMD in which (i) the swaps obey a continuous-time Markov jump process implemented via Gillespie's stochastic simulation algorithm (SSA), which also samples exactly the aforementioned joint distribution and has the advantage of being rejection free, and (ii) this REMD-SSA is combined with the heterogeneous multiscale method to accelerate the rate of the swaps and reach the so-called infinite-swap limit that is known to optimize sampling efficiency. The method is easy to implement and can be trivially parallelized. Here we illustrate its accuracy and efficiency on the examples of alanine dipeptide in vacuum and C-terminal β-hairpin of protein G in explicit solvent. In this latter example, our results indicate that the landscape of the protein is a triple funnel with two folded structures and one misfolded structure that are stabilized by H-bonds.

  4. Enterprise-wide PACS: beyond radiology, an architecture to manage all medical images.

    Science.gov (United States)

    Bandon, David; Lovis, Christian; Geissbühler, Antoine; Vallée, Jean-Paul

    2005-08-01

    Picture archiving and communication systems (PACS) have the vocation to manage all medical images acquired within the hospital. To address the various situations encountered in the imaging specialties, the traditional architecture used for the radiology department has to evolve. We present our preliminarily results toward an enterprise-wide PACS intended to support all kind of image production in medicine, from biomolecular images to whole-body pictures. Our solution is based on an existing radiologic PACS system from which images are distributed through an electronic patient record to all care facilities. This platform is enriched with a flexible integration framework supporting digital image communication in medicine (DICOM) and DICOM-XML formats. In addition, a generic workflow engine highly customizable is used to drive work processes. Echocardiology; hematology; ear, nose, and throat; and dermatology, including wounds, follow-up is the first implemented extensions outside of radiology. We also propose a global strategy for further developments based on three possible architectures for an enterprise-wide PACS.

  5. Architecture and Patterns for IT Service Management, Resource Planning, and Governance Making Shoes for the Cobbler's Children

    CERN Document Server

    Betz, Charles T

    2011-01-01

    Information technology supports efficient operations, enterprise integration, and seamless value delivery, yet itself is too often inefficient, un-integrated, and of unclear value. This completely rewritten version of the bestselling Architecture and Patterns for IT Service Management, Resource Planning and Governance retains the original (and still unique) approach: apply the discipline of enterprise architecture to the business of large scale IT management itself. Author Charles Betz applies his deep practitioner experience to a critical reading of ITIL 2011, COBIT version 4, the CMMI suite

  6. A Novel Architecture of Metadata Management System Based on Intelligent Cache

    Institute of Scientific and Technical Information of China (English)

    SONG Baoyan; ZHAO Hongwei; WANG Yan; GAO Nan; XU Jin

    2006-01-01

    This paper introduces a novel architecture of metadata management system based on intelligent cache called Metadata Intelligent Cache Controller (MICC). By using an intelligent cache to control the metadata system, MICC can deal with different scenarios such as splitting and merging of queries into sub-queries for available metadata sets in local, in order to reduce access time of remote queries. Application can find results patially from local cache and the remaining portion of the metadata that can be fetched from remote locations. Using the existing metadata, it can not only enhance the fault tolerance and load balancing of system effectively, but also improve the efficiency of access while ensuring the access quality.

  7. Hybrid Three-Phase/Single-Phase Microgrid Architecture with Power Management Capabilities

    DEFF Research Database (Denmark)

    Sun, Qiuye; Zhou, Jianguo; Guerrero, Josep M.

    2015-01-01

    With the fast proliferation of single-phase distributed generation (DG) units and loads integrated into residential microgrids, independent power sharing per phase and full use of the energy generated by DGs have become crucial. To address these issues, this paper proposes a hybrid microgrid...... architecture and its power management strategy. In this microgrid structure, a power sharing unit (PSU), composed of three single-phase back-to-back (SPBTB) converters, is proposed to be installed at the point of common coupling (PCC). The aim of the PSU is mainly to realize the power exchange and coordinated...... control of load power sharing among phases, as well as to allow fully utilization of the energy generated by DGs. Meanwhile, the method combining the modified adaptive backstepping-sliding mode control approach and droop control is also proposed to design the SPBTB system controllers. With the application...

  8. A block chain based architecture for asset management in coalition operations

    Science.gov (United States)

    Verma, Dinesh; Desai, Nirmit; Preece, Alun; Taylor, Ian

    2017-05-01

    To support dynamic communities of interests in coalition operations, new architectures for efficient sharing of ISR assets are needed. The use of blockchain technology in wired business environments, such as digital currency systems, offers an interesting solution by creating a way to maintain a distributed shared ledger without requiring a single trusted authority. In this paper, we discuss how a blockchain-based system can be modified to provide a solution for dynamic asset sharing amongst coalition members, enabling the creation of a logically centralized asset management system by a seamless policy-compliant federation of different coalition systems. We discuss the use of blockchain for three different types of assets in a coalition context, showing how blockchain can offer a suitable solution for sharing assets in those environments. We also discuss the limitations in the current implementations of blockchain which need to be overcome for the technology to become more effective in a decentralized tactical edge environment.

  9. Architecture of a consent management suite and integration into IHE-based Regional Health Information Networks.

    Science.gov (United States)

    Heinze, Oliver; Birkle, Markus; Köster, Lennart; Bergh, Björn

    2011-10-04

    The University Hospital Heidelberg is implementing a Regional Health Information Network (RHIN) in the Rhine-Neckar-Region in order to establish a shared-care environment, which is based on established Health IT standards and in particular Integrating the Healthcare Enterprise (IHE). Similar to all other Electronic Health Record (EHR) and Personal Health Record (PHR) approaches the chosen Personal Electronic Health Record (PEHR) architecture relies on the patient's consent in order to share documents and medical data with other care delivery organizations, with the additional requirement that the German legislation explicitly demands a patients' opt-in and does not allow opt-out solutions. This creates two issues: firstly the current IHE consent profile does not address this approach properly and secondly none of the employed intra- and inter-institutional information systems, like almost all systems on the market, offers consent management solutions at all. Hence, the objective of our work is to develop and introduce an extensible architecture for creating, managing and querying patient consents in an IHE-based environment. Based on the features offered by the IHE profile Basic Patient Privacy Consent (BPPC) and literature, the functionalities and components to meet the requirements of a centralized opt-in consent management solution compliant with German legislation have been analyzed. Two services have been developed and integrated into the Heidelberg PEHR. The standard-based Consent Management Suite consists of two services. The Consent Management Service is able to receive and store consent documents. It can receive queries concerning a dedicated patient consent, process it and return an answer. It represents a centralized policy enforcement point. The Consent Creator Service allows patients to create their consents electronically. Interfaces to a Master Patient Index (MPI) and a provider index allow to dynamically generate XACML-based policies which are

  10. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management.

    Science.gov (United States)

    Luo, Qiang; Ma, Yina; Bhatt, Meghana A; Montague, P Read; Feng, Jianfeng

    2017-01-01

    Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent noise (SDN), which captured the directional connectivity underlying the impression management during the bargaining game. We found that the sophisticated strategists engaged stronger directional connectivity from both dorsal anterior cingulate cortex and retrosplenial cortex to rostral prefrontal cortex, and the strengths of these directional influences were associated with higher level of deception during the game. Using the directional connectivity as a neural signature, we identified the strategic deception with 80% accuracy by a machine-learning classifier. These results suggest that different social strategies are supported by distinct patterns of directional connectivity among key brain regions for social cognition.

  11. Architecture of security management unit for safe hosting of multiple agents

    Science.gov (United States)

    Gilmont, Tanguy; Legat, Jean-Didier; Quisquater, Jean-Jacques

    1999-04-01

    In such growing areas as remote applications in large public networks, electronic commerce, digital signature, intellectual property and copyright protection, and even operating system extensibility, the hardware security level offered by existing processors is insufficient. They lack protection mechanisms that prevent the user from tampering critical data owned by those applications. Some devices make exception, but have not enough processing power nor enough memory to stand up to such applications (e.g. smart cards). This paper proposes an architecture of secure processor, in which the classical memory management unit is extended into a new security management unit. It allows ciphered code execution and ciphered data processing. An internal permanent memory can store cipher keys and critical data for several client agents simultaneously. The ordinary supervisor privilege scheme is replaced by a privilege inheritance mechanism that is more suited to operating system extensibility. The result is a secure processor that has hardware support for extensible multitask operating systems, and can be used for both general applications and critical applications needing strong protection. The security management unit and the internal permanent memory can be added to an existing CPU core without loss of performance, and do not require it to be modified.

  12. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management

    Directory of Open Access Journals (Sweden)

    Qiang Luo

    2017-11-01

    Full Text Available Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent noise (SDN, which captured the directional connectivity underlying the impression management during the bargaining game. We found that the sophisticated strategists engaged stronger directional connectivity from both dorsal anterior cingulate cortex and retrosplenial cortex to rostral prefrontal cortex, and the strengths of these directional influences were associated with higher level of deception during the game. Using the directional connectivity as a neural signature, we identified the strategic deception with 80% accuracy by a machine-learning classifier. These results suggest that different social strategies are supported by distinct patterns of directional connectivity among key brain regions for social cognition.

  13. A Hybrid Three Layer Architecture for Fire Agent Management in Rescue Simulation Environment

    Directory of Open Access Journals (Sweden)

    Alborz Geramifard

    2008-11-01

    Full Text Available This paper presents a new architecture called FAIS for imple- menting intelligent agents cooperating in a special Multi Agent environ- ment, namely the RoboCup Rescue Simulation System. This is a layered architecture which is customized for solving fire extinguishing problem. Structural decision making algorithms are combined with heuristic ones in this model, so it's a hybrid architecture.

  14. A Hybrid Three Layer Architecture for Fire Agent Management in Rescue Simulation Environment

    Directory of Open Access Journals (Sweden)

    Alborz Geramifard

    2005-06-01

    Full Text Available This paper presents a new architecture called FAIS for implementing intelligent agents cooperating in a special Multi Agent environment, namely the RoboCup Rescue Simulation System. This is a layered architecture which is customized for solving fire extinguishing problem. Structural decision making algorithms are combined with heuristic ones in this model, so it's a hybrid architecture.

  15. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    Science.gov (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  16. Design, Analysis and User Acceptance of Architectural Design Education in Learning System Based on Knowledge Management Theory

    Science.gov (United States)

    Wu, Yun-Wu; Lin, Yu-An; Wen, Ming-Hui; Perng, Yeng-Hong; Hsu, I-Ting

    2016-01-01

    The major purpose of this study is to develop an architectural design knowledge management learning system with corresponding learning activities to help the students have meaningful learning and improve their design capability in their learning process. Firstly, the system can help the students to obtain and share useful knowledge. Secondly,…

  17. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  18. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    OpenAIRE

    Alsahli, Abdulaziz; Khan, Hameed; Alyahya, Sultan

    2016-01-01

    Requirement change management (RCM) is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD).The o...

  19. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  20. Physical replicas and the Bose glass in cold atomic gases

    International Nuclear Information System (INIS)

    Morrison, S; Kantian, A; Daley, A J; Zoller, P; Katzgraber, H G; Lewenstein, M; Buechler, H P

    2008-01-01

    We study cold atomic gases in a disorder potential and analyse the correlations between different systems subjected to the same disorder landscape. Such independent copies with the same disorder landscape are known as replicas. While, in general, these are not accessible experimentally in condensed matter systems, they can be realized using standard tools for controlling cold atomic gases in an optical lattice. Of special interest is the overlap function which represents a natural order parameter for disordered systems and is a correlation function between the atoms of two independent replicas with the same disorder. We demonstrate an efficient measurement scheme for the determination of this disorder-induced correlation function. As an application, we focus on the disordered Bose-Hubbard model and determine the overlap function within the perturbation theory and a numerical analysis. We find that the measurement of the overlap function allows for the identification of the Bose-glass phase in certain parameter regimes

  1. Physical replicas and the Bose glass in cold atomic gases

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, S; Kantian, A; Daley, A J; Zoller, P [Institute for Theoretical Physics, University of Innsbruck, Technikerstr. 25, A-6020 Innsbruck (Austria); Katzgraber, H G [Theoretische Physik, ETH Zurich, CH-8093 Zuerich (Switzerland); Lewenstein, M [ICAO-Institut de Ciencies Fotoniques, Parc Mediterrani de la Tecnologia, E-08860 Castelldefels, Barcelona (Spain); Buechler, H P [Institute for Theoretical Physics III, University of Stuttgart, Pfaffenwaldring 57, 70550 Stuttgart (Germany)], E-mail: sarah.morrison@uibk.ac.at

    2008-07-15

    We study cold atomic gases in a disorder potential and analyse the correlations between different systems subjected to the same disorder landscape. Such independent copies with the same disorder landscape are known as replicas. While, in general, these are not accessible experimentally in condensed matter systems, they can be realized using standard tools for controlling cold atomic gases in an optical lattice. Of special interest is the overlap function which represents a natural order parameter for disordered systems and is a correlation function between the atoms of two independent replicas with the same disorder. We demonstrate an efficient measurement scheme for the determination of this disorder-induced correlation function. As an application, we focus on the disordered Bose-Hubbard model and determine the overlap function within the perturbation theory and a numerical analysis. We find that the measurement of the overlap function allows for the identification of the Bose-glass phase in certain parameter regimes.

  2. Replica approach to mean-variance portfolio optimization

    Science.gov (United States)

    Varga-Haszonits, Istvan; Caccioli, Fabio; Kondor, Imre

    2016-12-01

    We consider the problem of mean-variance portfolio optimization for a generic covariance matrix subject to the budget constraint and the constraint for the expected return, with the application of the replica method borrowed from the statistical physics of disordered systems. We find that the replica symmetry of the solution does not need to be assumed, but emerges as the unique solution of the optimization problem. We also check the stability of this solution and find that the eigenvalues of the Hessian are positive for r  =  N/T  optimal in-sample variance is found to vanish at the critical point inversely proportional to the divergent estimation error.

  3. Platinum replica electron microscopy: Imaging the cytoskeleton globally and locally.

    Science.gov (United States)

    Svitkina, Tatyana M

    2017-05-01

    Structural studies reveal how smaller components of a system work together as a whole. However, combining high resolution of details with full coverage of the whole is challenging. In cell biology, light microscopy can image many cells in their entirety, but at a lower resolution, whereas electron microscopy affords very high resolution, but usually at the expense of the sample size and coverage. Structural analyses of the cytoskeleton are especially demanding, because cytoskeletal networks are unresolvable by light microscopy due to their density and intricacy, whereas their proper preservation is a challenge for electron microscopy. Platinum replica electron microscopy can uniquely bridge the gap between the "comfort zones" of light and electron microscopy by allowing high resolution imaging of the cytoskeleton throughout the entire cell and in many cells in the population. This review describes the principles and applications of platinum replica electron microscopy for studies of the cytoskeleton. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Difficult Sudoku Puzzles Created by Replica Exchange Monte Carlo Method

    OpenAIRE

    Watanabe, Hiroshi

    2013-01-01

    An algorithm to create difficult Sudoku puzzles is proposed. An Ising spin-glass like Hamiltonian describing difficulty of puzzles is defined, and difficult puzzles are created by minimizing the energy of the Hamiltonian. We adopt the replica exchange Monte Carlo method with simultaneous temperature adjustments to search lower energy states efficiently, and we succeed in creating a puzzle which is the world hardest ever created in our definition, to our best knowledge. (Added on Mar. 11, the ...

  5. Replica treatment of the Calogero-Sutherland model

    International Nuclear Information System (INIS)

    Gangardt, Dimitry M.; Kamenev, Alex

    2001-01-01

    Employing Forrester-Ha method of Jack polynomials, we derive an integral identity connecting certain N-fold coordinate average of the Calogero-Sutherland model with the n-fold replica integral. Subsequent analytical continuation in n leads to asymptotic expressions for the (static and dynamic) density-density correlation function of the model as well as the Green's function for an arbitrary coupling constant λ

  6. Grid tied PV/battery system architecture and power management for fast electric vehicle charging

    Science.gov (United States)

    Badawy, Mohamed O.

    The prospective spread of Electric vehicles (EV) and plug-in hybrid electric vehicles (PHEV) arises the need for fast charging rates. Higher charging rates requirements lead to high power demands, which cant be always supported by the grid. Thus, the use of on-site sources alongside the electrical grid for EVs charging is a rising area of interest. In this dissertation, a photovoltaic (PV) source is used to support the high power EVs charging. However, the PV output power has an intermittent nature that is dependable on the weather conditions. Thus, battery storage are combined with the PV in a grid tied system, providing a steady source for on-site EVs use in a renewable energy based fast charging station. Verily, renewable energy based fast charging stations should be cost effective, efficient, and reliable to increase the penetration of EVs in the automotive market. Thus, this Dissertation proposes a novel power flow management topology that aims on decreasing the running cost along with innovative hardware solutions and control structures for the developed architecture. The developed power flow management topology operates the hybrid system at the minimum operating cost while extending the battery lifetime. An optimization problem is formulated and two stages of optimization, i.e online and offline stages, are adopted to optimize the batteries state of charge (SOC) scheduling and continuously compensate for the forecasting errors. The proposed power flow management topology is validated and tested with two metering systems, i.e unified and dual metering systems. The results suggested that minimal power flow is anticipated from the battery storage to the grid in the dual metering system. Thus, the power electronic interfacing system is designed accordingly. Interconnecting bi-directional DC/DC converters are analyzed, and a cascaded buck boost (CBB) converter is chosen and tested under 80 kW power flow rates. The need to perform power factor correction (PFC) on

  7. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    Directory of Open Access Journals (Sweden)

    Abdulaziz Alsahli

    2016-01-01

    Full Text Available Requirement change management (RCM is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD.The objective of this research is to introduce an innovative approach for handling requirements and architecture changes simultaneously during global software development. The approach makes use of Case-Based Reasoning (CBR and agile practices. Agile practices make our approach iterative, whereas CBR stores requirements and makes them reusable. Twin Peaks is our base model, meaning that requirements and architecture are handled simultaneously. For this research, grounded theory has been applied; similarly, interviews from domain experts were conducted. Interview and literature transcripts formed the basis of data collection in grounded theory. Physical saturation of theory has been achieved through a published case study and developed tool. Expert reviews and statistical analysis have been used for evaluation. The proposed approach resulted in effective change management of requirements and architecture simultaneously during global software development.

  8. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    Science.gov (United States)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate

  9. Space Station needs, attributes and architectural options. Volume 2, book 2, part 2, Task 2: Information management system

    Science.gov (United States)

    1983-01-01

    Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.

  10. Migration of the Japanese healthcare enterprise from a financial to integrated management: strategy and architecture.

    Science.gov (United States)

    Akiyama, M

    2001-01-01

    The Hospital Information System (HIS) has been positioned as the hub of the healthcare information management architecture. In Japan, the billing system assigns an "insurance disease names" to performed exams based on the diagnosis type. Departmental systems provide localized, departmental services, such as order receipt and diagnostic reporting, but do not provide patient demographic information. The system above has many problems. The departmental system's terminals and the HIS's terminals are not integrated. Duplicate data entry introduces errors and increases workloads. Order and exam data managed by the HIS can be sent to the billing system, but departmental data cannot usually be entered. Additionally, billing systems usually keep departmental data for only a short time before it is deleted. The billing system provides payment based on what is entered. The billing system is oriented towards diagnoses. Most importantly, the system is geared towards generating billing reports rather than at providing high-quality patient care. The role of the application server is that of a mediator between system components. Data and events generated by system components are sent to the application server that routes them to appropriate destinations. It also records all system events, including state changes to clinical data, access of clinical data and so on. Finally, the Resource Management System identifies all system resources available to the enterprise. The departmental systems are responsible for managing data and clinical processes at a departmental level. The client interacts with the system via the application server, which provides a general set of system-level functions. The system is implemented using current technologies CORBA and HTTP. System data is collected by the application server and assembled into XML documents for delivery to clients. Clients can access these URLs using standard HTTP clients, since each department provides an HTTP compliant web

  11. Replica exchange with solute tempering: A method for sampling biological systems in explicit water

    Science.gov (United States)

    Liu, Pu; Kim, Byungchan; Friesner, Richard A.; Berne, B. J.

    2005-09-01

    An innovative replica exchange (parallel tempering) method called replica exchange with solute tempering (REST) for the efficient sampling of aqueous protein solutions is presented here. The method bypasses the poor scaling with system size of standard replica exchange and thus reduces the number of replicas (parallel processes) that must be used. This reduction is accomplished by deforming the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. For proof of concept, REST is compared with standard replica exchange for an alanine dipeptide molecule in water. The comparisons confirm that REST greatly reduces the number of CPUs required by regular replica exchange and increases the sampling efficiency. This method reduces the CPU time required for calculating thermodynamic averages and for the ab initio folding of proteins in explicit water. Author contributions: B.J.B. designed research; P.L. and B.K. performed research; P.L. and B.K. analyzed data; and P.L., B.K., R.A.F., and B.J.B. wrote the paper.Abbreviations: REST, replica exchange with solute tempering; REM, replica exchange method; MD, molecular dynamics.*P.L. and B.K. contributed equally to this work.

  12. Supporting self-management of obesity using a novel game architecture.

    Science.gov (United States)

    Giabbanelli, Philippe J; Crutzen, Rik

    2015-09-01

    Obesity has commonly been addressed using a 'one size fits all' approach centred on a combination of diet and exercise. This has not succeeded in halting the obesity epidemic, as two-thirds of American adults are now obese or overweight. Practitioners are increasingly highlighting that one's weight is shaped by myriad factors, suggesting that interventions should be tailored to the specific needs of individuals. Health games have potential to provide such tailored approach. However, they currently tend to focus on communicating and/or reinforcing knowledge, in order to suscitate learning in the participants. We argue that it would be equally, if not more valuable, that games learn from participants using recommender systems. This would allow treatments to be comprehensive, as games can deduce from the participants' behaviour which factors seem to be most relevant to his or her weight and focus on them. We introduce a novel game architecture and discuss its implications on facilitating the self-management of obesity. © The Author(s) 2014.

  13. Next Generation RFID-Based Medical Service Management System Architecture in Wireless Sensor Network

    Science.gov (United States)

    Tolentino, Randy S.; Lee, Kijeong; Kim, Yong-Tae; Park, Gil-Cheol

    Radio Frequency Identification (RFID) and Wireless Sensor Network (WSN) are two important wireless technologies that have wide variety of applications and provide unlimited future potentials most especially in healthcare systems. RFID is used to detect presence and location of objects while WSN is used to sense and monitor the environment. Integrating RFID with WSN not only provides identity and location of an object but also provides information regarding the condition of the object carrying the sensors enabled RFID tag. However, there isn't any flexible and robust communication infrastructure to integrate these devices into an emergency care setting. An efficient wireless communication substrate for medical devices that addresses ad hoc or fixed network formation, naming and discovery, transmission efficiency of data, data security and authentication, as well as filtration and aggregation of vital sign data need to be study and analyze. This paper proposed an efficient next generation architecture for RFID-based medical service management system in WSN that possesses the essential elements of each future medical application that are integrated with existing medical practices and technologies in real-time, remote monitoring, in giving medication, and patient status tracking assisted by embedded wearable wireless sensors which are integrated in wireless sensor network.

  14. Service oriented network architecture for control and management of home appliances

    Science.gov (United States)

    Hayakawa, Hiroshi; Koita, Takahiro; Sato, Kenya

    2005-12-01

    Recent advances in multimedia network systems and mechatronics have led to the development of a new generation of applications that associate the use of various multimedia objects with the behavior of multiple robotic actors. The connection of audio and video devices through high speed multimedia networks is expected to make the system more convenient to use. For example, many home appliances, such as a video camera, a display monitor, a video recorder, an audio system and so on, are being equipped with a communication interface in the near future. Recently some platforms (i.e. UPnP1, HAVi2 and so on) are proposed for constructing home networks; however, there are some issues to be solved to realize various services by connecting different equipment via the pervasive peer-to-peer network. UPnP offers network connectivity of PCs of intelligent home appliances, practically, which means to require a PC in the network to control other devices. Meanwhile, HAVi has been developed for intelligent AV equipments with sophisticated functions using high CPU power and large memory. Considering the targets of home alliances are embedded systems, this situation raises issues of software and hardware complexity, cost, power consumption and so on. In this study, we have proposed and developed the service oriented network architecture for control and management of home appliances, named SONICA (Service Oriented Network Interoperability for Component Adaptation), to address these issues described before.

  15. Enhancing Architecture-Implementation Conformance with Change Management and Support for Behavioral Mapping

    Science.gov (United States)

    Zheng, Yongjie

    2012-01-01

    Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…

  16. ANALYSIS OF THE KEY ACTIVITIES OF THE LIFE CYCLE OF KNOWLEDGE MANAGEMENT IN THE UNIVERSITY AND DEVELOPMENT OF THE CONCEPTUAL ARCHITECTURE OF THE KNOWLEDGE MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Eugene N. Tcheremsina

    2013-01-01

    Full Text Available This article gives an analysis of the key activities of the life cycle of knowledge management in terms of the features of knowledge management in higher education. Based on the analysis we propose the model of the conceptual architecture of virtual knowledge-space of a university. The proposed model is the basis for the development of kernel intercollegiate virtual knowledge-space, based on cloud technology. 

  17. C3PO - A dynamic data placement agent for ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00346910; The ATLAS collaboration; Lassnig, Mario; Barisits, Martin-Stefan; Serfon, Cedric; Garonne, Vincent

    2017-01-01

    This paper introduces a new dynamic data placement agent for the ATLAS distributed data management system. This agent is designed to pre-place potentially popular data to make it more widely available. It therefore incorporates information from a variety of sources. Those include input datasets and sites workload information from the ATLAS workload management system, network metrics from different sources like FTS and PerfSonar, historical popularity data collected through a tracer mechanism and more. With this data it decides if, when and where to place new replicas that then can be used by the WMS to distribute the workload more evenly over available computing resources and then ultimately reduce job waiting times. This paper gives an overview of the architecture and the final implementation of this new agent. The paper also includes an evaluation of the placement algorithm by comparing the transfer times and the new replica usage.

  18. QoS Management and Control for an All-IP WiMAX Network Architecture: Design, Implementation and Evaluation

    Directory of Open Access Journals (Sweden)

    Thomas Michael Bohnert

    2008-01-01

    Full Text Available The IEEE 802.16 standard provides a specification for a fixed and mobile broadband wireless access system, offering high data rate transmission of multimedia services with different Quality-of-Service (QoS requirements through the air interface. The WiMAX Forum, going beyond the air interface, defined an end-to-end WiMAX network architecture, based on an all-IP platform in order to complete the standards required for a commercial rollout of WiMAX as broadband wireless access solution. As the WiMAX network architecture is only a functional specification, this paper focuses on an innovative solution for an end-to-end WiMAX network architecture offering in compliance with the WiMAX Forum specification. To our best knowledge, this is the first WiMAX architecture built by a research consortium globally and was performed within the framework of the European IST project WEIRD (WiMAX Extension to Isolated Research Data networks. One of the principal features of our architecture is support for end-to-end QoS achieved by the integration of resource control in the WiMAX wireless link and the resource management in the wired domains in the network core. In this paper we present the architectural design of these QoS features in the overall WiMAX all-IP framework and their functional as well as performance evaluation. The presented results can safely be considered as unique and timely for any WiMAX system integrator.

  19. Replica field theory for a polymer in random media

    International Nuclear Information System (INIS)

    Goldschmidt, Yadin Y.

    2000-01-01

    In this paper we revisit the problem of a (non-self-avoiding) polymer chain in a random medium which was previously investigated by Edwards and Muthukumar (EM) [J. Chem. Phys. 89, 2435 (1988)]. As noticed by Cates and Ball (CB) [J. Phys. (France) 49, 2009 (1988)] there is a discrepancy between the predictions of the replica calculation of EM and the expectation that in an infinite medium the quenched and annealed results should coincide (for a chain that is free to move) and a long polymer should always collapse. CB argued that only in a finite volume one might see a ''localization transition'' (or crossover) from a stretched to a collapsed chain in three spatial dimensions. Here we carry out the replica calculation in the presence of an additional confining harmonic potential that mimics the effect of a finite volume. Using a variational scheme with five variational parameters we derive analytically for d -1/(4-d) ∼(g ln V) -1/(4-d) , where R is the radius of gyration, g is the strength of the disorder, μ is the spring constant associated with the confining potential, and V is the associated effective volume of the system. Thus the EM result is recovered with their constant replaced by ln V as argued by CB. We see that in the strict infinite volume limit the polymer always collapses, but for finite volume a transition from a stretched to a collapsed form might be observed as a function of the strength of the disorder. For d V ' ∼exp(g 2/(2-d) L (4-d)/(2-d) ) the annealed results are recovered and R∼(Lg) 1/(d-2) , where L is the length of the polymer. Hence the polymer also collapses in the large L limit. The one-step replica symmetry breaking solution is crucial for obtaining the above results. (c) 2000 The American Physical Society

  20. Replica Approach for Minimal Investment Risk with Cost

    Science.gov (United States)

    Shinzato, Takashi

    2018-06-01

    In the present work, the optimal portfolio minimizing the investment risk with cost is discussed analytically, where an objective function is constructed in terms of two negative aspects of investment, the risk and cost. We note the mathematical similarity between the Hamiltonian in the mean-variance model and the Hamiltonians in the Hopfield model and the Sherrington-Kirkpatrick model, show that we can analyze this portfolio optimization problem by using replica analysis, and derive the minimal investment risk with cost and the investment concentration of the optimal portfolio. Furthermore, we validate our proposed method through numerical simulations.

  1. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  2. Application of extraction replicas and analytical electron microscopy to precipitate phase studies

    International Nuclear Information System (INIS)

    Kenik, E.A.; Maziasz, P.J.

    1984-01-01

    Extraction replicas provide a powerful extension of AEM techniques for analysis of fine precipitates. In many cases, replicas allow more accurate analyses to be performed and, in some cases, allow unique analyses which cannot be performed in-foil. However, there are limitations to the use of extraction replicas in AEM, of which the analyst must be aware. Many can be eliminated by careful preparation. Often, combined AEM studies of precipitates in-foil and on extraction replicas provide complementary and corroborative information for the fullest analysis of precipitate phases

  3. Replica calibration artefacts for optical 3D scanning of micro parts

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Carmignato, S.; Cantatore, Angela

    2009-01-01

    This work deals with development of calibration artefacts produced by using hard replica materials, achieving high quality geometrical reproduction of suitable reference artefacts, high stability, and high surface cooperativeness. An investigation was carried out using a replica material for dental...... applications to reproduce the geometry of a step artefact, a miniature step gauge, and a curve standard for optical measuring machines. The replica artefacts were calibrated using a tactile coordinate measuring machine and measured on two different optical scanners. Replication quality and applicability...... of the artefacts to verify the accuracy of optical measurements as well as thermal expansion coefficient and stability of the replica artefacts over time were documented....

  4. Coastal Ocean Observing Network - Open Source Architecture for Data Management and Web-Based Data Services

    Science.gov (United States)

    Pattabhi Rama Rao, E.; Venkat Shesu, R.; Udaya Bhaskar, T. V. S.

    2012-07-01

    The observations from the oceans are the backbone for any kind of operational services, viz. potential fishing zone advisory services, ocean state forecast, storm surges, cyclones, monsoon variability, tsunami, etc. Though it is important to monitor open Ocean, it is equally important to acquire sufficient data in the coastal ocean through coastal ocean observing systems for re-analysis, analysis and forecast of coastal ocean by assimilating different ocean variables, especially sub-surface information; validation of remote sensing data, ocean and atmosphere model/analysis and to understand the processes related to air-sea interaction and ocean physics. Accurate information and forecast of the state of the coastal ocean at different time scales is vital for the wellbeing of the coastal population as well as for the socio-economic development of the country through shipping, offshore oil and energy etc. Considering the importance of ocean observations in terms of understanding our ocean environment and utilize them for operational oceanography, a large number of platforms were deployed in the Indian Ocean including coastal observatories, to acquire data on ocean variables in and around Indian Seas. The coastal observation network includes HF Radars, wave rider buoys, sea level gauges, etc. The surface meteorological and oceanographic data generated by these observing networks are being translated into ocean information services through analysis and modelling. Centralized data management system is a critical component in providing timely delivery of Ocean information and advisory services. In this paper, we describe about the development of open-source architecture for real-time data reception from the coastal observation network, processing, quality control, database generation and web-based data services that includes on-line data visualization and data downloads by various means.

  5. Collating and Curating Neuroanatomical Nomenclatures: Principles and Use of the Brain Architecture Knowledge Management System (BAMS).

    Science.gov (United States)

    Bota, Mihail; Swanson, Larry W

    2010-01-01

    Terms used to describe nervous system parts and their interconnections are rife with synonyms, partial correspondences, and even homonyms, making effective scientific communication unnecessarily difficult. To address this problem a new Topological Relations schema for the Relations module of BAMS (Brain Architecture Knowledge Management System) was created. It includes a representation of the qualitative spatial relations between nervous system parts defined in different neuroanatomical nomenclatures or atlases and is general enough to record data and metadata from the literature, regardless of description level or species. Based on this foundation a Projections Translations inference engine was developed for the BAMS interface that automatically translates neuroanatomical projection (axonal inputs and outputs) reports across nomenclatures from translated information. To make BAMS more useful to the neuroscience community three things were done. First, we implemented a simple schema for validation of the translated neuroanatomical projections. Second, more than 1,000 topological relations between brain gray matter regions for the rat were inserted, along with associated details. Finally, a case study was performed to enter all historical or legacy published information about terminology related to one relatively complex gray matter region of the rat. The bed nuclei of the stria terminalis (BST) were chosen and 21 different nomenclatures from 1923 to present were collated, along with 284 terms for parts (gray matter differentiations), 360 qualitative topological relations between parts, and more than 7,000 details about spatial relations between parts, all of which was annotated with appropriate metadata. This information was used to construct a graphical "knowledge map" of relations used in the literature to describe subdivisions of the rat BST.

  6. Execution Management in the Virtual Ship Architecture Issue 1.00

    National Research Council Canada - National Science Library

    Cramp, Anthony

    2000-01-01

    The Virtual Ship is an application of the High Level Architecture (HLA) in which simulation models that represent the components of a warship are brought together in a distributed manner to create a virtual representation of a warship...

  7. SecureCore Software Architecture: Trusted Management Layer (TML) Kernel Extension Module Integration Guide

    National Research Council Canada - National Science Library

    Shifflett, David J; Clark, Paul C; Irvine, Cynthia E; Nguyen, Thuy D; Vidas, Timothy M; Levin, Timothy E

    2007-01-01

    .... The purpose of the SecureCore research project is to investigate fundamental architectural features required for the trusted operation of mobile computing devices such as smart cards, embedded...

  8. SecureCore Software Architecture: Trusted Management Layer (TML) Kernel Extension Module Interface Specification

    National Research Council Canada - National Science Library

    Shifflett, David J; Clark, Paul C; Irvine, Cynthia E; Nguyen, Thuy D; Vidas, Timothy M; Levin, Timothy E

    2008-01-01

    .... The purpose of the SecureCore research project is to investigate fundamental architectural features required for the trusted operation of mobile computing devices such as smart cards, embedded...

  9. The added value of the replica simulators in the exploitation of nuclear power plants; El valor anadido de los simuladores replica en la explotacion de las centrales nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-07-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  10. Study on Information Management for the Conservation of Traditional Chinese Architectural Heritage - 3d Modelling and Metadata Representation

    Science.gov (United States)

    Yen, Y. N.; Weng, K. H.; Huang, H. Y.

    2013-07-01

    After over 30 years of practise and development, Taiwan's architectural conservation field is moving rapidly into digitalization and its applications. Compared to modern buildings, traditional Chinese architecture has considerably more complex elements and forms. To document and digitize these unique heritages in their conservation lifecycle is a new and important issue. This article takes the caisson ceiling of the Taipei Confucius Temple, octagonal with 333 elements in 8 types, as a case study for digitization practise. The application of metadata representation and 3D modelling are the two key issues to discuss. Both Revit and SketchUp were appliedin this research to compare its effectiveness to metadata representation. Due to limitation of the Revit database, the final 3D models wasbuilt with SketchUp. The research found that, firstly, cultural heritage databasesmustconvey that while many elements are similar in appearance, they are unique in value; although 3D simulations help the general understanding of architectural heritage, software such as Revit and SketchUp, at this stage, could onlybe used tomodel basic visual representations, and is ineffective indocumenting additional critical data ofindividually unique elements. Secondly, when establishing conservation lifecycle information for application in management systems, a full and detailed presentation of the metadata must also be implemented; the existing applications of BIM in managing conservation lifecycles are still insufficient. Results of the research recommends SketchUp as a tool for present modelling needs, and BIM for sharing data between users, but the implementation of metadata representation is of the utmost importance.

  11. PICNIC Architecture.

    Science.gov (United States)

    Saranummi, Niilo

    2005-01-01

    The PICNIC architecture aims at supporting inter-enterprise integration and the facilitation of collaboration between healthcare organisations. The concept of a Regional Health Economy (RHE) is introduced to illustrate the varying nature of inter-enterprise collaboration between healthcare organisations collaborating in providing health services to citizens and patients in a regional setting. The PICNIC architecture comprises a number of PICNIC IT Services, the interfaces between them and presents a way to assemble these into a functioning Regional Health Care Network meeting the needs and concerns of its stakeholders. The PICNIC architecture is presented through a number of views relevant to different stakeholder groups. The stakeholders of the first view are national and regional health authorities and policy makers. The view describes how the architecture enables the implementation of national and regional health policies, strategies and organisational structures. The stakeholders of the second view, the service viewpoint, are the care providers, health professionals, patients and citizens. The view describes how the architecture supports and enables regional care delivery and process management including continuity of care (shared care) and citizen-centred health services. The stakeholders of the third view, the engineering view, are those that design, build and implement the RHCN. The view comprises four sub views: software engineering, IT services engineering, security and data. The proposed architecture is founded into the main stream of how distributed computing environments are evolving. The architecture is realised using the web services approach. A number of well established technology platforms and generic standards exist that can be used to implement the software components. The software components that are specified in PICNIC are implemented in Open Source.

  12. 2005 dossier: granite. Tome: architecture and management of the geologic disposal

    International Nuclear Information System (INIS)

    2005-01-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the geologic disposal of high-level and long-lived radioactive wastes in granite formations. Content: 1 - Approach of the study: main steps since the December 30, 1991 law, ANDRA's research program on disposal in granitic formations; 2 - high-level and long-lived (HLLL) wastes: production scenarios, waste categories, inventory model; 3 - disposal facility design in granitic environment: definition of the geologic disposal functions, the granitic material, general facility design options; 4 - general architecture of a disposal facility in granitic environment: surface facilities, underground facilities, disposal process, operational safety; 5 - B-type wastes disposal area: primary containers of B-type wastes, safety options, concrete containers, disposal alveoles, architecture of the B-type wastes disposal area, disposal process and feasibility aspects, functions of disposal components with time; 6 - C-type wastes disposal area: C-type wastes primary containers, safety options, super-containers, disposal alveoles, architecture of the C-type wastes disposal area, disposal process in a reversibility logics, functions of disposal components with time; 7 - spent fuels disposal area: spent fuel assemblies, safety options, spent fuel containers, disposal alveoles, architecture of the spent fuel disposal area, disposal process in a reversibility logics, functions of disposal components with time; 8 - conclusions: suitability of the architecture with various types of French granites, strong design, reversibility taken into consideration. (J.S.)

  13. Broken symmetry in the mean field theory of the ising spin glass: replica way and no replica way

    International Nuclear Information System (INIS)

    De Dominicis, C.

    1983-06-01

    We review the type of symmetry breaking involved in the solution discovered by Parisi and in the static derivation of the solution first introduced via dynamics by Sompolinsky. We turn to a formulation of the problem due to Thouless, Anderson and Palmer (TAP) that put a set of equations for the magnetization. A probability law for the magnetization is then built. We consider two cases: (i) a canonical distribution which is shown to give indentical results to the Hamiltonian formulation under a weak and physical assumption and (ii) a white distribution characterized by two matrices and a response. We show what symmetry breaking is necessary to recover Sompolinsky free energy. In section III we supplement replica indices in the Hamiltonian approach by ''time'' indices ans show in particular that the analytic continuation involved in Sompolinsky's equilibrium derivation, is trying to mimick a translational symmetry breaking in ''time'' that incorporates Sompolinsky's ansatz of a long time scale sequence. In section IV we apply the same treatment to the white average approach and show that, replicas can be altogether discorded and replaced by ''time''. Finally, we briefly discuss the attribution of distinct answers for the standard spin glass order parameter depending on the physical situation: equilibrium or non equilibrium associated with canonical or white (non canonical) initial conditions and density matrices

  14. Managing the Evolution of an Enterprise Architecture using a MAS-Product-Line Approach

    Science.gov (United States)

    Pena, Joaquin; Hinchey, Michael G.; Resinas, manuel; Sterritt, Roy; Rash, James L.

    2006-01-01

    We view an evolutionary system ns being n software product line. The core architecture is the unchanging part of the system, and each version of the system may be viewed as a product from the product line. Each "product" may be described as the core architecture with sonre agent-based additions. The result is a multiagent system software product line. We describe an approach to such n Software Product Line-based approach using the MaCMAS Agent-Oriented nzethoclology. The approach scales to enterprise nrchitectures as a multiagent system is an approprinre means of representing a changing enterprise nrchitectclre nnd the inferaction between components in it.

  15. Parallel replica dynamics method for bistable stochastic reaction networks: Simulation and sensitivity analysis

    Science.gov (United States)

    Wang, Ting; Plecháč, Petr

    2017-12-01

    Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.

  16. Parallel replica dynamics method for bistable stochastic reaction networks: Simulation and sensitivity analysis.

    Science.gov (United States)

    Wang, Ting; Plecháč, Petr

    2017-12-21

    Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.

  17. Fabrication of free-standing replicas of fragile, laminar, chitinous biotemplates

    International Nuclear Information System (INIS)

    Lakhtakia, Akhlesh; Motyka, Michael A; MartIn-Palma, Raul J; Pantano, Carlo G

    2009-01-01

    The conformal-evaporated-film-by-rotation technique, followed by the dissolution of chitin in an aqueous solution of orthophosphoric acid, can be used to fabricate free-standing replicas of fragile, laminar, chitinous biotemplates. This novel approach was demonstrated using butterfly wings as biotemplates and GeSeSb chalcogenide glass for replicas. (communication)

  18. Fabrication of free-standing replicas of fragile, laminar, chitinous biotemplates

    Energy Technology Data Exchange (ETDEWEB)

    Lakhtakia, Akhlesh; Motyka, Michael A [Materials Research Institute and Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, PA 16802 (United States); MartIn-Palma, Raul J; Pantano, Carlo G [Materials Research Institute and Department of Materials Science and Engineering, Pennsylvania State University, University Park, PA 16802 (United States)], E-mail: akhlesh@psu.edu

    2009-09-01

    The conformal-evaporated-film-by-rotation technique, followed by the dissolution of chitin in an aqueous solution of orthophosphoric acid, can be used to fabricate free-standing replicas of fragile, laminar, chitinous biotemplates. This novel approach was demonstrated using butterfly wings as biotemplates and GeSeSb chalcogenide glass for replicas. (communication)

  19. A Novel General Chemistry Laboratory: Creation of Biomimetic Superhydrophobic Surfaces through Replica Molding

    Science.gov (United States)

    Verbanic, Samuel; Brady, Owen; Sanda, Ahmed; Gustafson, Carolina; Donhauser, Zachary J.

    2014-01-01

    Biomimetic replicas of superhydrophobic lotus and taro leaf surfaces can be made using polydimethylsiloxane. These replicas faithfully reproduce the microstructures of the leaves' surface and can be analyzed using contact angle goniometry, self-cleaning experiments, and optical microscopy. These simple and adaptable experiments were used to…

  20. Strategies for the architectural firm : risk- and flexibility management at Wiegerinck Architecten

    NARCIS (Netherlands)

    Berning, C.C.X.; Steeneken, J.; Goossens, T.J.P.; Kamminga, A.B.G.

    2006-01-01

    Wiegerinck Architecten (WA) is an integral architectural office, specialized in hospital buildings. WA wants to have a strategic competitive advantage over their competitors in the hospital building sector. They want to create a competitive strategic advantage through improving both their knowledge

  1. A MultiAgent Architecture for Collaborative Serious Game applied to Crisis Management Training: Improving Adaptability of Non Player Characters

    Directory of Open Access Journals (Sweden)

    M’hammed Ali Oulhaci

    2014-05-01

    Full Text Available Serious Games (SG are more and more used for training, as in the crisis management domain, where several hundred stakeholders can be involved, causing various organizational difficulties on field exercises. SGs specific benefits include player immersion and detailed players’ actions tracking during a virtual exercise. Moreover, Non Player Characters (NPC can adapt the crisis management exercise perimeter to the available stakeholders or to specific training objectives. In this paper we present a Multi-Agent System architecture supporting behavioural simulation as well as monitoring and assessment of human players. A NPC is enacted by a Game Agent which reproduces the behaviour of a human actor, based on a deliberative model (Belief Desire Intention. To facilitate the scenario design, an Agent editor allows a designer to configure agents’behaviours. The behaviour simulation was implemented within the pre-existing SIMFOR project, a serious game for training in crisis management.

  2. SET: Session Layer-Assisted Efficient TCP Management Architecture for 6LoWPAN with Multiple Gateways

    Directory of Open Access Journals (Sweden)

    Akbar AliHammad

    2010-01-01

    Full Text Available 6LoWPAN (IPv6 based Low-Power Personal Area Network is a protocol specification that facilitates communication of IPv6 packets on top of IEEE 802.15.4 so that Internet and wireless sensor networks can be inter-connected. This interconnection is especially required in commercial and enterprise applications of sensor networks where reliable and timely data transfers such as multiple code updates are needed from Internet nodes to sensor nodes. For this type of inbound traffic which is mostly bulk, TCP as transport layer protocol is essential, resulting in end-to-end TCP session through a default gateway. In this scenario, a single gateway tends to become the bottleneck because of non-uniform connectivity to all the sensor nodes besides being vulnerable to buffer overflow. We propose SET; a management architecture for multiple split-TCP sessions across a number of serving gateways. SET implements striping and multiple TCP session management through a shim at session layer. Through analytical modeling and ns2 simulations, we show that our proposed architecture optimizes communication for ingress bulk data transfer while providing associated load balancing services. We conclude that multiple split-TCP sessions managed in parallel across a number of gateways result in reduced latency for bulk data transfer and provide robustness against gateway failures.

  3. Replica analysis for the duality of the portfolio optimization problem.

    Science.gov (United States)

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  4. Replica analysis for the duality of the portfolio optimization problem

    Science.gov (United States)

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  5. Architectural prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders...

  6. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... correlation between the study of existing architectures and the training of competences to design for present-day realities.......This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...

  7. Management of microbial community composition, architecture and performance in autotrophic nitrogen removing bioreactors through aeration regimes

    DEFF Research Database (Denmark)

    Mutlu, A. Gizem

    to describe aggregation and architectural evolution in nitritation/anammox reactors, incorporating the possible influences of intermediates formed with intermittent aeration. Community analysis revealed an abundant fraction of heterotrophic types despite the absence of organic carbon in the feed. The aerobic...... and anaerobic ammonia oxidizing guilds were dominated by fast-growing Nitrosomonas spp. and Ca. Brocadia spp., while the nitrite oxidizing guild was dominated by high affinity Nitrospira spp. Emission of nitrous oxide (N2O) was evaluated from both reactors under dynamic aeration regimes. Contrary to the widely...... impacts could be isolated, increasing process understanding. It was demonstrated that aeration strategy can be used as a powerful tool to manipulate the microbial community composition, its architecture and reactor performance. We suggest operation via intermittent aeration with short aerated periods...

  8. Trust-Management, Intrusion-Tolerance, Accountability, and Reconstitution Architecture (TIARA)

    Science.gov (United States)

    2009-12-01

    Tainting, tagged, metadata, architecture, hardware, processor, microkernel , zero-kernel, co-design 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF... microkernels (e.g., [27]) embraced the idea that it was beneficial to reduce the ker- nel, separating out services as separate processes isolated from...limited adoption. More recently Tanenbaum [72] notes the security virtues of microkernels and suggests the modern importance of security makes it

  9. An Object-Oriented Architecture for User Interface Management in Distributed Applications

    OpenAIRE

    Denzer, Ralf

    2017-01-01

    User interfaces for large distributed applications have to handle specific problems: the complexity of the application itself and the integration of online-data into the user interface. A main task of the user interface architecture is to provide powerful tools to design and augment the end-user system easily, hence giving the designer more time to focus on user requirements. Our experiences developing a user interface system for a process control room showed that a lot of time during the dev...

  10. Architecture and robustness tradeoffs in speed-scaled queues with application to energy management

    Science.gov (United States)

    Dinh, Tuan V.; Andrew, Lachlan L. H.; Nazarathy, Yoni

    2014-08-01

    We consider single-pass, lossless, queueing systems at steady-state subject to Poisson job arrivals at an unknown rate. Service rates are allowed to depend on the number of jobs in the system, up to a fixed maximum, and power consumption is an increasing function of speed. The goal is to control the state dependent service rates such that both energy consumption and delay are kept low. We consider a linear combination of the mean job delay and energy consumption as the performance measure. We examine both the 'architecture' of the system, which we define as a specification of the number of speeds that the system can choose from, and the 'design' of the system, which we define as the actual speeds available. Previous work has illustrated that when the arrival rate is precisely known, there is little benefit in introducing complex (multi-speed) architectures, yet in view of parameter uncertainty, allowing a variable number of speeds improves robustness. We quantify the tradeoffs of architecture specification with respect to robustness, analysing both global robustness and a newly defined measure which we call local robustness.

  11. Internal structure analysis of particle-double network gels used in a gel organ replica

    Science.gov (United States)

    Abe, Mei; Arai, Masanori; Saito, Azusa; Sakai, Kazuyuki; Kawakami, Masaru; Furukawa, Hidemitsu

    2016-04-01

    In recent years, the fabrication of patient organ replicas using 3D printers has been attracting a great deal of attention in medical fields. However, the cost of these organ replicas is very high as it is necessary to employ very expensive 3D printers and printing materials. Here we present a new gel organ replica, of human kidney, fabricated with a conventional molding technique, using a particle-double network hydrogel (P-DN gel). The replica is transparent and has the feel of a real kidney. It is expected that gel organ replicas produced this way will be a useful tool for the education of trainee surgeons and clinical ultrasonography technologists. In addition to developing a gel organ replica, the internal structure of the P-DN gel used is also discussed. Because the P-DN gel has a complex structure comprised of two different types of network, it has not been possible to investigate them internally in detail. Gels have an inhomogeneous network structure. If it is able to get a more uniform structure, it is considered that this would lead to higher strength in the gel. In the present study we investigate the structure of P-DN gel, using the gel organ replica. We investigated the internal structure of P-DN gel using Scanning Microscopic Light Scattering (SMILS), a non-contacting and non-destructive.

  12. Fabrication of the replica templated from butterfly wing scales with complex light trapping structures

    Science.gov (United States)

    Han, Zhiwu; Li, Bo; Mu, Zhengzhi; Yang, Meng; Niu, Shichao; Zhang, Junqiu; Ren, Luquan

    2015-11-01

    The polydimethylsiloxane (PDMS) positive replica templated twice from the excellent light trapping surface of butterfly Trogonoptera brookiana wing scales was fabricated by a simple and promising route. The exact SiO2 negative replica was fabricated by using a synthesis method combining a sol-gel process and subsequent selective etching. Afterwards, a vacuum-aided process was introduced to make PDMS gel fill into the SiO2 negative replica, and the PDMS gel was solidified in an oven. Then, the SiO2 negative replica was used as secondary template and the structures in its surface was transcribed onto the surface of PDMS. At last, the PDMS positive replica was obtained. After comparing the PDMS positive replica and the original bio-template in terms of morphology, dimensions and reflectance spectra and so on, it is evident that the excellent light trapping structures of butterfly wing scales were inherited by the PDMS positive replica faithfully. This bio-inspired route could facilitate the preparation of complex light trapping nanostructure surfaces without any assistance from other power-wasting and expensive nanofabrication technologies.

  13. BIM and architectural heritage: towards an operational methodology for the knowledge and the management of Cultural Heritage

    Directory of Open Access Journals (Sweden)

    Laura Inzerillo

    2016-06-01

    Full Text Available The study aims to answer the growing need for virtuously organize informational apparatuses related to Cultural Heritage. We propose a methodology that integrates multidisciplinary processes of interaction with information aimed at survey, documentation, management, knowledge and enhancement of historic artifacts.It is needed to review and update the procedure of instrumental data acquisition, standardization and structuring of the acquired data in a three-dimensional semantic model as well as the subsequent representability and accessibility of the model and the related database. If the use of Building Information Modeling has in recent years seen a consolidation in the procedures and the identification of standard methods in design process, nevertheless in the field of architectural heritage, the challenge to identify operational methodologies for the conservation, management and process enhancement is still open.

  14. The added value of the replica simulators in the exploitation of nuclear power plants

    International Nuclear Information System (INIS)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-01-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  15. 2005 dossier: granite. Tome: architecture and management of the geologic disposal; Dossier 2005: granite. Tome architecture et gestion du stockage geologique

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the geologic disposal of high-level and long-lived radioactive wastes in granite formations. Content: 1 - Approach of the study: main steps since the December 30, 1991 law, ANDRA's research program on disposal in granitic formations; 2 - high-level and long-lived (HLLL) wastes: production scenarios, waste categories, inventory model; 3 - disposal facility design in granitic environment: definition of the geologic disposal functions, the granitic material, general facility design options; 4 - general architecture of a disposal facility in granitic environment: surface facilities, underground facilities, disposal process, operational safety; 5 - B-type wastes disposal area: primary containers of B-type wastes, safety options, concrete containers, disposal alveoles, architecture of the B-type wastes disposal area, disposal process and feasibility aspects, functions of disposal components with time; 6 - C-type wastes disposal area: C-type wastes primary containers, safety options, super-containers, disposal alveoles, architecture of the C-type wastes disposal area, disposal process in a reversibility logics, functions of disposal components with time; 7 - spent fuels disposal area: spent fuel assemblies, safety options, spent fuel containers, disposal alveoles, architecture of the spent fuel disposal area, disposal process in a reversibility logics, functions of disposal components with time; 8 - conclusions: suitability of the architecture with various types of French granites, strong design, reversibility taken into consideration. (J.S.)

  16. A PRIVACY MANAGEMENT ARCHITECTURE FOR PATIENT-CONTROLLED PERSONAL HEALTH RECORD SYSTEM

    Directory of Open Access Journals (Sweden)

    MD. NURUL HUDA

    2009-06-01

    Full Text Available Patient-controlled personal health record systems can help make health care safer, cheaper, and more convenient by facilitating patients to 1 grant any care provider access to their complete personal health records anytime from anywhere, 2 avoid repeated tests and 3 control their privacy transparently. In this paper, we present the architecture of our Privacy-aware Patient-controlled Personal Health Record (P3HR system through which a patient can view her integrated health history, and share her health information transparently with others (e.g., healthcare providers. Access to the health information of a particular patient is completely controlled by that patient. We also carry out intuitive security and privacy analysis of the P3HR system architecture considering different types of security attacks. Finally, we describe a prototype implementation of the P3HR system that we developed reflecting the special view of Japanese society. The most important advantage of P3HR system over other existing systems is that most likely P3HR system provides complete privacy protection without losing data accuracy. Unlike traditional partially anonymous health records (e.g., using k-anonymity or l-diversity, the health records in P3HR are closer to complete anonymity, and yet preserve data accuracy. Our approach makes it very unlikely that patients could be identified by an attacker from their anonymous health records in the P3HR system.

  17. Multiscale transparent electrode architecture for efficient light management and carrier collection in solar cells.

    Science.gov (United States)

    Boccard, Mathieu; Battaglia, Corsin; Hänni, Simon; Söderström, Karin; Escarré, Jordi; Nicolay, Sylvain; Meillaud, Fanny; Despeisse, Matthieu; Ballif, Christophe

    2012-03-14

    The challenge for all photovoltaic technologies is to maximize light absorption, to convert photons with minimal losses into electric charges, and to efficiently extract them to the electrical circuit. For thin-film solar cells, all these tasks rely heavily on the transparent front electrode. Here we present a multiscale electrode architecture that allows us to achieve efficiencies as high as 14.1% with a thin-film silicon tandem solar cell employing only 3 μm of silicon. Our approach combines the versatility of nanoimprint lithography, the unusually high carrier mobility of hydrogenated indium oxide (over 100 cm(2)/V/s), and the unequaled light-scattering properties of self-textured zinc oxide. A multiscale texture provides light trapping over a broad wavelength range while ensuring an optimum morphology for the growth of high-quality silicon layers. A conductive bilayer stack guarantees carrier extraction while minimizing parasitic absorption losses. The tunability accessible through such multiscale electrode architecture offers unprecedented possibilities to address the trade-off between cell optical and electrical performance. © 2012 American Chemical Society

  18. Architecture Governance: The Importance of Architecture Governance for Achieving Operationally Responsive Ground Systems

    Science.gov (United States)

    Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik

    2011-01-01

    Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level

  19. Human skulls with turquoise inlays: pre hispanic origin or replicas?

    International Nuclear Information System (INIS)

    Silva V, Y.; Castillo M, M.T.; Bautista M, J.P.; Arenas A, J.

    2006-01-01

    The lack of archaeological context determining if the manufacture of two human skulls adorned with turquoise inlays have pre-Columbian origin or not (replicas), led to perform other studies. Under these conditions, besides orthodox methodology commonly used to assign chronology and cultural aspects as form, style, decoration, iconography, etc., it was necessary to obtain more results based on the use of characterization techniques. The techniques employed were Scanning Electron Microscopy (SEM), X-Ray Energy Dispersive Spectroscopy (EDS), Transmission Electron Microscopy (TEM) and Fourier Transform Infrared Spectroscopy (FTIR), in order to determine the manufacture techniques and chemical composition of the materials used for the cementant. SEM analysis showed the presence of zones composed by Ca, O, C and Al. In some cases Mg, Cl, Fe and Pb were identified. High concentration of Cu was present in all samples, due to residues of turquoise inlays (CuAI 6 (PO 4 ) 4 (OH) 8 (H 2 O) 4 ) with which the skulls were decorated. In the cementant was identified the Ca as base element of the cementant, as well as particles < 100 nm with irregular morphology and other amorphous zones. FTIR spectrums indicated the presence of organic substances that could be used as agglutinating in the cementant. The current work shows a progress identifying involved techniques in the manufacturing of two human skulls with turquoise inlays. (Author)

  20. Parylene C coating for high-performance replica molding.

    Science.gov (United States)

    Heyries, Kevin A; Hansen, Carl L

    2011-12-07

    This paper presents an improvement to the soft lithography fabrication process that uses chemical vapor deposition of poly(chloro-p-xylylene) (parylene C) to protect microfabricated masters and to improve the release of polymer devices following replica molding. Chemical vapor deposition creates nanometre thick conformal coatings of parylene C on silicon wafers having arrays of 30 μm high SU8 pillars with densities ranging from 278 to 10,040 features per mm(2) and aspect ratios (height : width) from 1 : 1 to 6 : 1. A single coating of parylene C was sufficient to permanently promote poly(dimethyl)siloxane (PDMS) mold release and to protect masters for an indefinite number of molding cycles. We also show that the improved release properties of parylene treated masters allow for fabrication with hard polymers, such as poly(urethane), that would otherwise not be compatible with SU8 on silicon masters. Parylene C provides a robust and high performance mold release coating for soft lithography microfabrication that extends the life of microfabricated masters and improves the achievable density and aspect ratio of replicated features.

  1. Foundations and latest advances in replica exchange transition interface sampling

    Science.gov (United States)

    Cabriolu, Raffaela; Skjelbred Refsnes, Kristin M.; Bolhuis, Peter G.; van Erp, Titus S.

    2017-10-01

    Nearly 20 years ago, transition path sampling (TPS) emerged as an alternative method to free energy based approaches for the study of rare events such as nucleation, protein folding, chemical reactions, and phase transitions. TPS effectively performs Monte Carlo simulations with relatively short molecular dynamics trajectories, with the advantage of not having to alter the actual potential energy surface nor the underlying physical dynamics. Although the TPS approach also introduced a methodology to compute reaction rates, this approach was for a long time considered theoretically attractive, providing the exact same results as extensively long molecular dynamics simulations, but still expensive for most relevant applications. With the increase of computer power and improvements in the algorithmic methodology, quantitative path sampling is finding applications in more and more areas of research. In particular, the transition interface sampling (TIS) and the replica exchange TIS (RETIS) algorithms have, in turn, improved the efficiency of quantitative path sampling significantly, while maintaining the exact nature of the approach. Also, open-source software packages are making these methods, for which implementation is not straightforward, now available for a wider group of users. In addition, a blooming development takes place regarding both applications and algorithmic refinements. Therefore, it is timely to explore the wide panorama of the new developments in this field. This is the aim of this article, which focuses on the most efficient exact path sampling approach, RETIS, as well as its recent applications, extensions, and variations.

  2. Preserving the Boltzmann ensemble in replica-exchange molecular dynamics.

    Science.gov (United States)

    Cooke, Ben; Schmidler, Scott C

    2008-10-28

    We consider the convergence behavior of replica-exchange molecular dynamics (REMD) [Sugita and Okamoto, Chem. Phys. Lett. 314, 141 (1999)] based on properties of the numerical integrators in the underlying isothermal molecular dynamics (MD) simulations. We show that a variety of deterministic algorithms favored by molecular dynamics practitioners for constant-temperature simulation of biomolecules fail either to be measure invariant or irreducible, and are therefore not ergodic. We then show that REMD using these algorithms also fails to be ergodic. As a result, the entire configuration space may not be explored even in an infinitely long simulation, and the simulation may not converge to the desired equilibrium Boltzmann ensemble. Moreover, our analysis shows that for initial configurations with unfavorable energy, it may be impossible for the system to reach a region surrounding the minimum energy configuration. We demonstrate these failures of REMD algorithms for three small systems: a Gaussian distribution (simple harmonic oscillator dynamics), a bimodal mixture of Gaussians distribution, and the alanine dipeptide. Examination of the resulting phase plots and equilibrium configuration densities indicates significant errors in the ensemble generated by REMD simulation. We describe a simple modification to address these failures based on a stochastic hybrid Monte Carlo correction, and prove that this is ergodic.

  3. Replica Exchange Simulations of the Thermodynamics of Aβ Fibril Growth

    Science.gov (United States)

    Takeda, Takako; Klimov, Dmitri K.

    2009-01-01

    Abstract Replica exchange molecular dynamics and an all-atom implicit solvent model are used to probe the thermodynamics of deposition of Alzheimer's Aβ monomers on preformed amyloid fibrils. Consistent with the experiments, two deposition stages have been identified. The docking stage occurs over a wide temperature range, starting with the formation of the first peptide-fibril interactions at 500 K. Docking is completed when a peptide fully adsorbs on the fibril edge at the temperature of 380 K. The docking transition appears to be continuous, and occurs without free energy barriers or intermediates. During docking, incoming Aβ monomer adopts a disordered structure on the fibril edge. The locking stage occurs at the temperature of ≈360 K and is characterized by the rugged free energy landscape. Locking takes place when incoming Aβ peptide forms a parallel β-sheet structure on the fibril edge. Because the β-sheets formed by locked Aβ peptides are typically off-registry, the structure of the locked phase differs from the structure of the fibril interior. The study also reports that binding affinities of two distinct fibril edges with respect to incoming Aβ peptides are different. The peptides bound to the concave edge have significantly lower free energy compared to those bound on the convex edge. Comparison with the available experimental data is discussed. PMID:19167295

  4. Stability and replica symmetry in the ising spin glass: a toy model

    International Nuclear Information System (INIS)

    De Dominicis, C.; Mottishaw, P.

    1986-01-01

    Searching for possible replica symmetric solutions in an Ising spin glass (in the tree approximation) we investigate a toy model whose bond distribution has two non vanishing cumulants (instead of one only as in a gaussian distribution)

  5. Toward Measures for Software Architectures

    National Research Council Canada - National Science Library

    Chastek, Gary; Ferguson, Robert

    2006-01-01

    .... Defining these architectural measures is very difficult. The software architecture deeply affects subsequent development and project management decisions, such as the breakdown of the coding tasks and the definition of the development increments...

  6. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  7. Designed-walk replica-exchange method for simulations of complex systems

    OpenAIRE

    Urano, Ryo; Okamoto, Yuko

    2015-01-01

    We propose a new implementation of the replica-exchange method (REM) in which replicas follow a pre-planned route in temperature space instead of a random walk. Our method satisfies the detailed balance condition in the proposed route. The method forces tunneling events between the highest and lowest temperatures to happen with an almost constant period. The number of tunneling counts is proportional to that of the random-walk REM multiplied by the square root of moving distance in temperatur...

  8. Utility of replica techniques for x-ray microanalysis of second phase particles

    International Nuclear Information System (INIS)

    Bentley, J.

    1984-01-01

    X-ray microanalysis of second phase particles in ion-milled or electropolished thin foils is often complicated by the presence of the matrix nearby. Extraction replica techniques provide a means to avoid many of the complications of thin-foil analyses. In this paper, three examples of the analysis of second phase particles are described and illustrate the improvement obtained by the use of extraction replicas for qualitative analysis, quantitative analysis, and analysis of radioactive specimens

  9. Development of innovative architecture of the organizational and economic mechanism for the nature protection management

    Science.gov (United States)

    Mikhailov, V. G.; Kiseleva, T. V.; Karasev, V. A.; Mikhailov, G. S.; Skukin, V. A.

    2017-05-01

    The problems of the efficient functioning of environmental and economic systems of various levels on the basis of the adequate organizational and economic management mechanism are considered in the article. The purpose of the study is the analysis and development of theoretical provisions for the formation of a modern, innovative organizational and economic mechanism of the nature protection management. The compliance matrix of the innovative elements presented in the structure of the organizational and economic mechanism of the nature protection management is developed. The main result of the study is the improvement of the existing management mechanism to minimize the negative impact on the environment, including through the incentive system, and to improve the financial performance of the economic entity. The practical component of the study conducted can be recommended to municipal, regional and federal authorities, as well as the industrial enterprises, to support the adoption of the effective, environmentally sound management decisions that are consistent with the global concept of sustainable development.

  10. Calculation of absolute protein-ligand binding free energy using distributed replica sampling.

    Science.gov (United States)

    Rodinger, Tomas; Howell, P Lynne; Pomès, Régis

    2008-10-21

    Distributed replica sampling [T. Rodinger et al., J. Chem. Theory Comput. 2, 725 (2006)] is a simple and general scheme for Boltzmann sampling of conformational space by computer simulation in which multiple replicas of the system undergo a random walk in reaction coordinate or temperature space. Individual replicas are linked through a generalized Hamiltonian containing an extra potential energy term or bias which depends on the distribution of all replicas, thus enforcing the desired sampling distribution along the coordinate or parameter of interest regardless of free energy barriers. In contrast to replica exchange methods, efficient implementation of the algorithm does not require synchronicity of the individual simulations. The algorithm is inherently suited for large-scale simulations using shared or heterogeneous computing platforms such as a distributed network. In this work, we build on our original algorithm by introducing Boltzmann-weighted jumping, which allows moves of a larger magnitude and thus enhances sampling efficiency along the reaction coordinate. The approach is demonstrated using a realistic and biologically relevant application; we calculate the standard binding free energy of benzene to the L99A mutant of T4 lysozyme. Distributed replica sampling is used in conjunction with thermodynamic integration to compute the potential of mean force for extracting the ligand from protein and solvent along a nonphysical spatial coordinate. Dynamic treatment of the reaction coordinate leads to faster statistical convergence of the potential of mean force than a conventional static coordinate, which suffers from slow transitions on a rugged potential energy surface.

  11. Conformational sampling enhancement of replica exchange molecular dynamics simulations using swarm particle intelligence

    International Nuclear Information System (INIS)

    Kamberaj, Hiqmet

    2015-01-01

    In this paper, we present a new method based on swarm particle social intelligence for use in replica exchange molecular dynamics simulations. In this method, the replicas (representing the different system configurations) are allowed communicating with each other through the individual and social knowledge, in additional to considering them as a collection of real particles interacting through the Newtonian forces. The new method is based on the modification of the equations of motion in such way that the replicas are driven towards the global energy minimum. The method was tested for the Lennard-Jones clusters of N = 4,  5, and 6 atoms. Our results showed that the new method is more efficient than the conventional replica exchange method under the same practical conditions. In particular, the new method performed better on optimizing the distribution of the replicas among the thermostats with time and, in addition, ergodic convergence is observed to be faster. We also introduce a weighted histogram analysis method allowing analyzing the data from simulations by combining data from all of the replicas and rigorously removing the inserted bias

  12. Information architecture. Volume 3: Guidance

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The purpose of this document, as presented in Volume 1, The Foundations, is to assist the Department of Energy (DOE) in developing and promulgating information architecture guidance. This guidance is aimed at increasing the development of information architecture as a Departmentwide management best practice. This document describes departmental information architecture principles and minimum design characteristics for systems and infrastructures within the DOE Information Architecture Conceptual Model, and establishes a Departmentwide standards-based architecture program. The publication of this document fulfills the commitment to address guiding principles, promote standard architectural practices, and provide technical guidance. This document guides the transition from the baseline or defacto Departmental architecture through approved information management program plans and budgets to the future vision architecture. This document also represents another major step toward establishing a well-organized, logical foundation for the DOE information architecture.

  13. Reference system architecture for trade promotion management: leveraging business intelligence technologies and decision support systems

    NARCIS (Netherlands)

    Balmus, Andra Bianca; Iacob, Maria Eugenia; van Sinderen, Marten J.; van Busschbach, Murk

    Working towards gaining competitive advantage and establishing stable relationships with their supply chain intermediaries, fast moving consumer goods companies are currently focusing their attention on intelligent, goal-based funds investment. Traditional trade promotion management systems (TPMS),

  14. CogWnet: A Resource Management Architecture for Cognitive Wireless Networks

    KAUST Repository

    Alqerm, Ismail; Shihada, Basem; Shin, Kang G.

    2013-01-01

    With the increasing adoption of wireless communication technologies, there is a need to improve management of existing radio resources. Cognitive radio is a promising technology to improve the utilization of wireless spectrum. Its operating

  15. Information Architecture: Looking Ahead.

    Science.gov (United States)

    Rosenfeld, Louis

    2002-01-01

    Considers the future of the field of information architecture. Highlights include a comparison with the growth of the field of professional management; the design of information systems since the Web; more demanding users; the need for an interdisciplinary approach; and how to define information architecture. (LRW)

  16. Managing of the sun in the architecture; Manejo del sol en la arquitectura

    Energy Technology Data Exchange (ETDEWEB)

    Gomez Azpeitia, Luis Gabriel [Facultad de Arquitectura y Diseno, Universidad de Colima (Mexico)

    2005-07-01

    The solar energy is the source responsible for most of the climatic processes that occur in the planet. From its magnitude, but mainly from its angle of incidence on the atmosphere and on the surface of the oceans and the Earth, depend the daily and seasonal fluctuations of temperature, and therefore the changes in the atmosphere's humidity content as well as pressure differentials that in their turn generate wind currents as well. The architecture, as an integral part of the earth surface, is interrelated with the sun since the relative position of the sun in respect to the buildings is fundamental to obtain energy efficiency as well as comfort in the inhabitants and adjustment to the surroundings. Hence the requirement that the architects formulate their projects in agreement to the sun path, to the different seasons of the year, either to avoid their radiation or to take advantage of it. In this presentation, it is intended to make a fast review of all the tools of support so that the architects recommend the practical use of some of them and so to demonstrate with a practical example the utility that they have in the process of architectural design. [Spanish] La energia solar es la fuente responsable de la mayor parte de los procesos climaticos que ocurren en el planeta. De su magnitud, pero sobre todo de su angulo de incidencia sobre la atmosfera y sobre la superficie de los oceanos y de la tierra, dependen las fluctuaciones de temperaturas diarias y estacionales, y por lo tanto cambios en el contenido de humedad de la atmosfera asi como diferencias de presion que a su vez generan las corrientes de viento. La arquitectura, como parte integral de la superficie terrestre esta interrelacionada con el sol, ya que la posicion relativa del sol con respecto a los edificios es fundamental para lograr eficiencia energetica, confort en los habitantes y adecuacion al entorno. De ahi la exigencia de que los arquitectos formulen sus proyectos en concordancia a la

  17. Architectural slicing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2013-01-01

    Architectural prototyping is a widely used practice, con- cerned with taking architectural decisions through experiments with light- weight implementations. However, many architectural decisions are only taken when systems are already (partially) implemented. This is prob- lematic in the context...... of architectural prototyping since experiments with full systems are complex and expensive and thus architectural learn- ing is hindered. In this paper, we propose a novel technique for harvest- ing architectural prototypes from existing systems, \\architectural slic- ing", based on dynamic program slicing. Given...... a system and a slicing criterion, architectural slicing produces an architectural prototype that contain the elements in the architecture that are dependent on the ele- ments in the slicing criterion. Furthermore, we present an initial design and implementation of an architectural slicer for Java....

  18. A Semantic Middleware Architecture Focused on Data and Heterogeneity Management within the Smart Grid

    Directory of Open Access Journals (Sweden)

    Rubén de Diego

    2014-09-01

    Full Text Available There is an increasing tendency of turning the current power grid, essentially unaware of variations in electricity demand and scattered energy sources, into something capable of bringing a degree of intelligence by using tools strongly related to information and communication technologies, thus turning into the so-called Smart Grid. In fact, it could be considered that the Smart Grid is an extensive smart system that spreads throughout any area where power is required, providing a significant optimization in energy generation, storage and consumption. However, the information that must be treated to accomplish these tasks is challenging both in terms of complexity (semantic features, distributed systems, suitable hardware and quantity (consumption data, generation data, forecasting functionalities, service reporting, since the different energy beneficiaries are prone to be heterogeneous, as the nature of their own activities is. This paper presents a proposal on how to deal with these issues by using a semantic middleware architecture that integrates different components focused on specific tasks, and how it is used to handle information at every level and satisfy end user requests.

  19. Managing Pan-European mammography images and data using a service oriented architecture

    CERN Document Server

    Amendolia, S R; McClatchey, R; Rogulin, D; Solomonides, T

    2004-01-01

    Medical conditions such as breast cancer, and mammograms as images, are extremely complex with many degrees of variability across the population. An effective solution for the management of disparate mammogram data sources that provides sufficient statistics for complex epidemiological study is a federation of autonomous multi- centre sites which transcends national boundaries. Grid-based technologies are emerging as open-source standards-based solutions for managing and collaborating distributed resources. In the light of these new computing solutions, the MammoGrid project, as one example of a HealthGrid, is developing a Grid-aware medical application which manages a European-wide database of mammograms. The MammoGrid solution utilizes the grid technologies in seamlessly integrating distributed data sets and is investigating the potential of the Grid to support effective co-working among mammogram analysts throughout the EU.

  20. Process-based allometry describes the influence of management on orchard tree aboveground architecture

    Directory of Open Access Journals (Sweden)

    Zachary T. Brym

    2018-06-01

    Full Text Available We evaluated allometric relationships in length, diameter, and mass of branches for two variably managed orchard tree species (tart cherry, Prunus cerasus; apple, Malus spp.. The empirically estimated allometric exponents (a of the orchard trees were described in the context of two processed-based allometry models that make predictions for a: the West, Brown and Enquist fractal branching model (WBE and the recently introduced Flow Similarity model (FS. These allometric models make predictions about relationships in plant morphology (e.g., branch mass, diameter, length, volume, surface area based on constraints imposed on plant growth by physical and physiological processes. We compared our empirical estimates of a to the model predictions to interpret the physiological implications of pruning and management in orchard systems. Our study found strong allometric relationships among the species and individuals studied with limited agreement with the expectations of either model. The 8/3-power law prediction of the mass ∼ diameter relationship by the WBE, indicative of biomechanical limitations, was marginally supported by this study. Length-including allometric relationships deviated from predictions of both models, but shift toward the expectation of flow similarity. In this way, managed orchard trees deviated from strict adherence to the idealized expectations of the models, but still fall within the range of model expectations in many cases despite intensive management.

  1. Building Quality into Learning Management Systems – An Architecture-Centric Approach

    NARCIS (Netherlands)

    Avgeriou, P.; Retalis, Simos; Skordalakis, Manolis

    2003-01-01

    The design and development of contemporary Learning Management Systems (LMS), is largely focused on satisfying functional requirements, rather than quality requirements, thus resulting in inefficient systems of poor software and business quality. In order to remedy this problem there is a research

  2. Architecturally Significant Requirements Identification, Classification and Change Management for Multi-tenant Cloud-Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Probst, Christian W.

    2017-01-01

    presented a framework for requirements classification and change management focusing on distributed Platform as a Service (PaaS) and Software as a Service (SaaS) systems as well as complex software ecosystems that are built using PaaS and SaaS, such as Tools as a Service (TaaS). We have demonstrated...

  3. DGSim : comparing grid resource management architectures through trace-based simulation

    NARCIS (Netherlands)

    Iosup, A.; Sonmez, O.O.; Epema, D.H.J.; Luque, E.; Margalef, T.; Benítez, D.

    2008-01-01

    Many advances in grid resource management are still required to realize the grid computing vision of the integration of a world-wide computing infrastructure for scientific use. The pressure for advances is increased by the fast evolution of single, large clusters, which are the primary

  4. A systematic literature review on the architecture of business process management systems

    NARCIS (Netherlands)

    Pourmirza, S.; Peters, S.P.F.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Due to the high complexity of modern-day business, organizations are forced to quickly adapt to a wide range of cutting-edge developments. These developments influence the structure and behavior of the business processes that represent the work and of the Business Process Management Systems (BPMS)

  5. A Knowledge Management Technology Architecture for Educational Research Organisations: Scaffolding Research Projects and Workflow Processing

    Science.gov (United States)

    Muthukumar; Hedberg, John G.

    2005-01-01

    There is growing recognition that the economic climate of the world is shifting towards a knowledge-based economy where knowledge will be cherished as the most prized asset. In this regard, technology can be leveraged as a useful tool in effectually managing the knowledge capital of an organisation. Although several research studies have advanced…

  6. Notification Event Architecture for Traveler Screening: Predictive Traveler Screening Using Event Driven Business Process Management

    Science.gov (United States)

    Lynch, John Kenneth

    2013-01-01

    Using an exploratory model of the 9/11 terrorists, this research investigates the linkages between Event Driven Business Process Management (edBPM) and decision making. Although the literature on the role of technology in efficient and effective decision making is extensive, research has yet to quantify the benefit of using edBPM to aid the…

  7. Risk management in architectural design control of uncertainty over building use and maintenance

    CERN Document Server

    Martani, Claudio

    2015-01-01

    This book analyzes the risk management process in relation to building design and operation and on this basis proposes a method and a set of tools that will improve the planning and evaluation of design solutions in order to control risks in the operation and management phase. Particular attention is paid to the relationship between design choices and the long-term performance of buildings in meeting requirements expressing user and client needs. A risk dashboard is presented as a risk measurement framework that identifies and addresses areas of uncertainty surrounding the satisfaction of particularly relevant requirements over time. This risk dashboard will assist both designers and clients. It will support designers by enabling them to improve the maintainability of project performance and will aid clients both in devising a brief that emphasizes the most relevant aspects of maintainability and in evaluating project proposals according to long-term risks. The results of assessment of the proposed method and...

  8. A Review of Kenya´s Post-Conflict Peace Building and Conflict Management Architecture

    DEFF Research Database (Denmark)

    Owiso, Michael

    2018-01-01

    ranging from election related violence, inter-communal rivalries, a history of marginalization as well as gender related violence, among others. This chapter is a critical analysis of Kenya´s response to conflict focusing on the country’s infrastructure for peace. The infrastructure is anchored......Conflict management and peacebuilding demands a deep understanding and analysis of the conflict and the circumstances surrounding it. This is because the causes may be complex, nuanced and may involve both short term and long term issues. Kenya is characterized by different forms of conflict...... on the National Policy for Peacebuilding and Conflict Management adopted in July 2014 by parliament. The policy developed through a process of multi-agency consultations articulates the country´s vision and strategy for responding to conflict. Although the policy is still nascent, the paper seeks to evaluate its...

  9. Design, architecture and implementation of a residential energy box management tool in a SmartGrid

    International Nuclear Information System (INIS)

    Ioakimidis, Christos S.; Oliveira, Luís J.; Genikomsakis, Konstantinos N.; Dallas, Panagiotis I.

    2014-01-01

    This paper presents the EB (energy box) concept in the context of the V2G (vehicle-to-grid) technology to address the energy management needs of a modern residence, considering that the available infrastructure includes micro-renewable energy sources in the form of solar and wind power, the electricity loads consist of “smart” and conventional household appliances, while the battery of an EV (electric vehicle) plays the role of local storage. The problem is formulated as a multi-objective DSP (dynamic stochastic programming) model in order to maximize comfort and lifestyle preferences and minimize cost. Combining the DSP model that controls the EB operation with a neural network based approach for simulating the thermal model of a building, a set of scenarios are examined to exemplify the applicability of the proposed energy management tool. The EB is capable of working under real-time tariff and placing bids in electricity markets both as a stand-alone option and integrated in a SmartGrid paradigm, where a number of EBs are managed by an aggregator. The results obtained for the Portuguese tertiary electricity market indicate that this approach has the potential to compete as an ancillary service and sustain business with benefits for both the microgrid and residence occupants. - Highlights: • The energy box is a residential energy management tool in the context of V2G (vehicle-to-grid). • Multi-objective dynamic stochastic programming is used to model the energy box. • The energy box is working under real-time electricity pricing. • The proposed implementation is capable of placing bids in electricity markets. • The results indicate its potential to compete in the Portuguese tertiary market

  10. A Proposed Architecture for Implementing a Knowledge Management System in the Brazilian National Cancer Institute

    Directory of Open Access Journals (Sweden)

    José Geraldo Pereira Barbosa

    2009-07-01

    Full Text Available Because their services are based decisively on the collection, analysis and exchange of clinical information or knowledge, within and across organizational boundaries, knowledge management has exceptional application and importance to health care organizations. This article proposes a conceptual framework for a knowledge management system, which is expected to support both hospitals and the oncology network in Brazil. Under this holistic single-case study, triangulation of multiple sources of data collection was used by means of archival records, documents and participant observation, as two of the authors were serving as INCA staff members, thus gaining access to the event and its documentation and being able to perceive reality from an insider point of view. The benefits derived from the present status of the ongoing implementation, so far, are: (i speediness of cancer diagnosis and enhanced quality of both diagnosis and data used in epidemiological studies; (ii reduction in treatment costs; (iii relief of INCA’S labor shortage; (iii improved management performance; (iv better use of installed capacity; (v easiness of massive (explicit knowledge transference among the members of the network; and (vi increase in organizational capacity of knowledge retention (institutionalization of procedures.

  11. Synthesis and properties of ZnFe2O4 replica with biological hierarchical structure

    International Nuclear Information System (INIS)

    Liu, Hongyan; Guo, Yiping; Zhang, Yangyang; Wu, Fen; Liu, Yun; Zhang, Di

    2013-01-01

    Highlights: • ZFO replica with hierarchical structure was synthesized from butterfly wings. • Biotemplate has a significant impact on the properties of ZFO material. • Our method opens up new avenues for the synthesis of spinel ferrites. -- Abstract: ZnFe 2 O 4 replica with biological hierarchical structure was synthesized from Papilio paris by a sol–gel method followed by calcination. The crystallographic structure and morphology of the obtained samples were characterized by X-ray diffraction, field-emission scanning electron microscope, and transmittance electron microscope. The results showed that the hierarchical structures were retained in the ZFO replica of spinel structure. The magnetic behavior of such novel products was measured by a vibrating sample magnetometer. A superparamagnetism-like behavior was observed due to nanostructuration size effects. In addition, the ZFO replica with “quasi-honeycomb-like structure” showed a much higher specific capacitance of 279.4 F g −1 at 10 mV s −1 in comparison with ZFO powder of 137.3 F g −1 , attributing to the significantly increased surface area. These results demonstrated that ZFO replica is a promising candidate for novel magnetic devices and supercapacitors

  12. A distance-aware replica adaptive data gathering protocol for Delay Tolerant Mobile Sensor Networks.

    Science.gov (United States)

    Feng, Yong; Gong, Haigang; Fan, Mingyu; Liu, Ming; Wang, Xiaomin

    2011-01-01

    In Delay Tolerant Mobile Sensor Networks (DTMSNs) that have the inherent features of intermitted connectivity and frequently changing network topology it is reasonable to utilize multi-replica schemes to improve the data gathering performance. However, most existing multi-replica approaches inject a large amount of message copies into the network to increase the probability of message delivery, which may drain each mobile node's limited battery supply faster and result in too much contention for the restricted resources of the DTMSN, so a proper data gathering scheme needs a trade off between the number of replica messages and network performance. In this paper, we propose a new data gathering protocol called DRADG (for Distance-aware Replica Adaptive Data Gathering protocol), which economizes network resource consumption through making use of a self-adapting algorithm to cut down the number of redundant replicas of messages, and achieves a good network performance by leveraging the delivery probabilities of the mobile sensors as main routing metrics. Simulation results have shown that the proposed DRADG protocol achieves comparable or higher message delivery ratios at the cost of the much lower transmission overhead than several current DTMSN data gathering schemes.

  13. Information Architecture Used to Manage Multi-Domain Data Analysis in Intensively Managed Landscape - Critical Zone Observatory

    Science.gov (United States)

    Kooper, R.; Angelo, B.; Marini, L.; Kumar, P.; Muste, M.

    2016-12-01

    The Intensively Managed Landscapes-Critical Zone Observatory (IML-CZO) is a multi-agency partnership that aims to understand the coevoluationary dynamics of change in the context of the landscape, soil, and biota. The Data Management aspect of IML-CZO provides data preservation and analysis for each of the scientific domains as they pursue environmental monitoring throughout the midwestern United States. Data Management is facilitated via data ingestion and storage through Clowder, an open-source, scalable data repository for organizing and analyzing data; and Geodashboard, a web application that provides exploring, querying, visualizing and downloading the data ingested into Clowder. The data collected covers many domains including geology, hydrology, and bioengineering. The data across these domains varies greatly; from real-time streams of environmental measurements to individual soil samples that are sent through a series of laboratories for analysis. All data can be uploaded to Clowder where metadata can be extracted or dynamically calculated based on the nature of the information. Geodashboard was created to provide scientists with a tool to explore data across these varying domains, and to visualize the extracted data from Clowder. Once Clowder has extracted the data, it is available for querying from a REST API for standardized and streamlined access. Users are able to explore the data on multiple axis, and are able to download data across multiple domains in a standardized format for further analysis and research. IML-CZO's Clowder has over 60 users and over 180 datasets. There are over 1.1 million extracted data points that date back to 1992, and it is continually growing.

  14. Multilevel QoS-policy-based routing management architecture appropriate for heterogeneous network environments

    Science.gov (United States)

    Chatzaki, Magda; Sartzetakis, Stelios

    1998-09-01

    As telecom providers introduce new and more sophisticated services the necessity of a global, unified view of the network infrastructure becomes demanding. Today, heterogenous backbone networks are interconnected in order to provide global connectivity. Due to technological impairments the cost of network operation, the maintenance complexity and the overuse of resources are extremely high under the goal of supporting the diverting customer requirements. We propose a scheme for ATM QoS support in such heterogenous, multi-domain, multi-technology network environment. The objective is to optimize users' and networks' profits by giving them the opportunity to satisfy their requirements. Our approach introduces a manager able to take routing decisions supporting quality of service guarantees for the customers, while making efficient use of network resources.

  15. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S.; Kekkonen, V. [VTT Energy, Espoo (Finland); Rissanen, P. [Tietosavo Oy (Finland)

    1996-12-31

    In this project the main target is to develop and assess methods for DSM (Demand Side Management) and distribution automation planning from the utility`s point of view. The final goal is to integrate these methods for the strategic planning of electric utilities. In practice the project is divided into four main parts: The development and assessment of DSM/IRP planning methods and cost/benefit analysis as a part of international co-operation (IEA DSM Agreement: Annex IV, European Cost/Benefit analysis of DSM, EUBC, and Finnish SAVE-project started in 1995 in co-operation with SRC International and six electric utilities in Finland); Development of PC-based DSM planning and assessment tools at VTT; Development of a decision support system of distribution network planning including DSM options at Tietosavo Oy and Integration of DSM planning and network planning tools in co-operation with VTT Energy and Tietosavo Oy

  16. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S; Kekkonen, V [VTT Energy, Espoo (Finland); Rissanen, P [Tietosavo Oy (Finland)

    1997-12-31

    In this project the main target is to develop and assess methods for DSM (Demand Side Management) and distribution automation planning from the utility`s point of view. The final goal is to integrate these methods for the strategic planning of electric utilities. In practice the project is divided into four main parts: The development and assessment of DSM/IRP planning methods and cost/benefit analysis as a part of international co-operation (IEA DSM Agreement: Annex IV, European Cost/Benefit analysis of DSM, EUBC, and Finnish SAVE-project started in 1995 in co-operation with SRC International and six electric utilities in Finland); Development of PC-based DSM planning and assessment tools at VTT; Development of a decision support system of distribution network planning including DSM options at Tietosavo Oy and Integration of DSM planning and network planning tools in co-operation with VTT Energy and Tietosavo Oy

  17. Microlens fabrication by replica molding of frozen laser-printed droplets

    Science.gov (United States)

    Surdo, Salvatore; Diaspro, Alberto; Duocastella, Martí

    2017-10-01

    In this work, we synergistically combine laser-induced forward transfer (LIFT) and replica molding for the fabrication of microlenses with control of their geometry and size independent of the material or substrate used. Our approach is based on a multistep process in which liquid microdroplets of an aqueous solution are first printed on a substrate by LIFT. Following a freezing step, the microdroplets are used as a master to fabricate a polydimethylsiloxane (PDMS) mold. A subsequent replica molding step enables the creation of microlenses and microlens arrays on arbitrary selected substrates and by using different curable polymers. Thus, our method combines the rapid fabrication capabilities of LIFT and the perfectively smooth surface quality of the generated microdroplets, with the advantages of replica molding in terms of parallelization and materials flexibility. We demonstrate our strategy by generating microlenses of different photocurable polymers and by characterizing their optical and morphological properties.

  18. Surgery planning and navigation by laser lithography plastic replica. Features, clinical applications, and advantages

    International Nuclear Information System (INIS)

    Kihara, Tomohiko; Tanaka, Yuuko; Furuhata, Kentaro

    1995-01-01

    The use of three-dimensional replicas created using laserlithography has recently become popular for surgical planning and intraoperative navigation in plastic surgery and oral maxillofacial surgery. In this study, we investigated many clinical applications that we have been involved in regarding the production of three-dimensional replicas. We have also analyzed the features, application classes, and advantages of this method. As a result, clinical applications are categorized into three classes, which are 'three-dimensional shape recognition', 'simulated surgery', and 'template'. The distinct features of three-dimensional replicas are 'direct recognition', 'fast manipulation', and 'free availability'. Meeting the requirements of surgical planning and intraoperative navigation, they have produced satisfactory results in clinical applications. (author)

  19. A new Geo-Information Architecture for Risk Management in the Alps

    Science.gov (United States)

    Baruffini, Mi.; Thuering, M.

    2009-04-01

    During the last decades land-use increased significantly in the Swiss (and European) mountain regions. Due to the scarceness of areas suitable for development, anthropic activities were extended into areas prone to natural hazards such as avalanches, debris flows and rockfalls (Smith 2001). Furthermore, the transalpine transport system necessity to develop effective links in an important area collides with the need to ensure the safety of travelers and the health of the population. Consequently, an increase in losses due to hazards can be observed. To mitigate these associated losses, both traditional protective measures and land-use planning policies are to be developed and implemented to optimize future investments. Efficient protection alternatives can be obtained considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. As part of the Swiss National Science Foundation Project 54 "Evaluation of the optimal resilience for vulnerable infrastructure networks - An interdisciplinary pilot study on the transalpine transportation corridors" we study the vulnerability of infrastructures due to natural hazards. The project aims to study various natural hazards (and later, even man-made) and to obtain an evaluation of the resilience according to an interdisciplinary approach, considering the possible damage by means of risk criteria and pointing out the feasibility of conceivable measures to reduce potential damage. The project consists of a geoscientific part and an application. The fist part consists in studying

  20. Acceleration of Lateral Equilibration in Mixed Lipid Bilayers Using Replica Exchange with Solute Tempering.

    Science.gov (United States)

    Huang, Kun; García, Angel E

    2014-10-14

    The lateral heterogeneity of cellular membranes plays an important role in many biological functions such as signaling and regulating membrane proteins. This heterogeneity can result from preferential interactions between membrane components or interactions with membrane proteins. One major difficulty in molecular dynamics simulations aimed at studying the membrane heterogeneity is that lipids diffuse slowly and collectively in bilayers, and therefore, it is difficult to reach equilibrium in lateral organization in bilayer mixtures. Here, we propose the use of the replica exchange with solute tempering (REST) approach to accelerate lateral relaxation in heterogeneous bilayers. REST is based on the replica exchange method but tempers only the solute, leaving the temperature of the solvent fixed. Since the number of replicas in REST scales approximately only with the degrees of freedom in the solute, REST enables us to enhance the configuration sampling of lipid bilayers with fewer replicas, in comparison with the temperature replica exchange molecular dynamics simulation (T-REMD) where the number of replicas scales with the degrees of freedom of the entire system. We apply the REST method to a cholesterol and 1,2-dipalmitoyl- sn -glycero-3-phosphocholine (DPPC) bilayer mixture and find that the lateral distribution functions of all molecular pair types converge much faster than in the standard MD simulation. The relative diffusion rate between molecules in REST is, on average, an order of magnitude faster than in the standard MD simulation. Although REST was initially proposed to study protein folding and its efficiency in protein folding is still under debate, we find a unique application of REST to accelerate lateral equilibration in mixed lipid membranes and suggest a promising way to probe membrane lateral heterogeneity through molecular dynamics simulation.

  1. Robotic architectures

    CSIR Research Space (South Africa)

    Mtshali, M

    2010-01-01

    Full Text Available In the development of mobile robotic systems, a robotic architecture plays a crucial role in interconnecting all the sub-systems and controlling the system. The design of robotic architectures for mobile autonomous robots is a challenging...

  2. On the Performance Evaluation of a MIMO-WCDMA Transmission Architecture for Building Management Systems.

    Science.gov (United States)

    Tsampasis, Eleftherios; Gkonis, Panagiotis K; Trakadas, Panagiotis; Zahariadis, Theodοre

    2018-01-08

    The goal of this study was to investigate the performance of a realistic wireless sensor nodes deployment in order to support modern building management systems (BMSs). A three-floor building orientation is taken into account, where each node is equipped with a multi-antenna system while a central base station (BS) collects and processes all received information. The BS is also equipped with multiple antennas; hence, a multiple input-multiple output (MIMO) system is formulated. Due to the multiple reflections during transmission in the inner of the building, a wideband code division multiple access (WCDMA) physical layer protocol has been considered, which has already been adopted for third-generation (3G) mobile networks. Results are presented for various MIMO orientations, where the mean transmission power per node is considered as an output metric for a specific signal-to-noise ratio (SNR) requirement and number of resolvable multipath components. In the first set of presented results, the effects of multiple access interference on overall transmission power are highlighted. As the number of mobile nodes per floor or the requested transmission rate increases, MIMO systems of a higher order should be deployed in order to maintain transmission power at adequate levels. In the second set of results, a comparison is performed among transmission in diversity combining and spatial multiplexing mode, which clearly indicate that the first case is the most appropriate solution for indoor communications.

  3. Web 2.0 systems supporting childhood chronic disease management: a pattern language representation of a general architecture.

    Science.gov (United States)

    Timpka, Toomas; Eriksson, Henrik; Ludvigsson, Johnny; Ekberg, Joakim; Nordfeldt, Sam; Hanberger, Lena

    2008-11-28

    Chronic disease management is a global health concern. By the time they reach adolescence, 10-15% of all children live with a chronic disease. The role of educational interventions in facilitating adaptation to chronic disease is receiving growing recognition, and current care policies advocate greater involvement of patients in self-care. Web 2.0 is an umbrella term for new collaborative Internet services characterized by user participation in developing and managing content. Key elements include Really Simple Syndication (RSS) to rapidly disseminate awareness of new information; weblogs (blogs) to describe new trends, wikis to share knowledge, and podcasts to make information available on personal media players. This study addresses the potential to develop Web 2.0 services for young persons with a chronic disease. It is acknowledged that the management of childhood chronic disease is based on interplay between initiatives and resources on the part of patients, relatives, and health care professionals, and where the balance shifts over time to the patients and their families. Participatory action research was used to stepwise define a design specification in the form of a pattern language. Support for children diagnosed with diabetes Type 1 was used as the example area. Each individual design pattern was determined graphically using card sorting methods, and textually in the form Title, Context, Problem, Solution, Examples and References. Application references were included at the lowest level in the graphical overview in the pattern language but not specified in detail in the textual descriptions. The design patterns are divided into functional and non-functional design elements, and formulated at the levels of organizational, system, and application design. The design elements specify access to materials for development of the competences needed for chronic disease management in specific community settings, endorsement of self-learning through online peer

  4. Web 2.0 systems supporting childhood chronic disease management: A pattern language representation of a general architecture

    Directory of Open Access Journals (Sweden)

    Ekberg Joakim

    2008-11-01

    Full Text Available Abstract Background Chronic disease management is a global health concern. By the time they reach adolescence, 10–15% of all children live with a chronic disease. The role of educational interventions in facilitating adaptation to chronic disease is receiving growing recognition, and current care policies advocate greater involvement of patients in self-care. Web 2.0 is an umbrella term for new collaborative Internet services characterized by user participation in developing and managing content. Key elements include Really Simple Syndication (RSS to rapidly disseminate awareness of new information; weblogs (blogs to describe new trends, wikis to share knowledge, and podcasts to make information available on personal media players. This study addresses the potential to develop Web 2.0 services for young persons with a chronic disease. It is acknowledged that the management of childhood chronic disease is based on interplay between initiatives and resources on the part of patients, relatives, and health care professionals, and where the balance shifts over time to the patients and their families. Methods Participatory action research was used to stepwise define a design specification in the form of a pattern language. Support for children diagnosed with diabetes Type 1 was used as the example area. Each individual design pattern was determined graphically using card sorting methods, and textually in the form Title, Context, Problem, Solution, Examples and References. Application references were included at the lowest level in the graphical overview in the pattern language but not specified in detail in the textual descriptions. Results The design patterns are divided into functional and non-functional design elements, and formulated at the levels of organizational, system, and application design. The design elements specify access to materials for development of the competences needed for chronic disease management in specific community

  5. Ensuring the Quality of Data Packages in the LTER Network Provenance Aware Synthesis Tracking Architecture Data Management System and Archive

    Science.gov (United States)

    Servilla, M. S.; O'Brien, M.; Costa, D.

    2013-12-01

    Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date

  6. Replica analysis of partition-function zeros in spin-glass models

    International Nuclear Information System (INIS)

    Takahashi, Kazutaka

    2011-01-01

    We study the partition-function zeros in mean-field spin-glass models. We show that the replica method is useful to find the locations of zeros in a complex parameter plane. For the random energy model, we obtain the phase diagram in the plane and find that there are two types of distributions of zeros: two-dimensional distribution within a phase and one-dimensional one on a phase boundary. Phases with a two-dimensional distribution are characterized by a novel order parameter defined in the present replica analysis. We also discuss possible patterns of distributions by studying several systems.

  7. Architecture & Environment

    Science.gov (United States)

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  8. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  9. Three-Dimensional Interpretation of Sculptural Heritage with Digital and Tangible 3D Printed Replicas

    Science.gov (United States)

    Saorin, José Luis; Carbonell-Carrera, Carlos; Cantero, Jorge de la Torre; Meier, Cecile; Aleman, Drago Diaz

    2017-01-01

    Spatial interpretation features as a skill to acquire in the educational curricula. The visualization and interpretation of three-dimensional objects in tactile devices and the possibility of digital manufacturing with 3D printers, offers an opportunity to include replicas of sculptures in teaching and, thus, facilitate the 3D interpretation of…

  10. Evaluation of creep damage development by the replica method; Utvaerdering av krypskadeutveckling med replikmetoden

    Energy Technology Data Exchange (ETDEWEB)

    Storesund, Jan [Det Norske Veritas AB, Stockholm (Sweden); Roennholm, Markku [Fortum (Sweden)

    2002-04-01

    Creep damage development in high temperature components can be monitored by the replica method. Damage is classified and an experience based time period for safe operation is recommended where a re-inspection should be conducted. Original recommendations are still commonly used but there are also developed ones are mostly less conservative. A data base of more than 6000 replicas, collected from welded components in Swedish and Finnish power plants, has been evaluated with respect to damage development in the present project. The results are in general in good agreement to the existing developed recommendations for re-inspections. Important factors that should be considered for use of the recommendations are highlighted: Service history, Material, welding and heat treatment, Measure of pressure and temperature, System stresses, Geometrical stress concentrations, stress distributions, Design of components and welds, Creep crack growth, Starts and stops, Extent and performance of the replica method. These factors have been analysed with respect to the evaluated data resulting in comments to the existing recommendations. In addition, recommendations and conditions for a high reliability of the replica method are described. The comments and recommendations can be read in separate sections in the end of the report.

  11. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images

    International Nuclear Information System (INIS)

    Laak, Jeroen A.W.M. van der; Dijkman, Henry B.P.M.; Pahlplatz, Martin M.M.

    2006-01-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000x to 200,000x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy

  12. Systematic expansion in the order parameter for replica theory of the dynamical glass transition.

    Science.gov (United States)

    Jacquin, Hugo; Zamponi, Francesco

    2013-03-28

    It has been shown recently that predictions from mode-coupling theory for the glass transition of hard-spheres become increasingly bad when dimensionality increases, whereas replica theory predicts a correct scaling. Nevertheless if one focuses on the regime around the dynamical transition in three dimensions, mode-coupling results are far more convincing than replica theory predictions. It seems thus necessary to reconcile the two theoretic approaches in order to obtain a theory that interpolates between low-dimensional, mode-coupling results, and "mean-field" results from replica theory. Even though quantitative results for the dynamical transition issued from replica theory are not accurate in low dimensions, two different approximation schemes--small cage expansion and replicated hyper-netted-chain (RHNC)--provide the correct qualitative picture for the transition, namely, a discontinuous jump of a static order parameter from zero to a finite value. The purpose of this work is to develop a systematic expansion around the RHNC result in powers of the static order parameter, and to calculate the first correction in this expansion. Interestingly, this correction involves the static three-body correlations of the liquid. More importantly, we separately demonstrate that higher order terms in the expansion are quantitatively relevant at the transition, and that the usual mode-coupling kernel, involving two-body direct correlation functions of the liquid, cannot be recovered from static computations.

  13. Comparison of pulsed versus continuous oxygen delivery using realistic adult nasal airway replicas

    Directory of Open Access Journals (Sweden)

    Chen JZ

    2017-08-01

    Full Text Available John Z Chen,1 Ira M Katz,2 Marine Pichelin,2 Kaixian Zhu,3 Georges Caillibotte,2 Michelle L Noga,4 Warren H Finlay,1 Andrew R Martin1 1Department of Mechanical Engineering, University of Alberta, Edmonton, AB, Canada; 2Medical R&D, Air Liquide Santé International, Centre de Recherche Paris-Saclay, Les Loges-en-Josas, 3Centre Explor!, Air Liquide Healthcare, Gentilly, France; 4Radiology and Diagnostic Imaging, University of Alberta, Edmonton, AB, Canada Background: Portable oxygen concentrators (POCs typically include pulse flow (PF modes to conserve oxygen. The primary aims of this study were to develop a predictive in vitro model for inhaled oxygen delivery using a set of realistic airway replicas, and to compare PF for a commercial POC with steady flow (SF from a compressed oxygen cylinder. Methods: Experiments were carried out using a stationary compressed oxygen cylinder, a POC, and 15 adult nasal airway replicas based on airway geometries derived from medical images. Oxygen delivery via nasal cannula was tested at PF settings of 2.0 and 6.0, and SF rates of 2.0 and 6.0 L/min. A test lung simulated three breathing patterns representative of a chronic obstructive pulmonary disease patient at rest, during exercise, and while asleep. Volume-averaged fraction of inhaled oxygen (FiO2 was calculated by analyzing oxygen concentrations sampled at the exit of each replica and inhalation flow rates over time. POC pulse volumes were also measured using a commercial O2 conserver test system to attempt to predict FiO2 for PF. Results: Relative volume-averaged FiO2 using PF ranged from 68% to 94% of SF values, increasing with breathing frequency and tidal volume. Three of 15 replicas failed to trigger the POC when used with the sleep breathing pattern at the 2.0 setting, and four of 15 replicas failed to trigger at the 6.0 setting. FiO2 values estimated from POC pulse characteristics followed similar trends but were lower than those derived from

  14. Collaborative architectural design management

    NARCIS (Netherlands)

    Sebastian, R.; Prins, M.

    2008-01-01

    A building project goes through a long process from project conception to realisation, handover, and operation, and involves a large number of people and organisations. As a project becomes more complex, more teams of specialists are required to combine their effort with considerable enthusiasm and

  15. Effect of cation nature of zeolite on carbon replicas and their electrochemical capacitance

    International Nuclear Information System (INIS)

    Zhou, Jin; Li, Wen; Zhang, Zhongshen; Wu, Xiaozhong; Xing, Wei; Zhuo, Shuping

    2013-01-01

    Graphical abstract: Cation nature of zeolite influences the porosity, surface chemical properties of carbon replicas of zeolite, resulting in different electrochemical capacitance. Highlights: ► The porosity of carbon replica strongly depends on zeolite's effective pore size. ► The surface chemical properties influence by the cation nature of zeolite. ► The N-doping introduces large pseudo-capacitance. ► The HYC800 carbon showed a high capacitance of up to 312 F g −1 in 1 M H 2 SO 4 . ► The prepared carbons show good durability of galvanostatic cycle. -- Abstract: N-doped carbon replicas of zeolite Y are prepared, and the effect of cation nature of zeolite (H + or Na + ) on the carbon replicas is studied. The morphology, structure and surface properties of the carbon materials are investigated by scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray diffraction (XRD), N 2 adsorption, X-ray photoelectron spectroscopy (XPS) and Fourier transform infrared spectroscopy (FT-IR). The pore regularity, pore parameter and surface chemical properties of the carbons may strongly depend on the cation nature of the zeolite Y. The carbon replicas of zeolite HY (H-form of zeolite Y) possesses higher pore regularity and much larger surface area than those of zeolite NaY (Na-form of zeolite Y), while the latter carbons seem to possess higher carbonization degrees. Electrochemical measurements show a large faradaic capacitance related to the N- or O-containing groups for the prepared carbons. Owing to the large specific surface area, high pore regularity and heteroatom-doping, the HYC800 sample derived from zeolite HY presents very high gravimetric capacitance, up to 312.4 F g −1 in H 2 SO 4 electrolyte, and this carbon can operate at 1.2 V with good retention ratio in the range of 0.25 to 10 A g −1

  16. Coulomb replica-exchange method: handling electrostatic attractive and repulsive forces for biomolecules.

    Science.gov (United States)

    Itoh, Satoru G; Okumura, Hisashi

    2013-03-30

    We propose a new type of the Hamiltonian replica-exchange method (REM) for molecular dynamics (MD) and Monte Carlo simulations, which we refer to as the Coulomb REM (CREM). In this method, electrostatic charge parameters in the Coulomb interactions are exchanged among replicas while temperatures are exchanged in the usual REM. By varying the atom charges, the CREM overcomes free-energy barriers and realizes more efficient sampling in the conformational space than the REM. Furthermore, this method requires only a smaller number of replicas because only the atom charges of solute molecules are used as exchanged parameters. We performed Coulomb replica-exchange MD simulations of an alanine dipeptide in explicit water solvent and compared the results with those of the conventional canonical, replica exchange, and van der Waals REMs. Two force fields of AMBER parm99 and AMBER parm99SB were used. As a result, the CREM sampled all local-minimum free-energy states more frequently than the other methods for both force fields. Moreover, the Coulomb, van der Waals, and usual REMs were applied to a fragment of an amyloid-β peptide (Aβ) in explicit water solvent to compare the sampling efficiency of these methods for a larger system. The CREM sampled structures of the Aβ fragment more efficiently than the other methods. We obtained β-helix, α-helix, 3(10)-helix, β-hairpin, and β-sheet structures as stable structures and deduced pathways of conformational transitions among these structures from a free-energy landscape. Copyright © 2012 Wiley Periodicals, Inc.

  17. Efficient Round-Trip Time Optimization for Replica-Exchange Enveloping Distribution Sampling (RE-EDS).

    Science.gov (United States)

    Sidler, Dominik; Cristòfol-Clough, Michael; Riniker, Sereina

    2017-06-13

    Replica-exchange enveloping distribution sampling (RE-EDS) allows the efficient estimation of free-energy differences between multiple end-states from a single molecular dynamics (MD) simulation. In EDS, a reference state is sampled, which can be tuned by two types of parameters, i.e., smoothness parameters(s) and energy offsets, such that all end-states are sufficiently sampled. However, the choice of these parameters is not trivial. Replica exchange (RE) or parallel tempering is a widely applied technique to enhance sampling. By combining EDS with the RE technique, the parameter choice problem could be simplified and the challenge shifted toward an optimal distribution of the replicas in the smoothness-parameter space. The choice of a certain replica distribution can alter the sampling efficiency significantly. In this work, global round-trip time optimization (GRTO) algorithms are tested for the use in RE-EDS simulations. In addition, a local round-trip time optimization (LRTO) algorithm is proposed for systems with slowly adapting environments, where a reliable estimate for the round-trip time is challenging to obtain. The optimization algorithms were applied to RE-EDS simulations of a system of nine small-molecule inhibitors of phenylethanolamine N-methyltransferase (PNMT). The energy offsets were determined using our recently proposed parallel energy-offset (PEOE) estimation scheme. While the multistate GRTO algorithm yielded the best replica distribution for the ligands in water, the multistate LRTO algorithm was found to be the method of choice for the ligands in complex with PNMT. With this, the 36 alchemical free-energy differences between the nine ligands were calculated successfully from a single RE-EDS simulation 10 ns in length. Thus, RE-EDS presents an efficient method for the estimation of relative binding free energies.

  18. Designing an architectural style for dynamic medical Cross-Organizational Workflow management system: an approach based on agents and web services.

    Science.gov (United States)

    Bouzguenda, Lotfi; Turki, Manel

    2014-04-01

    This paper shows how the combined use of agent and web services technologies can help to design an architectural style for dynamic medical Cross-Organizational Workflow (COW) management system. Medical COW aims at supporting the collaboration between several autonomous and possibly heterogeneous medical processes, distributed over different organizations (Hospitals, Clinic or laboratories). Dynamic medical COW refers to occasional cooperation between these health organizations, free of structural constraints, where the medical partners involved and their number are not pre-defined. More precisely, this paper proposes a new architecture style based on agents and web services technologies to deal with two key coordination issues of dynamic COW: medical partners finding and negotiation between them. It also proposes how the proposed architecture for dynamic medical COW management system can connect to a multi-agent system coupling the Clinical Decision Support System (CDSS) with Computerized Prescriber Order Entry (CPOE). The idea is to assist the health professionals such as doctors, nurses and pharmacists with decision making tasks, as determining diagnosis or patient data analysis without stopping their clinical processes in order to act in a coherent way and to give care to the patient.

  19. Architectural Contestation

    NARCIS (Netherlands)

    Merle, J.

    2012-01-01

    This dissertation addresses the reductive reading of Georges Bataille's work done within the field of architectural criticism and theory which tends to set aside the fundamental ‘broken’ totality of Bataille's oeuvre and also to narrowly interpret it as a mere critique of architectural form,

  20. Architecture Sustainability

    NARCIS (Netherlands)

    Avgeriou, Paris; Stal, Michael; Hilliard, Rich

    2013-01-01

    Software architecture is the foundation of software system development, encompassing a system's architects' and stakeholders' strategic decisions. A special issue of IEEE Software is intended to raise awareness of architecture sustainability issues and increase interest and work in the area. The

  1. Memory architecture

    NARCIS (Netherlands)

    2012-01-01

    A memory architecture is presented. The memory architecture comprises a first memory and a second memory. The first memory has at least a bank with a first width addressable by a single address. The second memory has a plurality of banks of a second width, said banks being addressable by components

  2. Zeolite-templated carbon replica: a Grand Canonical Monte-Carlo simulation study

    International Nuclear Information System (INIS)

    Thomas Roussel; Roland J M Pellenq; Christophe Bichara; Roger Gadiou; Antoine Didion; Cathie Vix Guterl; Fabrice Gaslain; Julien Parmentier; Valentin Valtchev; Joel Patarin

    2005-01-01

    Microporous carbon materials are interesting for several applications such as hydrogen storage, catalysis or electrical double layer capacitors. The development of the negative templating method to obtain carbon replicas from ordered templates, has lead to the synthesis of several new materials which have interesting textural properties, attractive for energy storage. Among the possible templates, zeolites can be used to obtain highly microporous carbon materials. Nevertheless, the phenomena involved in the replica synthesis are not fully understood, and the relationships between the structure of the template, the carbon precursor and the resulting carbon material need to be investigated. Experimental results for carbon zeolite-templated nano-structures can be found in a series of papers; see for instance ref. [1] in which Wang et al describe a route to ultra-small Single Wall Carbon Nano-tubes (SWNTs) using the porosity of zeolite AlPO 4 -5. After matrix removal, the resulting structure is a free-standing bundle of 4 Angstroms large nano-tubes. However, it is highly desirable to obtain an ordered porous carbon structure that forms a real 3D network to be used for instance in gas storage applications. Carbon replica of faujasite and EMT zeolites can have these properties since these zeolites have a 3D porous network made of 10 Angstroms cages connected to each other through 7 Angstroms large windows. The first step of this study was to generate a theoretical carbon replica structure of various zeolites (faujasite, EMT, AlPO 4 -5, silicalite). For this purpose, we used the Grand Canonical Monte-Carlo (GCMC) technique in which the carbon-carbon interactions were described within the frame of a newly developed Tight Binding approach and the carbon-zeolite interactions assumed to be characteristic of physi-sorption. The intrinsic stability of the subsequent carbon nano-structures was then investigated after mimicking the removal of the inorganic phase by switching

  3. Architectural Narratives

    DEFF Research Database (Denmark)

    Kiib, Hans

    2010-01-01

    a functional framework for these concepts, but tries increasingly to endow the main idea of the cultural project with a spatially aesthetic expression - a shift towards “experience architecture.” A great number of these projects typically recycle and reinterpret narratives related to historical buildings......In this essay, I focus on the combination of programs and the architecture of cultural projects that have emerged within the last few years. These projects are characterized as “hybrid cultural projects,” because they intend to combine experience with entertainment, play, and learning. This essay...... and architectural heritage; another group tries to embed new performative technologies in expressive architectural representation. Finally, this essay provides a theoretical framework for the analysis of the political rationales of these projects and for the architectural representation bridges the gap between...

  4. Evaluation of generalized degrees of freedom for sparse estimation by replica method

    Science.gov (United States)

    Sakata, A.

    2016-12-01

    We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.

  5. Neutron and gamma dose and spectra measurements on the Little Boy replica

    International Nuclear Information System (INIS)

    Hoots, S.; Wadsworth, D.

    1984-01-01

    The radiation-measurement team of the Weapons Engineering Division at Lawrence Livermore National Laboratory (LLNL) measured neutron and gamma dose and spectra on the Little Boy replica at Los Alamos National Laboratory (LANL) in April 1983. This assembly is a replica of the gun-type atomic bomb exploded over Hiroshima in 1945. These measurements support the National Academy of Sciences Program to reassess the radiation doses due to atomic bomb explosions in Japan. Specifically, the following types of information were important: neutron spectra as a function of geometry, gamma to neutron dose ratios out to 1.5 km, and neutron attenuation in the atmosphere. We measured neutron and gamma dose/fission from close-in to a kilometer out, and neutron and gamma spectra at 90 and 30 0 close-in. This paper describes these measurements and the results. 12 references, 13 figures, 5 tables

  6. Proactive replica checking to assure reliability of data in cloud storage with minimum replication

    Science.gov (United States)

    Murarka, Damini; Maheswari, G. Uma

    2017-11-01

    The two major issues for cloud storage systems are data reliability and storage costs. For data reliability protection, multi-replica replication strategy which is used mostly in current clouds acquires huge storage consumption, leading to a large storage cost for applications within the loud specifically. This paper presents a cost-efficient data reliability mechanism named PRCR to cut back the cloud storage consumption. PRCR ensures data reliability of large cloud information with the replication that might conjointly function as a price effective benchmark for replication. The duplication shows that when resembled to the standard three-replica approach, PRCR will scale back to consume only a simple fraction of the cloud storage from one-third of the storage, thence considerably minimizing the cloud storage price.

  7. Fabrication of micropillar substrates using replicas of alpha-particle irradiated and chemically etched PADC films

    International Nuclear Information System (INIS)

    Ng, C.K.M.; Chong, E.Y.W.; Roy, V.A.L.; Cheung, K.M.C.; Yeung, K.W.K.; Yu, K.N.

    2012-01-01

    We proposed a simple method to fabricate micropillar substrates. Polyallyldiglycol carbonate (PADC) films were irradiated by alpha particles and then chemically etched to form a cast with micron-scale spherical pores. A polydimethylsiloxane (PDMS) replica of this PADC film gave a micropillar substrate with micron-scale spherical pillars. HeLa cells cultured on such a micropillar substrate had significantly larger percentage of cells entering S-phase, attached cell numbers and cell spreading areas. - Highlights: ► We proposed a simple method to fabricate micropillar substrates. ► Polyallyldiglycol carbonate films were irradiated and etched to form casts. ► Polydimethylsiloxane replica then formed the micropillar substrates. ► Attachment and proliferation of HeLa cells were enhanced on these substrates.

  8. Fabrication of micropillar substrates using replicas of alpha-particle irradiated and chemically etched PADC films

    Energy Technology Data Exchange (ETDEWEB)

    Ng, C.K.M. [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong); Chong, E.Y.W. [Department of Orthopaedics and Traumatology, University of Hong Kong (Hong Kong); Roy, V.A.L. [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong); Cheung, K.M.C.; Yeung, K.W.K. [Department of Orthopaedics and Traumatology, University of Hong Kong (Hong Kong); Yu, K.N., E-mail: appetery@cityu.edu.hk [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong)

    2012-07-15

    We proposed a simple method to fabricate micropillar substrates. Polyallyldiglycol carbonate (PADC) films were irradiated by alpha particles and then chemically etched to form a cast with micron-scale spherical pores. A polydimethylsiloxane (PDMS) replica of this PADC film gave a micropillar substrate with micron-scale spherical pillars. HeLa cells cultured on such a micropillar substrate had significantly larger percentage of cells entering S-phase, attached cell numbers and cell spreading areas. - Highlights: Black-Right-Pointing-Pointer We proposed a simple method to fabricate micropillar substrates. Black-Right-Pointing-Pointer Polyallyldiglycol carbonate films were irradiated and etched to form casts. Black-Right-Pointing-Pointer Polydimethylsiloxane replica then formed the micropillar substrates. Black-Right-Pointing-Pointer Attachment and proliferation of HeLa cells were enhanced on these substrates.

  9. Neutron and gamma-ray dose-rates from the Little Boy replica

    International Nuclear Information System (INIS)

    Plassmann, E.A.; Pederson, R.A.

    1984-01-01

    We report dose-rate information obtained at many locations in the near vicinity of, and at distances out to 0.64 km from, the Little Boy replica while it was operated as a critical assembly. The measurements were made with modified conventional dosimetry instruments that used an Anderson-Braun detector for neutrons and a Geiger-Mueller tube for gamma rays with suitable electronic modules to count particle-induced pulses. Thermoluminescent dosimetry methods provide corroborative data. Our analysis gives estimates of both neutron and gamma-ray relaxation lengths in air for comparison with earlier calculations. We also show the neutron-to-gamma-ray dose ratio as a function of distance from the replica. Current experiments and further data analysis will refine these results. 7 references, 8 figures

  10. Impact of Channel Estimation Errors on Multiuser Detection via the Replica Method

    Directory of Open Access Journals (Sweden)

    Li Husheng

    2005-01-01

    Full Text Available For practical wireless DS-CDMA systems, channel estimation is imperfect due to noise and interference. In this paper, the impact of channel estimation errors on multiuser detection (MUD is analyzed under the framework of the replica method. System performance is obtained in the large system limit for optimal MUD, linear MUD, and turbo MUD, and is validated by numerical results for finite systems.

  11. Architectural technology

    DEFF Research Database (Denmark)

    2005-01-01

    The booklet offers an overall introduction to the Institute of Architectural Technology and its projects and activities, and an invitation to the reader to contact the institute or the individual researcher for further information. The research, which takes place at the Institute of Architectural...... Technology at the Roayl Danish Academy of Fine Arts, School of Architecture, reflects a spread between strategic, goal-oriented pilot projects, commissioned by a ministry, a fund or a private company, and on the other hand projects which originate from strong personal interests and enthusiasm of individual...

  12. Systemic Architecture

    DEFF Research Database (Denmark)

    Poletto, Marco; Pasquero, Claudia

    -up or tactical design, behavioural space and the boundary of the natural and the artificial realms within the city and architecture. A new kind of "real-time world-city" is illustrated in the form of an operational design manual for the assemblage of proto-architectures, the incubation of proto-gardens...... and the coding of proto-interfaces. These prototypes of machinic architecture materialize as synthetic hybrids embedded with biological life (proto-gardens), computational power, behavioural responsiveness (cyber-gardens), spatial articulation (coMachines and fibrous structures), remote sensing (FUNclouds...

  13. Humanizing Architecture

    DEFF Research Database (Denmark)

    Toft, Tanya Søndergaard

    2015-01-01

    The article proposes the urban digital gallery as an opportunity to explore the relationship between ‘human’ and ‘technology,’ through the programming of media architecture. It takes a curatorial perspective when proposing an ontological shift from considering media facades as visual spectacles...... agency and a sense of being by way of dematerializing architecture. This is achieved by way of programming the symbolic to provide new emotional realizations and situations of enlightenment in the public audience. This reflects a greater potential to humanize the digital in media architecture....

  14. Replica Exchange Gaussian Accelerated Molecular Dynamics: Improved Enhanced Sampling and Free Energy Calculation.

    Science.gov (United States)

    Huang, Yu-Ming M; McCammon, J Andrew; Miao, Yinglong

    2018-04-10

    Through adding a harmonic boost potential to smooth the system potential energy surface, Gaussian accelerated molecular dynamics (GaMD) provides enhanced sampling and free energy calculation of biomolecules without the need of predefined reaction coordinates. This work continues to improve the acceleration power and energy reweighting of the GaMD by combining the GaMD with replica exchange algorithms. Two versions of replica exchange GaMD (rex-GaMD) are presented: force constant rex-GaMD and threshold energy rex-GaMD. During simulations of force constant rex-GaMD, the boost potential can be exchanged between replicas of different harmonic force constants with fixed threshold energy. However, the algorithm of threshold energy rex-GaMD tends to switch the threshold energy between lower and upper bounds for generating different levels of boost potential. Testing simulations on three model systems, including the alanine dipeptide, chignolin, and HIV protease, demonstrate that through continuous exchanges of the boost potential, the rex-GaMD simulations not only enhance the conformational transitions of the systems but also narrow down the distribution width of the applied boost potential for accurate energetic reweighting to recover biomolecular free energy profiles.

  15. Biosynthesis of cathodoluminescent zinc oxide replicas using butterfly (Papilio paris) wing scales as templates

    International Nuclear Information System (INIS)

    Zhang Wang; Zhang Di; Fan Tongxiang; Ding Jian; Gu Jiajun; Guo Qixin; Ogawa, Hiroshi

    2009-01-01

    Papilio paris butterflies have an iridescent blue color patch on their hind wings which is visible over a wide viewing angle. Optical and scanning electron microscopy observations of scales from the wings show that the blue color scales have very different microstructure to the matt black ones which also populate the wings. Scanning electron micrographs of the blue scales show that their surfaces comprise a regular two-dimensional array of concavities. By contrast the matt black scales have fine, sponge-like structure, between the ridges and the cross ribs in the scales. Using both types of scale as bio-templates, we obtain zinc oxide (ZnO) replicas of the microstructures of the original scales. Room temperature (T = 300 K) cathodoluminescence spectra of these ZnO replicas have also been studied. Both spectra show a similar sharp near-band-edge emission, but have different green emission, which we associate with the different microstructures of the ZnO replicas

  16. Biosynthesis of cathodoluminescent zinc oxide replicas using butterfly (Papilio paris) wing scales as templates

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Wang [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China); Zhang Di [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China)], E-mail: zhangdi@sjtu.edu.cn; Fan Tongxiang; Ding Jian; Gu Jiajun [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China); Guo Qixin; Ogawa, Hiroshi [Department of Electrical and Electronic Engineering, Saga University, Saga 840-8502 (Japan)

    2009-01-01

    Papilio paris butterflies have an iridescent blue color patch on their hind wings which is visible over a wide viewing angle. Optical and scanning electron microscopy observations of scales from the wings show that the blue color scales have very different microstructure to the matt black ones which also populate the wings. Scanning electron micrographs of the blue scales show that their surfaces comprise a regular two-dimensional array of concavities. By contrast the matt black scales have fine, sponge-like structure, between the ridges and the cross ribs in the scales. Using both types of scale as bio-templates, we obtain zinc oxide (ZnO) replicas of the microstructures of the original scales. Room temperature (T = 300 K) cathodoluminescence spectra of these ZnO replicas have also been studied. Both spectra show a similar sharp near-band-edge emission, but have different green emission, which we associate with the different microstructures of the ZnO replicas.

  17. Replica sizing strategy for aortic valve replacement improves haemodynamic outcome of the epic supra valve.

    Science.gov (United States)

    Gonzalez-Lopez, David; Faerber, Gloria; Diab, Mahmoud; Amorim, Paulo; Zeynalov, Natig; Doenst, Torsten

    2017-10-01

    Current sizing strategies suggest valve selection based on annulus diameter despite supra-annular placement of biological prostheses potentially allowing placement of a larger size. We assessed the frequency of selecting a larger prosthesis if prosthesis size was selected using a replica (upsizing) and evaluated its impact on haemodynamics. We analysed all discharge echocardiograms between June 2012 and June 2014, where a replica sizer was used for isolated aortic valve replacement (Epic Supra: 266 patients, Trifecta: 49 patients). Upsizing was possible in 71% of the Epic Supra valves (by 1 size: 168, by 2 sizes: 20) and in 59% of the Trifectas (by 1 size: 26, by 2 sizes: 3). Patients for whom upsizing was possible had the lowest pressure gradients within their annulus size groups. The difference was significant in annulus diameters of 21-22 or 25-26 mm (Epic Supra) and 23-24 mm (Trifecta). Trifecta gradients were the lowest. However, the ability to upsize the Epic Supra by 2 sizes eliminated the differences between Epic Supra and Trifecta. Upsizing did not cause intraoperative complications. Using replica sizers for aortic prosthesis size selection allows the implantation of bigger prostheses than recommended in most cases and reduces postoperative gradients, specifically for Epic Supra. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  18. Characterization of Nb SRF cavity materials by white light interferometry and replica techniques

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Chen [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); The Applied Science Department, The College of William and Mary, Williamsburg, VA 23185 (United States); Reece, Charles [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); Kelley, Michael, E-mail: mkelley@jlab.org [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); The Applied Science Department, The College of William and Mary, Williamsburg, VA 23185 (United States)

    2013-06-01

    Much work has shown that the topography of the interior surface is an important contributor to the performance of Nb superconducting radiofrequency (SRF) accelerator cavities. Micron-scale topography is implicated in non-linear loss mechanisms that limit the useful accelerating gradient range and impact cryogenic cost. Aggressive final chemical treatments in cavity production seek to reliably obtain “smoothest” surfaces with superior performance. Process development suffers because the cavity interior surface cannot be viewed directly without cutting out pieces, rendering the cavities unavailable for further study. Here we explore replica techniques as an alternative, providing imprints of cavity internal surface that can be readily examined. A second matter is the topography measurement technique used. Atomic force microscopy (AFM) has proven successful, but too time intensive for routine use in this application. We therefore introduce white light interferometry (WLI) as an alternative approach. We examined real surfaces and their replicas, using AFM and WLI. We find that the replica/WLI is promising to provide the large majority of the desired information, recognizing that a trade-off is being made between best lateral resolution (AFM) and the opportunity to examine much more surface area (WLI).

  19. Tunable hydrodynamic characteristics in microchannels with biomimetic superhydrophobic (lotus leaf replica) walls.

    Science.gov (United States)

    Dey, Ranabir; Raj M, Kiran; Bhandaru, Nandini; Mukherjee, Rabibrata; Chakraborty, Suman

    2014-05-21

    The present work comprehensively addresses the hydrodynamic characteristics through microchannels with lotus leaf replica (exhibiting low adhesion and superhydrophobic properties) walls. The lotus leaf replica is fabricated following an efficient, two-step, soft-molding process and is then integrated with rectangular microchannels. The inherent biomimetic, superhydrophobic surface-liquid interfacial hydrodynamics, and the consequential bulk flow characteristics, are critically analyzed by the micro-particle image velocimetry technique. It is observed that the lotus leaf replica mediated microscale hydrodynamics comprise of two distinct flow regimes even within the low Reynolds number paradigm, unlike the commonly perceived solely apparent slip-stick dominated flows over superhydrophobic surfaces. While the first flow regime is characterized by an apparent slip-stick flow culminating in an enhanced bulk throughput rate, the second flow regime exhibits a complete breakdown of the aforementioned laminar and uni-axial flow model, leading to a predominantly no-slip flow. Interestingly, the critical flow condition dictating the transition between the two hydrodynamic regimes is intrinsically dependent on the micro-confinement effect. In this regard, an energetically consistent theoretical model is also proposed to predict the alterations in the critical flow condition with varying microchannel configurations, by addressing the underlying biomimetic surface-liquid interfacial conditions. Hence, the present research endeavour provides a new design-guiding paradigm for developing multi-functional microfluidic devices involving biomimetic, superhydrophobic surfaces, by judicious exploitation of the tunable hydrodynamic characteristics in the two regimes.

  20. Effect of roughness and material strength on the mechanical properties of fracture replicas

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.

    1995-08-01

    This report presents the results of 11 rotary shear tests conducted on replicas of three hollow cylinders of natural fractures with JRC values of 7.7, 9.4 and 12.0. The JRC values were determined from the results of laser profilometer measurements. The replicas were created from gypsum cement. By varying the water-to-gypsum cement ratio from 30 to 45%, fracture replicas with different values of compressive strength (JCS) were created. The rotary shear experiments were performed under constant normal (nominal) stresses ranging between 0.2 and 1.6 MPa. In this report, the shear test results are compared with predictions using Barton's empirical peak shear strength equation. observations during the experiments indicate that only certain parts of the fracture profiles influence fracture shear strength and dilatancy. Under relatively low applied normal stresses, the JCS does not seem to have a significant effect on shear behavior. As an alternative, a new procedure for predicting the shear behavior of fractures was developed. The approach is based on basic fracture properties such as fracture surface profile data and the compressive strength, modulus of elasticity, and Poisson's ratio of the fracture walls. Comparison between predictions and actual shear test results shows that the alternative procedure is a reliable method

  1. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen

    environments and a knowledge gap therefore exists in present hospital designs. Consequently, the purpose of this thesis has been to investigate if any research-based knowledge exist supporting the hypothesis that the interior architectural qualities of eating environments influence patient food intake, health...... and well-being, as well as outline a set of basic design principles ‘predicting’ the future interior architectural qualities of patient eating environments. Methodologically the thesis is based on an explorative study employing an abductive approach and hermeneutic-interpretative strategy utilizing tactics...... and food intake, as well as a series of references exist linking the interior architectural qualities of healthcare environments with the health and wellbeing of patients. On the basis of these findings, the thesis presents the concept of Architectural Theatricality as well as a set of design principles...

  2. A Replica Detection Scheme Based on the Deviation in Distance Traveled Sliding Window for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Alekha Kumar Mishra

    2017-01-01

    Full Text Available Node replication attack possesses a high level of threat in wireless sensor networks (WSNs and it is severe when the sensors are mobile. A limited number of replica detection schemes in mobile WSNs (MWSNs have been reported till date, where most of them are centralized in nature. The centralized detection schemes use time-location claims and the base station (BS is solely responsible for detecting replica. Therefore, these schemes are prone to single point of failure. There is also additional communication overhead associated with sending time-location claims to the BS. A distributed detection mechanism is always a preferred solution to the above kind of problems due to significantly lower communication overhead than their counterparts. In this paper, we propose a distributed replica detection scheme for MWSNs. In this scheme, the deviation in the distance traveled by a node and its replica is recorded by the observer nodes. Every node is an observer node for some nodes in the network. Observers are responsible for maintaining a sliding window of recent time-distance broadcast of the nodes. A replica is detected by an observer based on the degree of violation computed from the deviations recorded using the time-distance sliding window. The analysis and simulation results show that the proposed scheme is able to achieve higher detection probability compared to distributed replica detection schemes such as Efficient Distributed Detection (EDD and Multi-Time-Location Storage and Diffusion (MTLSD.

  3. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... proportions, to organize the process on site choosing either one room wall components or several rooms wall components – either horizontally or vertically. Combined with the seamless joint the playing with these possibilities the new industrialized architecture can deliver variations in choice of solutions...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  4. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... expression in the specific housing area. It is the aim of this article to expand the different design strategies which architects can use – to give the individual project attitudes and designs with architectural quality. Through the customized component production it is possible to choose different...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  5. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    Architectural freedom and industrialized architecture. Inge Vestergaard, Associate Professor, Cand. Arch. Aarhus School of Architecture, Denmark Noerreport 20, 8000 Aarhus C Telephone +45 89 36 0000 E-mai l inge.vestergaard@aarch.dk Based on the repetitive architecture from the "building boom" 1960...... customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performed expression in direct relation to the given context. Through the last couple of years we have in Denmark been focusing a more sustainable and low energy building technique, which also include...... to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...

  6. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...... to this systematic thinking of the building technique we get a diverse and functional architecture. Creating a new and clearer story telling about new and smart system based thinking behind the architectural expression....

  7. Architectural geometry

    KAUST Repository

    Pottmann, Helmut; Eigensatz, Michael; Vaxman, Amir; Wallner, Johannes

    2014-01-01

    Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural geometry, meanwhile contains a great wealth of individual contributions which are relevant in various fields. For mathematicians, the relation to discrete differential geometry is significant, in particular the integrable system viewpoint. Besides, new application contexts have become available for quite some old-established concepts. Regarding graphics and geometry processing, architectural geometry yields interesting new questions but also new objects, e.g. replacing meshes by other combinatorial arrangements. Numerical optimization plays a major role but in itself would be powerless without geometric understanding. Summing up, architectural geometry has become a rewarding field of study. We here survey the main directions which have been pursued, we show real projects where geometric considerations have played a role, and we outline open problems which we think are significant for the future development of both theory and practice of architectural geometry.

  8. Architectural geometry

    KAUST Repository

    Pottmann, Helmut

    2014-11-26

    Around 2005 it became apparent in the geometry processing community that freeform architecture contains many problems of a geometric nature to be solved, and many opportunities for optimization which however require geometric understanding. This area of research, which has been called architectural geometry, meanwhile contains a great wealth of individual contributions which are relevant in various fields. For mathematicians, the relation to discrete differential geometry is significant, in particular the integrable system viewpoint. Besides, new application contexts have become available for quite some old-established concepts. Regarding graphics and geometry processing, architectural geometry yields interesting new questions but also new objects, e.g. replacing meshes by other combinatorial arrangements. Numerical optimization plays a major role but in itself would be powerless without geometric understanding. Summing up, architectural geometry has become a rewarding field of study. We here survey the main directions which have been pursued, we show real projects where geometric considerations have played a role, and we outline open problems which we think are significant for the future development of both theory and practice of architectural geometry.

  9. Information Management Architecture for an Integrated Computing Environment for the Environmental Restoration Program. Volume 2, Interim business systems guidance

    International Nuclear Information System (INIS)

    1994-09-01

    As part of the Environmental Restoration Program at Martin Marietta, IEM (Information Engineering Methodology) was developed as part of a complete and integrated approach to the progressive development and subsequent maintenance of automated data sharing systems. This approach is centered around the organization's objectives, inherent data relationships, and business practices. IEM provides the Information Systems community with a tool kit of disciplined techniques supported by automated tools. It includes seven stages: Information Strategy Planning; Business Area Analysis; Business System Design; Technical Design; Construction; Transition; Production. This document focuses on the Business Systems Architecture

  10. Architecture of Environmental Engineering

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Alting, Leo

    2006-01-01

    An architecture of Environmental Engineering has been developed comprising the various disciplines and tools involved. It identifies industry as the major actor and target group, and it builds on the concept of Eco-efficiency. To improve Eco-efficiency, there is a limited number of intervention......-efficiency is the aim of Environmental Engineering, the discipline of synthesis – design and creation of solutions – will form a core pillar of the architecture. Other disciplines of Environmental Engineering exist forming the necessary background and frame for the synthesis. Environmental Engineering, thus, in essence...... comprise the disciplines of: management, system description & inventory, analysis & assessment, prioritisation, synthesis, and communication, each existing at all levels of intervention. The developed architecture of Environmental Engineering, thus, consists of thirty individual disciplines, within each...

  11. Architecture of Environmental Engineering

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Alting, Leo

    2004-01-01

    An architecture of Environmental Engineering has been developed comprising the various disciplines and tools involved. It identifies industry as the major actor and target group, and it builds on the concept of Eco-efficiency. To improve Eco-efficiency, there is a limited number of intervention...... of Eco-efficiency is the aim of Environmental Engineering, the discipline of synthesis – design and creation of solutions – will form a core pillar of the architecture. Other disciplines of Environmental Engineering exist forming the necessary background and frame for the synthesis. Environmental...... Engineering, thus, in essence comprise the disciplines of: management, system description & inventory, analysis & assessment, prioritisation, synthesis, and communication, each existing at all levels of intervention. The developed architecture of Environmental Engineering, thus, consists of thirty individual...

  12. Relational Architecture

    DEFF Research Database (Denmark)

    Reeh, Henrik

    2018-01-01

    in a scholarly institution (element #3), as well as the certified PhD scholar (element #4) and the architectural profession, notably its labour market (element #5). This first layer outlines the contemporary context which allows architectural research to take place in a dynamic relationship to doctoral education...... a human and institutional development going on since around 1990 when the present PhD institution was first implemented in Denmark. To be sure, the model is centred around the PhD dissertation (element #1). But it involves four more components: the PhD candidate (element #2), his or her supervisor...... and interrelated fields in which history, place, and sound come to emphasize architecture’s relational qualities rather than the apparent three-dimensional solidity of constructed space. A third layer of relational architecture is at stake in the professional experiences after the defence of the authors...

  13. Architectural Anthropology

    DEFF Research Database (Denmark)

    Stender, Marie

    Architecture and anthropology have always had a common focus on dwelling, housing, urban life and spatial organisation. Current developments in both disciplines make it even more relevant to explore their boundaries and overlaps. Architects are inspired by anthropological insights and methods......, while recent material and spatial turns in anthropology have also brought an increasing interest in design, architecture and the built environment. Understanding the relationship between the social and the physical is at the heart of both disciplines, and they can obviously benefit from further...... collaboration: How can qualitative anthropological approaches contribute to contemporary architecture? And just as importantly: What can anthropologists learn from architects’ understanding of spatial and material surroundings? Recent theoretical developments in anthropology stress the role of materials...

  14. Architectural Engineers

    DEFF Research Database (Denmark)

    Petersen, Rikke Premer

    engineering is addresses from two perspectives – as an educational response and an occupational constellation. Architecture and engineering are two of the traditional design professions and they frequently meet in the occupational setting, but at educational institutions they remain largely estranged....... The paper builds on a multi-sited study of an architectural engineering program at the Technical University of Denmark and an architectural engineering team within an international engineering consultancy based on Denmark. They are both responding to new tendencies within the building industry where...... the role of engineers and architects increasingly overlap during the design process, but their approaches reflect different perceptions of the consequences. The paper discusses some of the challenges that design education, not only within engineering, is facing today: young designers must be equipped...

  15. Data architecture from zen to reality

    CERN Document Server

    Tupper, Charles D

    2011-01-01

    Data Architecture: From Zen to Reality explains the principles underlying data architecture, how data evolves with organizations, and the challenges organizations face in structuring and managing their data. It also discusses proven methods and technologies to solve the complex issues dealing with data. The book uses a holistic approach to the field of data architecture by covering the various applied areas of data, including data modelling and data model management, data quality , data governance, enterprise information management, database design, data warehousing, and warehouse design. This book is a core resource for anyone emplacing, customizing or aligning data management systems, taking the Zen-like idea of data architecture to an attainable reality.

  16. Enterprise architecture evaluation using architecture framework and UML stereotypes

    Directory of Open Access Journals (Sweden)

    Narges Shahi

    2014-08-01

    Full Text Available There is an increasing need for enterprise architecture in numerous organizations with complicated systems with various processes. Support for information technology, organizational units whose elements maintain complex relationships increases. Enterprise architecture is so effective that its non-use in organizations is regarded as their institutional inability in efficient information technology management. The enterprise architecture process generally consists of three phases including strategic programing of information technology, enterprise architecture programing and enterprise architecture implementation. Each phase must be implemented sequentially and one single flaw in each phase may result in a flaw in the whole architecture and, consequently, in extra costs and time. If a model is mapped for the issue and then it is evaluated before enterprise architecture implementation in the second phase, the possible flaws in implementation process are prevented. In this study, the processes of enterprise architecture are illustrated through UML diagrams, and the architecture is evaluated in programming phase through transforming the UML diagrams to Petri nets. The results indicate that the high costs of the implementation phase will be reduced.

  17. C3PO - A Dynamic Data Placement Agent for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, Thomas; The ATLAS collaboration; Barisits, Martin-Stefan; Serfon, Cedric; Garonne, Vincent

    2016-01-01

    This contribution introduces a new dynamic data placement agent for the ATLAS distributed data management system. This agent is designed to pre-place potentially popular data to make it more widely available. It uses data from a variety of sources. Those include input datasets and sites workload information from the ATLAS workload management system, network metrics from different sources like FTS and PerfSonar, historical popularity data collected through a tracer mechanism and more. With this data it decides if, when and where to place new replicas that then can be used by the WMS to distribute the workload more evenly over available computing resources and then ultimately reduce job waiting times. The new replicas are created with a short lifetime that gets extended, when the data is accessed and therefore the system behaves like a big cache. This paper gives an overview of the architecture and the final implementation of this new agent. The paper also includes an evaluation of different placement algorithm...

  18. Reframing Architecture

    DEFF Research Database (Denmark)

    Riis, Søren

    2013-01-01

    I would like to thank Prof. Stephen Read (2011) and Prof. Andrew Benjamin (2011) for both giving inspiring and elaborate comments on my article “Dwelling in-between walls: the architectural surround”. As I will try to demonstrate below, their two different responses not only supplement my article...... focuses on how the absence of an initial distinction might threaten the endeavour of my paper. In my reply to Read and Benjamin, I will discuss their suggestions and arguments, while at the same time hopefully clarifying the postphenomenological approach to architecture....

  19. Proposal for logistics information management system using distributed architecture; Bunsangata butsuryu joho system no teian to kensho

    Energy Technology Data Exchange (ETDEWEB)

    Kataoka, N.; Koizumi, H.; Shimizu, H. [Mitsubishi Electric Power Corp., Tokyo (Japan)

    1998-03-01

    Conventional host-based central-processing type logistics information systems collect all information about stocked products (sales results, inventory, out-of-stock items) on a single host computer, and based on this information perform ordering, shipping, receiving, and other processing. In a client/server architecture, the system is not simply downsized: in order to ensure more effective use of logistics information and closer coordination with manufacturing information systems, the logistics information system must be configured as a distributed system specific to a given factory and its various products. Such distributed systems each function acts independently, but at the same time the overall system of which they is part must operate in harmony to perform cost optimization, adjust allocation of resources among different factories and business locations, and present a single monolithic interface to retailers and sales agents. In this paper, we propose a logistics information system with a distributed architecture as well as agents whose role is to coordinate operation of the overall system, as one means of realizing this combination of component autonomy and overall system harmony. The methodology proposed here was applied to a proving system, and its effectiveness was verified. 9 refs., 12 figs.

  20. Zeolite-templated carbon replica: a grand canonical Monte-Carlo simulation study

    International Nuclear Information System (INIS)

    Roussel, Th.; Pellenq, R.J.M.; Bichara, Ch.; Gadiou, R.; Didion, A.; Vix-Guterl, C.; Gaslain, F.; Parmentier, J.; Valtchev, V.; Patarin, J.

    2005-01-01

    Microporous carbon materials are interesting for several applications such as hydrogen storage, catalysis or electrical double layer capacitors. The development of the negative templating method to obtain carbon replicas from ordered templates, has lead to the synthesis of several new materials which have interesting textural properties, attractive for energy storage. Among the possible templates, zeolites can be used to obtain highly microporous carbon materials. Nevertheless, the phenomena involved in the replica synthesis are not fully understood, and the relationships between the structure of the template, the carbon precursor and the resulting carbon material need to be investigated. Experimental results for carbon zeolite-templated nano-structures can be found in a series of papers; see for instance ref. [1] in which Wang et al describe a route to ultra-small Single Wall Carbon Nano-tubes (SWNTs) using the porosity of zeolite AlPO 4 -5. After matrix removal, the resulting structure is a free-standing bundle of 4 Angstroms large nano-tubes. However, it is highly desirable to obtain an ordered porous carbon structure that forms a real 3D network to be used for instance in gas storage applications. Carbon replica of faujasite and EMT zeolites can have these properties since these zeolites have a 3D porous network made of 10 Angstroms cages connected to each other through 7 Angstroms large windows. The first step of this study was to generate a theoretical carbon replica structure of various zeolites (faujasite, EMT, AlPO 4 -5, silicalite). For this purpose, we used the Grand Canonical Monte-Carlo (GCMC) technique in which the carbon-carbon interactions were described within the frame of a newly developed Tight Binding approach and the carbon-zeolite interactions assumed to be characteristic of physisorption. The intrinsic stability of the subsequent carbon nano-structures was then investigated after mimicking the removal of the inorganic phase by switching

  1. Inferring predator behavior from attack rates on prey-replicas that differ in conspicuousness.

    Directory of Open Access Journals (Sweden)

    Yoel E Stuart

    Full Text Available Behavioral ecologists and evolutionary biologists have long studied how predators respond to prey items novel in color and pattern. Because a predatory response is influenced by both the predator's ability to detect the prey and a post-detection behavioral response, variation among prey types in conspicuousness may confound inference about post-prey-detection predator behavior. That is, a relatively high attack rate on a given prey type may result primarily from enhanced conspicuousness and not predators' direct preference for that prey. Few studies, however, account for such variation in conspicuousness. In a field experiment, we measured predation rates on clay replicas of two aposematic forms of the poison dart frog Dendrobates pumilio, one novel and one familiar, and two cryptic controls. To ask whether predators prefer or avoid a novel aposematic prey form independently of conspicuousness differences among replicas, we first modeled the visual system of a typical avian predator. Then, we used this model to estimate replica contrast against a leaf litter background to test whether variation in contrast alone could explain variation in predator attack rate. We found that absolute predation rates did not differ among color forms. Predation rates relative to conspicuousness did, however, deviate significantly from expectation, suggesting that predators do make post-detection decisions to avoid or attack a given prey type. The direction of this deviation from expectation, though, depended on assumptions we made about how avian predators discriminate objects from the visual background. Our results show that it is important to account for prey conspicuousness when investigating predator behavior and also that existing models of predator visual systems need to be refined.

  2. Anti-stiction coating of PDMS moulds for rapid microchannel fabrication by double replica moulding

    DEFF Research Database (Denmark)

    Zhuang, Guisheng; Kutter, Jörg Peter

    2011-01-01

    ), which resulted in an anti-stiction layer for the improved release after PDMS casting. The deposition of FDTS on an O2 plasma-activated surface of PDMS produced a reproducible and well-performing anti-stiction monolayer of fluorocarbon, and we used the FDTS-coated moulds as micro-masters for rapid......In this paper, we report a simple and precise method to rapidly replicate master structures for fast microchannel fabrication by double replica moulding of polydimethylsiloxane (PDMS). A PDMS mould was surface-treated by vapour phase deposition of 1H,1H,2H,2H-perfluorodecyltrichlorosilane (FDTS...

  3. A sequence of Clifford algebras and three replicas of Dirac particle

    International Nuclear Information System (INIS)

    Krolikowski, W.; Warsaw Univ.

    1990-01-01

    The embedding of Dirac algebra into a sequence N=1, 2, 3,... of Clifford algebras is discussed, leading to Dirac equations with N=1 additional, electromagnetically ''hidden'' spins 1/2. It is shown that there are three and only three replicas N=1, 3, 5 of Dirac particle if the theory of relativity together with the probability interpretation of wave function is applied both to the ''visible'' spin and ''hidden'' spins, and a new ''hidden exclusion principle''is imposed on the wave function (then ''hidden'' spins add up to zero). It is appealing to explore this idea in order to explain the puzzle of three generations of lepton and quarks. (author)

  4. Enhanced Sampling in Molecular Dynamics Using Metadynamics, Replica-Exchange, and Temperature-Acceleration

    Directory of Open Access Journals (Sweden)

    Cameron Abrams

    2013-12-01

    Full Text Available We review a selection of methods for performing enhanced sampling in molecular dynamics simulations. We consider methods based on collective variable biasing and on tempering, and offer both historical and contemporary perspectives. In collective-variable biasing, we first discuss methods stemming from thermodynamic integration that use mean force biasing, including the adaptive biasing force algorithm and temperature acceleration. We then turn to methods that use bias potentials, including umbrella sampling and metadynamics. We next consider parallel tempering and replica-exchange methods. We conclude with a brief presentation of some combination methods.

  5. Time-reversal focusing of an expanding soliton gas in disordered replicas

    KAUST Repository

    Fratalocchi, Andrea

    2011-05-31

    We investigate the properties of time reversibility of a soliton gas, originating from a dispersive regularization of a shock wave, as it propagates in a strongly disordered environment. An original approach combining information measures and spin glass theory shows that time-reversal focusing occurs for different replicas of the disorder in forward and backward propagation, provided the disorder varies on a length scale much shorter than the width of the soliton constituents. The analysis is performed by starting from a new class of reflectionless potentials, which describe the most general form of an expanding soliton gas of the defocusing nonlinear Schrödinger equation.

  6. Time-reversal focusing of an expanding soliton gas in disordered replicas

    KAUST Repository

    Fratalocchi, Andrea; Armaroli, A.; Trillo, S.

    2011-01-01

    We investigate the properties of time reversibility of a soliton gas, originating from a dispersive regularization of a shock wave, as it propagates in a strongly disordered environment. An original approach combining information measures and spin glass theory shows that time-reversal focusing occurs for different replicas of the disorder in forward and backward propagation, provided the disorder varies on a length scale much shorter than the width of the soliton constituents. The analysis is performed by starting from a new class of reflectionless potentials, which describe the most general form of an expanding soliton gas of the defocusing nonlinear Schrödinger equation.

  7. Textile Architecture

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen

    2010-01-01

    Textiles can be used as building skins, adding new aesthetic and functional qualities to architecture. Just like we as humans can put on a coat, buildings can also get dressed. Depending on our mood, or on the weather, we can change coat, and so can the building. But the idea of using textiles...

  8. Replica Node Detection Using Enhanced Single Hop Detection with Clonal Selection Algorithm in Mobile Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    L. S. Sindhuja

    2016-01-01

    Full Text Available Security of Mobile Wireless Sensor Networks is a vital challenge as the sensor nodes are deployed in unattended environment and they are prone to various attacks. One among them is the node replication attack. In this, the physically insecure nodes are acquired by the adversary to clone them by having the same identity of the captured node, and the adversary deploys an unpredictable number of replicas throughout the network. Hence replica node detection is an important challenge in Mobile Wireless Sensor Networks. Various replica node detection techniques have been proposed to detect these replica nodes. These methods incur control overheads and the detection accuracy is low when the replica is selected as a witness node. This paper proposes to solve these issues by enhancing the Single Hop Detection (SHD method using the Clonal Selection algorithm to detect the clones by selecting the appropriate witness nodes. The advantages of the proposed method include (i increase in the detection ratio, (ii decrease in the control overhead, and (iii increase in throughput. The performance of the proposed work is measured using detection ratio, false detection ratio, packet delivery ratio, average delay, control overheads, and throughput. The implementation is done using ns-2 to exhibit the actuality of the proposed work.

  9. Synthesis and properties of ZnFe{sub 2}O{sub 4} replica with biological hierarchical structure

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongyan; Guo, Yiping, E-mail: ypguo@sjtu.edu.cn; Zhang, Yangyang; Wu, Fen; Liu, Yun; Zhang, Di, E-mail: zhangdi@sjtu.edu.cn

    2013-09-20

    Highlights: • ZFO replica with hierarchical structure was synthesized from butterfly wings. • Biotemplate has a significant impact on the properties of ZFO material. • Our method opens up new avenues for the synthesis of spinel ferrites. -- Abstract: ZnFe{sub 2}O{sub 4} replica with biological hierarchical structure was synthesized from Papilio paris by a sol–gel method followed by calcination. The crystallographic structure and morphology of the obtained samples were characterized by X-ray diffraction, field-emission scanning electron microscope, and transmittance electron microscope. The results showed that the hierarchical structures were retained in the ZFO replica of spinel structure. The magnetic behavior of such novel products was measured by a vibrating sample magnetometer. A superparamagnetism-like behavior was observed due to nanostructuration size effects. In addition, the ZFO replica with “quasi-honeycomb-like structure” showed a much higher specific capacitance of 279.4 F g{sup −1} at 10 mV s{sup −1} in comparison with ZFO powder of 137.3 F g{sup −1}, attributing to the significantly increased surface area. These results demonstrated that ZFO replica is a promising candidate for novel magnetic devices and supercapacitors.

  10. From green architecture to architectural green

    DEFF Research Database (Denmark)

    Earon, Ofri

    2011-01-01

    that describes the architectural exclusivity of this particular architecture genre. The adjective green expresses architectural qualities differentiating green architecture from none-green architecture. Currently, adding trees and vegetation to the building’s facade is the main architectural characteristics...... they have overshadowed the architectural potential of green architecture. The paper questions how a green space should perform, look like and function. Two examples are chosen to demonstrate thorough integrations between green and space. The examples are public buildings categorized as pavilions. One......The paper investigates the topic of green architecture from an architectural point of view and not an energy point of view. The purpose of the paper is to establish a debate about the architectural language and spatial characteristics of green architecture. In this light, green becomes an adjective...

  11. Agent-Based Architectures and Algorithms for Energy Management in Smart Grids. Application to Smart Power Generation and Residential Demand Response

    International Nuclear Information System (INIS)

    Roche, Robin

    2012-01-01

    Due to the convergence of several profound trends in the energy sector, smart grids are emerging as the main paradigm for the modernization of the electric grid. Smart grids hold many promises, including the ability to integrate large shares of distributed and intermittent renewable energy sources, energy storage and electric vehicles, as well as the promise to give consumers more control on their energy consumption. Such goals are expected to be achieved through the use of multiple technologies, and especially of information and communication technologies, supported by intelligent algorithms. These changes are transforming power grids into even more complex systems, that require suitable tools to model, simulate and control their behaviors. In this dissertation, properties of multi-agent systems are used to enable a new systemic approach to energy management, and allow for agent-based architectures and algorithms to be defined. This new approach helps tackle the complexity of a cyber-physical system such as the smart grid by enabling the simultaneous consideration of multiple aspects such as power systems, the communication infrastructure, energy markets, and consumer behaviors. The approach is tested in two applications: a 'smart' energy management system for a gas turbine power plant, and a residential demand response system. An energy management system for gas turbine power plants is designed with the objective to minimize operational costs and emissions, in the smart power generation paradigm. A gas turbine model based on actual data is proposed, and used to run simulations with a simulator specifically developed for this problem. A meta-heuristic achieves dynamic dispatch among gas turbines according to their individual characteristics. Results show that the system is capable of operating the system properly while reducing costs and emissions. The computing and communication requirements of the system, resulting from the selected architecture, are

  12. A modular function architecture for adaptive and predictive energy management in hybrid electric vehicles; Eine modulare Funktionsarchitektur fuer adaptives und vorausschauendes Energiemanagement in Hybridfahrzeugen

    Energy Technology Data Exchange (ETDEWEB)

    Wilde, Andreas

    2009-10-27

    Due to the relatively low energy density of electrical energy storage devices, the control strategy of hybrid electric vehicles has to fulfil a variety of requirements in order to provide both, the availability of hybrid functions, and their efficient execution. Energy consuming functions such as electric drive or electric boost need a high amount of energy stored in the battery. On the other hand for the optimum use of the energy regeneration function a lower state of charge is preferable in order to enable storage of the kinetic energy of the vehicle in all situations, including upon deceleration from high speeds or downhill driving. These diverging requirements yield a conflict of objectives for the charging strategy of hybrid electric vehicles. This work proposes a way to overcome the restrictions on efficiency in hybrid electric vehicles without deteriorating overall driving performance by charging or discharging the traction battery, and by setting the energy management parametres according to the current and forthcoming driving situation. Specific charging and electric drive strategies are presented for various driving situations which are identified by sensors such as navigation systems, cameras or radar. Necessary sensor data fusion methods for driving situation identification are described and a modular function architecture for predictive energy management is derived that is plug-and-play compatible with a broad fleet of vehicles. In order to evaluate its potential, this work also focuses on the simulation of the energy functions and their implementation into an experimental vehicle. This allows measurements under real traffic conditions and a sensivity analysis of the main module interactions within the architecture. (orig.)

  13. Space Elevators Preliminary Architectural View

    Science.gov (United States)

    Pullum, L.; Swan, P. A.

    Space Systems Architecture has been expanded into a process by the US Department of Defense for their large scale systems of systems development programs. This paper uses the steps in the process to establishes a framework for Space Elevator systems to be developed and provides a methodology to manage complexity. This new approach to developing a family of systems is based upon three architectural views: Operational View OV), Systems View (SV), and Technical Standards View (TV). The top level view of the process establishes the stages for the development of the first Space Elevator and is called Architectural View - 1, Overview and Summary. This paper will show the guidelines and steps of the process while focusing upon components of the Space Elevator Preliminary Architecture View. This Preliminary Architecture View is presented as a draft starting point for the Space Elevator Project.

  14. Trends in PACS architecture

    International Nuclear Information System (INIS)

    Bellon, Erwin; Feron, Michel; Deprez, Tom; Reynders, Reinoud; Van den Bosch, Bart

    2011-01-01

    Radiological Picture Archiving and Communication Systems (PACS) have only relatively recently become abundant. Many hospitals have made the transition to PACS about a decade ago. During that decade requirements and available technology have changed considerably. In this paper we look at factors that influence the design of tomorrow's systems, especially those in larger multidisciplinary hospitals. We discuss their impact on PACS architecture (a technological perspective) as well as their impact on radiology (a management perspective). We emphasize that many of these influencing factors originate outside radiology and that radiology has little impact on these factors. That makes it the more important for managers in radiology to be aware of architectural aspects and it may change cooperation of radiology with, among others, the hospital's central IT department.

  15. Phonon replica dynamics in high quality GaN epilayers and AlGaN/GaN quantum wells

    Energy Technology Data Exchange (ETDEWEB)

    Alderighi, D.; Vinattieri, A.; Colocci, M. [Ist. Nazionale Fisica della Materia, Firenze (Italy); Dipt. di Fisica and LENS, Firenze (Italy); Bogani, F. [Ist. Nazionale Fisica della Materia, Firenze (Italy); Dipt. di Energetica, Firenze (Italy); Gottardo, S. [Dipt. di Fisica and LENS, Firenze (Italy); Grandjean, N.; Massies, J. [Centre de Recherche sur l' Hetero-Epitaxie et ses Applications, CNRS, Valbonne (France)

    2001-01-01

    We present an experimental study of the exciton and phonon replica dynamics in high quality GaN epilayers and AlGaN/GaN quantum wells (QW) by means of picosecond time-resolved photoluminescence (PL) measurements. A non-exponential decay is observed both at the zero phonon line (ZPL) and at the n = 1 LO replica. Time-resolved spectra unambiguously assign the replica to the free exciton A recombination. Optical migration effects are detected both in the epilayer and the QWs samples and disappear as the temperature increases up to 60-90 K. Even though the sample quality is comparable to state-of-the-art samples, localization effects dominate the exciton dynamics at low temperature in the studied GaN based structures. (orig.)

  16. Dynamical self-arrest in symmetric and asymmetric diblock copolymer melts using a replica approach within a local theory.

    Science.gov (United States)

    Wu, Sangwook

    2009-03-01

    We investigate dynamical self-arrest in a diblock copolymer melt using a replica approach within a self-consistent local method based on dynamical mean-field theory (DMFT). The local replica approach effectively predicts (chiN)_{A} for dynamical self-arrest in a block copolymer melt for symmetric and asymmetric cases. We discuss the competition of the cubic and quartic interactions in the Landau free energy for a block copolymer melt in stabilizing a glassy state depending on the chain length. Our local replica theory provides a universal value for the dynamical self-arrest in block copolymer melts with (chiN)_{A} approximately 10.5+64N;{-3/10} for the symmetric case.

  17. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  18. PHYSICAL DISABILITY, STIGMA, AND PHYSICAL ACTIVITY IN CHILDREN: A REPLICA STUDY

    Directory of Open Access Journals (Sweden)

    Markus GEBHARDT

    2016-04-01

    Full Text Available Introduction: Stereotypes can be reduced through positive descriptions. A stigma that able-bodied adults have towards children with physical disability can be reduced when the child is portrayed as being active. The study found out that a sporty active child, who uses a wheelchair, is perceived as more competent than the sporty active able-bodied child. Objective: This study is a replica study to support the hypotheses and to examine the stereotypes of able-bodied adults towards children with and without (physical disabilities. Methods: This study presents two experimental replica studies using a 2 (physical activity x 2 (sporty activities. The dependent variables were the perception of competencies and warmth according to Stereotype Content Model (SCM. Study 1 is an online experiment with 355 students of the Open University of Hagen. Study 2 surveys 1176 participants (from Munich and Graz with a paper-pencil-questionnaire. Results: The significant interaction effect was not supported by our studies. The sporty able-bodied child was rated higher in competences than the sporty child, who use a wheelchair. Sporting activity only reduces the stigma towards children with a physical disability slightly. Conclusion: The stigma towards children with physical disability can be reduced when the child is portrayed as being active, but the effect was not strong enough to chance the original classification by the SCM.

  19. Nuclear research emulsion neutron spectrometry at the Little-Boy replica

    International Nuclear Information System (INIS)

    Gold, R.; Roberts, J.H.; Preston, C.C.

    1985-10-01

    Nuclear research emulsions (NRE) have been used to characterize the neutron spectrum emitted by the Little-Boy replica. NRE were irradiated at the Little-Boy surface as well as approximately 2 m from the center of the Little-Boy replica using polar angles of 0 0 , 30 0 , 60 0 and 90 0 . For the NRE exposed at 2 m, neutron background was determined using shadow shields of borated polyethylene. Emulsion scanning to date has concentrated exclusively on the 2-m, 0 0 and 2-m, 90 0 locations. Approximately 5000 proton-recoil tracks have been measured in NRE irradiated at each of these locations. Neutron spectra obtained from these NRE proton-recoil spectra are compared with both liquid scintillator neutron spectrometry and Monte Carlo calculations. NRE and liquid scintillator neutron spectra generally agree within experimental uncertainties at the 2-m, 90 0 location. However, at the 2-m, 0 0 location, the neutron spectra derived from these two independent experimental methods differ significantly. NRE spectra and Monte Carlo calculations exhibit general agreement with regard to both intensity as well as energy dependence. Better agreement is attained between theory and experiment at the 2-m, 90 0 location, where the neutron intensity is considerably higher. 14 refs., 18 figs., 11 tabs

  20. One step replica symmetry breaking and extreme order statistics of logarithmic REMs

    Directory of Open Access Journals (Sweden)

    Xiangyu Cao, Yan V. Fyodorov, Pierre Le Doussal

    2016-12-01

    Full Text Available Building upon the one-step replica symmetry breaking formalism, duly understood and ramified, we show that the sequence of ordered extreme values of a general class of Euclidean-space logarithmically correlated random energy models (logREMs behave in the thermodynamic limit as a randomly shifted decorated exponential Poisson point process. The distribution of the random shift is determined solely by the large-distance ("infra-red", IR limit of the model, and is equal to the free energy distribution at the critical temperature up to a translation. the decoration process is determined solely by the small-distance ("ultraviolet", UV limit, in terms of the biased minimal process. Our approach provides connections of the replica framework to results in the probability literature and sheds further light on the freezing/duality conjecture which was the source of many previous results for log-REMs. In this way we derive the general and explicit formulae for the joint probability density of depths of the first and second minima (as well its higher-order generalizations in terms of model-specific contributions from UV as well as IR limits. In particular, we show that the second min statistics is largely independent of details of UV data, whose influence is seen only through the mean value of the gap. For a given log-correlated field this parameter can be evaluated numerically, and we provide several numerical tests of our theory using the circular model of $1/f$-noise.

  1. A new creep-strain-replica method for evaluating the remaining life time of components

    International Nuclear Information System (INIS)

    Joas, H.D.

    2001-01-01

    To realise a safe and economic operation of older power- or chemical plants a strategy for maintenance is necessary, which makes it possible to operate a component or the plant longer than 300,000 operating hours, this also for the situation that the mode of operation has changed meanwhile. In Germany a realistic evaluation of the remaining life-time is done by comparing the actual calculated test data of a component with the code TRD 301 and TRD 508 and additional non-destructive tests or other codes like ASME, Sec. II, BS 5500, AFCEN (1985). According to many boundary conditions, the calculated data are inaccurate and the measuring of creep-strain at temperatures of about 600 o C with capacitive strain-gauges very expensive. Description of the approach of the problems: spotwelding of two gauges to the surface of a component (in a defined distance), forming a gap, producing of replica of the gap after certain operating hours at shut-down conditions by trained personal, evaluation of the replica to gain the amount of creep-strain using a scanning electron microscope, assessment of the creep-strain data. (Author)

  2. Efficacy of independence sampling in replica exchange simulations of ordered and disordered proteins.

    Science.gov (United States)

    Lee, Kuo Hao; Chen, Jianhan

    2017-11-15

    Recasting temperature replica exchange (T-RE) as a special case of Gibbs sampling has led to a simple and efficient scheme for enhanced mixing (Chodera and Shirts, J. Chem. Phys., 2011, 135, 194110). To critically examine if T-RE with independence sampling (T-REis) improves conformational sampling, we performed T-RE and T-REis simulations of ordered and disordered proteins using coarse-grained and atomistic models. The results demonstrate that T-REis effectively increase the replica mobility in temperatures space with minimal computational overhead, especially for folded proteins. However, enhanced mixing does not translate well into improved conformational sampling. The convergences of thermodynamic properties interested are similar, with slight improvements for T-REis of ordered systems. The study re-affirms the efficiency of T-RE does not appear to be limited by temperature diffusion, but by the inherent rates of spontaneous large-scale conformational re-arrangements. Due to its simplicity and efficacy of enhanced mixing, T-REis is expected to be more effective when incorporated with various Hamiltonian-RE protocols. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Laboratory studies of groundwater degassing in replicas of natural fractured rock for linear flow geometry

    International Nuclear Information System (INIS)

    Geller, J.T.

    1998-02-01

    Laboratory experiments to simulate two-phase (gas and water) flow in fractured rock evolving from groundwater degassing were conducted in transparent replicas of natural rock fractures. These experiments extend the work by Geller et al. (1995) and Jarsjo and Geller (1996) that tests the hypothesis that groundwater degassing caused observed flow reductions in the Stripa Simulated Drift Experiment (SDE). Understanding degassing effects over a range of gas contents is needed due to the uncertainty in the gas contents of the water at the SDE. The main objectives of this study were to: (1) measure the effect of groundwater degassing on liquid flow rates for lower gas contents than the values used in Geller for linear flow geometry in the same fracture replicas of Geller; (2) provide a data set to develop a predictive model of two-phase flow in fractures for conditions of groundwater degassing; and (3) improve the certainty of experimental gas contents (this effort included modifications to the experimental system used by Geller et al. and separate gas-water equilibration tests). The Stripa site is being considered for a high-level radioactive waste repository

  4. MUF architecture /art London

    DEFF Research Database (Denmark)

    Svenningsen Kajita, Heidi

    2009-01-01

    Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art......Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art...

  5. Virtual Replication of IoT Hubs in the Cloud: A Flexible Approach to Smart Object Management

    Directory of Open Access Journals (Sweden)

    Simone Cirani

    2018-03-01

    Full Text Available In future years, the Internet of Things is expected to interconnect billions of highly heterogeneous devices, denoted as “smart objects”, enabling the development of innovative distributed applications. Smart objects are constrained sensor/actuator-equipped devices, in terms of computational power and available memory. In order to cope with the diverse physical connectivity technologies of smart objects, the Internet Protocol is foreseen as the common “language” for full interoperability and as a unifying factor for integration with the Internet. Large-scale platforms for interconnected devices are required to effectively manage resources provided by smart objects. In this work, we present a novel architecture for the management of large numbers of resources in a scalable, seamless, and secure way. The proposed architecture is based on a network element, denoted as IoT Hub, placed at the border of the constrained network, which implements the following functions: service discovery; border router; HTTP/Constrained Application Protocol (CoAP and CoAP/CoAP proxy; cache; and resource directory. In order to protect smart objects (which cannot, because of their constrained nature, serve a large number of concurrent requests and the IoT Hub (which serves as a gateway to the constrained network, we introduce the concept of virtual IoT Hub replica: a Cloud-based “entity” replicating all the functions of a physical IoT Hub, which external clients will query to access resources. IoT Hub replicas are constantly synchronized with the physical IoT Hub through a low-overhead protocol based on Message Queue Telemetry Transport (MQTT. An experimental evaluation, proving the feasibility and advantages of the proposed architecture, is presented.

  6. Architectural fragments

    DEFF Research Database (Denmark)

    Bang, Jacob Sebastian

    2018-01-01

    I have created a large collection of plaster models: a collection of Obstructions, errors and opportunities that may develop into architecture. The models are fragments of different complex shapes as well as more simple circular models with different profiling and diameters. In this contect I have....... I try to invent the ways of drawing the models - that decode and unfold them into architectural fragments- into future buildings or constructions in the landscape. [1] Luigi Moretti: Italian architect, 1907 - 1973 [2] Man Ray: American artist, 1890 - 1976. in 2015, I saw the wonderful exhibition...... "Man Ray - Human Equations" at the Glyptotek in Copenhagen, organized by the Philips Collection in Washington D.C. and the Israel Museum in Jerusalem (in 2013). See also: "Man Ray - Human Equations" catalogue published by Hatje Cantz Verlag, Germany, 2014....

  7. Kosmos = architecture

    Directory of Open Access Journals (Sweden)

    Tine Kurent

    1985-12-01

    Full Text Available The old Greek word "kosmos" means not only "cosmos", but also "the beautiful order", "the way of building", "building", "scenography", "mankind", and, in the time of the New Testament, also "pagans". The word "arhitekton", meaning first the "master of theatrical scenography", acquired the meaning of "builder", when the words "kosmos" and ~kosmetes" became pejorative. The fear that architecture was not considered one of the arts before Renaissance, since none of the Muses supervised the art of building, results from the misunderstanding of the word "kosmos". Urania was the Goddes of the activity implied in the verb "kosmein", meaning "to put in the beautiful order" - everything, from the universe to the man-made space, i. e. the architecture.

  8. Metabolistic Architecture

    DEFF Research Database (Denmark)

    2013-01-01

    Textile Spaces presents different approaches to using textile as a spatial definer and artistic medium. The publication collages images and text, art and architecture, science, philosophy and literature, process and product, past, present and future. It forms an insight into soft materials' funct......' functional and poetic potentials, linking the disciplines through fragments that aim to inspire a further look into the artists' and architects' practices, while simultaneously framing these textile visions in a wider context.......Textile Spaces presents different approaches to using textile as a spatial definer and artistic medium. The publication collages images and text, art and architecture, science, philosophy and literature, process and product, past, present and future. It forms an insight into soft materials...

  9. Mimicking the action of folding chaperones by Hamiltonian replica-exchange molecular dynamics simulations : Application in the refinement of de novo models

    NARCIS (Netherlands)

    Fan, Hao; Periole, Xavier; Mark, Alan E.

    The efficiency of using a variant of Hamiltonian replica-exchange molecular dynamics (Chaperone H-replica-exchange molecular dynamics [CH-REMD]) for the refinement of protein structural models generated de novo is investigated. In CH-REMD, the interaction between the protein and its environment,

  10. A Digital Architecture for a Network-Based Learning Health System: Integrating Chronic Care Management, Quality Improvement, and Research.

    Science.gov (United States)

    Marsolo, Keith; Margolis, Peter A; Forrest, Christopher B; Colletti, Richard B; Hutton, John J

    2015-01-01

    We collaborated with the ImproveCareNow Network to create a proof-of-concept architecture for a network-based Learning Health System. This collaboration involved transitioning an existing registry to one that is linked to the electronic health record (EHR), enabling a "data in once" strategy. We sought to automate a series of reports that support care improvement while also demonstrating the use of observational registry data for comparative effectiveness research. We worked with three leading EHR vendors to create EHR-based data collection forms. We automated many of ImproveCareNow's analytic reports and developed an application for storing protected health information and tracking patient consent. Finally, we deployed a cohort identification tool to support feasibility studies and hypothesis generation. There is ongoing uptake of the system. To date, 31 centers have adopted the EHR-based forms and 21 centers are uploading data to the registry. Usage of the automated reports remains high and investigators have used the cohort identification tools to respond to several clinical trial requests. The current process for creating EHR-based data collection forms requires groups to work individually with each vendor. A vendor-agnostic model would allow for more rapid uptake. We believe that interfacing network-based registries with the EHR would allow them to serve as a source of decision support. Additional standards are needed in order for this vision to be achieved, however. We have successfully implemented a proof-of-concept Learning Health System while providing a foundation on which others can build. We have also highlighted opportunities where sponsors could help accelerate progress.

  11. Flexible automated systems of real time mining operation management: concepts, architecture, models of network engineering for data transmission and processing

    Energy Technology Data Exchange (ETDEWEB)

    Markhasin, A.B.

    1987-11-01

    Since the mid 1960's considerable effort has been invested by the mining industry and its research institutions and by universities to create real time mining management automation systems. Some of the shortcomings which still persist in realizing the efficiency such systems can offer are due to objective and subjective factors within and outside the management systems: the creation of the component base, automation equipment, and computer technology, on the one hand, and the organization, process, engineering, and coordination of mining work on the other. This review addresses several of these shortcomings with recommendations for their solution in a primary and systematic way and suggests methods for the implementation of microprocessors and a network of flexible data transmission and processing facilities for both surface and underground mining.

  12. A Proposed Solution for Managing Doctor's Smart Cards in Hospitals Using a Single Sign-On Central Architecture

    OpenAIRE

    Mauro, Christian;Sunyaev, Ali;Leimeister, Jan Marco;Schweiger, Andreas;Krcmar, Helmut

    2014-01-01

    This paper describes a single sign-on solution for the central management of health care provider?s smart cards in hospitals. The proposed approach which is expected to be an improvement over current methods is made possible through the introduction of a national healthcare telematics infrastructure in Germany where every physician and every patient will automatically be given an electronic health smart card (for patients) and a corresponding health professional card (for health care provider...

  13. DeSyGNER: A Building Block Architecture Fostering Independent Cooperative Development of Multimedia Knowledge Management Applications

    OpenAIRE

    Deibel, Stephan R.A.; Greenes, Robert A.; Snydr-Michal, Jan T.

    1990-01-01

    Multimedia knowledge management requires generalized sharing, composition and inter-relation of disparate data and knowledge elements. Since traditional computer operating systems treat programs as bodies of isolated functionality, inter-program connections can be made only by considerable effort on the part of each program author. We have been developing a multimedia kernel called DeSyGNER (the Decision Systems Group Nucleus of Extensible Resources) that provides basic multimedia knowledge m...

  14. Geometry anisotropy and mechanical property isotropy in titanium foam fabricated by replica impregnation method

    Energy Technology Data Exchange (ETDEWEB)

    Manonukul, Anchalee, E-mail: anchalm@mtec.or.th [National Metal and Materials Technology Center (MTEC), National Science and Technology Development Agency (NSTDA), 114 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Srikudvien, Pathompoom [National Metal and Materials Technology Center (MTEC), National Science and Technology Development Agency (NSTDA), 114 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Tange, Makiko [Taisei Kogyo Thailand Co., Ltd., Room INC2d-409, Innovation Cluster 2 Building, Tower D, 141 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Puncreobutr, Chedtha [Department of Metallurgical Engineering, Faculty of Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330 (Thailand)

    2016-02-08

    Polyurethane (PU) foams have both geometry and mechanical property anisotropy. Metal foams, which are manufacturing by investment casting or melt deposition method and using PU foam as a template, also have mechanical property anisotropy. This work studied the mechanical properties in two directions of titanium foam with four different cell sizes fabricated using the replica impregnation method. The two directions are (1) the loading direction parallel to the foaming direction where the cells are elongated (EL direction) and (2) the loading direction perpendicular to the foaming direction where the cell are equiaxed (EQ direction). The results show that the compression responses for both EL and EQ directions are isotropy. Micrographs and X-ray micro-computed tomography show that the degree of geometry anisotropy is not strong enough to results in mechanical property anisotropy.

  15. Geometry anisotropy and mechanical property isotropy in titanium foam fabricated by replica impregnation method

    International Nuclear Information System (INIS)

    Manonukul, Anchalee; Srikudvien, Pathompoom; Tange, Makiko; Puncreobutr, Chedtha

    2016-01-01

    Polyurethane (PU) foams have both geometry and mechanical property anisotropy. Metal foams, which are manufacturing by investment casting or melt deposition method and using PU foam as a template, also have mechanical property anisotropy. This work studied the mechanical properties in two directions of titanium foam with four different cell sizes fabricated using the replica impregnation method. The two directions are (1) the loading direction parallel to the foaming direction where the cells are elongated (EL direction) and (2) the loading direction perpendicular to the foaming direction where the cell are equiaxed (EQ direction). The results show that the compression responses for both EL and EQ directions are isotropy. Micrographs and X-ray micro-computed tomography show that the degree of geometry anisotropy is not strong enough to results in mechanical property anisotropy.

  16. Computing Relative Free Energies of Solvation using Single Reference Thermodynamic Integration Augmented with Hamiltonian Replica Exchange.

    Science.gov (United States)

    Khavrutskii, Ilja V; Wallqvist, Anders

    2010-11-09

    This paper introduces an efficient single-topology variant of Thermodynamic Integration (TI) for computing relative transformation free energies in a series of molecules with respect to a single reference state. The presented TI variant that we refer to as Single-Reference TI (SR-TI) combines well-established molecular simulation methodologies into a practical computational tool. Augmented with Hamiltonian Replica Exchange (HREX), the SR-TI variant can deliver enhanced sampling in select degrees of freedom. The utility of the SR-TI variant is demonstrated in calculations of relative solvation free energies for a series of benzene derivatives with increasing complexity. Noteworthy, the SR-TI variant with the HREX option provides converged results in a challenging case of an amide molecule with a high (13-15 kcal/mol) barrier for internal cis/trans interconversion using simulation times of only 1 to 4 ns.

  17. Pion emission from the T2K replica target: method, results and application

    CERN Document Server

    Abgrall, N.; Anticic, T.; Antoniou, N.; Argyriades, J.; Baatar, B.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bravar, A.; Brooks, W.; Brzychczyk, J.; Bubak, A.; Bunyatov, S.A.; Busygina, O.; Christakoglou, P.; Chung, P.; Czopowicz, T.; Davis, N.; Debieux, S.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Esposito, L.S.; Feofilov, G.A.; Fodor, Z.; Ferrero, A.; Fulop, A.; Gazdzicki, M.; Golubeva, M.; Grabez, B.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hakobyan, H.; Hasegawa, T.; Idczak, R.; Igolkin, S.; Ivanov, Y.; Ivashkin, A.; Kadija, K.; Kapoyannis, A.; Katrynska, N.; Kielczewska, D.; Kikola, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kochebina, O.; Kolesnikov, V.I.; Kolev, D.; Kondratiev, V.P.; Korzenev, A.; Kowalski, S.; Krasnoperov, A.; Kuleshov, S.; Kurepin, A.; Lacey, R.; Larsen, D.; Laszlo, A.; Lyubushkin, V.V.; Mackowiak-Pawlowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A.I.; Maletic, D.; Marchionni, A.; Marcinek, A.; Maris, I.; Marin, V.; Marton, K.; Matulewicz, T.; Matveev, V.; Melkumov, G.L.; Messina, M.; Mrowczynski, St.; Murphy, S.; Nakadaira, T.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A.D.; Paul, T.; Peryt, W.; Petukhov, O.; Planeta, R.; Pluta, J.; Popov, B.A.; Posiadala, M.; Pulawski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Renfordt, R.; Robert, A.; Rohrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Sekiguchi, T.; Seyboth, P.; Shibata, M.; Sipos, M.; Skrzypczak, E.; Slodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Strabel, C.; Strobele, H.; Susa, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V.V.; Vesztergombi, G.; Wilczek, A.; Wlodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszynski, O.; Zambelli, L.; Zipper, W.; Hartz, M.; Ichikawa, A.K.; Kubo, H.; Marino, A.D.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Yuan, T.; Zimmerman, E.D.

    2013-01-01

    The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

  18. Behavioural responses of dogs to asymmetrical tail wagging of a robotic dog replica.

    Science.gov (United States)

    Artelle, K A; Dumoulin, L K; Reimchen, T E

    2011-03-01

    Recent evidence suggests that bilateral asymmetry in the amplitude of tail wagging of domestic dogs (Canis familiaris) is associated with approach (right wag) versus withdrawal (left wag) motivation and may be the by-product of hemispheric dominance. We consider whether such asymmetry in motion of the tail, a crucial appendage in intra-specific communication in all canids, provides visual information to a conspecific leading to differential behaviour. To evaluate this, we experimentally investigated the approach behaviour of free-ranging dogs to the asymmetric tail wagging of a life-size robotic dog replica. Our data, involving 452 separate interactions, showed a significantly greater proportion of dogs approaching the model continuously without stopping when the tail wagged to the left, compared with a right wag, which was more likely to yield stops. While the results indicate that laterality of a wagging tail provides behavioural information to conspecifics, the responses are not readily integrated into the predicted behaviour based on hemispheric dominance.

  19. Localization-Free Detection of Replica Node Attacks in Wireless Sensor Networks Using Similarity Estimation with Group Deployment Knowledge

    Directory of Open Access Journals (Sweden)

    Chao Ding

    2017-01-01

    Full Text Available Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs, adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies.

  20. A Generalized DRM Architectural Framework

    Directory of Open Access Journals (Sweden)

    PATRICIU, V. V.

    2011-02-01

    Full Text Available Online digital goods distribution environment lead to the need for a system to protect digital intellectual property. Digital Rights Management (DRM is the system born to protect and control distribution and use of those digital assets. The present paper is a review of the current state of DRM, focusing on architectural design, security technologies, and important DRM deployments. The paper primarily synthesizes DRM architectures within a general framework. We also present DRM ecosystem as providing a better understanding of what is currently happening to content rights management from a technological point of view. This paper includes conclusions of several DRM initiative studies, related to rights management systems with the purpose of identifying and describing the most significant DRM architectural models. The basic functions and processes of the DRM solutions are identified.

  1. New force replica exchange method and protein folding pathways probed by force-clamp technique.

    Science.gov (United States)

    Kouza, Maksim; Hu, Chin-Kun; Li, Mai Suan

    2008-01-28

    We have developed a new extended replica exchange method to study thermodynamics of a system in the presence of external force. Our idea is based on the exchange between different force replicas to accelerate the equilibrium process. This new approach was applied to obtain the force-temperature phase diagram and other thermodynamical quantities of the three-domain ubiquitin. Using the C(alpha)-Go model and the Langevin dynamics, we have shown that the refolding pathways of single ubiquitin depend on which terminus is fixed. If the N end is fixed then the folding pathways are different compared to the case when both termini are free, but fixing the C terminal does not change them. Surprisingly, we have found that the anchoring terminal does not affect the pathways of individual secondary structures of three-domain ubiquitin, indicating the important role of the multidomain construction. Therefore, force-clamp experiments, in which one end of a protein is kept fixed, can probe the refolding pathways of a single free-end ubiquitin if one uses either the polyubiquitin or a single domain with the C terminus anchored. However, it is shown that anchoring one end does not affect refolding pathways of the titin domain I27, and the force-clamp spectroscopy is always capable to predict folding sequencing of this protein. We have obtained the reasonable estimate for unfolding barrier of ubiquitin, using the microscopic theory for the dependence of unfolding time on the external force. The linkage between residue Lys48 and the C terminal of ubiquitin is found to have the dramatic effect on the location of the transition state along the end-to-end distance reaction coordinate, but the multidomain construction leaves the transition state almost unchanged. We have found that the maximum force in the force-extension profile from constant velocity force pulling simulations depends on temperature nonlinearly. However, for some narrow temperature interval this dependence becomes

  2. Long-time atomistic simulations with the Parallel Replica Dynamics method

    Science.gov (United States)

    Perez, Danny

    Molecular Dynamics (MD) -- the numerical integration of atomistic equations of motion -- is a workhorse of computational materials science. Indeed, MD can in principle be used to obtain any thermodynamic or kinetic quantity, without introducing any approximation or assumptions beyond the adequacy of the interaction potential. It is therefore an extremely powerful and flexible tool to study materials with atomistic spatio-temporal resolution. These enviable qualities however come at a steep computational price, hence limiting the system sizes and simulation times that can be achieved in practice. While the size limitation can be efficiently addressed with massively parallel implementations of MD based on spatial decomposition strategies, allowing for the simulation of trillions of atoms, the same approach usually cannot extend the timescales much beyond microseconds. In this article, we discuss an alternative parallel-in-time approach, the Parallel Replica Dynamics (ParRep) method, that aims at addressing the timescale limitation of MD for systems that evolve through rare state-to-state transitions. We review the formal underpinnings of the method and demonstrate that it can provide arbitrarily accurate results for any definition of the states. When an adequate definition of the states is available, ParRep can simulate trajectories with a parallel speedup approaching the number of replicas used. We demonstrate the usefulness of ParRep by presenting different examples of materials simulations where access to long timescales was essential to access the physical regime of interest and discuss practical considerations that must be addressed to carry out these simulations. Work supported by the United States Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division.

  3. Gamma-ray spectra and doses from the Little Boy replica

    International Nuclear Information System (INIS)

    Moss, C.E.; Lucas, M.C.; Tisinger, E.W.; Hamm, M.E.

    1984-01-01

    Most radiation safety guidelines in the nuclear industry are based on the data concerning the survivors of the nuclear explosions at Hiroshima and Nagasaki. Crucial to determining these guidelines is the radiation from the explosions. We have measured gamma-ray pulse-height distributions from an accurate replica of the Little Boy device used at Hiroshima, operated at low power levels near critical. The device was placed outdoors on a stand 4 m from the ground to minimize environmental effects. The power levels were based on a monitor detector calibrated very carefully in independent experiments. High-resolution pulse-height distributions were acquired with a germanium detector to identify the lines and to obtain line intensities. The 7631 to 7645 keV doublet from neutron capture in the heavy steel case was dominant. Low-resolution pulse-height distributions were acquired with bismuth-germanate detectors. We calculated flux spectra from these distributions using accurately measured detector response functions and efficiency curves. We then calculated dose-rate spectra from the flux spectra using a flux-to-dose-rate conversion procedure. The integral of each dose-rate spectrum gave an integral dose rate. The integral doses at 2 m ranged from 0.46 to 1.03 mrem per 10 13 fissions. The output of the Little Boy replica can be calculated with Monte Carlo codes. Comparison of our experimental spectra, line intensities, and integral doses can be used to verify these calculations at low power levels and give increased confidence to the calculated values from the explosion at Hiroshima. These calculations then can be used to establish better radiation safety guidelines. 7 references, 7 figures, 2 tables

  4. A Probabilistic Framework for Constructing Temporal Relations in Replica Exchange Molecular Trajectories.

    Science.gov (United States)

    Chattopadhyay, Aditya; Zheng, Min; Waller, Mark Paul; Priyakumar, U Deva

    2018-05-23

    Knowledge of the structure and dynamics of biomolecules is essential for elucidating the underlying mechanisms of biological processes. Given the stochastic nature of many biological processes, like protein unfolding, it's almost impossible that two independent simulations will generate the exact same sequence of events, which makes direct analysis of simulations difficult. Statistical models like Markov Chains, transition networks etc. help in shedding some light on the mechanistic nature of such processes by predicting long-time dynamics of these systems from short simulations. However, such methods fall short in analyzing trajectories with partial or no temporal information, for example, replica exchange molecular dynamics or Monte Carlo simulations. In this work we propose a probabilistic algorithm, borrowing concepts from graph theory and machine learning, to extract reactive pathways from molecular trajectories in the absence of temporal data. A suitable vector representation was chosen to represent each frame in the macromolecular trajectory (as a series of interaction and conformational energies) and dimensionality reduction was performed using principal component analysis (PCA). The trajectory was then clustered using a density-based clustering algorithm, where each cluster represents a metastable state on the potential energy surface (PES) of the biomolecule under study. A graph was created with these clusters as nodes with the edges learnt using an iterative expectation maximization algorithm. The most reactive path is conceived as the widest path along this graph. We have tested our method on RNA hairpin unfolding trajectory in aqueous urea solution. Our method makes the understanding of the mechanism of unfolding in RNA hairpin molecule more tractable. As this method doesn't rely on temporal data it can be used to analyze trajectories from Monte Carlo sampling techniques and replica exchange molecular dynamics (REMD).

  5. Flow field analysis in a compliant acinus replica model using particle image velocimetry (PIV).

    Science.gov (United States)

    Berg, Emily J; Weisman, Jessica L; Oldham, Michael J; Robinson, Risa J

    2010-04-19

    Inhaled particles reaching the alveolar walls have the potential to cross the blood-gas barrier and enter the blood stream. Experimental evidence of pulmonary dosimetry, however, cannot be explained by current whole lung dosimetry models. Numerical and experimental studies shed some light on the mechanisms of particle transport, but realistic geometries have not been investigated. In this study, a three dimensional expanding model including two generations of respiratory bronchioles and five terminal alveolar sacs was created from a replica human lung cast. Flow visualization techniques were employed to quantify the fluid flow while utilizing streamlines to evaluate recirculation. Pathlines were plotted to track the fluid motion and estimate penetration depth of inhaled air. This study provides evidence that the two generations immediately proximal to the terminal alveolar sacs do not have recirculating eddies, even for intense breathing. Results of Peclet number calculations indicate that substantial convective motion is present in vivo for the case of deep breathing, which significantly increases particle penetration into the alveoli. However, particle diffusion remains the dominant mechanism of particle transport over convection, even for intense breathing because inhaled particles do not reach the alveolar wall in a single breath by convection alone. Examination of the velocity fields revealed significant uneven ventilation of the alveoli during a single breath, likely due to variations in size and location. This flow field data, obtained from replica model geometry with realistic breathing conditions, provides information to better understand fluid and particle behavior in the acinus region of the lung. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Architecture design of a generic centralized adjudication module integrated in a web-based clinical trial management system.

    Science.gov (United States)

    Zhao, Wenle; Pauls, Keith

    2016-04-01

    Centralized outcome adjudication has been used widely in multicenter clinical trials in order to prevent potential biases and to reduce variations in important safety and efficacy outcome assessments. Adjudication procedures could vary significantly among different studies. In practice, the coordination of outcome adjudication procedures in many multicenter clinical trials remains as a manual process with low efficiency and high risk of delay. Motivated by the demands from two large clinical trial networks, a generic outcome adjudication module has been developed by the network's data management center within a homegrown clinical trial management system. In this article, the system design strategy and database structure are presented. A generic database model was created to transfer different adjudication procedures into a unified set of sequential adjudication steps. Each adjudication step was defined by one activate condition, one lock condition, one to five categorical data items to capture adjudication results, and one free text field for general comments. Based on this model, a generic outcome adjudication user interface and a generic data processing program were developed within a homegrown clinical trial management system to provide automated coordination of outcome adjudication. By the end of 2014, this generic outcome adjudication module had been implemented in 10 multicenter trials. A total of 29 adjudication procedures were defined with the number of adjudication steps varying from 1 to 7. The implementation of a new adjudication procedure in this generic module took an experienced programmer 1 or 2 days. A total of 7336 outcome events had been adjudicated and 16,235 adjudication step activities had been recorded. In a multicenter trial, 1144 safety outcome event submissions went through a three-step adjudication procedure and reported a median of 3.95 days from safety event case report form submission to adjudication completion. In another trial

  7. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Science.gov (United States)

    2010-07-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  8. istSOS, a new sensor observation management system: software architecture and a real-case application for flood protection

    Directory of Open Access Journals (Sweden)

    M. Cannata

    2015-11-01

    Full Text Available istSOS (Istituto scienze della Terra Sensor Observation Service is an implementation of the Sensor Observation Service (SOS standard from the Open Geospatial Consortium. The development of istSOS started in 2009 in order to provide a simple implementation of the SOS for the management, provision and integration of hydro-meteorological data collected in Canton Ticino (Southern Switzerland. istSOS is an Open Source, entirely written in Python and based on reliable software like PostgreSQL/PostGIS and Apache/mod_wsgi. This paper illustrates the latest software enhancements, including a RESTful Web service and a Web-based graphical user interface, which enable a better and simplified interaction with measurements and SOS service settings. The robustness of the implemented solution has been validated in a real-case application: the Verbano Lake Early Warning System. In this application, near real-time data have to be exchanged by inter-regional partners and used in a hydrological model for lake level forecasting and flooding hazard assessment. This system is linked with a dedicated geoportal used by the civil protection for the management, alert and protection of the population and the assets of the Locarno area. Practical considerations, technical issues and foreseen improvements are presented and discussed.

  9. Architectural Drawing

    DEFF Research Database (Denmark)

    Steinø, Nicolai

    2018-01-01

    In a time of computer aided design, computer graphics and parametric design tools, the art of architectural drawing is in a state of neglect. But design and drawing are inseparably linked in ways which often go unnoticed. Essentially, it is very difficult, if not impossible, to conceive of a design...... is that computers can represent graphic ideas both faster and better than most medium-skilled draftsmen, drawing in design is not only about representing final designs. In fact, several steps involving the capacity to draw lie before the representation of a final design. Not only is drawing skills an important...... prerequisite for learning about the nature of existing objects and spaces, and thus to build a vocabulary of design. It is also a prerequisite for both reflecting and communicating about design ideas. In this paper, a taxonomy of notation, reflection, communication and presentation drawing is presented...

  10. Architectural Theatricality

    DEFF Research Database (Denmark)

    Tvedebrink, Tenna Doktor Olsen; Fisker, Anna Marie; Kirkegaard, Poul Henning

    2013-01-01

    In the attempt to improve patient treatment and recovery, researchers focus on applying concepts of hospitality to hospitals. Often these concepts are dominated by hotel-metaphors focusing on host–guest relationships or concierge services. Motivated by a project trying to improve patient treatment...... is known for his writings on theatricality, understood as a holistic design approach emphasizing the contextual, cultural, ritual and social meanings rooted in architecture. Relative hereto, the International Food Design Society recently argued, in a similar holistic manner, that the methodology used...... to provide an aesthetic eating experience includes knowledge on both food and design. Based on a hermeneutic reading of Semper’s theory, our thesis is that this holistic design approach is important when debating concepts of hospitality in hospitals. We use this approach to argue for how ‘food design...

  11. Lab architecture

    Science.gov (United States)

    Crease, Robert P.

    2008-04-01

    There are few more dramatic illustrations of the vicissitudes of laboratory architecturethan the contrast between Building 20 at the Massachusetts Institute of Technology (MIT) and its replacement, the Ray and Maria Stata Center. Building 20 was built hurriedly in 1943 as temporary housing for MIT's famous Rad Lab, the site of wartime radar research, and it remained a productive laboratory space for over half a century. A decade ago it was demolished to make way for the Stata Center, an architecturally striking building designed by Frank Gehry to house MIT's computer science and artificial intelligence labs (above). But in 2004 - just two years after the Stata Center officially opened - the building was criticized for being unsuitable for research and became the subject of still ongoing lawsuits alleging design and construction failures.

  12. Dynamic configuration management of a multi-standard and multi-mode reconfigurable multi-ASIP architecture for turbo decoding

    Science.gov (United States)

    Lapotre, Vianney; Gogniat, Guy; Baghdadi, Amer; Diguet, Jean-Philippe

    2017-12-01

    The multiplication of connected devices goes along with a large variety of applications and traffic types needing diverse requirements. Accompanying this connectivity evolution, the last years have seen considerable evolutions of wireless communication standards in the domain of mobile telephone networks, local/wide wireless area networks, and Digital Video Broadcasting (DVB). In this context, intensive research has been conducted to provide flexible turbo decoder targeting high throughput, multi-mode, multi-standard, and power consumption efficiency. However, flexible turbo decoder implementations have not often considered dynamic reconfiguration issues in this context that requires high speed configuration switching. Starting from this assessment, this paper proposes the first solution that allows frame-by-frame run-time configuration management of a multi-processor turbo decoder without compromising the decoding performances.

  13. Functional Interface Considerations within an Exploration Life Support System Architecture

    Science.gov (United States)

    Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad

    2016-01-01

    As notional life support system (LSS) architectures are developed and evaluated, myriad options must be considered pertaining to process technologies, components, and equipment assemblies. Each option must be evaluated relative to its impact on key functional interfaces within the LSS architecture. A leading notional architecture has been developed to guide the path toward realizing future crewed space exploration goals. This architecture includes atmosphere revitalization, water recovery and management, and environmental monitoring subsystems. Guiding requirements for developing this architecture are summarized and important interfaces within the architecture are discussed. The role of environmental monitoring within the architecture is described.

  14. Architectural design of experience based factory model for software ...

    African Journals Online (AJOL)

    architectural design. Automation features are incorporated in the design in which workflow system and intelligent agents are integrated, and the facilitation of cloud environment is empowered to further support the automation. Keywords: architectural design; knowledge management; experience factory; workflow;

  15. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  16. SUSTAINABLE ARCHITECTURE : WHAT ARCHITECTURE STUDENTS THINK

    OpenAIRE

    SATWIKO, PRASASTO

    2013-01-01

    Sustainable architecture has become a hot issue lately as the impacts of climate change become more intense. Architecture educations have responded by integrating knowledge of sustainable design in their curriculum. However, in the real life, new buildings keep coming with designs that completely ignore sustainable principles. This paper discusses the results of two national competitions on sustainable architecture targeted for architecture students (conducted in 2012 and 2013). The results a...

  17. Modeling Architectural Patterns Using Architectural Primitives

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    2005-01-01

    Architectural patterns are a key point in architectural documentation. Regrettably, there is poor support for modeling architectural patterns, because the pattern elements are not directly matched by elements in modeling languages, and, at the same time, patterns support an inherent variability that

  18. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  19. Lightweight enterprise architectures

    CERN Document Server

    Theuerkorn, Fenix

    2004-01-01

    STATE OF ARCHITECTUREArchitectural ChaosRelation of Technology and Architecture The Many Faces of Architecture The Scope of Enterprise Architecture The Need for Enterprise ArchitectureThe History of Architecture The Current Environment Standardization Barriers The Need for Lightweight Architecture in the EnterpriseThe Cost of TechnologyThe Benefits of Enterprise Architecture The Domains of Architecture The Gap between Business and ITWhere Does LEA Fit? LEA's FrameworkFrameworks, Methodologies, and Approaches The Framework of LEATypes of Methodologies Types of ApproachesActual System Environmen

  20. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  1. Information architecture. Volume 1, The foundations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    The Information Management Planning and Architecture Coordinating Team was formed to establish an information architecture framework to meet DOE`s current and future information needs. This department- wide activity was initiated in accordance with the DOE Information Management Strategic Plan; it also supports the Departmental Strategic Plan. It recognizes recent changes in emphasis as reflected in OMB Circular A-130 and the Information Resources Management Planning Process Improvement Team recommendations. Sections of this document provides the foundation for establishing DOE`s Information Architecture: Background, Business Case (reduced duplication of effort, increased integration of activities, improved operational capabilities), Baseline (technology baseline currently in place within DOE), Vision (guiding principles for future DOE Information Architecture), Standards Process, Policy and Process Integration (describes relations between information architecture and business processes), and Next Steps. Following each section is a scenario. A glossary of terms is provided.

  2. Precision of fit between implant impression coping and implant replica pairs for three implant systems.

    Science.gov (United States)

    Nicoll, Roxanna J; Sun, Albert; Haney, Stephan; Turkyilmaz, Ilser

    2013-01-01

    The fabrication of an accurately fitting implant-supported fixed prosthesis requires multiple steps, the first of which is assembling the impression coping on the implant. An imprecise fit of the impression coping on the implant will cause errors that will be magnified in subsequent steps of prosthesis fabrication. The purpose of this study was to characterize the 3-dimensional (3D) precision of fit between impression coping and implant replica pairs for 3 implant systems. The selected implant systems represent the 3 main joint types used in implant dentistry: external hexagonal, internal trilobe, and internal conical. Ten impression copings and 10 implant replicas from each of the 3 systems, B (Brånemark System), R (NobelReplace Select), and A (NobelActive) were paired. A standardized aluminum test body was luted to each impression coping, and the corresponding implant replica was embedded in a stone base. A coordinate measuring machine was used to quantify the maximum range of displacement in a vertical direction as a function of the tightening force applied to the guide pin. Maximum angular displacement in a horizontal plane was measured as a function of manual clockwise or counterclockwise rotation. Vertical and rotational positioning was analyzed by using 1-way analysis of variance (ANOVA). The Fisher protected least significant difference (PLSD) multiple comparisons test of the means was applied when the F-test in the ANOVA was significant (α=.05). The mean and standard deviation for change in the vertical positioning of impression copings was 4.3 ±2.1 μm for implant system B, 2.8 ±4.2 μm for implant system R, and 20.6 ±8.8 μm for implant system A. The mean and standard deviation for rotational positioning was 3.21 ±0.98 degrees for system B, 2.58 ±1.03 degrees for system R, and 5.30 ±0.79 degrees for system A. The P-value for vertical positioning between groups A and B and between groups A and R was <.001. No significant differences were found for

  3. Additive Manufacturing: A Comparative Analysis of Dimensional Accuracy and Skin Texture Reproduction of Auricular Prostheses Replicas.

    Science.gov (United States)

    Unkovskiy, Alexey; Spintzyk, Sebastian; Axmann, Detlef; Engel, Eva-Maria; Weber, Heiner; Huettig, Fabian

    2017-11-10

    The use of computer-aided design/computer-aided manufacturing (CAD/CAM) and additive manufacturing in maxillofacial prosthetics has been widely acknowledged. Rapid prototyping can be considered for manufacturing of auricular prostheses. Therefore, so-called prostheses replicas can be fabricated by digital means. The objective of this study was to identify a superior additive manufacturing method to fabricate auricular prosthesis replicas (APRs) within a digital workflow. Auricles of 23 healthy subjects (mean age of 37.8 years) were measured in vivo with respect to an anthropometrical protocol. Landmarks were volumized with fiducial balls for 3D scanning using a handheld structured light scanner. The 3D CAD dataset was postprocessed, and the same anthropometrical measurements were made in the CAD software with the digital lineal. Each CAD dataset was materialized using fused deposition modeling (FDM), selective laser sintering (SLS), and stereolithography (SL), constituting 53 APR samples. All distances between the landmarks were measured on the APRs. After the determination of the measurement error within the five data groups (in vivo, CAD, FDM, SLS, and SL), the mean values were compared using matched pairs method. To this, the in vivo and CAD dataset were set as references. Finally, the surface structure of the APRs was qualitatively evaluated with stereomicroscopy and profilometry to ascertain the level of skin detail reproduction. The anthropometrical approach showed drawbacks in measuring the protrusion of the ear's helix. The measurement error within all groups of measurements was calculated between 0.20 and 0.28 mm, implying a high reproducibility. The lowest mean differences of 53 produced APRs were found in FDM (0.43%) followed by SLS (0.54%) and SL (0.59%)--compared to in vivo, and again in FDM (0.20%) followed by SL (0.36%) and SLS (0.39%)--compared to CAD. None of these values exceed the threshold of clinical relevance (1.5%); however, the qualitative

  4. Marshall Application Realignment System (MARS) Architecture

    Science.gov (United States)

    Belshe, Andrea; Sutton, Mandy

    2010-01-01

    The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most

  5. Secure Architectures in the Cloud

    NARCIS (Netherlands)

    De Capitani di Vimercati, Sabrina; Pieters, Wolter; Probst, Christian W.

    2011-01-01

    This report documents the outcomes of Dagstuhl Seminar 11492 “Secure Architectures in the Cloud‿. In cloud computing, data storage and processing are offered as services, and data are managed by external providers that reside outside the control of the data owner. The use of such services reduces

  6. Microprocessor architectures RISC, CISC and DSP

    CERN Document Server

    Heath, Steve

    1995-01-01

    'Why are there all these different processor architectures and what do they all mean? Which processor will I use? How should I choose it?' Given the task of selecting an architecture or design approach, both engineers and managers require a knowledge of the whole system and an explanation of the design tradeoffs and their effects. This is information that rarely appears in data sheets or user manuals. This book fills that knowledge gap.Section 1 provides a primer and history of the three basic microprocessor architectures. Section 2 describes the ways in which the architectures react with the

  7. On the Application of Replica Molding Technology for the Indirect Measurement of Surface and Geometry of Micromilled Components

    DEFF Research Database (Denmark)

    Baruffi, Federico; Parenti, Paolo; Cacciatore, Francesco

    2017-01-01

    the replica molding technology. The method consists of obtaining a replica of the feature that is inaccessible for standard measurement devices and performing its indirect measurement. This paper examines the performance of a commercial replication media applied to the indirect measurement of micromilled...... components. Two specifically designed micromilled benchmark samples were used to assess the accuracy in replicating both surface texture and geometry. A 3D confocal microscope and a focus variation instrument were employed and the associated uncertainties were evaluated. The replication method proved...... to be suitable for characterizing micromilled surface texture even though an average overestimation in the nano-metric level of the Sa parameter was observed. On the other hand, the replicated geometry generally underestimated that of the master, often leading to a different measurement output considering...

  8. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    Science.gov (United States)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  9. A Draft Science Management Plan for Returned Samples from Mars: Recommendations from the International Mars Architecture for the Return of Samples (iMARS) Phase II Working Group

    Science.gov (United States)

    Haltigin, T.; Lange, C.; Mugnuolo, R.; Smith, C.

    2018-04-01

    This paper summarizes the findings and recommendations of the International Mars Architecture for the Return of Samples (iMARS) Phase II Working Group, an international team comprising 38 members from 16 countries and agencies.

  10. Metallic microwires obtained as replicas of etched ion tracks in polymer matrixes: Microscopy and emission properties

    International Nuclear Information System (INIS)

    Zagorski, D.L.; Bedin, S.A.; Oleinikov, V.A.; Polyakov, N.B.; Rybalko, O.G.; Mchedlishvili, B.V.

    2009-01-01

    Specially prepared porous matrixes (with through and dead-end pores of cylindrical or conical forms) were used as the templates for making ensembles of microwires. The process of electrodeposition of metal (Cu) into these pores was investigated. AFM technique was used for studying the 'composite material' (metal microwires embedded into the polymer matrix). It was shown that the combination of different modes of AFM (tapping with phase-contrast mode, contact with lateral force mode) makes it possible to detect metal in the polymer matrix. Additional spread resistance mode in the contact regime allowed to measure the electrical conductivity of a single wire. The ensembles of free-standing microwires (metallic replicas of the pores obtained after removing of the polymer matrix) were used as the substrates (for deposition of the probe) for ion emission in the mass-spectrometer. It was shown that the intensity of formed ion beam increases with increasing of power of the laser pulse and with increasing of the mass of the probe. The intensity of mass-spectra signal on the power of laser pulse has a threshold character with saturation accompanied with the appearance of dimer ions. At the same time this intensity decreases with the increasing of the surface density of wires. The effect of degradation of wires during the laser pulse irradiation was found.

  11. From Dalek half balls to Daft Punk helmets: Mimetic fandom and the crafting of replicas

    Directory of Open Access Journals (Sweden)

    Matt Hills

    2014-06-01

    Full Text Available Mimetic fandom is a surprisingly understudied mode of (culturally masculinized fan activity in which fans research and craft replica props. Mimetic fandom can be considered as (inauthentic and (immaterial, combining noncommercial status with grassroots marketing or brand reinforcement as well as fusing an emphasis on material artifacts with Web 2.0 collective intelligence. Simply analyzing mimetic fandom as part of fannish material culture fails to adequately assess the nonmaterial aspects of this collaborative creativity. Two fan cultures are taken as case studies: Dalek building groups and Daft Punk helmet constructors. These diverse cases indicate that mimetic fandom has a presence and significance that moves across media fandoms and is not restricted to the science fiction, fantasy, and horror followings with which it is most often associated. Mimetic fandom may be theorized as an oscillatory activity that confuses binaries and constructions of (academic/fan authenticity. This fan practice desires and pursues a kind of ontological bridging or unity—from text to reality—that is either absent or less dominant in many other fan activities such as cosplay, screen-used prop collecting, and geographical pilgrimage. Fan studies may benefit from reassessing the place of mimesis, especially in order to theorize fan practices that are less clearly transformative in character.

  12. Conformational Ensembles of α-Synuclein Derived Peptide with Different Osmolytes from Temperature Replica Exchange Sampling

    Directory of Open Access Journals (Sweden)

    Salma Jamal

    2017-12-01

    Full Text Available Intrinsically disordered proteins (IDP are a class of proteins that do not have a stable three-dimensional structure and can adopt a range of conformations playing various vital functional role. Alpha-synuclein is one such IDP which can aggregate into toxic protofibrils and has been associated largely with Parkinson's disease (PD along with other neurodegenerative diseases. Osmolytes are small organic compounds that can alter the environment around the proteins by acting as denaturants or protectants for the proteins. In the present study, we have conducted a series of replica exchange molecular dynamics simulations to explore the role of osmolytes, urea which is a denaturant and TMAO (trimethylamine N-oxide, a protecting osmolyte, in aggregation and conformations of the synuclein peptide. We observed that both the osmolytes have significantly distinct impacts on the peptide and led to transitions of the conformations of the peptide from one state to other. Our findings highlighted that urea attenuated peptide aggregation and resulted in the formation of extended peptide structures whereas TMAO led to compact and folded forms of the peptide.

  13. Enhancing dry adhesives and replica molding with ethyl cyano-acrylate

    International Nuclear Information System (INIS)

    Bovero, E; Menon, C

    2014-01-01

    The use of cyano-acrylate to improve the performance of dry adhesives and their method of fabrication is investigated. Specifically, the contributions of this work are: (1) a new adhesion method to adhere to a large variety of surfaces, (2) a strategy to increase the compliance of dry adhesives, and (3) an improved fabrication process for micro-structured dry adhesives based on replica molding. For the first contribution, the adhesion method consists of anchoring a micro-structured dry adhesive to a surface through a layer of hardened ethyl cyano-acrylate (ECA). This method increases the adhesion of the orders of magnitude at the expense of leaving residue after detachment. However, this method preserves reusability. For the second contribution, a double-sided dry adhesive is obtained by introducing a substrate with a millimeter-sized pillar structure, which enabled further increasing adhesion. For the third contribution, an ECA layer is used as a mold for the fabrication of new adhesives. These new types of molds proved able to produce dry adhesives with high reproducibility and low degradation. (paper)

  14. Partial replicas of uv-irradiated bacteriophage T4 genomes and their role in multiplicity reactivation

    International Nuclear Information System (INIS)

    Rayssiguier, C.; Kozinski, A.W.; Doermann, A.H.

    1980-01-01

    A physicochemical study was made of the replication and transmission of uv-irradiated T4 genomes. The data presented in this paper justify the following conclusions. (i) For both low and high multiplicity of infection there was abundant replication from uv-irradiated parental templates. It exceeded by far the efficiency predicted by the hypothesis that a single lethal hit completely prevents replication of the killed phage DNA: i.e., some dead phage particles must replicate parts of their DNA. (ii) Replication of the uv-irradiated DNA was repetitive as shown by density reversal experiments. (iii) Newly synthesized progeny DNA originating from uv-irradiated templates appeared as significantly shorter segments of the genomes than progeny DNA produced from non-uv-irradiated templates. A good correlation existed between the number of uv hits and the number of random cuts that would be needed to reduce replication fragments to the length observed. (iv) The contribution of uv-irradiated parental DNA among progeny phage in multiplicity reactivation was disposed in shorter subunits than was the DNA from unirradiated parental phage. It is important to emphasize that it was mainly in the form of replicative hybrid. These conclusions appear to justify excluding interparental recombination as a prerequisite for multiplicity reactivation. They lead directly to some form of partial replica hypothesis for multiplicity reactivation

  15. Replica of human dentin treated with different desensitizing agents: a methodological SEM study in vitro

    Directory of Open Access Journals (Sweden)

    Pereira Jose Carlos

    2002-01-01

    Full Text Available This is a preliminary study to determine a methodological sequence in vitro which may allow the reproduction of dentin for SEM analysis, after the use of different desensitizing agents. Dentin discs obtained from extracted human third molars were etched with 6% citric acid, an artificial smear layer was created and the surface dentin discs were divided into four quadrants. Quadrants 2, 3 and 4 of each disc were conditioned with 6% citric acid. The desensitizing agents (Oxa-Gel®, Gluma Desensitizer and an experimental agent were applied to quadrants 3 and 4. To evaluate the acid resistance of the treatment, quadrant 4 was etched again with 6% citric acid. An impression was then taken with Aquasil ULV. After a setting period of 6 min, each disc was removed from the impression and stored in a moist-free environment for 24 h at 37ºC. After that time, a low-viscosity epoxy resin (Araltec GY 1109 BR was poured into the impression and cured for 24 h. All specimens were metal-coated for SEM analysis. Comparison of the photomicrographs of dentin discs with their respective impressions and resin replicas showed that this technique can reproduce the characteristics of the dentin surface treated with desensitizing agents.

  16. Replicas in Cultural Heritage: 3d Printing and the Museum Experience

    Science.gov (United States)

    Ballarin, M.; Balletti, C.; Vernier, P.

    2018-05-01

    3D printing has seen a recent massive diffusion for several applications, not least the field of Cultural Heritage. Being used for different purposes, such as study, analysis, conservation or access in museum exhibitions, 3D printed replicas need to undergo a process of validation also in terms of metrical precision and accuracy. The Laboratory of Photogrammetry of Iuav University of Venice has started several collaborations with Italian museum institutions firstly for the digital acquisition and then for the physical reproduction of objects of historical and artistic interest. The aim of the research is to analyse the metric characteristics of the printed model in relation to the original data, and to optimize the process that from the survey leads to the physical representation of an object. In fact, this could be acquired through different methodologies that have different precisions (multi-image photogrammetry, TOF laser scanner, triangulation based laser scanner), and it always involves a long processing phase. It should not be forgotten that the digital data have to undergo a series of simplifications, which, on one hand, eliminate the noise introduced by the acquisition process, but on the other one, they can lead to discrepancies between the physical copy and the original geometry. In this paper we will show the results obtained on a small archaeological find that was acquired and reproduced for a museum exhibition intended for blind and partially sighted people.

  17. Plasticity of 150-loop in influenza neuraminidase explored by Hamiltonian replica exchange molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Nanyu Han

    Full Text Available Neuraminidase (NA of influenza is a key target for antiviral inhibitors, and the 150-cavity in group-1 NA provides new insight in treating this disease. However, NA of 2009 pandemic influenza (09N1 was found lacking this cavity in a crystal structure. To address the issue of flexibility of the 150-loop, Hamiltonian replica exchange molecular dynamics simulations were performed on different groups of NAs. Free energy landscape calculated based on the volume of 150-cavity indicates that 09N1 prefers open forms of 150-loop. The turn A (residues 147-150 of the 150-loop is discovered as the most dynamical motif which induces the inter-conversion of this loop among different conformations. In the turn A, the backbone dynamic of residue 149 is highly related with the shape of 150-loop, thus can function as a marker for the conformation of 150-loop. As a contrast, the closed conformation of 150-loop is more energetically favorable in N2, one of group-2 NAs. The D147-H150 salt bridge is found having no correlation with the conformation of 150-loop. Instead the intimate salt bridge interaction between the 150 and 430 loops in N2 variant contributes the stabilizing factor for the closed form of 150-loop. The clustering analysis elaborates the structural plasticity of the loop. This enhanced sampling simulation provides more information in further structural-based drug discovery on influenza virus.

  18. Simulated tempering distributed replica sampling: A practical guide to enhanced conformational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Sarah; Pomes, Regis, E-mail: pomes@sickkids.ca

    2010-11-01

    Simulated tempering distributed replica sampling (STDR) is a generalized-ensemble method designed specifically for simulations of large molecular systems on shared and heterogeneous computing platforms [Rauscher, Neale and Pomes (2009) J. Chem. Theor. Comput. 5, 2640]. The STDR algorithm consists of an alternation of two steps: (1) a short molecular dynamics (MD) simulation; and (2) a stochastic temperature jump. Repeating these steps thousands of times results in a random walk in temperature, which allows the system to overcome energetic barriers, thereby enhancing conformational sampling. The aim of the present paper is to provide a practical guide to applying STDR to complex biomolecular systems. We discuss the details of our STDR implementation, which is a highly-parallel algorithm designed to maximize computational efficiency while simultaneously minimizing network communication and data storage requirements. Using a 35-residue disordered peptide in explicit water as a test system, we characterize the efficiency of the STDR algorithm with respect to both diffusion in temperature space and statistical convergence of structural properties. Importantly, we show that STDR provides a dramatic enhancement of conformational sampling compared to a canonical MD simulation.

  19. Perovskite Quantum Dots Modeled Using ab Initio and Replica Exchange Molecular Dynamics

    KAUST Repository

    Buin, Andrei; Comin, Riccardo; Ip, Alexander H.; Sargent, Edward H.

    2015-01-01

    © 2015 American Chemical Society. Organometal halide perovskites have recently attracted tremendous attention at both the experimental and theoretical levels. Much of this work has been dedicated to bulk material studies, yet recent experimental work has shown the formation of highly efficient quantum-confined nanocrystals with tunable band edges. Here we investigate perovskite quantum dots from theory, predicting an upper bound of the Bohr radius of 45 Å that agrees well with literature values. When the quantum dots are stoichiometric, they are trap-free and have nearly symmetric contributions to confinement from the valence and conduction bands. We further show that surface-associated conduction bandedge states in perovskite nanocrystals lie below the bulk states, which could explain the difference in Urbach tails between mesoporous and planar perovskite films. In addition to conventional molecular dynamics (MD), we implement an enhanced phase-space sampling algorithm, replica exchange molecular dynamics (REMD). We find that in simulation of methylammonium orientation and global minima, REMD outperforms conventional MD. To the best of our knowledge, this is the first REMD implementation for realistic-sized systems in the realm of DFT calculations.

  20. Perovskite Quantum Dots Modeled Using ab Initio and Replica Exchange Molecular Dynamics

    KAUST Repository

    Buin, Andrei

    2015-06-18

    © 2015 American Chemical Society. Organometal halide perovskites have recently attracted tremendous attention at both the experimental and theoretical levels. Much of this work has been dedicated to bulk material studies, yet recent experimental work has shown the formation of highly efficient quantum-confined nanocrystals with tunable band edges. Here we investigate perovskite quantum dots from theory, predicting an upper bound of the Bohr radius of 45 Å that agrees well with literature values. When the quantum dots are stoichiometric, they are trap-free and have nearly symmetric contributions to confinement from the valence and conduction bands. We further show that surface-associated conduction bandedge states in perovskite nanocrystals lie below the bulk states, which could explain the difference in Urbach tails between mesoporous and planar perovskite films. In addition to conventional molecular dynamics (MD), we implement an enhanced phase-space sampling algorithm, replica exchange molecular dynamics (REMD). We find that in simulation of methylammonium orientation and global minima, REMD outperforms conventional MD. To the best of our knowledge, this is the first REMD implementation for realistic-sized systems in the realm of DFT calculations.

  1. Obtain ceramic porous alumina-zirconia by replica method calcium phosphate coated

    International Nuclear Information System (INIS)

    Silva, A.D.R.; Rigoli, W.R.; Osiro, Denise; Pallone, E.M.J.A.

    2016-01-01

    Biomaterials used in bone replacement, including porous bioceramics, are often used as support structure for bone formation and repair. The porous bioceramics are used because present features as biocompatibility, high porosity and pore morphology that confer adequate mechanical strength and induce bone growth. In this work were obtained porous specimens of alumina containing 5% by inclusion of volume of zirconia produced by the replica method. The porous specimens had its surface chemically treated with phosphoric acid and were coated with calcium phosphate. The coating was performed using the biomimetic method during 14 days and an initial pH of 6.1. The porous specimens were characterized using the follow techniques: porosity, axial compression tests, microtomography, scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS), X-ray diffraction (XRD) and pH measurements SBF solution. The results showed specimens with suitable pore morphology for application as biomaterial, and even a reduced time of incubation favored the calcium phosphate phases formation on the material surfaces. (author)

  2. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  3. Architectural design decisions

    NARCIS (Netherlands)

    Jansen, Antonius Gradus Johannes

    2008-01-01

    A software architecture can be considered as the collection of key decisions concerning the design of the software of a system. Knowledge about this design, i.e. architectural knowledge, is key for understanding a software architecture and thus the software itself. Architectural knowledge is mostly

  4. Information Integration Architecture Development

    OpenAIRE

    Faulkner, Stéphane; Kolp, Manuel; Nguyen, Duy Thai; Coyette, Adrien; Do, Thanh Tung; 16th International Conference on Software Engineering and Knowledge Engineering

    2004-01-01

    Multi-Agent Systems (MAS) architectures are gaining popularity for building open, distributed, and evolving software required by systems such as information integration applications. Unfortunately, despite considerable work in software architecture during the last decade, few research efforts have aimed at truly defining patterns and languages for designing such multiagent architectures. We propose a modern approach based on organizational structures and architectural description lan...

  5. Fragments of Architecture

    DEFF Research Database (Denmark)

    Bang, Jacob Sebastian

    2016-01-01

    Topic 3: “Case studies dealing with the artistic and architectural work of architects worldwide, and the ties between specific artistic and architectural projects, methodologies and products”......Topic 3: “Case studies dealing with the artistic and architectural work of architects worldwide, and the ties between specific artistic and architectural projects, methodologies and products”...

  6. Order parameter free enhanced sampling of the vapor-liquid transition using the generalized replica exchange method.

    Science.gov (United States)

    Lu, Qing; Kim, Jaegil; Straub, John E

    2013-03-14

    The generalized Replica Exchange Method (gREM) is extended into the isobaric-isothermal ensemble, and applied to simulate a vapor-liquid phase transition in Lennard-Jones fluids. Merging an optimally designed generalized ensemble sampling with replica exchange, gREM is particularly well suited for the effective simulation of first-order phase transitions characterized by "backbending" in the statistical temperature. While the metastable and unstable states in the vicinity of the first-order phase transition are masked by the enthalpy gap in temperature replica exchange method simulations, they are transformed into stable states through the parameterized effective sampling weights in gREM simulations, and join vapor and liquid phases with a succession of unimodal enthalpy distributions. The enhanced sampling across metastable and unstable states is achieved without the need to identify a "good" order parameter for biased sampling. We performed gREM simulations at various pressures below and near the critical pressure to examine the change in behavior of the vapor-liquid phase transition at different pressures. We observed a crossover from the first-order phase transition at low pressure, characterized by the backbending in the statistical temperature and the "kink" in the Gibbs free energy, to a continuous second-order phase transition near the critical pressure. The controlling mechanisms of nucleation and continuous phase transition are evident and the coexistence properties and phase diagram are found in agreement with literature results.

  7. Using the multi-objective optimization replica exchange Monte Carlo enhanced sampling method for protein-small molecule docking.

    Science.gov (United States)

    Wang, Hongrui; Liu, Hongwei; Cai, Leixin; Wang, Caixia; Lv, Qiang

    2017-07-10

    In this study, we extended the replica exchange Monte Carlo (REMC) sampling method to protein-small molecule docking conformational prediction using RosettaLigand. In contrast to the traditional Monte Carlo (MC) and REMC sampling methods, these methods use multi-objective optimization Pareto front information to facilitate the selection of replicas for exchange. The Pareto front information generated to select lower energy conformations as representative conformation structure replicas can facilitate the convergence of the available conformational space, including available near-native structures. Furthermore, our approach directly provides min-min scenario Pareto optimal solutions, as well as a hybrid of the min-min and max-min scenario Pareto optimal solutions with lower energy conformations for use as structure templates in the REMC sampling method. These methods were validated based on a thorough analysis of a benchmark data set containing 16 benchmark test cases. An in-depth comparison between MC, REMC, multi-objective optimization-REMC (MO-REMC), and hybrid MO-REMC (HMO-REMC) sampling methods was performed to illustrate the differences between the four conformational search strategies. Our findings demonstrate that the MO-REMC and HMO-REMC conformational sampling methods are powerful approaches for obtaining protein-small molecule docking conformational predictions based on the binding energy of complexes in RosettaLigand.

  8. Application of carbon extraction replicas in grain-size measurements of high-strength steels using TEM

    International Nuclear Information System (INIS)

    Poorhaydari, Kioumars; Ivey, Douglas G.

    2007-01-01

    In this paper, the application of carbon extraction replicas in grain-size measurements is introduced and discussed. Modern high-strength microalloyed steels, used as structural or pipeline materials, have very small grains with substructures. Replicas used in transmission electron microscopes can resolve the grain boundaries and can be used for systematic measurement of grain size in cases where the small size of the grains pushes the resolution of conventional optical microscopes. The grain-size variations obtained from replicas are compared with those obtained from optical and scanning electron microscopy. An emphasis is placed on the importance of using the correct technique for imaging and the optimal magnification. Grain-size measurements are used for estimation of grain-boundary strengthening contribution to yield strength. The variation in grain size is also correlated with hardness in the base metal of several microalloyed steels, as well as the fine-grained heat-affected zone of a weld structure with several heat inputs

  9. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  10. Advanced Ground Systems Maintenance Enterprise Architecture Project

    Science.gov (United States)

    Perotti, Jose M. (Compiler)

    2015-01-01

    The project implements an architecture for delivery of integrated health management capabilities for the 21st Century launch complex. The delivered capabilities include anomaly detection, fault isolation, prognostics and physics based diagnostics.

  11. An Enterprise Information System Data Architecture Guide

    National Research Council Canada - National Science Library

    Lewis, Grace

    2001-01-01

    Data architecture defines how data is stored, managed, and used in a system. It establishes common guidelines for data operations that make it impossible to predict, model, gauge, or control the flow of data in the system...

  12. CASPER: Embedding Power Estimation and Hardware-Controlled Power Management in a Cycle-Accurate Micro-Architecture Simulation Platform for Many-Core Multi-Threading Heterogeneous Processors

    Directory of Open Access Journals (Sweden)

    Arun Ravindran

    2012-02-01

    Full Text Available Despite the promising performance improvement observed in emerging many-core architectures in high performance processors, high power consumption prohibitively affects their use and marketability in the low-energy sectors, such as embedded processors, network processors and application specific instruction processors (ASIPs. While most chip architects design power-efficient processors by finding an optimal power-performance balance in their design, some use sophisticated on-chip autonomous power management units, which dynamically reduce the voltage or frequencies of idle cores and hence extend battery life and reduce operating costs. For large scale designs of many-core processors, a holistic approach integrating both these techniques at different levels of abstraction can potentially achieve maximal power savings. In this paper we present CASPER, a robust instruction trace driven cycle-accurate many-core multi-threading micro-architecture simulation platform where we have incorporated power estimation models of a wide variety of tunable many-core micro-architectural design parameters, thus enabling processor architects to explore a sufficiently large design space and achieve power-efficient designs. Additionally CASPER is designed to accommodate cycle-accurate models of hardware controlled power management units, enabling architects to experiment with and evaluate different autonomous power-saving mechanisms to study the run-time power-performance trade-offs in embedded many-core processors. We have implemented two such techniques in CASPER–Chipwide Dynamic Voltage and Frequency Scaling, and Performance Aware Core-Specific Frequency Scaling, which show average power savings of 35.9% and 26.2% on a baseline 4-core SPARC based architecture respectively. This power saving data accounts for the power consumption of the power management units themselves. The CASPER simulation platform also provides users with complete support of SPARCV9

  13. Architectural Design of a LMS with LTSA-Conformance

    Science.gov (United States)

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper illustrates an approach for architectural design of a Learning Management System (LMS), which is verifiable against the Learning Technology System Architecture (LTSA) conformance rules. We introduce a new method for software architectural design that extends the Unified Modeling Language (UML) component diagram with the formal…

  14. LTSA Conformance Testing to Architectural Design of LMS Using Ontology

    Science.gov (United States)

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…

  15. A COMPARATIVE STUDY OF SYSTEM NETWORK ARCHITECTURE Vs DIGITAL NETWORK ARCHITECTURE

    OpenAIRE

    Seema; Mukesh Arya

    2011-01-01

    The efficient managing system of sources is mandatory for the successful running of any network. Here this paper describes the most popular network architectures one of developed by IBM, System Network Architecture (SNA) and other is Digital Network Architecture (DNA). As we know that the network standards and protocols are needed for the network developers as well as users. Some standards are The IEEE 802.3 standards (The Institute of Electrical and Electronics Engineers 1980) (LAN), IBM Sta...

  16. Communications Architecture Recommendations to Enable Joint Vision 2020

    National Research Council Canada - National Science Library

    Armstrong, R. B

    2003-01-01

    The Mission Information Management (MIM) Communications Architecture provides a framework to develop an integrated space, air, and terrestrial communications network that supports all national security users...

  17. FTIR study of ageing of fast drying oil colour (FDOC) alkyd paint replicas

    Science.gov (United States)

    Duce, Celia; Della Porta, Valentina; Tiné, Maria Rosaria; Spepi, Alessio; Ghezzi, Lisa; Colombini, Maria Perla; Bramanti, Emilia

    2014-09-01

    We propose ATR-FTIR spectroscopy for the characterization of the spectral changes in alkyd resin from the Griffin Alkyd Fast Drying Oil Colour range (Winsor & Newton), occurring over 550 days (˜18 months) of natural ageing and over six months of artificial ageing under an acetic acid atmosphere. Acetic acid is one of the atmospheric pollutants found inside museums in concentrations that can have a significant effect on the works exhibited. During natural ageing we observed an increase and broadening of the OH group band around 3300 cm-1 and an increase in bands in the region 1730-1680 cm-1 due to carbonyl stretching. We found a broad band around 1635 cm-1 likely due to Cdbnd O stretching vibrations of β dichetons. These spectral changes are the result of autooxidation reactions during natural ageing and crosslinking, which then form f alcohols and carbonyl species. The increase in absorbance at 1635 cm-1 was selected as a parameter to monitor the ageing process of paintings prepared with FDOC, without the need for any extractive procedure. FTIR spectra of paint replicas kept under an acetic acid atmosphere indicated the chemical groups involved in the reaction with acid, thus suggesting which spectral FTIR regions could be investigated in order to follow any degradation in real paintings. A red paint sample from a hyper-realistic artwork (“Racconta storie”, 2003) by the Italian painter Patrizia Zara was investigated by FTIR in order to evaluate the effects of 10 years natural ageing on alkyd colours. The results obtained suggested that after the end of chemical drying (autooxidation), alkyd colours are very stable.

  18. Secure Service Oriented Architectures (SOA) Supporting NEC [Architecture orientée service (SOA) gérant la NEC

    NARCIS (Netherlands)

    Meiler, P.P.; Schmeing, M.

    2009-01-01

    Combined scenario ; Data management ; Data processing ; Demonstrator ; Information systems ; Integrated systems ; Interoperability ; Joint scenario ; Network Enabled Capability (NEC) ; Operational effectiveness ; Operations research ; Scenarios ; Secure communication ; Service Oriented Architecture

  19. EPICS architecture

    International Nuclear Information System (INIS)

    Dalesio, L.R.; Kozubal, A.J.; Kraimer, M.R.

    1992-01-01

    The Experimental Physics and Industrial Control System (EPICS) provides control and data acquisition for the experimental physics community. Because the capabilities required by the experimental physics community for control were not available through industry, we began the design and implementation of EPICS. It is a distributed process control system built on a software communication bus. The functional subsystems, which provide data acquisition, supervisory control, closed loop control, archiving, and alarm management, greatly reduce the need for programming. Sequential control is provided through a sequential control language, allowing the implementer to express state diagrams easily. Data analysis of the archived data is provided through an interactive tool. The timing system provides distributed synchronization for control and time stamped data for data correlation across nodes in the network. The system is scalable from a single test station with a low channel count to a large distributed network with thousands of channels. The functions provided to the physics applications have proven helpful to the experiments while greatly reducing the time to deliver controls. (author)

  20. Medical Data Architecture Project Status

    Science.gov (United States)

    Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2018-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.

  1. Modeling Architectural Patterns’ Behavior Using Architectural Primitives

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Architectural patterns have an impact on both the structure and the behavior of a system at the architecture design level. However, it is challenging to model patterns’ behavior in a systematic way because modeling languages do not provide the appropriate abstractions and because each pattern

  2. Religious architecture: anthropological perspectives

    NARCIS (Netherlands)

    Verkaaik, O.

    2013-01-01

    Religious Architecture: Anthropological Perspectives develops an anthropological perspective on modern religious architecture, including mosques, churches and synagogues. Borrowing from a range of theoretical perspectives on space-making and material religion, this volume looks at how religious

  3. Avionics Architecture for Exploration

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics Architectures for Exploration (AAE) project is to develop a reference architecture that is based on standards and that can be scaled and...

  4. RATS: Reactive Architectures

    National Research Council Canada - National Science Library

    Christensen, Marc

    2004-01-01

    This project had two goals: To build an emulation prototype board for a tiled architecture and to demonstrate the utility of a global inter-chip free-space photonic interconnection fabric for polymorphous computer architectures (PCA...

  5. Rhein-Ruhr architecture

    DEFF Research Database (Denmark)

    2002-01-01

    katalog til udstillingen 'Rhein - Ruhr architecture' Meldahls smedie, 15. marts - 28. april 2002. 99 sider......katalog til udstillingen 'Rhein - Ruhr architecture' Meldahls smedie, 15. marts - 28. april 2002. 99 sider...

  6. Architecture and Film

    OpenAIRE

    Mohammad Javaheri, Saharnaz

    2016-01-01

    Film does not exist without architecture. In every movie that has ever been made throughout history, the cinematic image of architecture is embedded within the picture. Throughout my studies and research, I began to see that there is no director who can consciously or unconsciously deny the use of architectural elements in his or her movies. Architecture offers a strong profile to distinguish characters and story. In the early days, films were shot in streets surrounde...

  7. Elements of Architecture

    DEFF Research Database (Denmark)

    Elements of Architecture explores new ways of engaging architecture in archaeology. It conceives of architecture both as the physical evidence of past societies and as existing beyond the physical environment, considering how people in the past have not just dwelled in buildings but have existed...

  8. PLM support to architecture based development

    DEFF Research Database (Denmark)

    Bruun, Hans Peter Lomholt

    , organisation, processes, etc. To identify, evaluate, and align aspects of these domains are necessary for developing the optimal layout of product architectures. It is stated in this thesis that architectures describe building principles for products, product families, and product programs, where this project...... and developing architectures can be difficult to manage, update, and maintain during development. The concept of representing product architectures in computer-based product information tools has though been central in this research, and in the creation of results. A standard PLM tool (Windchill PDMLink...... architectures in computer systems. Presented results build on research literature and experiences from industrial partners. Verification of the theory contributions, approaches, models, and tools, have been carried out in industrial projects, with promising results. This thesis describes the means for: (1...

  9. Replica symmetry breaking solution for two-sublattice fermionic Ising spin glass models in a transverse field

    International Nuclear Information System (INIS)

    Zimmer, F.M.; Magalhaes, S.G.

    2007-01-01

    The one-step replica symmetry breaking is used to study the competition between spin glass (SG) and antiferromagnetic order (AF) in two-sublattice fermionic Ising SG models in the presence of a transverse Γ and a parallel H magnetic fields. Inter- and intra-sublattice exchange interactions following Gaussian distributions are considered. The problem is formulated in a Grassmann path integral formalism within the static ansatz. Results show that H favors the non-ergodic mixed phase (AF+SG) and it destroys the AF. The Γ suppresses the magnetic orders, and the intra-sublattice interaction can introduce a discontinuous phase transition

  10. The use of extraction and electronic diffraction replicas for precipitates characterization in welded Cr-Mo Steels

    International Nuclear Information System (INIS)

    Gutierrez de Saiz-Solabarria, S.; San Juan Nunez, J.M.

    1997-01-01

    The precipitates and phases found in the structure of welded joints of Heat Interchanges Tubes were studied and identified. The base material satisfied the requirements of ASME Sec II, SA 213 Gr T22 (2 1/4 Cr 1 Mo). Compositions of Filler Metals were: 2 1/4 Cr 1 Mo and 2 1/4 Cr 1 Mo 1/4 Nb. The chemical composition of base and weld materials were analyzed by atomic emission spectroscopy in high vacuum electric discharge and by inductive plasma coupled. For the constituents characterization extraction and diffraction microscopy replicas were used. (Author) 65 refs

  11. Vital architecture, slow momentum policy

    DEFF Research Database (Denmark)

    Braae, Ellen Marie

    2010-01-01

    A reflection on the relation between Danish landscape architecture policy and the statements made through current landscape architectural project.......A reflection on the relation between Danish landscape architecture policy and the statements made through current landscape architectural project....

  12. Exporting Humanist Architecture

    DEFF Research Database (Denmark)

    Nielsen, Tom

    2016-01-01

    The article is a chapter in the catalogue for the Danish exhibition at the 2016 Architecture Biennale in Venice. The catalogue is conceived at an independent book exploring the theme Art of Many - The Right to Space. The chapter is an essay in this anthology tracing and discussing the different...... values and ethical stands involved in the export of Danish Architecture. Abstract: Danish architecture has, in a sense, been driven by an unwritten contract between the architects and the democratic state and its institutions. This contract may be viewed as an ethos – an architectural tradition...... with inherent aesthetic and moral values. Today, however, Danish architecture is also an export commodity. That raises questions, which should be debated as openly as possible. What does it mean for architecture and architects to practice in cultures and under political systems that do not use architecture...

  13. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  14. The BWS Open Business Enterprise System Architecture

    Directory of Open Access Journals (Sweden)

    Cristian IONITA

    2011-01-01

    Full Text Available Business process management systems play a central role in supporting the business operations of medium and large organizations. This paper analyses the properties current business enterprise systems and proposes a new application type called Open Business Enterprise System. A new open system architecture called Business Workflow System is proposed. This architecture combines the instruments for flexible data management, business process management and integration into a flexible system able to manage modern business operations. The architecture was validated by implementing it into the DocuMentor platform used by major companies in Romania and US. These implementations offered the necessary data to create and refine an enterprise integration methodology called DMCPI. The final section of the paper presents the concepts, stages and techniques employed by the methodology.

  15. Information management

    Science.gov (United States)

    Ricks, Wendell; Corker, Kevin

    1990-01-01

    Primary Flight Display (PFD) information management and cockpit display of information management research is presented in viewgraph form. The information management problem in the cockpit, information management burdens, the key characteristics of an information manager, the interface management system handling the flow of information and the dialogs between the system and the pilot, and overall system architecture are covered.

  16. Big data in cloud : a data architecture

    OpenAIRE

    Sá, Jorge Vaz de Oliveira e; Martins, César Silva; Simões, Paulo

    2015-01-01

    Nowadays, organizations have at their disposal a large volume of data with a wide variety of types. Technology-driven organizations want to capture process and analyze this data at a fast velocity, in order to better understand and manage their customers, their operations and their business processes. As much as data volume and variety increases and as faster analytic results are needed, more demanding is for a data architecture. This data architecture should enable collecting,...

  17. Incorporating enterprise strategic plans into enterprise architecture

    NARCIS (Netherlands)

    Lins Borges Azevedo, Carlos

    2017-01-01

    In the last years, information technology (IT) executives have identified IT-business strategic alignment as a top management concern. In the information technology area, emphasis has been given to the Enterprise Architecture (EA) discipline with respect to enterprise management. The focus of the

  18. Virtual Sensor Web Architecture

    Science.gov (United States)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  19. Effect of boundary conditions on the strength and deformability of replicas of natural fractures in welded tuff: Comparison between predicted and observed shear behavior using a graphical method

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.; Robertson, A.B.

    1993-09-01

    Four series of cyclic direct-shear experiments were conducted on several replicas of three natural fractures and a laboratory-developed tensile fracture of welded tuff from Yucca Mountain to test the graphical load-displacement analysis method proposed by Saeb (1989) and Amadei and Saeb (1990). Based on the results of shear tests conducted on several joint replicas under different levels of constant normal load ranging between 0.6 and 25.6 kips (2.7 and 113.9 kN), the shear behavior of joint replicas under constant normal stiffness ranging between 14.8 and 187.5 kips/in. (25.9 and 328.1 kN/cm) was predicted by using the graphical method. The predictions were compared to the results of actual shear tests conducted for the same range of constant normal stiffness. In general, a good agreement was found between the predicted and the observed shear behavior

  20. Effect of boundary conditions on the strength and deformability of replicas of natural fractures in welded tuff: Data analysis

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.

    1994-04-01

    Assessing the shear behavior of intact rock ampersand rock fractures is an important issue in the design of a potential nuclear waste repository at Yucca Mountain Nevada. Cyclic direct shear experiments were conducted on replicas of three natural fractures and a laboratory-developed tensile fracture of welded tuff. The tests were carried out under constant normal loads or constant normal stiffnesses with different initial normal load levels. Each test consisted of five cycles of forward and reverse shear motion. Based on the results of the shear tests conducted under constant normal load, the shear behavior of the joint replicas tested under constant normal stiffness was predicted by using the graphical analysis method of Saeb (1989), and Amadei and Saeb (1990). Comparison between the predictions and the actual constant stiffness direct shear experiment results can be found in a report by Wibowo et al. (1993b). Results of the constant normal load shear experiments are analyzed using several constitutive models proposed in the rock mechanics literature for joint shear strength, dilatancy, and joint surface damage. It is shown that some of the existing models have limitations. New constitutive models are proposed and are included in a mathematical analysis tool that can be used to predict joint behavior under various boundary conditions