WorldWideScience

Sample records for replica management architecture

  1. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  2. File-based replica management

    CERN Document Server

    Kunszt, Peter Z; Stockinger, Heinz; Stockinger, Kurt

    2005-01-01

    Data replication is one of the best known strategies to achieve high levels of availability and fault tolerance, as well as minimal access times for large, distributed user communities using a world-wide Data Grid. In certain scientific application domains, the data volume can reach the order of several petabytes; in these domains, data replication and access optimization play an important role in the manageability and usability of the Grid. In this paper, we present the design and implementation of a replica management Grid middleware that was developed within the EDG project left bracket European Data Grid Project (EDG), http://www.eu-egee.org right bracket and is designed to be extensible so that user communities can adjust its detailed behavior according to their QoS requirements.

  3. A Novel Buffer Management Architecture for Epidemic Routing in Delay Tolerant Networks (DTNs)

    KAUST Repository

    Elwhishi, Ahmed; Ho, Pin-Han; Naik, K.; Shihada, Basem

    2010-01-01

    Delay tolerant networks (DTNs) are wireless networks in which an end-to-end path for a given node pair can never exist for an extended period. It has been reported as a viable approach in launching multiple message replicas in order to increase message delivery ratio and reduce message delivery delay. This advantage, nonetheless, is at the expense of taking more buffer space at each node. The combination of custody and replication entails high buffer and bandwidth overhead. This paper investigates a new buffer management architecture for epidemic routing in DTNs, which helps each node to make a decision on which message should be forwarded or dropped. The proposed buffer management architecture is characterized by a suite of novel functional modules, including Summary Vector Exchange Module (SVEM), Networks State Estimation Module (NSEM), and Utility Calculation Module (UCM). Extensive simulation results show that the proposed buffer management architecture can achieve superb performance against its counterparts in terms of delivery ratio and delivery delay.

  4. A Novel Buffer Management Architecture for Epidemic Routing in Delay Tolerant Networks (DTNs)

    KAUST Repository

    Elwhishi, Ahmed

    2010-11-17

    Delay tolerant networks (DTNs) are wireless networks in which an end-to-end path for a given node pair can never exist for an extended period. It has been reported as a viable approach in launching multiple message replicas in order to increase message delivery ratio and reduce message delivery delay. This advantage, nonetheless, is at the expense of taking more buffer space at each node. The combination of custody and replication entails high buffer and bandwidth overhead. This paper investigates a new buffer management architecture for epidemic routing in DTNs, which helps each node to make a decision on which message should be forwarded or dropped. The proposed buffer management architecture is characterized by a suite of novel functional modules, including Summary Vector Exchange Module (SVEM), Networks State Estimation Module (NSEM), and Utility Calculation Module (UCM). Extensive simulation results show that the proposed buffer management architecture can achieve superb performance against its counterparts in terms of delivery ratio and delivery delay.

  5. Implementation and performance analysis of the LHCb LFC replica using Oracle streams technology

    CERN Document Server

    Düllmann, D; Martelli, B; Peco, G; Bonifazzi, F; Da Fonte Perez, E; Baranowski, Z; Vagnoni, V

    2007-01-01

    The presentation will describe the architecture and the deployment of the LHCb read-only File Catalogue for the LHC Computing Grid (LFC) replica implemented at the Italian INFN National Centre for Telematics and Informatics (CNAF), and evaluate a series of tests on the LFC with replica. The LHCb computing model foresees the replication of the central LFC database in every Tier-1, in order to assure more scalability and fault tolerance to LHCb applications Scientific data intensive applications use a large collection of files for storing data. In particular, as regards the HEP community, data generated by large detectors will be managed and stored using databases. The intensive access to information stored in databases by the Grid computing applications requires a distributed database replication in order to guarantee the scalability and, in case of failure, redundancy. Besides the results of the tests will be an important reference for all the Grid users This talk will describe the replica implementation of L...

  6. Creating technical heritage object replicas in a virtual environment

    Science.gov (United States)

    Egorova, Olga; Shcherbinin, Dmitry

    2016-03-01

    The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.

  7. Data Sets Replicas Placements Strategy from Cost-Effective View in the Cloud

    Directory of Open Access Journals (Sweden)

    Xiuguo Wu

    2016-01-01

    Full Text Available Replication technology is commonly used to improve data availability and reduce data access latency in the cloud storage system by providing users with different replicas of the same service. Most current approaches largely focus on system performance improvement, neglecting management cost in deciding replicas number and their store places, which cause great financial burden for cloud users because the cost for replicas storage and consistency maintenance may lead to high overhead with the number of new replicas increased in a pay-as-you-go paradigm. In this paper, towards achieving the approximate minimum data sets management cost benchmark in a practical manner, we propose a replicas placements strategy from cost-effective view with the premise that system performance meets requirements. Firstly, we design data sets management cost models, including storage cost and transfer cost. Secondly, we use the access frequency and the average response time to decide which data set should be replicated. Then, the method of calculating replicas’ number and their store places with minimum management cost is proposed based on location problem graph. Both the theoretical analysis and simulations have shown that the proposed strategy offers the benefits of lower management cost with fewer replicas.

  8. Replica methods for loopy sparse random graphs

    International Nuclear Information System (INIS)

    Coolen, ACC

    2016-01-01

    I report on the development of a novel statistical mechanical formalism for the analysis of random graphs with many short loops, and processes on such graphs. The graphs are defined via maximum entropy ensembles, in which both the degrees (via hard constraints) and the adjacency matrix spectrum (via a soft constraint) are prescribed. The sum over graphs can be done analytically, using a replica formalism with complex replica dimensions. All known results for tree-like graphs are recovered in a suitable limit. For loopy graphs, the emerging theory has an appealing and intuitive structure, suggests how message passing algorithms should be adapted, and what is the structure of theories describing spin systems on loopy architectures. However, the formalism is still largely untested, and may require further adjustment and refinement. (paper)

  9. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise...... architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  10. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process......Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  11. Fast Optimal Replica Placement with Exhaustive Search Using Dynamically Reconfigurable Processor

    Directory of Open Access Journals (Sweden)

    Hidetoshi Takeshita

    2011-01-01

    Full Text Available This paper proposes a new replica placement algorithm that expands the exhaustive search limit with reasonable calculation time. It combines a new type of parallel data-flow processor with an architecture tuned for fast calculation. The replica placement problem is to find a replica-server set satisfying service constraints in a content delivery network (CDN. It is derived from the set cover problem which is known to be NP-hard. It is impractical to use exhaustive search to obtain optimal replica placement in large-scale networks, because calculation time increases with the number of combinations. To reduce calculation time, heuristic algorithms have been proposed, but it is known that no heuristic algorithm is assured of finding the optimal solution. The proposed algorithm suits parallel processing and pipeline execution and is implemented on DAPDNA-2, a dynamically reconfigurable processor. Experiments show that the proposed algorithm expands the exhaustive search limit by the factor of 18.8 compared to the conventional algorithm search limit running on a Neumann-type processor.

  12. Validation of the replica trick for simple models

    Science.gov (United States)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  13. Architecture for Data Management

    OpenAIRE

    Vukolic, Marko

    2015-01-01

    In this document we present the preliminary architecture of the SUPERCLOUD data management and storage. We start by defining the design requirements of the architecture, motivated by use cases and then review the state-of-the-art. We survey security and dependability technologies and discuss designs for the overall unifying architecture for data management that serves as an umbrella for different security and dependability data management features. Specifically the document lays out the archi...

  14. Hyper-V Replica essentials

    CERN Document Server

    Krstevski, Vangel

    2013-01-01

    a in various deployment scenarios.Hyper-V Replica Essentials is for Windows Server administrators who want to improve their system availability and speed up disaster recovery. You will need experience in Hyper-V deployment because Hyper-V Replica is built in the Hyper-V platform.

  15. A Validation Study of the Impression Replica Technique.

    Science.gov (United States)

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  16. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  17. Beyond Virtual Replicas: 3D Modeling and Maltese Prehistoric Architecture

    Directory of Open Access Journals (Sweden)

    Filippo Stanco

    2013-01-01

    Full Text Available In the past decade, computer graphics have become strategic for the development of projects aimed at the interpretation of archaeological evidence and the dissemination of scientific results to the public. Among all the solutions available, the use of 3D models is particularly relevant for the reconstruction of poorly preserved sites and monuments destroyed by natural causes or human actions. These digital replicas are, at the same time, a virtual environment that can be used as a tool for the interpretative hypotheses of archaeologists and as an effective medium for a visual description of the cultural heritage. In this paper, the innovative methodology and aims and outcomes of a virtual reconstruction of the Borg in-Nadur megalithic temple, carried out by Archeomatica Project of the University of Catania, are offered as a case study for a virtual archaeology of prehistoric Malta.

  18. An Architectural Modelfor Intelligent Network Management

    Institute of Scientific and Technical Information of China (English)

    罗军舟; 顾冠群; 费翔

    2000-01-01

    Traditional network management approach involves the management of each vendor's equipment and network segment in isolation through its own proprietary element management system. It is necessary to set up a new network management architecture that calls for operation consolidation across vendor and technology boundaries. In this paper, an architectural model for Intelligent Network Management (INM) is presented. The INM system includes a manager system, which controls all subsystems and coordinates different management tasks; an expert system, which is responsible for handling particularly difficult problems, and intelligent agents, which bring the management closer to applications and user requirements by spreading intelligent agents through network segments or domain. In the expert system model proposed, especially an intelligent fault management system is given.The architectural model is to build the INM system to meet the need of managing modern network systems.

  19. Modeling Vocal Fold Intravascular Flow using Synthetic Replicas

    Science.gov (United States)

    Terry, Aaron D.; Ricks, Matthew T.; Thomson, Scott L.

    2017-11-01

    Vocal fold vibration that is induced by air flowing from the lungs is believed to decrease blood flow through the vocal folds. This is important due to the critical role of blood flow in maintaining tissue health. However, the precise mechanical relationships between vocal fold vibration and blood perfusion remain understudied. A platform for studying liquid perfusion in a synthetic, life-size, self-oscillating vocal fold replica has recently been developed. The replicas are fabricated using molded silicone with material properties comparable to those of human vocal fold tissues and that include embedded microchannels through which liquid is perfused. The replicas are mounted on an air flow supply tube to initiate flow-induced vibration. A liquid reservoir is attached to the microchannel to cause liquid to perfuse through replica in the anterior-posterior direction. As replica vibration is initiated and amplitude increases, perfusion flow rate decreases. In this presentation, the replica design will be presented, along with data quantifying the relationships between parameters such as replica vibration amplitude, stiffness, microchannel diameter, and perfusion flow rate. This work was supported by Grant NIDCD R01DC005788 from the National Institutes of Health.

  20. Layered Fault Management Architecture

    National Research Council Canada - National Science Library

    Sztipanovits, Janos

    2004-01-01

    ... UAVs or Organic Air Vehicles. The approach of this effort was to analyze fault management requirements of formation flight for fleets of UAVs, and develop a layered fault management architecture which demonstrates significant...

  1. Replica Fourier Transform: Properties and applications

    International Nuclear Information System (INIS)

    Crisanti, A.; De Dominicis, C.

    2015-01-01

    The Replica Fourier Transform is the generalization of the discrete Fourier Transform to quantities defined on an ultrametric tree. It finds use in conjunction of the replica method used to study thermodynamics properties of disordered systems such as spin glasses. Its definition is presented in a systematic and simple form and its use illustrated with some representative examples. In particular we give a detailed discussion of the diagonalization in the Replica Fourier Space of the Hessian matrix of the Gaussian fluctuations about the mean field saddle point of spin glass theory. The general results are finally discussed for a generic spherical spin glass model, where the Hessian can be computed analytically

  2. The Efficacy of Epidemic Algorithms on Detecting Node Replicas in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Narasimha Shashidhar

    2015-12-01

    Full Text Available A node replication attack against a wireless sensor network involves surreptitious efforts by an adversary to insert duplicate sensor nodes into the network while avoiding detection. Due to the lack of tamper-resistant hardware and the low cost of sensor nodes, launching replication attacks takes little effort to carry out. Naturally, detecting these replica nodes is a very important task and has been studied extensively. In this paper, we propose a novel distributed, randomized sensor duplicate detection algorithm called Discard to detect node replicas in group-deployed wireless sensor networks. Our protocol is an epidemic, self-organizing duplicate detection scheme, which exhibits emergent properties. Epidemic schemes have found diverse applications in distributed computing: load balancing, topology management, audio and video streaming, computing aggregate functions, failure detection, network and resource monitoring, to name a few. To the best of our knowledge, our algorithm is the first attempt at exploring the potential of this paradigm to detect replicas in a wireless sensor network. Through analysis and simulation, we show that our scheme achieves robust replica detection with substantially lower communication, computational and storage requirements than prior schemes in the literature.

  3. Bayesian ensemble refinement by replica simulations and reweighting

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  4. Enterprise Architecture in the Company Management Framework

    Directory of Open Access Journals (Sweden)

    Bojinov Bojidar Violinov

    2016-11-01

    Full Text Available The study aims to explore the role and importance of the concept of enterprise architecture in modern company management. For this purpose it clarifies the nature, scope, components of the enterprise architecture and relationships within it using the Zachman model. Based on the critical analysis of works by leading scientists, there presented a definition of enterprise architecture as a general description of all elements of strategic management of the company combined with description of its organizational, functional and operational structure, including the relationship between all tangible and intangible resources essential for its normal functioning and development. This in turn enables IT enterprise architecture to be defined as a set of corporate IT resources (hardware, software and technology, their interconnection and integration within the overall architecture of the company, as well as their formal description, methods and tools for their modeling and management in order to achieve strategic business goals of the organization. In conclusion the article summarizes the significance and role of enterprise architecture for strategic management of the company in today’s digital economy. The study underlines the importance of an integrated multidisciplinary approach to the work of a contemporary company, and the need for adequate matching and alignment of IT with business priorities and objectives of the company.

  5. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    Science.gov (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  6. Business architecture management architecting the business for consistency and alignment

    CERN Document Server

    Simon, Daniel

    2015-01-01

    This book presents a comprehensive overview of enterprise architecture management with a specific focus on the business aspects. While recent approaches to enterprise architecture management have dealt mainly with aspects of information technology, this book covers all areas of business architecture from business motivation and models to business execution. The book provides examples of how architectural thinking can be applied in these areas, thus combining different perspectives into a consistent whole. In-depth experiences from end-user organizations help readers to understand the abstract concepts of business architecture management and to form blueprints for their own professional approach. Business architecture professionals, researchers, and others working in the field of strategic business management will benefit from this comprehensive volume and its hands-on examples of successful business architecture management practices.​.

  7. Architectural Debt Management in Value-oriented Architecting

    NARCIS (Netherlands)

    Li, Z.; Liang, P.; Avgeriou, P.

    2014-01-01

    Architectural technical debt (ATD) may be incurred when making architecture decisions. In most cases, ATD is not effectively managed in the architecting process: It is not made explicit, and architecture decision making does not consider the ATD incurred by the different design options. This chapter

  8. A Reference Architecture for Space Information Management

    Science.gov (United States)

    Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.

    2006-01-01

    We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.

  9. Pervasive Application Rights Management Architecture

    OpenAIRE

    Dusparic, Ivana

    2005-01-01

    This dissertation describes an application rights management architecture that combines license management with digital rights management to provide an integrated platform for the specification, generation, delivery and management of application usage rights for pervasive computing environments. A new rights expression language is developed, extended from the existing language, ODRL, which allows the expression of mobile application usage rights and supports fine-grained usage ...

  10. SRF Cavity Surface Topography Characterization Using Replica Techniques

    Energy Technology Data Exchange (ETDEWEB)

    C. Xu, M.J. Kelley, C.E. Reece

    2012-07-01

    To better understand the roll of topography on SRF cavity performance, we seek to obtain detailed topographic information from the curved practical cavity surfaces. Replicas taken from a cavity interior surface provide internal surface molds for fine Atomic Force Microscopy (AFM) and stylus profilometry. In this study, we confirm the replica resolution both on surface local defects such as grain boundary and etching pits and compare the surface uniform roughness with the aid of Power Spectral Density (PSD) where we can statistically obtain roughness parameters at different scales. A series of sampling locations are at the same magnetic field chosen at the same latitude on a single cell cavity to confirm the uniformity. Another series of sampling locations at different magnetic field amplitudes are chosen for this replica on the same cavity for later power loss calculation. We also show that application of the replica followed by rinsing does not adversely affect the cavity performance.

  11. Neurovascular Modeling: Small-Batch Manufacturing of Silicone Vascular Replicas

    Science.gov (United States)

    Chueh, J.Y.; Wakhloo, A.K.; Gounis, M.J.

    2009-01-01

    BACKGROUND AND PURPOSE Realistic, population based cerebrovascular replicas are required for the development of neuroendovascular devices. The objective of this work was to develop an efficient methodology for manufacturing realistic cerebrovascular replicas. MATERIALS AND METHODS Brain MR angiography data from 20 patients were acquired. The centerline of the vasculature was calculated, and geometric parameters were measured to describe quantitatively the internal carotid artery (ICA) siphon. A representative model was created on the basis of the quantitative measurements. Using this virtual model, we designed a mold with core-shell structure and converted it into a physical object by fused-deposit manufacturing. Vascular replicas were created by injection molding of different silicones. Mechanical properties, including the stiffness and luminal coefficient of friction, were measured. RESULTS The average diameter, length, and curvature of the ICA siphon were 4.15 ± 0.09 mm, 22.60 ± 0.79 mm, and 0.34 ± 0.02 mm-1 (average ± standard error of the mean), respectively. From these image datasets, we created a median virtual model, which was transformed into a physical replica by an efficient batch-manufacturing process. The coefficient of friction of the luminal surface of the replica was reduced by up to 55% by using liquid silicone rubber coatings. The modulus ranged from 0.67 to 1.15 MPa compared with 0.42 MPa from human postmortem studies, depending on the material used to make the replica. CONCLUSIONS Population-representative, smooth, and true-to-scale silicone arterial replicas with uniform wall thickness were successfully built for in vitro neurointerventional device-testing by using a batch-manufacturing process. PMID:19321626

  12. Replica Fourier Tansforms on Ultrametric Trees, and Block-Diagonalizing Multi-Replica Matrices

    Science.gov (United States)

    de Dominicis, C.; Carlucci, D. M.; Temesvári, T.

    1997-01-01

    The analysis of objects living on ultrametric trees, in particular the block-diagonalization of 4-replica matrices M^{α β;γ^δ}, is shown to be dramatically simplified through the introduction of properly chosen operations on those objects. These are the Replica Fourier Transforms on ultrametric trees. Those transformations are defined and used in the present work. On montre que l'analyse d'objets vivant sur un arbre ultramétrique, en particulier, la diagonalisation par blocs d'une matrice M^{α β;γ^δ} dépendant de 4-répliques, se simplifie de façon dramatique si l'on introduit les opérations appropriées sur ces objets. Ce sont les Transformées de Fourier de Répliques sur un arbre ultramétrique. Ces transformations sont définies et utilisées dans le présent travail.

  13. Popularity framework to process dataset traces and its application on dynamic replica reduction in the ATLAS experiment

    International Nuclear Information System (INIS)

    Molfetas, Angelos; Megino, Fernando Barreiro; Tykhonov, Andrii; Lassnig, Mario; Garonne, Vincent; Barisits, Martin; Campana, Simone; Dimitrov, Gancho; Jezequel, Stephane; Ueda, Ikuo; Viegas, Florbela Tique Aires

    2011-01-01

    The ATLAS experiment's data management system is constantly tracing file movement operations that occur on the Worldwide LHC Computing Grid (WLCG). Due to the large scale of the WLCG, statistical analysis of the traces is infeasible in real-time. Factors that contribute to the scalability problems include the capability for users to initiate on-demand queries, high dimensionality of tracer entries combined with very low cardinality parameters, and the large size of the namespace. These scalability issues are alleviated through the adoption of an incremental model that aggregates data for all combinations occurring in selected tracer fields on a daily basis. Using this model it is possible to query on-demand relevant statistics about system usage. We present an implementation of this popularity model in the experiment's distributed data management system, DQ2, and describe a direct application example of the popularity framework, an automated cleaning system, which uses the statistics to dynamically detect and reduce unpopular replicas from grid sites. This paper describes the architecture employed by the cleaning system and reports on the results collected from a prototype during the first months of the ATLAS collision data taking.

  14. Replica exchange with solute tempering: A method for sampling biological systems in explicit water

    Science.gov (United States)

    Liu, Pu; Kim, Byungchan; Friesner, Richard A.; Berne, B. J.

    2005-09-01

    An innovative replica exchange (parallel tempering) method called replica exchange with solute tempering (REST) for the efficient sampling of aqueous protein solutions is presented here. The method bypasses the poor scaling with system size of standard replica exchange and thus reduces the number of replicas (parallel processes) that must be used. This reduction is accomplished by deforming the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. For proof of concept, REST is compared with standard replica exchange for an alanine dipeptide molecule in water. The comparisons confirm that REST greatly reduces the number of CPUs required by regular replica exchange and increases the sampling efficiency. This method reduces the CPU time required for calculating thermodynamic averages and for the ab initio folding of proteins in explicit water. Author contributions: B.J.B. designed research; P.L. and B.K. performed research; P.L. and B.K. analyzed data; and P.L., B.K., R.A.F., and B.J.B. wrote the paper.Abbreviations: REST, replica exchange with solute tempering; REM, replica exchange method; MD, molecular dynamics.*P.L. and B.K. contributed equally to this work.

  15. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  16. Replica calibration artefacts for optical 3D scanning of micro parts

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Carmignato, S.; Cantatore, Angela

    2009-01-01

    This work deals with development of calibration artefacts produced by using hard replica materials, achieving high quality geometrical reproduction of suitable reference artefacts, high stability, and high surface cooperativeness. An investigation was carried out using a replica material for dental...... applications to reproduce the geometry of a step artefact, a miniature step gauge, and a curve standard for optical measuring machines. The replica artefacts were calibrated using a tactile coordinate measuring machine and measured on two different optical scanners. Replication quality and applicability...... of the artefacts to verify the accuracy of optical measurements as well as thermal expansion coefficient and stability of the replica artefacts over time were documented....

  17. Application of extraction replicas and analytical electron microscopy to precipitate phase studies

    International Nuclear Information System (INIS)

    Kenik, E.A.; Maziasz, P.J.

    1984-01-01

    Extraction replicas provide a powerful extension of AEM techniques for analysis of fine precipitates. In many cases, replicas allow more accurate analyses to be performed and, in some cases, allow unique analyses which cannot be performed in-foil. However, there are limitations to the use of extraction replicas in AEM, of which the analyst must be aware. Many can be eliminated by careful preparation. Often, combined AEM studies of precipitates in-foil and on extraction replicas provide complementary and corroborative information for the fullest analysis of precipitate phases

  18. The GENiC architecture for integrated data centre energy management

    NARCIS (Netherlands)

    Pesch, D.; McGibney, A.; Sobonski, P.; Rea, S.; Scherer, Th.; Chen, L.; Engbersen, T.; Mehta, D.; O'Sullivan, B.; Pages, E.; Townley, J.; Kasinathan, Dh.; Torrens, J.I.; Zavrel, V.; Hensen, J.L.M.

    2015-01-01

    We present an architecture for integrated data centre energy management developed in the EC funded GENiC project. The architecture was devised to create a platform that can integrate functions for workload management, cooling, power management and control of heat recovery for future, highly

  19. A resource management architecture for metacomputing systems.

    Energy Technology Data Exchange (ETDEWEB)

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  20. Accuracy of three-dimensional printing for manufacturing replica teeth.

    Science.gov (United States)

    Lee, Keun-Young; Cho, Jin-Woo; Chang, Na-Young; Chae, Jong-Moon; Kang, Kyung-Hwa; Kim, Sang-Cheol; Cho, Jin-Hyoung

    2015-09-01

    Three-dimensional (3D) printing is a recent technological development that may play a significant role in orthodontic diagnosis and treatment. It can be used to fabricate skull models or study models, as well as to make replica teeth in autotransplantation or tooth impaction cases. The aim of this study was to evaluate the accuracy of fabrication of replica teeth made by two types of 3D printing technologies. Fifty extracted molar teeth were selected as samples. They were scanned to generate high-resolution 3D surface model stereolithography files. These files were converted into physical models using two types of 3D printing technologies: Fused deposition modeling (FDM) and PolyJet technology. All replica teeth were scanned and 3D images generated. Computer software compared the replica teeth to the original teeth with linear measurements, volumetric measurements, and mean deviation measurements with best-fit alignment. Paired t-tests were used to statistically analyze the measurements. Most measurements of teeth formed using FDM tended to be slightly smaller, while those of the PolyJet replicas tended to be slightly larger, than those of the extracted teeth. Mean deviation measurements with best-fit alignment of FDM and PolyJet group were 0.047 mm and 0.038 mm, respectively. Although there were statistically significant differences, they were regarded as clinically insignificant. This study confirms that FDM and PolyJet technologies are accurate enough to be usable in orthodontic diagnosis and treatment.

  1. Architectural mismatch issues in identity management deployment

    DEFF Research Database (Denmark)

    Andersen, Mads Schaarup

    2010-01-01

    Integrating Commercial Off-The-Shelf products in a company's software product portfolio offers business value, but introduces challenges from a software architecture perspective. In this paper, the research challenges in relation to identity management in the Danish municipality administration...... system called Opus, are outlined. Opus BRS is the identity management part of Opus. Opus integrates SAP, legacy mainframe systems, and other third party systems of the individual municipality. Each of these systems define their own software architecture and access control model, leading to architectural...... mismatch with an impact on security, usability, and maintainability. The research project is discussed and access control and identity provisioning are recognized as the major areas of interest in relation to the mismatch challenges. The project is carried out in close cooperation with KMD, one...

  2. An Architecture for Cross-Cloud System Management

    Science.gov (United States)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  3. A critical inventory of preoperative skull replicas.

    Science.gov (United States)

    Fasel, J H D; Beinemann, J; Schaller, K; Gailloud, P

    2013-09-01

    Physical replicas of organs are used increasingly for preoperative planning. The quality of these models is generally accepted by surgeons. In view of the strong trend towards minimally invasive and personalised surgery, however, the aim of this investigation was to assess qualitatively the accuracy of such replicas, using skull models as an example. Skull imaging was acquired for three cadavers by computed tomography using clinical routine parameters. After digital three-dimensional (3D) reconstruction, physical replicas were produced by 3D printing. The facsimilia were analysed systematically and compared with the best gold standard possible: the macerated skull itself. The skull models were far from anatomically accurate. Non-conforming rendering was observed in particular for foramina, sutures, notches, fissures, grooves, channels, tuberosities, thin-walled structures, sharp peaks and crests, and teeth. Surgeons should be aware that preoperative models may not yet render the exact anatomy of the patient under consideration and are advised to continue relying, in specific conditions, on their own analysis of the native computed tomography or magnetic resonance imaging.

  4. A DISTRIBUTED PROGNOSTIC HEALTH MANAGEMENT ARCHITECTURE

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper introduces a generic distributed prognostic health management (PHM) architecture with specific application to the electrical power systems domain. Current...

  5. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  6. Fabrication of the replica templated from butterfly wing scales with complex light trapping structures

    Science.gov (United States)

    Han, Zhiwu; Li, Bo; Mu, Zhengzhi; Yang, Meng; Niu, Shichao; Zhang, Junqiu; Ren, Luquan

    2015-11-01

    The polydimethylsiloxane (PDMS) positive replica templated twice from the excellent light trapping surface of butterfly Trogonoptera brookiana wing scales was fabricated by a simple and promising route. The exact SiO2 negative replica was fabricated by using a synthesis method combining a sol-gel process and subsequent selective etching. Afterwards, a vacuum-aided process was introduced to make PDMS gel fill into the SiO2 negative replica, and the PDMS gel was solidified in an oven. Then, the SiO2 negative replica was used as secondary template and the structures in its surface was transcribed onto the surface of PDMS. At last, the PDMS positive replica was obtained. After comparing the PDMS positive replica and the original bio-template in terms of morphology, dimensions and reflectance spectra and so on, it is evident that the excellent light trapping structures of butterfly wing scales were inherited by the PDMS positive replica faithfully. This bio-inspired route could facilitate the preparation of complex light trapping nanostructure surfaces without any assistance from other power-wasting and expensive nanofabrication technologies.

  7. Conformational sampling enhancement of replica exchange molecular dynamics simulations using swarm particle intelligence

    International Nuclear Information System (INIS)

    Kamberaj, Hiqmet

    2015-01-01

    In this paper, we present a new method based on swarm particle social intelligence for use in replica exchange molecular dynamics simulations. In this method, the replicas (representing the different system configurations) are allowed communicating with each other through the individual and social knowledge, in additional to considering them as a collection of real particles interacting through the Newtonian forces. The new method is based on the modification of the equations of motion in such way that the replicas are driven towards the global energy minimum. The method was tested for the Lennard-Jones clusters of N = 4,  5, and 6 atoms. Our results showed that the new method is more efficient than the conventional replica exchange method under the same practical conditions. In particular, the new method performed better on optimizing the distribution of the replicas among the thermostats with time and, in addition, ergodic convergence is observed to be faster. We also introduce a weighted histogram analysis method allowing analyzing the data from simulations by combining data from all of the replicas and rigorously removing the inserted bias

  8. Fabrication of free-standing replicas of fragile, laminar, chitinous biotemplates

    Energy Technology Data Exchange (ETDEWEB)

    Lakhtakia, Akhlesh; Motyka, Michael A [Materials Research Institute and Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, PA 16802 (United States); MartIn-Palma, Raul J; Pantano, Carlo G [Materials Research Institute and Department of Materials Science and Engineering, Pennsylvania State University, University Park, PA 16802 (United States)], E-mail: akhlesh@psu.edu

    2009-09-01

    The conformal-evaporated-film-by-rotation technique, followed by the dissolution of chitin in an aqueous solution of orthophosphoric acid, can be used to fabricate free-standing replicas of fragile, laminar, chitinous biotemplates. This novel approach was demonstrated using butterfly wings as biotemplates and GeSeSb chalcogenide glass for replicas. (communication)

  9. Fabrication of free-standing replicas of fragile, laminar, chitinous biotemplates

    International Nuclear Information System (INIS)

    Lakhtakia, Akhlesh; Motyka, Michael A; MartIn-Palma, Raul J; Pantano, Carlo G

    2009-01-01

    The conformal-evaporated-film-by-rotation technique, followed by the dissolution of chitin in an aqueous solution of orthophosphoric acid, can be used to fabricate free-standing replicas of fragile, laminar, chitinous biotemplates. This novel approach was demonstrated using butterfly wings as biotemplates and GeSeSb chalcogenide glass for replicas. (communication)

  10. C3PO - A dynamic data placement agent for ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00346910; The ATLAS collaboration; Lassnig, Mario; Barisits, Martin-Stefan; Serfon, Cedric; Garonne, Vincent

    2017-01-01

    This paper introduces a new dynamic data placement agent for the ATLAS distributed data management system. This agent is designed to pre-place potentially popular data to make it more widely available. It therefore incorporates information from a variety of sources. Those include input datasets and sites workload information from the ATLAS workload management system, network metrics from different sources like FTS and PerfSonar, historical popularity data collected through a tracer mechanism and more. With this data it decides if, when and where to place new replicas that then can be used by the WMS to distribute the workload more evenly over available computing resources and then ultimately reduce job waiting times. This paper gives an overview of the architecture and the final implementation of this new agent. The paper also includes an evaluation of the placement algorithm by comparing the transfer times and the new replica usage.

  11. Replica approach to mean-variance portfolio optimization

    Science.gov (United States)

    Varga-Haszonits, Istvan; Caccioli, Fabio; Kondor, Imre

    2016-12-01

    We consider the problem of mean-variance portfolio optimization for a generic covariance matrix subject to the budget constraint and the constraint for the expected return, with the application of the replica method borrowed from the statistical physics of disordered systems. We find that the replica symmetry of the solution does not need to be assumed, but emerges as the unique solution of the optimization problem. We also check the stability of this solution and find that the eigenvalues of the Hessian are positive for r  =  N/T  optimal in-sample variance is found to vanish at the critical point inversely proportional to the divergent estimation error.

  12. Experiences with Architectural Software Configuration Management in Ragnarok

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1998-01-01

    This paper describes a model, denoted architectural software configuration management, that minimises the gap between software design and configuration management by allowing developers to do configuration- and version control of the abstractions and hierarchy in a software architecture. The model...... emphasises traceability and reproducibility by unifying the concepts version and bound configuration. Experiences with such a model, implemented in a prototype “Ragnarok”, from three real-life, small- to medium-sized, software development projects are reported. The conclusion is that the presented model...

  13. Patrol Detection for Replica Attacks on Wireless Sensor Networks

    OpenAIRE

    Wang, Liang-Min; Shi, Yang

    2011-01-01

    Replica attack is a critical concern in the security of wireless sensor networks. We employ mobile nodes as patrollers to detect replicas distributed in different zones in a network, in which a basic patrol detection protocol and two detection algorithms for stationary and mobile modes are presented. Then we perform security analysis to discuss the defense strategies against the possible attacks on the proposed detection protocol. Moreover, we show the advantages of the proposed protocol by d...

  14. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Tie-Jiun [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Gao, Jun [INPAC, Shanghai Key Laboratory for Particle Physics and Cosmology,Department of Physics and Astronomy, Shanghai Jiao-Tong University, Shanghai 200240 (China); High Energy Physics Division, Argonne National Laboratory,Argonne, Illinois, 60439 (United States); Huston, Joey [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Nadolsky, Pavel [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Schmidt, Carl; Stump, Daniel [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Wang, Bo-Ting; Xie, Ke Ping [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Dulat, Sayipjamal [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); School of Physics Science and Technology, Xinjiang University,Urumqi, Xinjiang 830046 (China); Center for Theoretical Physics, Xinjiang University,Urumqi, Xinjiang 830046 (China); Pumplin, Jon; Yuan, C.P. [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States)

    2017-03-20

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A correction is proposed to address a bias in asymmetric uncertainties introduced by the Taylor series approximation. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  15. Multiscale implementation of infinite-swap replica exchange molecular dynamics.

    Science.gov (United States)

    Yu, Tang-Qing; Lu, Jianfeng; Abrams, Cameron F; Vanden-Eijnden, Eric

    2016-10-18

    Replica exchange molecular dynamics (REMD) is a popular method to accelerate conformational sampling of complex molecular systems. The idea is to run several replicas of the system in parallel at different temperatures that are swapped periodically. These swaps are typically attempted every few MD steps and accepted or rejected according to a Metropolis-Hastings criterion. This guarantees that the joint distribution of the composite system of replicas is the normalized sum of the symmetrized product of the canonical distributions of these replicas at the different temperatures. Here we propose a different implementation of REMD in which (i) the swaps obey a continuous-time Markov jump process implemented via Gillespie's stochastic simulation algorithm (SSA), which also samples exactly the aforementioned joint distribution and has the advantage of being rejection free, and (ii) this REMD-SSA is combined with the heterogeneous multiscale method to accelerate the rate of the swaps and reach the so-called infinite-swap limit that is known to optimize sampling efficiency. The method is easy to implement and can be trivially parallelized. Here we illustrate its accuracy and efficiency on the examples of alanine dipeptide in vacuum and C-terminal β-hairpin of protein G in explicit solvent. In this latter example, our results indicate that the landscape of the protein is a triple funnel with two folded structures and one misfolded structure that are stabilized by H-bonds.

  16. C3PO - A Dynamic Data Placement Agent for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, Thomas; The ATLAS collaboration; Barisits, Martin-Stefan; Serfon, Cedric; Garonne, Vincent

    2016-01-01

    This contribution introduces a new dynamic data placement agent for the ATLAS distributed data management system. This agent is designed to pre-place potentially popular data to make it more widely available. It uses data from a variety of sources. Those include input datasets and sites workload information from the ATLAS workload management system, network metrics from different sources like FTS and PerfSonar, historical popularity data collected through a tracer mechanism and more. With this data it decides if, when and where to place new replicas that then can be used by the WMS to distribute the workload more evenly over available computing resources and then ultimately reduce job waiting times. The new replicas are created with a short lifetime that gets extended, when the data is accessed and therefore the system behaves like a big cache. This paper gives an overview of the architecture and the final implementation of this new agent. The paper also includes an evaluation of different placement algorithm...

  17. Accuracy of three-dimensional printing for manufacturing replica teeth

    OpenAIRE

    Lee, Keun-Young; Cho, Jin-Woo; Chang, Na-Young; Chae, Jong-Moon; Kang, Kyung-Hwa; Kim, Sang-Cheol; Cho, Jin-Hyoung

    2015-01-01

    Objective Three-dimensional (3D) printing is a recent technological development that may play a significant role in orthodontic diagnosis and treatment. It can be used to fabricate skull models or study models, as well as to make replica teeth in autotransplantation or tooth impaction cases. The aim of this study was to evaluate the accuracy of fabrication of replica teeth made by two types of 3D printing technologies. Methods Fifty extracted molar teeth were selected as samples. They were sc...

  18. Maritime Domain Awareness Architecture Management Hub Strategy

    National Research Council Canada - National Science Library

    2008-01-01

    This document provides an initial high level strategy for carrying out the responsibilities of the national Maritime Domain Awareness Architecture Management Hub to deliver a standards based service...

  19. The added value of the replica simulators in the exploitation of nuclear power plants; El valor anadido de los simuladores replica en la explotacion de las centrales nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-07-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  20. Internal structure analysis of particle-double network gels used in a gel organ replica

    Science.gov (United States)

    Abe, Mei; Arai, Masanori; Saito, Azusa; Sakai, Kazuyuki; Kawakami, Masaru; Furukawa, Hidemitsu

    2016-04-01

    In recent years, the fabrication of patient organ replicas using 3D printers has been attracting a great deal of attention in medical fields. However, the cost of these organ replicas is very high as it is necessary to employ very expensive 3D printers and printing materials. Here we present a new gel organ replica, of human kidney, fabricated with a conventional molding technique, using a particle-double network hydrogel (P-DN gel). The replica is transparent and has the feel of a real kidney. It is expected that gel organ replicas produced this way will be a useful tool for the education of trainee surgeons and clinical ultrasonography technologists. In addition to developing a gel organ replica, the internal structure of the P-DN gel used is also discussed. Because the P-DN gel has a complex structure comprised of two different types of network, it has not been possible to investigate them internally in detail. Gels have an inhomogeneous network structure. If it is able to get a more uniform structure, it is considered that this would lead to higher strength in the gel. In the present study we investigate the structure of P-DN gel, using the gel organ replica. We investigated the internal structure of P-DN gel using Scanning Microscopic Light Scattering (SMILS), a non-contacting and non-destructive.

  1. Calculation of absolute protein-ligand binding free energy using distributed replica sampling.

    Science.gov (United States)

    Rodinger, Tomas; Howell, P Lynne; Pomès, Régis

    2008-10-21

    Distributed replica sampling [T. Rodinger et al., J. Chem. Theory Comput. 2, 725 (2006)] is a simple and general scheme for Boltzmann sampling of conformational space by computer simulation in which multiple replicas of the system undergo a random walk in reaction coordinate or temperature space. Individual replicas are linked through a generalized Hamiltonian containing an extra potential energy term or bias which depends on the distribution of all replicas, thus enforcing the desired sampling distribution along the coordinate or parameter of interest regardless of free energy barriers. In contrast to replica exchange methods, efficient implementation of the algorithm does not require synchronicity of the individual simulations. The algorithm is inherently suited for large-scale simulations using shared or heterogeneous computing platforms such as a distributed network. In this work, we build on our original algorithm by introducing Boltzmann-weighted jumping, which allows moves of a larger magnitude and thus enhances sampling efficiency along the reaction coordinate. The approach is demonstrated using a realistic and biologically relevant application; we calculate the standard binding free energy of benzene to the L99A mutant of T4 lysozyme. Distributed replica sampling is used in conjunction with thermodynamic integration to compute the potential of mean force for extracting the ligand from protein and solvent along a nonphysical spatial coordinate. Dynamic treatment of the reaction coordinate leads to faster statistical convergence of the potential of mean force than a conventional static coordinate, which suffers from slow transitions on a rugged potential energy surface.

  2. Information Architecture for Quality Management Support in Hospitals.

    Science.gov (United States)

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  3. Re-engineering Nascom's network management architecture

    Science.gov (United States)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated

  4. Trillo NPP full scope replica simulator project: The last great NPP simulation challenge in Spain

    International Nuclear Information System (INIS)

    Rivero, N.; Abascal, A.

    2006-01-01

    In the year 2000, Trillo NPP (Spanish PWR-KWU design nuclear power plant) and Tecnatom came to the agreement of developing a Trillo plant specific simulator, having as scope all the plant systems operated either from the main control room or from the emergency panels. The simulator operation should be carried out both through a control room replica and graphical user interface, this latter based on plant schematics and softpanels concept. Trillo simulator is to be primarily utilized as a pedagogical tool for the Trillo operational staff training. Because the engineering grade of the mathematical models, it will also have additional uses, such as: - Operation engineering (POE's validation, New Computerized Operator Support Systems Validation, etc).; - Emergency drills; -Plant design modifications assessment. This project has become the largest simulation task Tecnatom has ever undertaken, being structured in three different subprojects, namely: - Simulator manufacture, Simulator acceptance and Training material production. Most relevant technological innovations the project brings are: Highest accuracy in the Nuclear Island models, Advanced Configuration Management System, Open Software architecture, Human machine interface new design, Latest design I/O system and an Instructor Station with extended functionality. The Trillo simulator 'Ready for Training' event is due on September 2003, having started the Factory Acceptance Tests in Autumn 2002. (author)

  5. Patrol Detection for Replica Attacks on Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2011-02-01

    Full Text Available Replica attack is a critical concern in the security of wireless sensor networks. We employ mobile nodes as patrollers to detect replicas distributed in different zones in a network, in which a basic patrol detection protocol and two detection algorithms for stationary and mobile modes are presented. Then we perform security analysis to discuss the defense strategies against the possible attacks on the proposed detection protocol. Moreover, we show the advantages of the proposed protocol by discussing and comparing the communication cost and detection probability with some existing methods.

  6. Reference architecture of application services for personal wellbeing information management.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha

    2011-01-01

    Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.

  7. Towards a framework for managing enterprise architecture acceptance / Sonja Gilliland

    OpenAIRE

    Gilliland, Sonja

    2014-01-01

    An enterprise is a complex and changing entity, which is managed and maintained by humans. Enterprise architecture has been identified as an organisational strategy designed to assist enterprises with the understanding of complexity and the management of change. Acceptance, implementation and maintenance of enterprise architecture in organisations are complex and time-consuming. Work roles, responsibilities, common vocabulary, and buy-in are some of the cooperative human factors of stakeholde...

  8. Microlens fabrication by replica molding of frozen laser-printed droplets

    Science.gov (United States)

    Surdo, Salvatore; Diaspro, Alberto; Duocastella, Martí

    2017-10-01

    In this work, we synergistically combine laser-induced forward transfer (LIFT) and replica molding for the fabrication of microlenses with control of their geometry and size independent of the material or substrate used. Our approach is based on a multistep process in which liquid microdroplets of an aqueous solution are first printed on a substrate by LIFT. Following a freezing step, the microdroplets are used as a master to fabricate a polydimethylsiloxane (PDMS) mold. A subsequent replica molding step enables the creation of microlenses and microlens arrays on arbitrary selected substrates and by using different curable polymers. Thus, our method combines the rapid fabrication capabilities of LIFT and the perfectively smooth surface quality of the generated microdroplets, with the advantages of replica molding in terms of parallelization and materials flexibility. We demonstrate our strategy by generating microlenses of different photocurable polymers and by characterizing their optical and morphological properties.

  9. Democratic management and architecture school

    Directory of Open Access Journals (Sweden)

    Silvana Aparecida de Souza

    2011-10-01

    Full Text Available It is a conceptual and theoretical research on school organization and its democratization, focusing on one aspect of an objective nature: its architecture. The study was based on the academic literature on democratization and theoretical contribution of Michel Foucault, with regard to the analysis of space as a resourcecontrol, surveillance and training, going through a historical review of the modelconstruction of school buildings in Brazil. It is therefore a sociological analysis of the school environment, in relation to the democratization process of basic education, understood as ensuring that the conditions of access and permanence to a universalquality education, and conceived and gestated from collective interests of its users.We conclude that the architecture of public schools in Brazil do not provides democratic management, either by format controller of buildings constructed in the republican period, either by the current economic priority for the construction of public school buildings, which includes little or no space for collective activities. The character of the buildings remains controller, no more for its architecture, but made possible by technological development, which allows monitoring by video cameras, which is made with the permission and support of community.

  10. 3D printed replicas for endodontic education.

    Science.gov (United States)

    Reymus, M; Fotiadou, C; Kessler, A; Heck, K; Hickel, R; Diegritz, C

    2018-06-14

    To assess the feasibility of producing artificial teeth for endodontic training using 3D printing technology, to analyse the accuracy of the printing process, and to evaluate the teeth by students when used during training. Sound extracted human teeth were selected, digitalized by cone beam computed tomography (CBCT) and appropriate software and finally reproduced by a stereolithographic printer. The printed teeth were scanned and compared with the original ones (trueness) and to one another (precision). Undergraduate dental students in the third and fourth years performed root canal treatment on printed molars and were subsequently asked to evaluate their experience with these compared to real teeth. The workflow was feasible for manufacturing 3D printed tooth replicas. The absolute deviation after printing (trueness) ranged from 50.9μm to 104.3μm. The values for precision ranged from 43.5μm to 68.2μm. Students reported great benefits in the use of the replicated teeth for training purposes. The presented workflow is feasible for any dental educational institution who has access to a CBCT unit and a stereolithographic printer. The accuracy of the printing process is suitable for the production of tooth replicas for endodontic training. Undergraduate students favoured the availability of these replicas and the fairness they ensured in training due to standardization. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Fiber-wireless convergence in next-generation communication networks systems, architectures, and management

    CERN Document Server

    Chang, Gee-Kung; Ellinas, Georgios

    2017-01-01

    This book investigates new enabling technologies for Fi-Wi convergence. The editors discuss Fi-Wi technologies at the three major network levels involved in the path towards convergence: system level, network architecture level, and network management level. The main topics will be: a. At system level: Radio over Fiber (digitalized vs. analogic, standardization, E-band and beyond) and 5G wireless technologies; b. Network architecture level: NGPON, WDM-PON, BBU Hotelling, Cloud Radio Access Networks (C-RANs), HetNets. c. Network management level: SDN for convergence, Next-generation Point-of-Presence, Wi-Fi LTE Handover, Cooperative MultiPoint. • Addresses the Fi-Wi convergence issues at three different levels, namely at the system level, network architecture level, and network management level • Provides approaches in communication systems, network architecture, and management that are expected to steer the evolution towards fiber-wireless convergence • Contributions from leading experts in the field of...

  12. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  13. Designed-walk replica-exchange method for simulations of complex systems

    OpenAIRE

    Urano, Ryo; Okamoto, Yuko

    2015-01-01

    We propose a new implementation of the replica-exchange method (REM) in which replicas follow a pre-planned route in temperature space instead of a random walk. Our method satisfies the detailed balance condition in the proposed route. The method forces tunneling events between the highest and lowest temperatures to happen with an almost constant period. The number of tunneling counts is proportional to that of the random-walk REM multiplied by the square root of moving distance in temperatur...

  14. Standard practice for production and evaluation of field metallographic replicas

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2001-01-01

    1.1 This practice covers recognized methods for the preparation and evaluation of cellulose acetate or plastic film replicas which have been obtained from metallographically prepared surfaces. It is designed for the evaluation of replicas to ensure that all significant features of a metallographically prepared surface have been duplicated and preserved on the replica with sufficient detail to permit both LM and SEM examination with optimum resolution and sensitivity. 1.2 This practice may be used as a controlling document in commercial situations. 1.3 The values stated in SI units are to be regarded as the standard. Inch-pound units given in parentheses are for information only. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  15. CogWnet: A Resource Management Architecture for Cognitive Wireless Networks

    KAUST Repository

    Alqerm, Ismail

    2013-07-01

    With the increasing adoption of wireless communication technologies, there is a need to improve management of existing radio resources. Cognitive radio is a promising technology to improve the utilization of wireless spectrum. Its operating principle is based on building an integrated hardware and software architecture that configures the radio to meet application requirements within the constraints of spectrum policy regulations. However, such an architecture must be able to cope with radio environment heterogeneity. In this paper, we propose a cognitive resource management architecture, called CogWnet, that allocates channels, re-configures radio transmission parameters to meet QoS requirements, ensures reliability, and mitigates interference. The architecture consists of three main layers: Communication Layer, which includes generic interfaces to facilitate the communication between the cognitive architecture and TCP/IP stack layers; Decision-Making Layer, which classifies the stack layers input parameters and runs decision-making optimization algorithms to output optimal transmission parameters; and Policy Layer to enforce policy regulations on the selected part of the spectrum. The efficiency of CogWnet is demonstrated through a testbed implementation and evaluation.

  16. Architecture-based Model for Preventive and Operative Crisis Management

    National Research Council Canada - National Science Library

    Jungert, Erland; Derefeldt, Gunilla; Hallberg, Jonas; Hallberg, Niklas; Hunstad, Amund; Thuren, Ronny

    2004-01-01

    .... A system that should support activities of this type must not only have a high capacity, with respect to the dataflow, but also have suitable tools for decision support. To overcome these problems, an architecture for preventive and operative crisis management is proposed. The architecture is based on models for command and control, but also for risk analysis.

  17. An Architecture for Open Learning Management Systems

    NARCIS (Netherlands)

    Avgeriou, Paris; Retalis, Simos; Skordalakis, Manolis

    2003-01-01

    There exists an urgent demand on defining architectures for Learning Management Systems, so that high-level frameworks for understanding these systems can be discovered, and quality attributes like portability, interoperability, reusability and modifiability can be achieved. In this paper we propose

  18. IN QUEST OF TOTAL QUALITY MANAGEMENT PRINCIPLES IN ARCHITECTURAL DESIGN SERVICES: EVIDENCE FROM TURKEY

    Directory of Open Access Journals (Sweden)

    Umut Durmus

    2010-12-01

    Full Text Available Proposal: Architectural design companies increasingly recognize that time spent on management is not at the expense of their production and there are always better ways to organize business. Although architects have long placed a traditional emphasis on quality, quality management is still a new concept for the majority of architectural design companies, which have to organize relatively more complicated operations nowadays to meet their clients’ expectations. This study aims to understand how architectural design companies define quality and explores the extent to which Total Quality Management (TQM principles like continual improvement, employee involvement, customer satisfaction and others can be pertinent in these companies. Adopting a qualitative research strategy, the authors interviewed with the owner-managers of 10 widely-recognized architectural design companies of different size in Istanbul. The results from the content analysis of semi-structured interview data suggest that i TQM principles cannot be directly applied in architectural design companies without an appropriate translation; ii special characteristics of design services are important to explain quality-related perceptions of owner-managers; iii the owner-managers feel the pressure from the changing internal and external environmental conditions, however few of them adopt a systematic and documented approach to quality management. Architectural design offices which aim to establish a quality management system can benefit from this study to understand potential problem areas on their road.

  19. Replica-Based High-Performance Tuple Space Computing

    DEFF Research Database (Denmark)

    Andric, Marina; De Nicola, Rocco; Lluch Lafuente, Alberto

    2015-01-01

    of concurrency and data access. We investigate issues related to replica consistency, provide an operational semantics that guides the implementation of the language, and discuss the main synchronization mechanisms of our prototypical run-time framework. Finally, we provide a performance analysis, which includes...

  20. A Novel General Chemistry Laboratory: Creation of Biomimetic Superhydrophobic Surfaces through Replica Molding

    Science.gov (United States)

    Verbanic, Samuel; Brady, Owen; Sanda, Ahmed; Gustafson, Carolina; Donhauser, Zachary J.

    2014-01-01

    Biomimetic replicas of superhydrophobic lotus and taro leaf surfaces can be made using polydimethylsiloxane. These replicas faithfully reproduce the microstructures of the leaves' surface and can be analyzed using contact angle goniometry, self-cleaning experiments, and optical microscopy. These simple and adaptable experiments were used to…

  1. Fault Management Architectures and the Challenges of Providing Software Assurance

    Science.gov (United States)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    The satellite systems Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most is system complexity due to a need to establish a multi-dimensional structure across hardware, software and operations. This structure is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. These architecture, implementation and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (VV) is challenging. A breakout session at the 2012 NASA Independent Verification Validation (IVV) Annual Workshop titled VV of Fault Management: Challenges and Successes exposed these issues in terms of VV for a representative set of architectures. NASA's IVV is funded by NASA's Software Assurance Research Program (SARP) in partnership with NASA's Jet Propulsion Laboratory (JPL) to extend the work performed at the Workshop session. NASA IVV will extract FM architectures across the IVV portfolio and evaluate the data set for robustness, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This work focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures, visibility, and associated VVIVV techniques provides a data set that can enable higher assurance that a satellite system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the satellite community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the

  2. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images

    International Nuclear Information System (INIS)

    Laak, Jeroen A.W.M. van der; Dijkman, Henry B.P.M.; Pahlplatz, Martin M.M.

    2006-01-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000x to 200,000x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy

  3. Kinetics from Replica Exchange Molecular Dynamics Simulations.

    Science.gov (United States)

    Stelzl, Lukas S; Hummer, Gerhard

    2017-08-08

    Transitions between metastable states govern many fundamental processes in physics, chemistry and biology, from nucleation events in phase transitions to the folding of proteins. The free energy surfaces underlying these processes can be obtained from simulations using enhanced sampling methods. However, their altered dynamics makes kinetic and mechanistic information difficult or impossible to extract. Here, we show that, with replica exchange molecular dynamics (REMD), one can not only sample equilibrium properties but also extract kinetic information. For systems that strictly obey first-order kinetics, the procedure to extract rates is rigorous. For actual molecular systems whose long-time dynamics are captured by kinetic rate models, accurate rate coefficients can be determined from the statistics of the transitions between the metastable states at each replica temperature. We demonstrate the practical applicability of the procedure by constructing master equation (Markov state) models of peptide and RNA folding from REMD simulations.

  4. Synthesis and properties of ZnFe2O4 replica with biological hierarchical structure

    International Nuclear Information System (INIS)

    Liu, Hongyan; Guo, Yiping; Zhang, Yangyang; Wu, Fen; Liu, Yun; Zhang, Di

    2013-01-01

    Highlights: • ZFO replica with hierarchical structure was synthesized from butterfly wings. • Biotemplate has a significant impact on the properties of ZFO material. • Our method opens up new avenues for the synthesis of spinel ferrites. -- Abstract: ZnFe 2 O 4 replica with biological hierarchical structure was synthesized from Papilio paris by a sol–gel method followed by calcination. The crystallographic structure and morphology of the obtained samples were characterized by X-ray diffraction, field-emission scanning electron microscope, and transmittance electron microscope. The results showed that the hierarchical structures were retained in the ZFO replica of spinel structure. The magnetic behavior of such novel products was measured by a vibrating sample magnetometer. A superparamagnetism-like behavior was observed due to nanostructuration size effects. In addition, the ZFO replica with “quasi-honeycomb-like structure” showed a much higher specific capacitance of 279.4 F g −1 at 10 mV s −1 in comparison with ZFO powder of 137.3 F g −1 , attributing to the significantly increased surface area. These results demonstrated that ZFO replica is a promising candidate for novel magnetic devices and supercapacitors

  5. A distance-aware replica adaptive data gathering protocol for Delay Tolerant Mobile Sensor Networks.

    Science.gov (United States)

    Feng, Yong; Gong, Haigang; Fan, Mingyu; Liu, Ming; Wang, Xiaomin

    2011-01-01

    In Delay Tolerant Mobile Sensor Networks (DTMSNs) that have the inherent features of intermitted connectivity and frequently changing network topology it is reasonable to utilize multi-replica schemes to improve the data gathering performance. However, most existing multi-replica approaches inject a large amount of message copies into the network to increase the probability of message delivery, which may drain each mobile node's limited battery supply faster and result in too much contention for the restricted resources of the DTMSN, so a proper data gathering scheme needs a trade off between the number of replica messages and network performance. In this paper, we propose a new data gathering protocol called DRADG (for Distance-aware Replica Adaptive Data Gathering protocol), which economizes network resource consumption through making use of a self-adapting algorithm to cut down the number of redundant replicas of messages, and achieves a good network performance by leveraging the delivery probabilities of the mobile sensors as main routing metrics. Simulation results have shown that the proposed DRADG protocol achieves comparable or higher message delivery ratios at the cost of the much lower transmission overhead than several current DTMSN data gathering schemes.

  6. Architecture for Integrated System Health Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Managing the health of vehicle, crew, and habitat systems is a primary function of flight controllers today. We propose to develop an architecture for automating...

  7. Tomosynthesis-detected Architectural Distortion: Management Algorithm with Radiologic-Pathologic Correlation.

    Science.gov (United States)

    Durand, Melissa A; Wang, Steven; Hooley, Regina J; Raghu, Madhavi; Philpotts, Liane E

    2016-01-01

    As use of digital breast tomosynthesis becomes increasingly widespread, new management challenges are inevitable because tomosynthesis may reveal suspicious lesions not visible at conventional two-dimensional (2D) full-field digital mammography. Architectural distortion is a mammographic finding associated with a high positive predictive value for malignancy. It is detected more frequently at tomosynthesis than at 2D digital mammography and may even be occult at conventional 2D imaging. Few studies have focused on tomosynthesis-detected architectural distortions to date, and optimal management of these distortions has yet to be well defined. Since implementing tomosynthesis at our institution in 2011, we have learned some practical ways to assess architectural distortion. Because distortions may be subtle, tomosynthesis localization tools plus improved visualization of adjacent landmarks are crucial elements in guiding mammographic identification of elusive distortions. These same tools can guide more focused ultrasonography (US) of the breast, which facilitates detection and permits US-guided tissue sampling. Some distortions may be sonographically occult, in which case magnetic resonance imaging may be a reasonable option, both to increase diagnostic confidence and to provide a means for image-guided biopsy. As an alternative, tomosynthesis-guided biopsy, conventional stereotactic biopsy (when possible), or tomosynthesis-guided needle localization may be used to achieve tissue diagnosis. Practical uses for tomosynthesis in evaluation of architectural distortion are highlighted, potential complications are identified, and a working algorithm for management of tomosynthesis-detected architectural distortion is proposed. (©)RSNA, 2016.

  8. Information management architecture for an integrated computing environment for the Environmental Restoration Program. Environmental Restoration Program, Volume 3, Interim technical architecture

    International Nuclear Information System (INIS)

    1994-09-01

    This third volume of the Information Management Architecture for an Integrated Computing Environment for the Environmental Restoration Program--the Interim Technical Architecture (TA) (referred to throughout the remainder of this document as the ER TA)--represents a key milestone in establishing a coordinated information management environment in which information initiatives can be pursued with the confidence that redundancy and inconsistencies will be held to a minimum. This architecture is intended to be used as a reference by anyone whose responsibilities include the acquisition or development of information technology for use by the ER Program. The interim ER TA provides technical guidance at three levels. At the highest level, the technical architecture provides an overall computing philosophy or direction. At this level, the guidance does not address specific technologies or products but addresses more general concepts, such as the use of open systems, modular architectures, graphical user interfaces, and architecture-based development. At the next level, the technical architecture provides specific information technology recommendations regarding a wide variety of specific technologies. These technologies include computing hardware, operating systems, communications software, database management software, application development software, and personal productivity software, among others. These recommendations range from the adoption of specific industry or Martin Marietta Energy Systems, Inc. (Energy Systems) standards to the specification of individual products. At the third level, the architecture provides guidance regarding implementation strategies for the recommended technologies that can be applied to individual projects and to the ER Program as a whole

  9. A Layered Software Architecture for the Management of a Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Domenico CONSOLI

    2011-01-01

    Full Text Available In this paper we describe a layered software architecture in the management of a manufactur-ing company that intensively uses computer technology. Application tools, new and legacy, after the updating, operate in a context of an open web oriented architecture. The software architecture enables the integration and interoperability among all tools that support business processes. Manufacturing Executive System and Text Mining tools are excellent interfaces, the former both for internal production and management processes and the latter for external processes coming from the market. In this way, it is possible to implement, a computer integrated factory, flexible and agile, that immediately responds to customer requirements.

  10. Coulomb replica-exchange method: handling electrostatic attractive and repulsive forces for biomolecules.

    Science.gov (United States)

    Itoh, Satoru G; Okumura, Hisashi

    2013-03-30

    We propose a new type of the Hamiltonian replica-exchange method (REM) for molecular dynamics (MD) and Monte Carlo simulations, which we refer to as the Coulomb REM (CREM). In this method, electrostatic charge parameters in the Coulomb interactions are exchanged among replicas while temperatures are exchanged in the usual REM. By varying the atom charges, the CREM overcomes free-energy barriers and realizes more efficient sampling in the conformational space than the REM. Furthermore, this method requires only a smaller number of replicas because only the atom charges of solute molecules are used as exchanged parameters. We performed Coulomb replica-exchange MD simulations of an alanine dipeptide in explicit water solvent and compared the results with those of the conventional canonical, replica exchange, and van der Waals REMs. Two force fields of AMBER parm99 and AMBER parm99SB were used. As a result, the CREM sampled all local-minimum free-energy states more frequently than the other methods for both force fields. Moreover, the Coulomb, van der Waals, and usual REMs were applied to a fragment of an amyloid-β peptide (Aβ) in explicit water solvent to compare the sampling efficiency of these methods for a larger system. The CREM sampled structures of the Aβ fragment more efficiently than the other methods. We obtained β-helix, α-helix, 3(10)-helix, β-hairpin, and β-sheet structures as stable structures and deduced pathways of conformational transitions among these structures from a free-energy landscape. Copyright © 2012 Wiley Periodicals, Inc.

  11. A System Architecture for Autonomous Demand Side Load Management in Smart Buildings

    DEFF Research Database (Denmark)

    Costanzo, Giuseppe Tommaso; Zhu, Guchuan; Anjos, Miguel F.

    2012-01-01

    This paper presents a system architecture for load management in smart buildings which enables autonomous demand side load management in the smart grid. Being of a layered structure composed of three main modules for admission control, load balancing, and demand response management...... in multiple time-scales and allows seamless integration of diverse techniques for online operation control, optimal scheduling, and dynamic pricing. The design of a home energy manager based on this architecture is illustrated and the simulation results with Matlab/Simulink confirm the viability...

  12. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  13. Generalized Information Architecture for Managing Requirements in IBM?s Rational DOORS(r) Application.

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner; Shannon, Sharon A.

    2014-12-01

    When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rational DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201

  14. Replica analysis of partition-function zeros in spin-glass models

    International Nuclear Information System (INIS)

    Takahashi, Kazutaka

    2011-01-01

    We study the partition-function zeros in mean-field spin-glass models. We show that the replica method is useful to find the locations of zeros in a complex parameter plane. For the random energy model, we obtain the phase diagram in the plane and find that there are two types of distributions of zeros: two-dimensional distribution within a phase and one-dimensional one on a phase boundary. Phases with a two-dimensional distribution are characterized by a novel order parameter defined in the present replica analysis. We also discuss possible patterns of distributions by studying several systems.

  15. The added value of the replica simulators in the exploitation of nuclear power plants

    International Nuclear Information System (INIS)

    Diaz Giron, P. a.; Ortega, F.; Rivero, N.

    2011-01-01

    Nuclear power plants full scope replica simulators were in the past solely designed following operational personnel training criteria. Nevertheless, these simulators not only feature a high replica control room but also provide an accurate process response. Control room replica simulators are presently based on complex technological platforms permitting highest physical and functional fidelity, allowing to be used as versatile and value added tools in diverse plants operation and maintenance activities. In recent years. Tecnatom has extended the use of such simulators to different engineering applications. this article intends to identify the simulators use in training and other applications beyond training. (Author)

  16. A Formally Verified Decentralized Key Management Architecture for Wireless Sensor Networks

    NARCIS (Netherlands)

    Law, Y.W.; Corin, R.J.; Etalle, Sandro; Hartel, Pieter H.

    We present a decentralized key management architecture for wireless sensor networks, covering the aspects of key deployment, key refreshment and key establishment. Our architecture is based on a clear set of assumptions and guidelines. Balance between security and energy consumption is achieved by

  17. Parallel replica dynamics method for bistable stochastic reaction networks: Simulation and sensitivity analysis

    Science.gov (United States)

    Wang, Ting; Plecháč, Petr

    2017-12-01

    Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.

  18. Parallel replica dynamics method for bistable stochastic reaction networks: Simulation and sensitivity analysis.

    Science.gov (United States)

    Wang, Ting; Plecháč, Petr

    2017-12-21

    Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.

  19. Multi-agent based distributed control architecture for microgrid energy management and optimization

    International Nuclear Information System (INIS)

    Basir Khan, M. Reyasudin; Jidin, Razali; Pasupuleti, Jagadeesh

    2016-01-01

    Highlights: • A new multi-agent based distributed control architecture for energy management. • Multi-agent coordination based on non-cooperative game theory. • A microgrid model comprised of renewable energy generation systems. • Performance comparison of distributed with conventional centralized control. - Abstract: Most energy management systems are based on a centralized controller that is difficult to satisfy criteria such as fault tolerance and adaptability. Therefore, a new multi-agent based distributed energy management system architecture is proposed in this paper. The distributed generation system is composed of several distributed energy resources and a group of loads. A multi-agent system based decentralized control architecture was developed in order to provide control for the complex energy management of the distributed generation system. Then, non-cooperative game theory was used for the multi-agent coordination in the system. The distributed generation system was assessed by simulation under renewable resource fluctuations, seasonal load demand and grid disturbances. The simulation results show that the implementation of the new energy management system proved to provide more robust and high performance controls than conventional centralized energy management systems.

  20. Architecture Framework for Fault Management Assessment and Design (AFFMAD), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Architecture Framework for Fault Management Assessment And Design(AFFMAD) provides Fault Management (FM) trade space exploration and rigorous performance constraint...

  1. An ODMG-compatible testbed architecture for scalable management and analysis of physics data

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.

    1997-01-01

    This paper describes a testbed architecture for the investigation and development of scalable approaches to the management and analysis of massive amounts of high energy physics data. The architecture has two components: an interface layer that is compliant with a substantial subset of the ODMG-93 Version 1.2 specification, and a lightweight object persistence manager that provides flexible storage and retrieval services on a variety of single- and multi-level storage architectures, and on a range of parallel and distributed computing platforms

  2. Zeolite-templated carbon replica: a Grand Canonical Monte-Carlo simulation study

    International Nuclear Information System (INIS)

    Thomas Roussel; Roland J M Pellenq; Christophe Bichara; Roger Gadiou; Antoine Didion; Cathie Vix Guterl; Fabrice Gaslain; Julien Parmentier; Valentin Valtchev; Joel Patarin

    2005-01-01

    Microporous carbon materials are interesting for several applications such as hydrogen storage, catalysis or electrical double layer capacitors. The development of the negative templating method to obtain carbon replicas from ordered templates, has lead to the synthesis of several new materials which have interesting textural properties, attractive for energy storage. Among the possible templates, zeolites can be used to obtain highly microporous carbon materials. Nevertheless, the phenomena involved in the replica synthesis are not fully understood, and the relationships between the structure of the template, the carbon precursor and the resulting carbon material need to be investigated. Experimental results for carbon zeolite-templated nano-structures can be found in a series of papers; see for instance ref. [1] in which Wang et al describe a route to ultra-small Single Wall Carbon Nano-tubes (SWNTs) using the porosity of zeolite AlPO 4 -5. After matrix removal, the resulting structure is a free-standing bundle of 4 Angstroms large nano-tubes. However, it is highly desirable to obtain an ordered porous carbon structure that forms a real 3D network to be used for instance in gas storage applications. Carbon replica of faujasite and EMT zeolites can have these properties since these zeolites have a 3D porous network made of 10 Angstroms cages connected to each other through 7 Angstroms large windows. The first step of this study was to generate a theoretical carbon replica structure of various zeolites (faujasite, EMT, AlPO 4 -5, silicalite). For this purpose, we used the Grand Canonical Monte-Carlo (GCMC) technique in which the carbon-carbon interactions were described within the frame of a newly developed Tight Binding approach and the carbon-zeolite interactions assumed to be characteristic of physi-sorption. The intrinsic stability of the subsequent carbon nano-structures was then investigated after mimicking the removal of the inorganic phase by switching

  3. Utility of replica techniques for x-ray microanalysis of second phase particles

    International Nuclear Information System (INIS)

    Bentley, J.

    1984-01-01

    X-ray microanalysis of second phase particles in ion-milled or electropolished thin foils is often complicated by the presence of the matrix nearby. Extraction replica techniques provide a means to avoid many of the complications of thin-foil analyses. In this paper, three examples of the analysis of second phase particles are described and illustrate the improvement obtained by the use of extraction replicas for qualitative analysis, quantitative analysis, and analysis of radioactive specimens

  4. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  5. Design of management information system for nuclear industry architectural project costs

    International Nuclear Information System (INIS)

    Zhang Xingzhi; Li Wei

    1996-01-01

    Management Information System (MIS) for nuclear industry architectural project is analysed and designed in detail base on quota management and engineering budget management of nuclear industry in respect of the practice of Qinshan Second Phase 2 x 600 MW Project

  6. Virtual Replication of IoT Hubs in the Cloud: A Flexible Approach to Smart Object Management

    Directory of Open Access Journals (Sweden)

    Simone Cirani

    2018-03-01

    Full Text Available In future years, the Internet of Things is expected to interconnect billions of highly heterogeneous devices, denoted as “smart objects”, enabling the development of innovative distributed applications. Smart objects are constrained sensor/actuator-equipped devices, in terms of computational power and available memory. In order to cope with the diverse physical connectivity technologies of smart objects, the Internet Protocol is foreseen as the common “language” for full interoperability and as a unifying factor for integration with the Internet. Large-scale platforms for interconnected devices are required to effectively manage resources provided by smart objects. In this work, we present a novel architecture for the management of large numbers of resources in a scalable, seamless, and secure way. The proposed architecture is based on a network element, denoted as IoT Hub, placed at the border of the constrained network, which implements the following functions: service discovery; border router; HTTP/Constrained Application Protocol (CoAP and CoAP/CoAP proxy; cache; and resource directory. In order to protect smart objects (which cannot, because of their constrained nature, serve a large number of concurrent requests and the IoT Hub (which serves as a gateway to the constrained network, we introduce the concept of virtual IoT Hub replica: a Cloud-based “entity” replicating all the functions of a physical IoT Hub, which external clients will query to access resources. IoT Hub replicas are constantly synchronized with the physical IoT Hub through a low-overhead protocol based on Message Queue Telemetry Transport (MQTT. An experimental evaluation, proving the feasibility and advantages of the proposed architecture, is presented.

  7. Acceleration of Lateral Equilibration in Mixed Lipid Bilayers Using Replica Exchange with Solute Tempering.

    Science.gov (United States)

    Huang, Kun; García, Angel E

    2014-10-14

    The lateral heterogeneity of cellular membranes plays an important role in many biological functions such as signaling and regulating membrane proteins. This heterogeneity can result from preferential interactions between membrane components or interactions with membrane proteins. One major difficulty in molecular dynamics simulations aimed at studying the membrane heterogeneity is that lipids diffuse slowly and collectively in bilayers, and therefore, it is difficult to reach equilibrium in lateral organization in bilayer mixtures. Here, we propose the use of the replica exchange with solute tempering (REST) approach to accelerate lateral relaxation in heterogeneous bilayers. REST is based on the replica exchange method but tempers only the solute, leaving the temperature of the solvent fixed. Since the number of replicas in REST scales approximately only with the degrees of freedom in the solute, REST enables us to enhance the configuration sampling of lipid bilayers with fewer replicas, in comparison with the temperature replica exchange molecular dynamics simulation (T-REMD) where the number of replicas scales with the degrees of freedom of the entire system. We apply the REST method to a cholesterol and 1,2-dipalmitoyl- sn -glycero-3-phosphocholine (DPPC) bilayer mixture and find that the lateral distribution functions of all molecular pair types converge much faster than in the standard MD simulation. The relative diffusion rate between molecules in REST is, on average, an order of magnitude faster than in the standard MD simulation. Although REST was initially proposed to study protein folding and its efficiency in protein folding is still under debate, we find a unique application of REST to accelerate lateral equilibration in mixed lipid membranes and suggest a promising way to probe membrane lateral heterogeneity through molecular dynamics simulation.

  8. An architecture model for multiple disease management information systems.

    Science.gov (United States)

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  9. Surgery planning and navigation by laser lithography plastic replica. Features, clinical applications, and advantages

    International Nuclear Information System (INIS)

    Kihara, Tomohiko; Tanaka, Yuuko; Furuhata, Kentaro

    1995-01-01

    The use of three-dimensional replicas created using laserlithography has recently become popular for surgical planning and intraoperative navigation in plastic surgery and oral maxillofacial surgery. In this study, we investigated many clinical applications that we have been involved in regarding the production of three-dimensional replicas. We have also analyzed the features, application classes, and advantages of this method. As a result, clinical applications are categorized into three classes, which are 'three-dimensional shape recognition', 'simulated surgery', and 'template'. The distinct features of three-dimensional replicas are 'direct recognition', 'fast manipulation', and 'free availability'. Meeting the requirements of surgical planning and intraoperative navigation, they have produced satisfactory results in clinical applications. (author)

  10. ARCHITECTURE SOFTWARE SOLUTION TO SUPPORT AND DOCUMENT MANAGEMENT QUALITY SYSTEM

    Directory of Open Access Journals (Sweden)

    Milan Eric

    2010-12-01

    Full Text Available One of the basis of a series of standards JUS ISO 9000 is quality system documentation. An architecture of the quality system documentation depends on the complexity of business system. An establishment of an efficient management documentation of system of quality is of a great importance for the business system, as well as in the phase of introducing the quality system and in further stages of its improvement. The study describes the architecture and capability of software solutions to support and manage the quality system documentation in accordance with the requirements of standards ISO 9001:2001, ISO 14001:2005 HACCP etc.

  11. Synthesis and properties of ZnFe{sub 2}O{sub 4} replica with biological hierarchical structure

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongyan; Guo, Yiping, E-mail: ypguo@sjtu.edu.cn; Zhang, Yangyang; Wu, Fen; Liu, Yun; Zhang, Di, E-mail: zhangdi@sjtu.edu.cn

    2013-09-20

    Highlights: • ZFO replica with hierarchical structure was synthesized from butterfly wings. • Biotemplate has a significant impact on the properties of ZFO material. • Our method opens up new avenues for the synthesis of spinel ferrites. -- Abstract: ZnFe{sub 2}O{sub 4} replica with biological hierarchical structure was synthesized from Papilio paris by a sol–gel method followed by calcination. The crystallographic structure and morphology of the obtained samples were characterized by X-ray diffraction, field-emission scanning electron microscope, and transmittance electron microscope. The results showed that the hierarchical structures were retained in the ZFO replica of spinel structure. The magnetic behavior of such novel products was measured by a vibrating sample magnetometer. A superparamagnetism-like behavior was observed due to nanostructuration size effects. In addition, the ZFO replica with “quasi-honeycomb-like structure” showed a much higher specific capacitance of 279.4 F g{sup −1} at 10 mV s{sup −1} in comparison with ZFO powder of 137.3 F g{sup −1}, attributing to the significantly increased surface area. These results demonstrated that ZFO replica is a promising candidate for novel magnetic devices and supercapacitors.

  12. Replica treatment of the Calogero-Sutherland model

    International Nuclear Information System (INIS)

    Gangardt, Dimitry M.; Kamenev, Alex

    2001-01-01

    Employing Forrester-Ha method of Jack polynomials, we derive an integral identity connecting certain N-fold coordinate average of the Calogero-Sutherland model with the n-fold replica integral. Subsequent analytical continuation in n leads to asymptotic expressions for the (static and dynamic) density-density correlation function of the model as well as the Green's function for an arbitrary coupling constant λ

  13. Intelligent web data management software architectures and emerging technologies

    CERN Document Server

    Ma, Kun; Yang, Bo; Sun, Runyuan

    2016-01-01

    This book presents some of the emerging techniques and technologies used to handle Web data management. Authors present novel software architectures and emerging technologies and then validate using experimental data and real world applications. The contents of this book are focused on four popular thematic categories of intelligent Web data management: cloud computing, social networking, monitoring and literature management. The Volume will be a valuable reference to researchers, students and practitioners in the field of Web data management, cloud computing, social networks using advanced intelligence tools.

  14. Architectural Decision Management for Digital Transformation of Products and Services

    Directory of Open Access Journals (Sweden)

    Alfred Zimmermann

    2016-04-01

    Full Text Available The digitization of our society changes the way we live, work, learn, communicate, and collaborate. The Internet of Things, Enterprise Social Networks, Adaptive Case Management, Mobility systems, Analytics for Big Data, and Cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and service-oriented enterprise architectures. Our aim is to support flexibility and agile transformations for both business domains and related information technology. The present research paper investigates mechanisms for decision analytics in the context of multi-perspective explorations of enterprise services and their digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support. The paper’s context focuses on digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight on the example domain – Internet of Things.

  15. Gamma-ray dosimetry measurements of the Little Boy replica

    International Nuclear Information System (INIS)

    Plassmann, E.A.; Pederson, R.A.

    1984-01-01

    We present the current status of our gamma-ray dosimetry results for the Little Boy replica. Both Geiger-Mueller and thermoluminescent detectors were used in the measurements. Future work is needed to test assumptions made in data analysis

  16. Reusable Rocket Engine Advanced Health Management System. Architecture and Technology Evaluation: Summary

    Science.gov (United States)

    Pettit, C. D.; Barkhoudarian, S.; Daumann, A. G., Jr.; Provan, G. M.; ElFattah, Y. M.; Glover, D. E.

    1999-01-01

    In this study, we proposed an Advanced Health Management System (AHMS) functional architecture and conducted a technology assessment for liquid propellant rocket engine lifecycle health management. The purpose of the AHMS is to improve reusable rocket engine safety and to reduce between-flight maintenance. During the study, past and current reusable rocket engine health management-related projects were reviewed, data structures and health management processes of current rocket engine programs were assessed, and in-depth interviews with rocket engine lifecycle and system experts were conducted. A generic AHMS functional architecture, with primary focus on real-time health monitoring, was developed. Fourteen categories of technology tasks and development needs for implementation of the AHMS were identified, based on the functional architecture and our assessment of current rocket engine programs. Five key technology areas were recommended for immediate development, which (1) would provide immediate benefits to current engine programs, and (2) could be implemented with minimal impact on the current Space Shuttle Main Engine (SSME) and Reusable Launch Vehicle (RLV) engine controllers.

  17. Conceptual Architecture of Building Energy Management Open Source Software (BEMOSS)

    Energy Technology Data Exchange (ETDEWEB)

    Khamphanchai, Warodom; Saha, Avijit; Rathinavel, Kruthika; Kuzlu, Murat; Pipattanasomporn, Manisa; Rahman, Saifur; Akyol, Bora A.; Haack, Jereme N.

    2014-12-01

    The objective of this paper is to present a conceptual architecture of a Building Energy Management Open Source Software (BEMOSS) platform. The proposed BEMOSS platform is expected to improve sensing and control of equipment in small- and medium-sized buildings, reduce energy consumption and help implement demand response (DR). It aims to offer: scalability, robustness, plug and play, open protocol, interoperability, cost-effectiveness, as well as local and remote monitoring. In this paper, four essential layers of BEMOSS software architecture -- namely User Interface, Application and Data Management, Operating System and Framework, and Connectivity layers -- are presented. A laboratory test bed to demonstrate the functionality of BEMOSS located at the Advanced Research Institute of Virginia Tech is also briefly described.

  18. Critiques, replicas and proposals for the New Urbanism Vision

    Directory of Open Access Journals (Sweden)

    Alaide Retana

    2014-03-01

    Full Text Available The new urbanism (NU is a vision of planning and urban design emerged in 1993, which finds its basis in the design of traditional communities. This trend has had various criticisms and replicas, which were reviewed in relation to urban sprawl, transportation, re-densifying, mix of uses of land, design, gentrification, pedestrianization and safety, which were analyzed in the neighborhood of Santa Barbara in Toluca, Mexico. This area was chosen for being traditional and forming part of the historical center of the city, which even though it was not designed under the guidelines of the NU, it has the quality of traditional, from which the NU would theoretically has taken its essence. The objective of this analysis is to establish whether the NU has the essence of a traditional Mexican neighborhood, as well as to check if the criticisms of the NU are informed when applied to a space belonging to a Mexican historic center that has been abandoned by problems of insecurity and degradation. The general conclusion is that the traditional neighborhoods have provided design elements to the NU, which will refute some of the criticisms, however, proposals for NU in neighborhoods of his-toric centers have to be based on the community, the architecture and existing urbanism, since these elements are those that give the identity.

  19. Evaluation of creep damage development by the replica method; Utvaerdering av krypskadeutveckling med replikmetoden

    Energy Technology Data Exchange (ETDEWEB)

    Storesund, Jan [Det Norske Veritas AB, Stockholm (Sweden); Roennholm, Markku [Fortum (Sweden)

    2002-04-01

    Creep damage development in high temperature components can be monitored by the replica method. Damage is classified and an experience based time period for safe operation is recommended where a re-inspection should be conducted. Original recommendations are still commonly used but there are also developed ones are mostly less conservative. A data base of more than 6000 replicas, collected from welded components in Swedish and Finnish power plants, has been evaluated with respect to damage development in the present project. The results are in general in good agreement to the existing developed recommendations for re-inspections. Important factors that should be considered for use of the recommendations are highlighted: Service history, Material, welding and heat treatment, Measure of pressure and temperature, System stresses, Geometrical stress concentrations, stress distributions, Design of components and welds, Creep crack growth, Starts and stops, Extent and performance of the replica method. These factors have been analysed with respect to the evaluated data resulting in comments to the existing recommendations. In addition, recommendations and conditions for a high reliability of the replica method are described. The comments and recommendations can be read in separate sections in the end of the report.

  20. Comparison of pulsed versus continuous oxygen delivery using realistic adult nasal airway replicas

    Directory of Open Access Journals (Sweden)

    Chen JZ

    2017-08-01

    Full Text Available John Z Chen,1 Ira M Katz,2 Marine Pichelin,2 Kaixian Zhu,3 Georges Caillibotte,2 Michelle L Noga,4 Warren H Finlay,1 Andrew R Martin1 1Department of Mechanical Engineering, University of Alberta, Edmonton, AB, Canada; 2Medical R&D, Air Liquide Santé International, Centre de Recherche Paris-Saclay, Les Loges-en-Josas, 3Centre Explor!, Air Liquide Healthcare, Gentilly, France; 4Radiology and Diagnostic Imaging, University of Alberta, Edmonton, AB, Canada Background: Portable oxygen concentrators (POCs typically include pulse flow (PF modes to conserve oxygen. The primary aims of this study were to develop a predictive in vitro model for inhaled oxygen delivery using a set of realistic airway replicas, and to compare PF for a commercial POC with steady flow (SF from a compressed oxygen cylinder. Methods: Experiments were carried out using a stationary compressed oxygen cylinder, a POC, and 15 adult nasal airway replicas based on airway geometries derived from medical images. Oxygen delivery via nasal cannula was tested at PF settings of 2.0 and 6.0, and SF rates of 2.0 and 6.0 L/min. A test lung simulated three breathing patterns representative of a chronic obstructive pulmonary disease patient at rest, during exercise, and while asleep. Volume-averaged fraction of inhaled oxygen (FiO2 was calculated by analyzing oxygen concentrations sampled at the exit of each replica and inhalation flow rates over time. POC pulse volumes were also measured using a commercial O2 conserver test system to attempt to predict FiO2 for PF. Results: Relative volume-averaged FiO2 using PF ranged from 68% to 94% of SF values, increasing with breathing frequency and tidal volume. Three of 15 replicas failed to trigger the POC when used with the sleep breathing pattern at the 2.0 setting, and four of 15 replicas failed to trigger at the 6.0 setting. FiO2 values estimated from POC pulse characteristics followed similar trends but were lower than those derived from

  1. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  2. Effect of cation nature of zeolite on carbon replicas and their electrochemical capacitance

    International Nuclear Information System (INIS)

    Zhou, Jin; Li, Wen; Zhang, Zhongshen; Wu, Xiaozhong; Xing, Wei; Zhuo, Shuping

    2013-01-01

    Graphical abstract: Cation nature of zeolite influences the porosity, surface chemical properties of carbon replicas of zeolite, resulting in different electrochemical capacitance. Highlights: ► The porosity of carbon replica strongly depends on zeolite's effective pore size. ► The surface chemical properties influence by the cation nature of zeolite. ► The N-doping introduces large pseudo-capacitance. ► The HYC800 carbon showed a high capacitance of up to 312 F g −1 in 1 M H 2 SO 4 . ► The prepared carbons show good durability of galvanostatic cycle. -- Abstract: N-doped carbon replicas of zeolite Y are prepared, and the effect of cation nature of zeolite (H + or Na + ) on the carbon replicas is studied. The morphology, structure and surface properties of the carbon materials are investigated by scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray diffraction (XRD), N 2 adsorption, X-ray photoelectron spectroscopy (XPS) and Fourier transform infrared spectroscopy (FT-IR). The pore regularity, pore parameter and surface chemical properties of the carbons may strongly depend on the cation nature of the zeolite Y. The carbon replicas of zeolite HY (H-form of zeolite Y) possesses higher pore regularity and much larger surface area than those of zeolite NaY (Na-form of zeolite Y), while the latter carbons seem to possess higher carbonization degrees. Electrochemical measurements show a large faradaic capacitance related to the N- or O-containing groups for the prepared carbons. Owing to the large specific surface area, high pore regularity and heteroatom-doping, the HYC800 sample derived from zeolite HY presents very high gravimetric capacitance, up to 312.4 F g −1 in H 2 SO 4 electrolyte, and this carbon can operate at 1.2 V with good retention ratio in the range of 0.25 to 10 A g −1

  3. Efficient Round-Trip Time Optimization for Replica-Exchange Enveloping Distribution Sampling (RE-EDS).

    Science.gov (United States)

    Sidler, Dominik; Cristòfol-Clough, Michael; Riniker, Sereina

    2017-06-13

    Replica-exchange enveloping distribution sampling (RE-EDS) allows the efficient estimation of free-energy differences between multiple end-states from a single molecular dynamics (MD) simulation. In EDS, a reference state is sampled, which can be tuned by two types of parameters, i.e., smoothness parameters(s) and energy offsets, such that all end-states are sufficiently sampled. However, the choice of these parameters is not trivial. Replica exchange (RE) or parallel tempering is a widely applied technique to enhance sampling. By combining EDS with the RE technique, the parameter choice problem could be simplified and the challenge shifted toward an optimal distribution of the replicas in the smoothness-parameter space. The choice of a certain replica distribution can alter the sampling efficiency significantly. In this work, global round-trip time optimization (GRTO) algorithms are tested for the use in RE-EDS simulations. In addition, a local round-trip time optimization (LRTO) algorithm is proposed for systems with slowly adapting environments, where a reliable estimate for the round-trip time is challenging to obtain. The optimization algorithms were applied to RE-EDS simulations of a system of nine small-molecule inhibitors of phenylethanolamine N-methyltransferase (PNMT). The energy offsets were determined using our recently proposed parallel energy-offset (PEOE) estimation scheme. While the multistate GRTO algorithm yielded the best replica distribution for the ligands in water, the multistate LRTO algorithm was found to be the method of choice for the ligands in complex with PNMT. With this, the 36 alchemical free-energy differences between the nine ligands were calculated successfully from a single RE-EDS simulation 10 ns in length. Thus, RE-EDS presents an efficient method for the estimation of relative binding free energies.

  4. DOD Business Systems Modernization: Military Departments Need to Strengthen Management of Enterprise Architecture Programs

    National Research Council Canada - National Science Library

    Hite, Randolph C; Johnson, Tonia; Eagle, Timothy; Epps, Elena; Holland, Michael; Lakhmani, Neela; LaPaza, Rebecca; Le, Anh; Paintsil, Freda

    2008-01-01

    .... Our framework for managing and evaluating the status of architecture programs consists of 31 core elements related to architecture governance, content, use, and measurement that are associated...

  5. A Replica Detection Scheme Based on the Deviation in Distance Traveled Sliding Window for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Alekha Kumar Mishra

    2017-01-01

    Full Text Available Node replication attack possesses a high level of threat in wireless sensor networks (WSNs and it is severe when the sensors are mobile. A limited number of replica detection schemes in mobile WSNs (MWSNs have been reported till date, where most of them are centralized in nature. The centralized detection schemes use time-location claims and the base station (BS is solely responsible for detecting replica. Therefore, these schemes are prone to single point of failure. There is also additional communication overhead associated with sending time-location claims to the BS. A distributed detection mechanism is always a preferred solution to the above kind of problems due to significantly lower communication overhead than their counterparts. In this paper, we propose a distributed replica detection scheme for MWSNs. In this scheme, the deviation in the distance traveled by a node and its replica is recorded by the observer nodes. Every node is an observer node for some nodes in the network. Observers are responsible for maintaining a sliding window of recent time-distance broadcast of the nodes. A replica is detected by an observer based on the degree of violation computed from the deviations recorded using the time-distance sliding window. The analysis and simulation results show that the proposed scheme is able to achieve higher detection probability compared to distributed replica detection schemes such as Efficient Distributed Detection (EDD and Multi-Time-Location Storage and Diffusion (MTLSD.

  6. Homeowner's Architectural Responses to Crime in Dar Es Salaan : Its impacts and implications to urban architecture, urban design and urban management

    OpenAIRE

    Bulamile, Ludigija Boniface

    2009-01-01

    HTML clipboardThis study is about Homeowner’s architectural responses to crime in Dar es Salaam Tanzania: its impacts and implications to urban architecture, urban design and urban management. The study explores and examines the processes through which homeowners respond to crimes of burglary, home robbery and fear of it using architectural or physical elements. The processes are explored and examined using case study methodology in three cases in Dar es Salaam. The cases are residentia...

  7. A healthcare management system for Turkey based on a service-oriented architecture.

    Science.gov (United States)

    Herand, Deniz; Gürder, Filiz; Taşkin, Harun; Yuksel, Emre Nuri

    2013-09-01

    The current Turkish healthcare management system has a structure that is extremely inordinate, cumbersome and inflexible. Furthermore, this structure has no common point of view and thus has no interoperability and responds slowly to innovations. The purpose of this study is to show that using which methods can the Turkish healthcare management system provide a structure that could be more modern, more flexible and more quick to respond to innovations and changes taking advantage of the benefits given by a service-oriented architecture (SOA). In this paper, the Turkish healthcare management system is chosen to be examined since Turkey is considered as one of the Third World countries and the information architecture of the existing healthcare management system of Turkey has not yet been configured with SOA, which is a contemporary innovative approach and should provide the base architecture of the new solution. The innovation of this study is the symbiosis of two main integration approaches, SOA and Health Level 7 (HL7), for integrating divergent healthcare information systems. A model is developed which is based on SOA and enables obtaining a healthcare management system having the SSF standards (HSSP Service Specification Framework) developed by the framework of the HSSP (Healthcare Services Specification Project) under the leadership of HL7 and the Object Management Group.

  8. Power-managed smart lighting using a semantic interoperability architecture

    NARCIS (Netherlands)

    Bhardwaj, S.; Syed, Aly; Ozcelebi, T.; Lukkien, J.J.

    2011-01-01

    We present a power-managed smart lighting system that allows collaboration of Consumer Electronics (CE) lighting-devices and corresponding system architectures provided by different CE suppliers. In the example scenario, the rooms of a building are categorized as low- and highpriority, each category

  9. A Hybrid Power Management (HPM) Based Vehicle Architecture

    Science.gov (United States)

    Eichenberg, Dennis J.

    2011-01-01

    Society desires vehicles with reduced fuel consumption and reduced emissions. This presents a challenge and an opportunity for industry and the government. The NASA John H. Glenn Research Center (GRC) has developed a Hybrid Power Management (HPM) based vehicle architecture for space and terrestrial vehicles. GRC's Electrical and Electromagnetics Branch of the Avionics and Electrical Systems Division initiated the HPM Program for the GRC Technology Transfer and Partnership Office. HPM is the innovative integration of diverse, state-of-the-art power devices in an optimal configuration for space and terrestrial applications. The appropriate application and control of the various power devices significantly improves overall system performance and efficiency. The basic vehicle architecture consists of a primary power source, and possibly other power sources, providing all power to a common energy storage system, which is used to power the drive motors and vehicle accessory systems, as well as provide power as an emergency power system. Each component is independent, permitting it to be optimized for its intended purpose. This flexible vehicle architecture can be applied to all vehicles to considerably improve system efficiency, reliability, safety, security, and performance. This unique vehicle architecture has the potential to alleviate global energy concerns, improve the environment, stimulate the economy, and enable new missions.

  10. Using Enterprise Architecture for the Alignment of Information Systems in Supply Chain Management

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    Using information systems in supply chain management (SCM) has become commonplace, and therefore architectural issue are part of the agenda for this domain. This article uses three perspectives on enterprise architecture (EA) in the supply chain: The "correlation view," the "remote view...

  11. Power-managed smart lighting using a semantic interoperability architecture

    NARCIS (Netherlands)

    Bhardwaj, S.; Syed, Aly; Ozcelebi, T.; Lukkien, J.J.

    2011-01-01

    This paper presents a power-managed smart lighting system that allows collaboration of lighting consumer electronics (CE) devices and corresponding system architectures provided by different CE suppliers. In the example scenario, the rooms of a building are categorized as low and high priority, each

  12. Novel pervasive scenarios for home management: the Butlers architecture.

    Science.gov (United States)

    Denti, Enrico

    2014-01-01

    Many efforts today aim to energy saving, promoting the user's awareness and virtuous behavior in a sustainability perspective. Our houses, appliances, energy meters and devices are becoming smarter and connected, domotics is increasing possibilities in house automation and control, and ambient intelligence and assisted living are bringing attention onto people's needs from different viewpoints. Our assumption is that considering these aspects together allows for novel intriguing possibilities. To this end, in this paper we combine home energy management with domotics, coordination technologies, intelligent agents, ambient intelligence, ubiquitous technologies and gamification to devise novel scenarios, where energy monitoring and management is just the basic brick of a much wider and comprehensive home management system. The aim is to control home appliances well beyond energy consumption, combining home comfort, appliance scheduling, safety constraints, etc. with dynamically-changeable users' preferences, goals and priorities. At the same time, usability and attractiveness are seen as key success factors: so, the intriguing technologies available in most houses and smart devices are exploited to make the system configuration and use simpler, entertaining and attractive for users. These aspects are also integrated with ubiquitous and pervasive technologies, geo-localization, social networks and communities to provide enhanced functionalities and support smarter application scenarios, hereby further strengthening technology acceptation and diffusion. Accordingly, we first analyse the system requirements and define a reference multi-layer architectural model - the Butlers architecture - that specifies seven layers of functionalities, correlating the requirements, the corresponding technologies and the consequent value-added for users in each layer. Then, we outline a set of notable scenarios of increasing functionalities and complexity, discuss the structure of the

  13. Replica Node Detection Using Enhanced Single Hop Detection with Clonal Selection Algorithm in Mobile Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    L. S. Sindhuja

    2016-01-01

    Full Text Available Security of Mobile Wireless Sensor Networks is a vital challenge as the sensor nodes are deployed in unattended environment and they are prone to various attacks. One among them is the node replication attack. In this, the physically insecure nodes are acquired by the adversary to clone them by having the same identity of the captured node, and the adversary deploys an unpredictable number of replicas throughout the network. Hence replica node detection is an important challenge in Mobile Wireless Sensor Networks. Various replica node detection techniques have been proposed to detect these replica nodes. These methods incur control overheads and the detection accuracy is low when the replica is selected as a witness node. This paper proposes to solve these issues by enhancing the Single Hop Detection (SHD method using the Clonal Selection algorithm to detect the clones by selecting the appropriate witness nodes. The advantages of the proposed method include (i increase in the detection ratio, (ii decrease in the control overhead, and (iii increase in throughput. The performance of the proposed work is measured using detection ratio, false detection ratio, packet delivery ratio, average delay, control overheads, and throughput. The implementation is done using ns-2 to exhibit the actuality of the proposed work.

  14. An emergency management demonstrator using the high level architecture

    International Nuclear Information System (INIS)

    Williams, R.J.

    1996-12-01

    This paper addresses the issues of simulation interoperability within the emergency management training context. A prototype implementation in Java of a subset of the High Level Architecture (HLA) is described. The use of Web Browsers to provide graphical user interfaces to HLA is also investigated. (au)

  15. Design methods and design theory for architectural design management

    NARCIS (Netherlands)

    Achten, H.H.; Otter, den A.F.H.J.; Achten, H.H.; Pels, H.J.

    2008-01-01

    Most parties that an architectural design manager meets in daily practice are engaged to some degree with design. What these parties are actually doing in a project is contingent with the concrete design project. Additionally, each party has some stake, and may employ different strategies to solve

  16. Zeolite-templated carbon replica: a grand canonical Monte-Carlo simulation study

    International Nuclear Information System (INIS)

    Roussel, Th.; Pellenq, R.J.M.; Bichara, Ch.; Gadiou, R.; Didion, A.; Vix-Guterl, C.; Gaslain, F.; Parmentier, J.; Valtchev, V.; Patarin, J.

    2005-01-01

    Microporous carbon materials are interesting for several applications such as hydrogen storage, catalysis or electrical double layer capacitors. The development of the negative templating method to obtain carbon replicas from ordered templates, has lead to the synthesis of several new materials which have interesting textural properties, attractive for energy storage. Among the possible templates, zeolites can be used to obtain highly microporous carbon materials. Nevertheless, the phenomena involved in the replica synthesis are not fully understood, and the relationships between the structure of the template, the carbon precursor and the resulting carbon material need to be investigated. Experimental results for carbon zeolite-templated nano-structures can be found in a series of papers; see for instance ref. [1] in which Wang et al describe a route to ultra-small Single Wall Carbon Nano-tubes (SWNTs) using the porosity of zeolite AlPO 4 -5. After matrix removal, the resulting structure is a free-standing bundle of 4 Angstroms large nano-tubes. However, it is highly desirable to obtain an ordered porous carbon structure that forms a real 3D network to be used for instance in gas storage applications. Carbon replica of faujasite and EMT zeolites can have these properties since these zeolites have a 3D porous network made of 10 Angstroms cages connected to each other through 7 Angstroms large windows. The first step of this study was to generate a theoretical carbon replica structure of various zeolites (faujasite, EMT, AlPO 4 -5, silicalite). For this purpose, we used the Grand Canonical Monte-Carlo (GCMC) technique in which the carbon-carbon interactions were described within the frame of a newly developed Tight Binding approach and the carbon-zeolite interactions assumed to be characteristic of physisorption. The intrinsic stability of the subsequent carbon nano-structures was then investigated after mimicking the removal of the inorganic phase by switching

  17. Broken symmetry in the mean field theory of the ising spin glass: replica way and no replica way

    International Nuclear Information System (INIS)

    De Dominicis, C.

    1983-06-01

    We review the type of symmetry breaking involved in the solution discovered by Parisi and in the static derivation of the solution first introduced via dynamics by Sompolinsky. We turn to a formulation of the problem due to Thouless, Anderson and Palmer (TAP) that put a set of equations for the magnetization. A probability law for the magnetization is then built. We consider two cases: (i) a canonical distribution which is shown to give indentical results to the Hamiltonian formulation under a weak and physical assumption and (ii) a white distribution characterized by two matrices and a response. We show what symmetry breaking is necessary to recover Sompolinsky free energy. In section III we supplement replica indices in the Hamiltonian approach by ''time'' indices ans show in particular that the analytic continuation involved in Sompolinsky's equilibrium derivation, is trying to mimick a translational symmetry breaking in ''time'' that incorporates Sompolinsky's ansatz of a long time scale sequence. In section IV we apply the same treatment to the white average approach and show that, replicas can be altogether discorded and replaced by ''time''. Finally, we briefly discuss the attribution of distinct answers for the standard spin glass order parameter depending on the physical situation: equilibrium or non equilibrium associated with canonical or white (non canonical) initial conditions and density matrices

  18. The Essential Leadership Role of Senior Management in Adopting Architectural Management and Modular Strategies (AMMS), with Perspectives on Experiences of European Automotive Firms

    DEFF Research Database (Denmark)

    Sanchez, Ron

    2015-01-01

    , however, to the fundamental changes in management and organizational processes a firm must undergo in order to implement architectural management and modular strategies ("AMMS") successfully. A common misperception among some senior managers is that implementing AMMS involves primarily some technical...... through the critical organizational and managerial changes required to implement and use AMMS effectively. This paper also suggests that there are two fundamentally different management approaches to leading the organizational change process needed to implement AMMS. We characterize......The potential benefits of architectural approaches to developing new products and of using modular architectures as the basis for new kinds of product strategies have been recognized since the 1990s and elaborated at some length in management research. Relatively little attention has been paid...

  19. Neutron and gamma-ray dose-rates from the Little Boy replica

    International Nuclear Information System (INIS)

    Plassmann, E.A.; Pederson, R.A.

    1984-01-01

    We report dose-rate information obtained at many locations in the near vicinity of, and at distances out to 0.64 km from, the Little Boy replica while it was operated as a critical assembly. The measurements were made with modified conventional dosimetry instruments that used an Anderson-Braun detector for neutrons and a Geiger-Mueller tube for gamma rays with suitable electronic modules to count particle-induced pulses. Thermoluminescent dosimetry methods provide corroborative data. Our analysis gives estimates of both neutron and gamma-ray relaxation lengths in air for comparison with earlier calculations. We also show the neutron-to-gamma-ray dose ratio as a function of distance from the replica. Current experiments and further data analysis will refine these results. 7 references, 8 figures

  20. Systematic expansion in the order parameter for replica theory of the dynamical glass transition.

    Science.gov (United States)

    Jacquin, Hugo; Zamponi, Francesco

    2013-03-28

    It has been shown recently that predictions from mode-coupling theory for the glass transition of hard-spheres become increasingly bad when dimensionality increases, whereas replica theory predicts a correct scaling. Nevertheless if one focuses on the regime around the dynamical transition in three dimensions, mode-coupling results are far more convincing than replica theory predictions. It seems thus necessary to reconcile the two theoretic approaches in order to obtain a theory that interpolates between low-dimensional, mode-coupling results, and "mean-field" results from replica theory. Even though quantitative results for the dynamical transition issued from replica theory are not accurate in low dimensions, two different approximation schemes--small cage expansion and replicated hyper-netted-chain (RHNC)--provide the correct qualitative picture for the transition, namely, a discontinuous jump of a static order parameter from zero to a finite value. The purpose of this work is to develop a systematic expansion around the RHNC result in powers of the static order parameter, and to calculate the first correction in this expansion. Interestingly, this correction involves the static three-body correlations of the liquid. More importantly, we separately demonstrate that higher order terms in the expansion are quantitatively relevant at the transition, and that the usual mode-coupling kernel, involving two-body direct correlation functions of the liquid, cannot be recovered from static computations.

  1. Building Quality into Learning Management Systems – An Architecture-Centric Approach

    OpenAIRE

    Avgeriou, P.; Retalis, Simos; Skordalakis, Manolis

    2003-01-01

    The design and development of contemporary Learning Management Systems (LMS), is largely focused on satisfying functional requirements, rather than quality requirements, thus resulting in inefficient systems of poor software and business quality. In order to remedy this problem there is a research trend into specifying and evaluating software architectures for LMS, since quality at-tributes in a system depend profoundly on its architecture. This paper presents a case study of appraising the s...

  2. Biosynthesis of cathodoluminescent zinc oxide replicas using butterfly (Papilio paris) wing scales as templates

    International Nuclear Information System (INIS)

    Zhang Wang; Zhang Di; Fan Tongxiang; Ding Jian; Gu Jiajun; Guo Qixin; Ogawa, Hiroshi

    2009-01-01

    Papilio paris butterflies have an iridescent blue color patch on their hind wings which is visible over a wide viewing angle. Optical and scanning electron microscopy observations of scales from the wings show that the blue color scales have very different microstructure to the matt black ones which also populate the wings. Scanning electron micrographs of the blue scales show that their surfaces comprise a regular two-dimensional array of concavities. By contrast the matt black scales have fine, sponge-like structure, between the ridges and the cross ribs in the scales. Using both types of scale as bio-templates, we obtain zinc oxide (ZnO) replicas of the microstructures of the original scales. Room temperature (T = 300 K) cathodoluminescence spectra of these ZnO replicas have also been studied. Both spectra show a similar sharp near-band-edge emission, but have different green emission, which we associate with the different microstructures of the ZnO replicas

  3. Biosynthesis of cathodoluminescent zinc oxide replicas using butterfly (Papilio paris) wing scales as templates

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Wang [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China); Zhang Di [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China)], E-mail: zhangdi@sjtu.edu.cn; Fan Tongxiang; Ding Jian; Gu Jiajun [State Key Lab of Metal Matrix Composites, Shanghai Jiao Tong University, 200240, Shanghai (China); Guo Qixin; Ogawa, Hiroshi [Department of Electrical and Electronic Engineering, Saga University, Saga 840-8502 (Japan)

    2009-01-01

    Papilio paris butterflies have an iridescent blue color patch on their hind wings which is visible over a wide viewing angle. Optical and scanning electron microscopy observations of scales from the wings show that the blue color scales have very different microstructure to the matt black ones which also populate the wings. Scanning electron micrographs of the blue scales show that their surfaces comprise a regular two-dimensional array of concavities. By contrast the matt black scales have fine, sponge-like structure, between the ridges and the cross ribs in the scales. Using both types of scale as bio-templates, we obtain zinc oxide (ZnO) replicas of the microstructures of the original scales. Room temperature (T = 300 K) cathodoluminescence spectra of these ZnO replicas have also been studied. Both spectra show a similar sharp near-band-edge emission, but have different green emission, which we associate with the different microstructures of the ZnO replicas.

  4. The architecture of psychological management: the Irish asylums (1801-1922).

    Science.gov (United States)

    Reuber, M

    1996-11-01

    This analysis examines some of the psychological, philosophical and sociological motives behind the development of pauper lunatic asylum architecture in Ireland during the time of the Anglo-Irish union (1801-1922). Ground plans and structural features are used to define five psycho-architectonic generations. While isolation and classification were the prime objectives in the first public asylum in Ireland (1810-1814), a combination of the ideas of a psychological, 'moral', management and 'panoptic' architecture led to a radial institutional design during the next phase of construction (1817-1835). The asylums of the third generation (1845-1855) lacked 'panoptic' features but they were still intended to allow a proper 'moral' management of the inmates, and to create a therapeutic family environment. By the time the institutions of the fourth epoch were erected (1862-1869) the 'moral' treatment approach had been given up, and asylums were built to allow a psychological management by 'association'. The last institutions (1894-1922) built before Ireland's acquisition of Dominion status (1922) were intended to foster the development of a curative society.

  5. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  6. Inter organizational System Management for integrated service delivery: an Enterprise Architecture Perspective

    OpenAIRE

    Elmir, Abir; Elmir, Badr; Bounabat, Bouchaib

    2015-01-01

    Service sharing is a prominent operating model to support business. Many large inter-organizational networks have implemented some form of value added integrated services in order to reach efficiency and to reduce costs sustainably. Coupling Service orientation with enterprise architecture paradigm is very important at improving organizational performance through business process optimization. Indeed, enterprise architecture management is increasingly discussed because of information system r...

  7. The architecture of the management system of complex steganographic information

    Science.gov (United States)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  8. System Architecture and Mobility Management for Mobile Immersive Communications

    Directory of Open Access Journals (Sweden)

    Mehran Dowlatshahi

    2007-01-01

    Full Text Available We propose a system design for delivery of immersive communications to mobile wireless devices based on a distributed proxy model. It is demonstrated that this architecture addresses key technical challenges for the delivery of these services, that is, constraints on link capacity and power consumption in mobile devices. However, additional complexity is introduced with respect to application layer mobility management. The paper proposes three possible methods for updating proxy assignments in response to mobility management and compares the performance of these methods.

  9. Research on fine management and visualization of ancient architectures based on integration of 2D and 3D GIS technology

    International Nuclear Information System (INIS)

    Jun, Yan; Shaohua, Wang; Jiayuan, Li; Qingwu, Hu

    2014-01-01

    Aimed at ancient architectures which own the characteristics of huge data quantity, fine-grained and high-precise, a 3D fine management and visualization method for ancient architectures based on the integration of 2D and 3D GIS is proposed. Firstly, after analysing various data types and characters of digital ancient architectures, main problems and key technologies existing in the 2D and 3D data management are discussed. Secondly, data storage and indexing model of digital ancient architecture based on 2D and 3D GIS integration were designed and the integrative storage and management of 2D and 3D data were achieved. Then, through the study of data retrieval method based on the space-time indexing and hierarchical object model of ancient architecture, 2D and 3D interaction of fine-grained ancient architectures 3D models was achieved. Finally, take the fine database of Liangyi Temple belonging to Wudang Mountain as an example, fine management and visualization prototype of 2D and 3D integrative digital ancient buildings of Liangyi Temple was built and achieved. The integrated management and visual analysis of 10GB fine-grained model of the ancient architecture was realized and a new implementation method for the store, browse, reconstruction, and architectural art research of ancient architecture model was provided

  10. Application of a Multimedia Service and Resource Management Architecture for Fault Diagnosis.

    Science.gov (United States)

    Castro, Alfonso; Sedano, Andrés A; García, Fco Javier; Villoslada, Eduardo; Villagrá, Víctor A

    2017-12-28

    Nowadays, the complexity of global video products has substantially increased. They are composed of several associated services whose functionalities need to adapt across heterogeneous networks with different technologies and administrative domains. Each of these domains has different operational procedures; therefore, the comprehensive management of multi-domain services presents serious challenges. This paper discusses an approach to service management linking fault diagnosis system and Business Processes for Telefónica's global video service. The main contribution of this paper is the proposal of an extended service management architecture based on Multi Agent Systems able to integrate the fault diagnosis with other different service management functionalities. This architecture includes a distributed set of agents able to coordinate their actions under the umbrella of a Shared Knowledge Plane, inferring and sharing their knowledge with semantic techniques and three types of automatic reasoning: heterogeneous, ontology-based and Bayesian reasoning. This proposal has been deployed and validated in a real scenario in the video service offered by Telefónica Latam.

  11. MASM: a market architecture for sensor management in distributed sensor networks

    Science.gov (United States)

    Viswanath, Avasarala; Mullen, Tracy; Hall, David; Garga, Amulya

    2005-03-01

    Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.

  12. Physical replicas and the Bose glass in cold atomic gases

    International Nuclear Information System (INIS)

    Morrison, S; Kantian, A; Daley, A J; Zoller, P; Katzgraber, H G; Lewenstein, M; Buechler, H P

    2008-01-01

    We study cold atomic gases in a disorder potential and analyse the correlations between different systems subjected to the same disorder landscape. Such independent copies with the same disorder landscape are known as replicas. While, in general, these are not accessible experimentally in condensed matter systems, they can be realized using standard tools for controlling cold atomic gases in an optical lattice. Of special interest is the overlap function which represents a natural order parameter for disordered systems and is a correlation function between the atoms of two independent replicas with the same disorder. We demonstrate an efficient measurement scheme for the determination of this disorder-induced correlation function. As an application, we focus on the disordered Bose-Hubbard model and determine the overlap function within the perturbation theory and a numerical analysis. We find that the measurement of the overlap function allows for the identification of the Bose-glass phase in certain parameter regimes

  13. Physical replicas and the Bose glass in cold atomic gases

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, S; Kantian, A; Daley, A J; Zoller, P [Institute for Theoretical Physics, University of Innsbruck, Technikerstr. 25, A-6020 Innsbruck (Austria); Katzgraber, H G [Theoretische Physik, ETH Zurich, CH-8093 Zuerich (Switzerland); Lewenstein, M [ICAO-Institut de Ciencies Fotoniques, Parc Mediterrani de la Tecnologia, E-08860 Castelldefels, Barcelona (Spain); Buechler, H P [Institute for Theoretical Physics III, University of Stuttgart, Pfaffenwaldring 57, 70550 Stuttgart (Germany)], E-mail: sarah.morrison@uibk.ac.at

    2008-07-15

    We study cold atomic gases in a disorder potential and analyse the correlations between different systems subjected to the same disorder landscape. Such independent copies with the same disorder landscape are known as replicas. While, in general, these are not accessible experimentally in condensed matter systems, they can be realized using standard tools for controlling cold atomic gases in an optical lattice. Of special interest is the overlap function which represents a natural order parameter for disordered systems and is a correlation function between the atoms of two independent replicas with the same disorder. We demonstrate an efficient measurement scheme for the determination of this disorder-induced correlation function. As an application, we focus on the disordered Bose-Hubbard model and determine the overlap function within the perturbation theory and a numerical analysis. We find that the measurement of the overlap function allows for the identification of the Bose-glass phase in certain parameter regimes.

  14. Platinum replica electron microscopy: Imaging the cytoskeleton globally and locally.

    Science.gov (United States)

    Svitkina, Tatyana M

    2017-05-01

    Structural studies reveal how smaller components of a system work together as a whole. However, combining high resolution of details with full coverage of the whole is challenging. In cell biology, light microscopy can image many cells in their entirety, but at a lower resolution, whereas electron microscopy affords very high resolution, but usually at the expense of the sample size and coverage. Structural analyses of the cytoskeleton are especially demanding, because cytoskeletal networks are unresolvable by light microscopy due to their density and intricacy, whereas their proper preservation is a challenge for electron microscopy. Platinum replica electron microscopy can uniquely bridge the gap between the "comfort zones" of light and electron microscopy by allowing high resolution imaging of the cytoskeleton throughout the entire cell and in many cells in the population. This review describes the principles and applications of platinum replica electron microscopy for studies of the cytoskeleton. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Replica Exchange Gaussian Accelerated Molecular Dynamics: Improved Enhanced Sampling and Free Energy Calculation.

    Science.gov (United States)

    Huang, Yu-Ming M; McCammon, J Andrew; Miao, Yinglong

    2018-04-10

    Through adding a harmonic boost potential to smooth the system potential energy surface, Gaussian accelerated molecular dynamics (GaMD) provides enhanced sampling and free energy calculation of biomolecules without the need of predefined reaction coordinates. This work continues to improve the acceleration power and energy reweighting of the GaMD by combining the GaMD with replica exchange algorithms. Two versions of replica exchange GaMD (rex-GaMD) are presented: force constant rex-GaMD and threshold energy rex-GaMD. During simulations of force constant rex-GaMD, the boost potential can be exchanged between replicas of different harmonic force constants with fixed threshold energy. However, the algorithm of threshold energy rex-GaMD tends to switch the threshold energy between lower and upper bounds for generating different levels of boost potential. Testing simulations on three model systems, including the alanine dipeptide, chignolin, and HIV protease, demonstrate that through continuous exchanges of the boost potential, the rex-GaMD simulations not only enhance the conformational transitions of the systems but also narrow down the distribution width of the applied boost potential for accurate energetic reweighting to recover biomolecular free energy profiles.

  16. Characterization of Nb SRF cavity materials by white light interferometry and replica techniques

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Chen [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); The Applied Science Department, The College of William and Mary, Williamsburg, VA 23185 (United States); Reece, Charles [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); Kelley, Michael, E-mail: mkelley@jlab.org [Thomas Jefferson National Accelerator Facility, Newport News, VA 23606 (United States); The Applied Science Department, The College of William and Mary, Williamsburg, VA 23185 (United States)

    2013-06-01

    Much work has shown that the topography of the interior surface is an important contributor to the performance of Nb superconducting radiofrequency (SRF) accelerator cavities. Micron-scale topography is implicated in non-linear loss mechanisms that limit the useful accelerating gradient range and impact cryogenic cost. Aggressive final chemical treatments in cavity production seek to reliably obtain “smoothest” surfaces with superior performance. Process development suffers because the cavity interior surface cannot be viewed directly without cutting out pieces, rendering the cavities unavailable for further study. Here we explore replica techniques as an alternative, providing imprints of cavity internal surface that can be readily examined. A second matter is the topography measurement technique used. Atomic force microscopy (AFM) has proven successful, but too time intensive for routine use in this application. We therefore introduce white light interferometry (WLI) as an alternative approach. We examined real surfaces and their replicas, using AFM and WLI. We find that the replica/WLI is promising to provide the large majority of the desired information, recognizing that a trade-off is being made between best lateral resolution (AFM) and the opportunity to examine much more surface area (WLI).

  17. A Virtual Power Plant Architecture for the Demand-Side Management of Smart Prosumers

    Directory of Open Access Journals (Sweden)

    Marco Pasetti

    2018-03-01

    Full Text Available In this paper, we present a conceptual study on a Virtual Power Plant (VPP architecture for the optimal management of Distributed Energy Resources (DERs owned by prosumers participating in Demand-Side Management (DSM programs. Compared to classical VPP architectures, which aim to aggregate several DERs dispersed throughout the electrical grid, in the proposed VPP architecture the supervised physical domain is limited to single users, i.e., to single Points of Delivery (PODs of the distribution network. The VPP architecture is based on a service-oriented approach, where multiple agents cooperate to implement the optimal management of the prosumer’s assets, by also considering different forms of Demand Response (DR requests. The considered DR schemes range from Price-Based DRs to Event-Based DRs, covering both the normal operating functions and the emergency control requests applied in modern distribution networks. With respect to centralized approaches, in this study the control perspective is moved from the system level to the single prosumer’s level, who is allowed to independently provide flexible power profiles through the aggregation of multiple DERs. A generalized optimization model, formulated as a Mixed-Integer Linear Programming (MILP problem, is also introduced. Such a model is able to compute the optimal scheduling of a prosumer’s assets by considering both DR requests and end-users’ requirements in terms of comfort levels while minimizing the costs.

  18. Connecting Architecture and Implementation

    Science.gov (United States)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  19. Managing Innovation and Business Development Using Enterprise Architecture

    DEFF Research Database (Denmark)

    Tambo, Torben; Bækgaard, Lars

    2011-01-01

    technologies. In management of IT, it has become increasingly popular to use Enterprise Architecture (EA) as a method and supported by a series of formal frameworks. EA maps artifacts and motives against the business strategy. In this paper, MOT and EA are reviewed for their mutual potentials and issues. Two...... of well planned technological changes. Management of technology (MOT) addresses identification, selection, (long term) planning, designing, implementation and operation of technology based business development. Information Technology (IT) is a key enabler for a vast range of contemporary corporate...... case studies illustrate how enterprises can make major changes derived from business strategy observing both MOT and EA. This suggests a combined view inspired from the IT business’ dedication to EA using EA’s formalisms and the management orientation of MOT to improve understanding technological...

  20. Case study B. Architectural design management using a project web

    NARCIS (Netherlands)

    DeClerck, F.; Pels, H.J.; Otter, den A.F.H.J.; Emmitt, S.; Prins, M.; Otter, den A.F.

    2009-01-01

    In this chapter the use and organization of use of a project website is described in the design and realization of a construction project. The case concerns a complicated project with a high number of different parties involved, managed by an architectural office and having an internationally

  1. Phonon replica dynamics in high quality GaN epilayers and AlGaN/GaN quantum wells

    Energy Technology Data Exchange (ETDEWEB)

    Alderighi, D.; Vinattieri, A.; Colocci, M. [Ist. Nazionale Fisica della Materia, Firenze (Italy); Dipt. di Fisica and LENS, Firenze (Italy); Bogani, F. [Ist. Nazionale Fisica della Materia, Firenze (Italy); Dipt. di Energetica, Firenze (Italy); Gottardo, S. [Dipt. di Fisica and LENS, Firenze (Italy); Grandjean, N.; Massies, J. [Centre de Recherche sur l' Hetero-Epitaxie et ses Applications, CNRS, Valbonne (France)

    2001-01-01

    We present an experimental study of the exciton and phonon replica dynamics in high quality GaN epilayers and AlGaN/GaN quantum wells (QW) by means of picosecond time-resolved photoluminescence (PL) measurements. A non-exponential decay is observed both at the zero phonon line (ZPL) and at the n = 1 LO replica. Time-resolved spectra unambiguously assign the replica to the free exciton A recombination. Optical migration effects are detected both in the epilayer and the QWs samples and disappear as the temperature increases up to 60-90 K. Even though the sample quality is comparable to state-of-the-art samples, localization effects dominate the exciton dynamics at low temperature in the studied GaN based structures. (orig.)

  2. Design management in the architectural engineering and construction sector : proceedings of the joint CIB W096 Architectural Management and CIB TG49. Architectural Engineering Conference held in conjunction with the 8th Brazilian Workshop on Building Design Management, University of Sao Paulo, 4-8 December 2008

    NARCIS (Netherlands)

    Melhado, S.; Prins, M.; Emmitt, S.; Bouchlaghem, D.; Otter, den A.F.H.J.

    2008-01-01

    Following the Denmark meeting, held in Lyngby 2005, the CIB W096 commission on Architectural Management merged its own meetings with two large events, the Adaptables Conference in Eindhoven 2006, and the CIB world Conference in Cape Town in 2007. Papers were invited under the theme Design Management

  3. Application of carbon extraction replicas in grain-size measurements of high-strength steels using TEM

    International Nuclear Information System (INIS)

    Poorhaydari, Kioumars; Ivey, Douglas G.

    2007-01-01

    In this paper, the application of carbon extraction replicas in grain-size measurements is introduced and discussed. Modern high-strength microalloyed steels, used as structural or pipeline materials, have very small grains with substructures. Replicas used in transmission electron microscopes can resolve the grain boundaries and can be used for systematic measurement of grain size in cases where the small size of the grains pushes the resolution of conventional optical microscopes. The grain-size variations obtained from replicas are compared with those obtained from optical and scanning electron microscopy. An emphasis is placed on the importance of using the correct technique for imaging and the optimal magnification. Grain-size measurements are used for estimation of grain-boundary strengthening contribution to yield strength. The variation in grain size is also correlated with hardness in the base metal of several microalloyed steels, as well as the fine-grained heat-affected zone of a weld structure with several heat inputs

  4. Hybrid Power Management-Based Vehicle Architecture

    Science.gov (United States)

    Eichenberg, Dennis J.

    2011-01-01

    Hybrid Power Management (HPM) is the integration of diverse, state-of-the-art power devices in an optimal configuration for space and terrestrial applications (s ee figure). The appropriate application and control of the various power devices significantly improves overall system performance and efficiency. The basic vehicle architecture consists of a primary power source, and possibly other power sources, that provides all power to a common energy storage system that is used to power the drive motors and vehicle accessory systems. This architecture also provides power as an emergency power system. Each component is independent, permitting it to be optimized for its intended purpose. The key element of HPM is the energy storage system. All generated power is sent to the energy storage system, and all loads derive their power from that system. This can significantly reduce the power requirement of the primary power source, while increasing the vehicle reliability. Ultracapacitors are ideal for an HPM-based energy storage system due to their exceptionally long cycle life, high reliability, high efficiency, high power density, and excellent low-temperature performance. Multiple power sources and multiple loads are easily incorporated into an HPM-based vehicle. A gas turbine is a good primary power source because of its high efficiency, high power density, long life, high reliability, and ability to operate on a wide range of fuels. An HPM controller maintains optimal control over each vehicle component. This flexible operating system can be applied to all vehicles to considerably improve vehicle efficiency, reliability, safety, security, and performance. The HPM-based vehicle architecture has many advantages over conventional vehicle architectures. Ultracapacitors have a much longer cycle life than batteries, which greatly improves system reliability, reduces life-of-system costs, and reduces environmental impact as ultracapacitors will probably never need to be

  5. Dynamical self-arrest in symmetric and asymmetric diblock copolymer melts using a replica approach within a local theory.

    Science.gov (United States)

    Wu, Sangwook

    2009-03-01

    We investigate dynamical self-arrest in a diblock copolymer melt using a replica approach within a self-consistent local method based on dynamical mean-field theory (DMFT). The local replica approach effectively predicts (chiN)_{A} for dynamical self-arrest in a block copolymer melt for symmetric and asymmetric cases. We discuss the competition of the cubic and quartic interactions in the Landau free energy for a block copolymer melt in stabilizing a glassy state depending on the chain length. Our local replica theory provides a universal value for the dynamical self-arrest in block copolymer melts with (chiN)_{A} approximately 10.5+64N;{-3/10} for the symmetric case.

  6. Replica scaling studies of hard missile impacts on reinforced concrete

    International Nuclear Information System (INIS)

    Barr, P.; Carter, P.G.; Howe, W.D.; Neilson, A.J.

    1982-01-01

    Missile and target combinations at three different liners scales have been used in an experimental assessment of the applicability of replica scaling to the dynamic behaviour of reinforced concrete structures impacted by rigid missiles. Experimental results are presented for models with relative linear scales of 1, 0.37 and 0.12. (orig.) [de

  7. Effect of roughness and material strength on the mechanical properties of fracture replicas

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.

    1995-08-01

    This report presents the results of 11 rotary shear tests conducted on replicas of three hollow cylinders of natural fractures with JRC values of 7.7, 9.4 and 12.0. The JRC values were determined from the results of laser profilometer measurements. The replicas were created from gypsum cement. By varying the water-to-gypsum cement ratio from 30 to 45%, fracture replicas with different values of compressive strength (JCS) were created. The rotary shear experiments were performed under constant normal (nominal) stresses ranging between 0.2 and 1.6 MPa. In this report, the shear test results are compared with predictions using Barton's empirical peak shear strength equation. observations during the experiments indicate that only certain parts of the fracture profiles influence fracture shear strength and dilatancy. Under relatively low applied normal stresses, the JCS does not seem to have a significant effect on shear behavior. As an alternative, a new procedure for predicting the shear behavior of fractures was developed. The approach is based on basic fracture properties such as fracture surface profile data and the compressive strength, modulus of elasticity, and Poisson's ratio of the fracture walls. Comparison between predictions and actual shear test results shows that the alternative procedure is a reliable method

  8. Replica field theory for a polymer in random media

    International Nuclear Information System (INIS)

    Goldschmidt, Yadin Y.

    2000-01-01

    In this paper we revisit the problem of a (non-self-avoiding) polymer chain in a random medium which was previously investigated by Edwards and Muthukumar (EM) [J. Chem. Phys. 89, 2435 (1988)]. As noticed by Cates and Ball (CB) [J. Phys. (France) 49, 2009 (1988)] there is a discrepancy between the predictions of the replica calculation of EM and the expectation that in an infinite medium the quenched and annealed results should coincide (for a chain that is free to move) and a long polymer should always collapse. CB argued that only in a finite volume one might see a ''localization transition'' (or crossover) from a stretched to a collapsed chain in three spatial dimensions. Here we carry out the replica calculation in the presence of an additional confining harmonic potential that mimics the effect of a finite volume. Using a variational scheme with five variational parameters we derive analytically for d -1/(4-d) ∼(g ln V) -1/(4-d) , where R is the radius of gyration, g is the strength of the disorder, μ is the spring constant associated with the confining potential, and V is the associated effective volume of the system. Thus the EM result is recovered with their constant replaced by ln V as argued by CB. We see that in the strict infinite volume limit the polymer always collapses, but for finite volume a transition from a stretched to a collapsed form might be observed as a function of the strength of the disorder. For d V ' ∼exp(g 2/(2-d) L (4-d)/(2-d) ) the annealed results are recovered and R∼(Lg) 1/(d-2) , where L is the length of the polymer. Hence the polymer also collapses in the large L limit. The one-step replica symmetry breaking solution is crucial for obtaining the above results. (c) 2000 The American Physical Society

  9. Control architectures for IT management

    International Nuclear Information System (INIS)

    Wang Ting

    2003-01-01

    This paper summaries the three financial control architectures for IT department in an enterprise or organization, they are unallocated cost center, allocated cost center and profit center, analyses the characteristics of them and in the end gives the detailed suggestions for choosing these control architectures. (authors)

  10. Stability and replica symmetry in the ising spin glass: a toy model

    International Nuclear Information System (INIS)

    De Dominicis, C.; Mottishaw, P.

    1986-01-01

    Searching for possible replica symmetric solutions in an Ising spin glass (in the tree approximation) we investigate a toy model whose bond distribution has two non vanishing cumulants (instead of one only as in a gaussian distribution)

  11. A cognitive decision agent architecture for optimal energy management of microgrids

    International Nuclear Information System (INIS)

    Velik, Rosemarie; Nicolay, Pascal

    2014-01-01

    Highlights: • We propose an optimization approach for energy management in microgrids. • The optimizer emulates processes involved in human decision making. • Optimization objectives are energy self-consumption and financial gain maximization. • We gain improved optimization results in significantly reduced computation time. - Abstract: Via the integration of renewable energy and storage technologies, buildings have started to change from passive (electricity) consumers to active prosumer microgrids. Along with this development come a shift from centralized to distributed production and consumption models as well as discussions about the introduction of variable demand–supply-driven grid electricity prices. Together with upcoming ICT and automation technologies, these developments open space to a wide range of novel energy management and energy trading possibilities to optimally use available energy resources. However, what is considered as an optimal energy management and trading strategy heavily depends on the individual objectives and needs of a microgrid operator. Accordingly, elaborating the most suitable strategy for each particular system configuration and operator need can become quite a complex and time-consuming task, which can massively benefit from computational support. In this article, we introduce a bio-inspired cognitive decision agent architecture for optimized, goal-specific energy management in (interconnected) microgrids, which are additionally connected to the main electricity grid. For evaluating the performance of the architecture, a number of test cases are specified targeting objectives like local photovoltaic energy consumption maximization and financial gain maximization. Obtained outcomes are compared against a modified simulating annealing optimization approach in terms of objective achievement and computational effort. Results demonstrate that the cognitive decision agent architecture yields improved optimization results in

  12. Tunable hydrodynamic characteristics in microchannels with biomimetic superhydrophobic (lotus leaf replica) walls.

    Science.gov (United States)

    Dey, Ranabir; Raj M, Kiran; Bhandaru, Nandini; Mukherjee, Rabibrata; Chakraborty, Suman

    2014-05-21

    The present work comprehensively addresses the hydrodynamic characteristics through microchannels with lotus leaf replica (exhibiting low adhesion and superhydrophobic properties) walls. The lotus leaf replica is fabricated following an efficient, two-step, soft-molding process and is then integrated with rectangular microchannels. The inherent biomimetic, superhydrophobic surface-liquid interfacial hydrodynamics, and the consequential bulk flow characteristics, are critically analyzed by the micro-particle image velocimetry technique. It is observed that the lotus leaf replica mediated microscale hydrodynamics comprise of two distinct flow regimes even within the low Reynolds number paradigm, unlike the commonly perceived solely apparent slip-stick dominated flows over superhydrophobic surfaces. While the first flow regime is characterized by an apparent slip-stick flow culminating in an enhanced bulk throughput rate, the second flow regime exhibits a complete breakdown of the aforementioned laminar and uni-axial flow model, leading to a predominantly no-slip flow. Interestingly, the critical flow condition dictating the transition between the two hydrodynamic regimes is intrinsically dependent on the micro-confinement effect. In this regard, an energetically consistent theoretical model is also proposed to predict the alterations in the critical flow condition with varying microchannel configurations, by addressing the underlying biomimetic surface-liquid interfacial conditions. Hence, the present research endeavour provides a new design-guiding paradigm for developing multi-functional microfluidic devices involving biomimetic, superhydrophobic surfaces, by judicious exploitation of the tunable hydrodynamic characteristics in the two regimes.

  13. Localization-Free Detection of Replica Node Attacks in Wireless Sensor Networks Using Similarity Estimation with Group Deployment Knowledge

    Directory of Open Access Journals (Sweden)

    Chao Ding

    2017-01-01

    Full Text Available Due to the unattended nature and poor security guarantee of the wireless sensor networks (WSNs, adversaries can easily make replicas of compromised nodes, and place them throughout the network to launch various types of attacks. Such an attack is dangerous because it enables the adversaries to control large numbers of nodes and extend the damage of attacks to most of the network with quite limited cost. To stop the node replica attack, we propose a location similarity-based detection scheme using deployment knowledge. Compared with prior solutions, our scheme provides extra functionalities that prevent replicas from generating false location claims without deploying resource-consuming localization techniques on the resource-constraint sensor nodes. We evaluate the security performance of our proposal under different attack strategies through heuristic analysis, and show that our scheme achieves secure and robust replica detection by increasing the cost of node replication. Additionally, we evaluate the impact of network environment on the proposed scheme through theoretic analysis and simulation experiments, and indicate that our scheme achieves effectiveness and efficiency with substantially lower communication, computational, and storage overhead than prior works under different situations and attack strategies.

  14. Evaluation of generalized degrees of freedom for sparse estimation by replica method

    Science.gov (United States)

    Sakata, A.

    2016-12-01

    We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.

  15. An Architecture to Manage Incoming Traffic of Inter-Domain Routing Using OpenFlow Networks

    Directory of Open Access Journals (Sweden)

    Walber José Adriano Silva

    2018-04-01

    Full Text Available The Border Gateway Protocol (BGP is the current state-of-the-art inter-domain routing between Autonomous Systems (ASes. Although BGP has different mechanisms to manage outbound traffic in an AS domain, it lacks an efficient tool for inbound traffic control from transit ASes such as Internet Service Providers (ISPs. For inter-domain routing, the BGP’s destination-based forwarding paradigm limits the granularity of distributing the network traffic among the multiple paths of the current Internet topology. Thus, this work offered a new architecture to manage incoming traffic in the inter-domain using OpenFlow networks. The architecture explored direct inter-domain communication to exchange control information and the functionalities of the OpenFlow protocol. Based on the achieved results of the size of exchanging messages, the proposed architecture is not only scalable, but also capable of performing load balancing for inbound traffic using different strategies.

  16. Mimicking the action of folding chaperones by Hamiltonian replica-exchange molecular dynamics simulations : Application in the refinement of de novo models

    NARCIS (Netherlands)

    Fan, Hao; Periole, Xavier; Mark, Alan E.

    The efficiency of using a variant of Hamiltonian replica-exchange molecular dynamics (Chaperone H-replica-exchange molecular dynamics [CH-REMD]) for the refinement of protein structural models generated de novo is investigated. In CH-REMD, the interaction between the protein and its environment,

  17. Architecture of a software quench management system

    International Nuclear Information System (INIS)

    Jerzy M. Nogiec et al.

    2001-01-01

    Testing superconducting accelerator magnets is inherently coupled with the proper handling of quenches; i.e., protecting the magnet and characterizing the quench process. Therefore, software implementations must include elements of both data acquisition and real-time controls. The architecture of the quench management software developed at Fermilab's Magnet Test Facility is described. This system consists of quench detection, quench protection, and quench characterization components that execute concurrently in a distributed system. Collaboration between the elements of quench detection, quench characterization and current control are discussed, together with a schema of distributed saving of various quench-related data. Solutions to synchronization and reliability in such a distributed quench system are also presented

  18. Architecture Governance: The Importance of Architecture Governance for Achieving Operationally Responsive Ground Systems

    Science.gov (United States)

    Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik

    2011-01-01

    Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level

  19. Neutron and gamma dose and spectra measurements on the Little Boy replica

    International Nuclear Information System (INIS)

    Hoots, S.; Wadsworth, D.

    1984-01-01

    The radiation-measurement team of the Weapons Engineering Division at Lawrence Livermore National Laboratory (LLNL) measured neutron and gamma dose and spectra on the Little Boy replica at Los Alamos National Laboratory (LANL) in April 1983. This assembly is a replica of the gun-type atomic bomb exploded over Hiroshima in 1945. These measurements support the National Academy of Sciences Program to reassess the radiation doses due to atomic bomb explosions in Japan. Specifically, the following types of information were important: neutron spectra as a function of geometry, gamma to neutron dose ratios out to 1.5 km, and neutron attenuation in the atmosphere. We measured neutron and gamma dose/fission from close-in to a kilometer out, and neutron and gamma spectra at 90 and 30 0 close-in. This paper describes these measurements and the results. 12 references, 13 figures, 5 tables

  20. System Architecture Modeling for Technology Portfolio Management using ATLAS

    Science.gov (United States)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  1. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    Science.gov (United States)

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. The Architecture of Financial Risk Management Systems

    Directory of Open Access Journals (Sweden)

    Iosif ZIMAN

    2013-01-01

    Full Text Available The architecture of systems dedicated to risk management is probably one of the more complex tasks to tackle in the world of finance. Financial risk has been at the center of attention since the explosive growth of financial markets and even more so after the 2008 financial crisis. At multiple levels, financial companies, financial regulatory bodies, governments and cross-national regulatory bodies, all have put the subject of financial risk in particular and the way it is calculated, managed, reported and monitored under intense scrutiny. As a result the technology underpinnings which support the implementation of financial risk systems has evolved considerably and has become one of the most complex areas involving systems and technology in the context of the financial industry. We present the main paradigms, require-ments and design considerations when undertaking the implementation of risk system and give examples of user requirements, sample product coverage and performance parameters.

  3. Enhanced risk management by an emerging multi-agent architecture

    Science.gov (United States)

    Lin, Sin-Jin; Hsu, Ming-Fu

    2014-07-01

    Classification in imbalanced datasets has attracted much attention from researchers in the field of machine learning. Most existing techniques tend not to perform well on minority class instances when the dataset is highly skewed because they focus on minimising the forecasting error without considering the relative distribution of each class. This investigation proposes an emerging multi-agent architecture, grounded on cooperative learning, to solve the class-imbalanced classification problem. Additionally, this study deals further with the obscure nature of the multi-agent architecture and expresses comprehensive rules for auditors. The results from this study indicate that the presented model performs satisfactorily in risk management and is able to tackle a highly class-imbalanced dataset comparatively well. Furthermore, the knowledge visualised process, supported by real examples, can assist both internal and external auditors who must allocate limited detecting resources; they can take the rules as roadmaps to modify the auditing programme.

  4. The Architecture Improvement Method: cost management and systematic learning about strategic product architectures

    NARCIS (Netherlands)

    de Weerd-Nederhof, Petronella C.; Wouters, Marc; Teuns, Steven J.A.; Hissel, Paul H.

    2007-01-01

    The architecture improvement method (AIM) is a method for multidisciplinary product architecture improvement, addressing uncertainty and complexity and incorporating feedback loops, facilitating trade-off decision making during the architecture creation process. The research reported in this paper

  5. Número de replicações de inquéritos dietéticos para estimativa da ingestão de nutrientes em gestantes brasileiras

    Directory of Open Access Journals (Sweden)

    Daniela Saes Sartorelli

    2014-12-01

    Full Text Available Objetivos: determinar o número de replicações de inquéritos dietéticos necessários para estimar a ingestão usual de nutrientes e em categorias de consumo de gestantes no Brasil. Métodos: estudo prospectivo conduzido entre 82 gestantes, no qual as informações sobre energia e 18 nutrientes foram obtidas em três inquéritos recordatórios de 24 horas, sendo um em cada trimestre gestacional. Empregaram-se diferentes fórmulas para o cálculo do número de replicações do método necessárias para classificar as gestantes em categorias de ingestão, que considera a razão das variâncias intrapessoal/interpessoal, e para a estimativa da ingestão usual, baseado na variância intrapessoal. Resultados: para classificar as gestantes em categorias são necessárias entre 11 e 51 replicações do método, considerando-se coeficiente de correlação de 0,9. Admitindo coeficiente de correlação de 0,7, o número de replicações do método variou entre quatro e 19. Para a estimativa da ingestão usual são necessárias entre duas e 33 replicações, admitindo-se um erro de 10%. Considerando-se um erro de 20%, são necessárias entre uma e sete replicações de inquéritos dietéticos. Conclusões: é necessário um elevado número de replicações de inquéritos dietéticos na estimativa da ingestão de nutrientes na gestação e o emprego de um número reduzido de replicações poderá atenuar as associações entre a dieta e desfechos de saúde maternos e fetais.

  6. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management

    OpenAIRE

    Qiang Luo; Qiang Luo; Yina Ma; Yina Ma; Meghana A. Bhatt; Meghana A. Bhatt; P. Read Montague; P. Read Montague; P. Read Montague; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng; Jianfeng Feng

    2017-01-01

    Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent...

  7. Information architecture. Volume 3: Guidance

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The purpose of this document, as presented in Volume 1, The Foundations, is to assist the Department of Energy (DOE) in developing and promulgating information architecture guidance. This guidance is aimed at increasing the development of information architecture as a Departmentwide management best practice. This document describes departmental information architecture principles and minimum design characteristics for systems and infrastructures within the DOE Information Architecture Conceptual Model, and establishes a Departmentwide standards-based architecture program. The publication of this document fulfills the commitment to address guiding principles, promote standard architectural practices, and provide technical guidance. This document guides the transition from the baseline or defacto Departmental architecture through approved information management program plans and budgets to the future vision architecture. This document also represents another major step toward establishing a well-organized, logical foundation for the DOE information architecture.

  8. Proactive replica checking to assure reliability of data in cloud storage with minimum replication

    Science.gov (United States)

    Murarka, Damini; Maheswari, G. Uma

    2017-11-01

    The two major issues for cloud storage systems are data reliability and storage costs. For data reliability protection, multi-replica replication strategy which is used mostly in current clouds acquires huge storage consumption, leading to a large storage cost for applications within the loud specifically. This paper presents a cost-efficient data reliability mechanism named PRCR to cut back the cloud storage consumption. PRCR ensures data reliability of large cloud information with the replication that might conjointly function as a price effective benchmark for replication. The duplication shows that when resembled to the standard three-replica approach, PRCR will scale back to consume only a simple fraction of the cloud storage from one-third of the storage, thence considerably minimizing the cloud storage price.

  9. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    Science.gov (United States)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  10. Architecture of a micro grid energy manager; Arquitectura de un gestor energetico de microrredes

    Energy Technology Data Exchange (ETDEWEB)

    Jimeno-Huarte, J.; Anduaga-Muniozgueren, J.; Oyarzabal-Moreno, J.

    2009-07-01

    Micro grids are defined as a set of aggregated micro generators and loads operating like a unique system. Micro grids need energy management systems in order to coordinate the actions of the elements that compose them. This way, Micro grids provide useful services to connected users as well as to the electrical system. This paper presents the architecture of a Micro grid energy Manager applying multi agent based technologies and communication standards. an application of this architecture to the secondary regulation function has been performed using TECNALIA's Micro grid as validation platform. The implementation of the secondary regulation takes into account economical criteria while the technical restrictions of the controlled equipment are fulfilled. (Author) 14 refs.

  11. Data architecture from zen to reality

    CERN Document Server

    Tupper, Charles D

    2011-01-01

    Data Architecture: From Zen to Reality explains the principles underlying data architecture, how data evolves with organizations, and the challenges organizations face in structuring and managing their data. It also discusses proven methods and technologies to solve the complex issues dealing with data. The book uses a holistic approach to the field of data architecture by covering the various applied areas of data, including data modelling and data model management, data quality , data governance, enterprise information management, database design, data warehousing, and warehouse design. This book is a core resource for anyone emplacing, customizing or aligning data management systems, taking the Zen-like idea of data architecture to an attainable reality.

  12. Smart Traffic Management Protocol Based on VANET architecture

    Directory of Open Access Journals (Sweden)

    Amilcare Francesco Santamaria

    2014-01-01

    Full Text Available Nowadays one of the hottest theme in wireless environments research is the application of the newest technologies to road safety problems and traffic management exploiting the (VANET architecture. In this work, a novel protocol that aims to achieve a better traffic management is proposed. The overal system is able to reduce traffic level inside the city exploiting inter-communication among vehicles and support infrastructures also known as (V2V and (V2I communications. We design a network protocol called (STMP that takes advantages of IEEE 802.11p standard. On each road several sensors system are placed and they are responsible of monitoring. Gathered data are spread in the network exploiting ad-hoc protocol messages. The increasing knowledge about environment conditions make possible to take preventive actions. Moreover, having a realtime monitoring of the lanes it is possible to reveal roads and city blocks congestions in a shorter time. An external entity to the (VANET is responsible to manage traffic and rearrange traffic along the lanes of the city avoiding huge traffic levels.

  13. Proposal of a referential Enterprise Architecture management framework for companies

    Directory of Open Access Journals (Sweden)

    César Esquetini Cáceres

    2014-12-01

    Full Text Available (Received: 2014/11/26 - Accepted: 2014/12/17Enterprise Architecture (EA is conceived nowadays as an essential management activity to visualize and evaluate the future direction of a company. The objective of this paper is to make a literature review on EA to evaluate its role as management tool. It is also explained how EA can fulfill two fundamental purposes, first as a tool for assessing the current situation (self-assessment of an organization; second as a tool to model and simulate future scenarios that allow better decision making for the restructuration and development of improvement plans. Furthermore an analysis is made of the integration possibilities of EA with other business management methodologies, as balanced score card (BSC and the model of the European Foundation for Quality Management (EFQM. As the result a management framework is presented, which includes the required elements to achieve excellence and quality standards in organizations.

  14. Replica Approach for Minimal Investment Risk with Cost

    Science.gov (United States)

    Shinzato, Takashi

    2018-06-01

    In the present work, the optimal portfolio minimizing the investment risk with cost is discussed analytically, where an objective function is constructed in terms of two negative aspects of investment, the risk and cost. We note the mathematical similarity between the Hamiltonian in the mean-variance model and the Hamiltonians in the Hopfield model and the Sherrington-Kirkpatrick model, show that we can analyze this portfolio optimization problem by using replica analysis, and derive the minimal investment risk with cost and the investment concentration of the optimal portfolio. Furthermore, we validate our proposed method through numerical simulations.

  15. Educational strategies for architectural design management : the design of a new curriculum

    NARCIS (Netherlands)

    Prins, M.; Halman, J.I.M.

    1996-01-01

    This paper is about the design of a new curriculum on Architectural Design Management Systems. This curriculum is embedded in the Stan Ackermans lnstitute(SAI). The SAI is a school for continuing post graduate education on technological design. First some recent developments in the building industry

  16. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  17. Nuclear research emulsion neutron spectrometry at the Little-Boy replica

    International Nuclear Information System (INIS)

    Gold, R.; Roberts, J.H.; Preston, C.C.

    1985-10-01

    Nuclear research emulsions (NRE) have been used to characterize the neutron spectrum emitted by the Little-Boy replica. NRE were irradiated at the Little-Boy surface as well as approximately 2 m from the center of the Little-Boy replica using polar angles of 0 0 , 30 0 , 60 0 and 90 0 . For the NRE exposed at 2 m, neutron background was determined using shadow shields of borated polyethylene. Emulsion scanning to date has concentrated exclusively on the 2-m, 0 0 and 2-m, 90 0 locations. Approximately 5000 proton-recoil tracks have been measured in NRE irradiated at each of these locations. Neutron spectra obtained from these NRE proton-recoil spectra are compared with both liquid scintillator neutron spectrometry and Monte Carlo calculations. NRE and liquid scintillator neutron spectra generally agree within experimental uncertainties at the 2-m, 90 0 location. However, at the 2-m, 0 0 location, the neutron spectra derived from these two independent experimental methods differ significantly. NRE spectra and Monte Carlo calculations exhibit general agreement with regard to both intensity as well as energy dependence. Better agreement is attained between theory and experiment at the 2-m, 90 0 location, where the neutron intensity is considerably higher. 14 refs., 18 figs., 11 tabs

  18. An Agile Enterprise Regulation Architecture for Health Information Security Management

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-01-01

    Abstract Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  19. An agile enterprise regulation architecture for health information security management.

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital.

  20. Architectural management in the digital arena : proceedings of the CIB-W096 conference Vienna 2011, Vienna University of Technology, Austria, 13-14 October 2011

    NARCIS (Netherlands)

    Otter, den A.F.H.J.; Emmitt, S.; Achammer, Ch.

    2011-01-01

    Leading research into architectural design management is the CIB’s working committee W096 Architectural Management. CIB-W096 was officially established in 1993, following a conference on ‘Architectural Management’ at the University of Nottingham in the UK. Since this time the commission has been

  1. Space Station data management system architecture

    Science.gov (United States)

    Mallary, William E.; Whitelaw, Virginia A.

    1987-01-01

    Within the Space Station program, the Data Management System (DMS) functions in a dual role. First, it provides the hardware resources and software services which support the data processing, data communications, and data storage functions of the onboard subsystems and payloads. Second, it functions as an integrating entity which provides a common operating environment and human-machine interface for the operation and control of the orbiting Space Station systems and payloads by both the crew and the ground operators. This paper discusses the evolution and derivation of the requirements and issues which have had significant effect on the design of the Space Station DMS, describes the DMS components and services which support system and payload operations, and presents the current architectural view of the system as it exists in October 1986; one-and-a-half years into the Space Station Phase B Definition and Preliminary Design Study.

  2. Architecture and Patterns for IT Service Management, Resource Planning, and Governance Making Shoes for the Cobbler's Children

    CERN Document Server

    Betz, Charles T

    2011-01-01

    Information technology supports efficient operations, enterprise integration, and seamless value delivery, yet itself is too often inefficient, un-integrated, and of unclear value. This completely rewritten version of the bestselling Architecture and Patterns for IT Service Management, Resource Planning and Governance retains the original (and still unique) approach: apply the discipline of enterprise architecture to the business of large scale IT management itself. Author Charles Betz applies his deep practitioner experience to a critical reading of ITIL 2011, COBIT version 4, the CMMI suite

  3. Program information architecture/document hierarchy. [Information Management Systems, it's components and rationale

    Energy Technology Data Exchange (ETDEWEB)

    Woods, T.W.

    1991-09-01

    The Nuclear Waste Management System (NWMS) Management Systems Improvement Strategy (MSIS) (DOE 1990) requires that the information within the computer program and information management system be ordered into a precedence hierarchy for consistency. Therefore, the US Department of Energy (DOE). Office of Civilian Radioactive Waste Management (OCRWM) requested Westinghouse Hanford Company to develop a plan for NWMS program information which the MSIS calls a document hierarchy. This report provides the results of that effort and describes the management system as a program information architecture.'' 3 refs., 3 figs.

  4. Enterprise architecture evaluation using architecture framework and UML stereotypes

    Directory of Open Access Journals (Sweden)

    Narges Shahi

    2014-08-01

    Full Text Available There is an increasing need for enterprise architecture in numerous organizations with complicated systems with various processes. Support for information technology, organizational units whose elements maintain complex relationships increases. Enterprise architecture is so effective that its non-use in organizations is regarded as their institutional inability in efficient information technology management. The enterprise architecture process generally consists of three phases including strategic programing of information technology, enterprise architecture programing and enterprise architecture implementation. Each phase must be implemented sequentially and one single flaw in each phase may result in a flaw in the whole architecture and, consequently, in extra costs and time. If a model is mapped for the issue and then it is evaluated before enterprise architecture implementation in the second phase, the possible flaws in implementation process are prevented. In this study, the processes of enterprise architecture are illustrated through UML diagrams, and the architecture is evaluated in programming phase through transforming the UML diagrams to Petri nets. The results indicate that the high costs of the implementation phase will be reduced.

  5. Fabrication of micropillar substrates using replicas of alpha-particle irradiated and chemically etched PADC films

    International Nuclear Information System (INIS)

    Ng, C.K.M.; Chong, E.Y.W.; Roy, V.A.L.; Cheung, K.M.C.; Yeung, K.W.K.; Yu, K.N.

    2012-01-01

    We proposed a simple method to fabricate micropillar substrates. Polyallyldiglycol carbonate (PADC) films were irradiated by alpha particles and then chemically etched to form a cast with micron-scale spherical pores. A polydimethylsiloxane (PDMS) replica of this PADC film gave a micropillar substrate with micron-scale spherical pillars. HeLa cells cultured on such a micropillar substrate had significantly larger percentage of cells entering S-phase, attached cell numbers and cell spreading areas. - Highlights: ► We proposed a simple method to fabricate micropillar substrates. ► Polyallyldiglycol carbonate films were irradiated and etched to form casts. ► Polydimethylsiloxane replica then formed the micropillar substrates. ► Attachment and proliferation of HeLa cells were enhanced on these substrates.

  6. Dialogue management in a home machine environment : linguistic components over an agent architecture

    OpenAIRE

    Quesada Moreno, José Francisco; García, Federico; Sena Pichardo, María Esther; Bernal Bermejo, José Ángel; Amores Carredano, José Gabriel de

    2001-01-01

    This paper presents the main characteristics of an Agent-based Architecture for the design and implementation of a Spoken Dialogue System. From a theoretical point of view, the system is based on the Information State Update approach, in particular, the system aims at the management of Natural Command Language Dialogue Moves in a Home Machine Environment. Specifically, the paper is focused on the Natural Language Understanding and Dialogue Management Agents...

  7. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    Directory of Open Access Journals (Sweden)

    Abdulaziz Alsahli

    2016-01-01

    Full Text Available Requirement change management (RCM is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD.The objective of this research is to introduce an innovative approach for handling requirements and architecture changes simultaneously during global software development. The approach makes use of Case-Based Reasoning (CBR and agile practices. Agile practices make our approach iterative, whereas CBR stores requirements and makes them reusable. Twin Peaks is our base model, meaning that requirements and architecture are handled simultaneously. For this research, grounded theory has been applied; similarly, interviews from domain experts were conducted. Interview and literature transcripts formed the basis of data collection in grounded theory. Physical saturation of theory has been achieved through a published case study and developed tool. Expert reviews and statistical analysis have been used for evaluation. The proposed approach resulted in effective change management of requirements and architecture simultaneously during global software development.

  8. Axial displacement of abutments into implants and implant replicas, with the tapered cone-screw internal connection, as a function of tightening torque.

    Science.gov (United States)

    Dailey, Bruno; Jordan, Laurence; Blind, Olivier; Tavernier, Bruno

    2009-01-01

    The passive fit of a superstructure on implant abutments is essential to success. One source of error when using a tapered cone-screw internal connection may be the difference between the tightening torque level applied to the abutments by the laboratory technician compared to that applied by the treating clinician. The purpose of this study was to measure the axial displacement of tapered cone-screw abutments into implants and their replicas as a function of the tightening torque level. Twenty tapered cone-screw abutments were selected. Two groups were created: 10 abutments were secured into 10 implants, and 10 abutments were secured into 10 corresponding implant replicas. Each abutment was tightened in increasing increments of 5 Ncm, from 0 Ncm to 45 Ncm, with a torque controller. The length of each sample was measured repeatedly with an Electronic Digital Micrometer. The mean axial displacement for the implant group and the replica group was calculated. The data were analyzed by the Mann-Whitney and Spearman tests. For both groups, there was always an axial displacement of the abutment upon each incremental application of torque. The mean axial displacement values varied between 7 and 12 microm for the implant group and between 6 and 21 microm for the replica group at each 5-Ncm increment. From 0 to 45 Ncm, the total mean axial displacement values were 89 microm for the implant group and 122 microm for the replica group. There was a continuous axial displacement of the abutments into implants and implant replicas when the applied torque was raised from 0 to 45 Ncm. Torque applied above the level recommended by the manufacturer increased the difference in displacement between the two groups.

  9. Difficult Sudoku Puzzles Created by Replica Exchange Monte Carlo Method

    OpenAIRE

    Watanabe, Hiroshi

    2013-01-01

    An algorithm to create difficult Sudoku puzzles is proposed. An Ising spin-glass like Hamiltonian describing difficulty of puzzles is defined, and difficult puzzles are created by minimizing the energy of the Hamiltonian. We adopt the replica exchange Monte Carlo method with simultaneous temperature adjustments to search lower energy states efficiently, and we succeed in creating a puzzle which is the world hardest ever created in our definition, to our best knowledge. (Added on Mar. 11, the ...

  10. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  11. Distributed Prognostics and Health Management with a Wireless Network Architecture

    Science.gov (United States)

    Goebel, Kai; Saha, Sankalita; Sha, Bhaskar

    2013-01-01

    A heterogeneous set of system components monitored by a varied suite of sensors and a particle-filtering (PF) framework, with the power and the flexibility to adapt to the different diagnostic and prognostic needs, has been developed. Both the diagnostic and prognostic tasks are formulated as a particle-filtering problem in order to explicitly represent and manage uncertainties in state estimation and remaining life estimation. Current state-of-the-art prognostic health management (PHM) systems are mostly centralized in nature, where all the processing is reliant on a single processor. This can lead to a loss in functionality in case of a crash of the central processor or monitor. Furthermore, with increases in the volume of sensor data as well as the complexity of algorithms, traditional centralized systems become for a number of reasons somewhat ungainly for successful deployment, and efficient distributed architectures can be more beneficial. The distributed health management architecture is comprised of a network of smart sensor devices. These devices monitor the health of various subsystems or modules. They perform diagnostics operations and trigger prognostics operations based on user-defined thresholds and rules. The sensor devices, called computing elements (CEs), consist of a sensor, or set of sensors, and a communication device (i.e., a wireless transceiver beside an embedded processing element). The CE runs in either a diagnostic or prognostic operating mode. The diagnostic mode is the default mode where a CE monitors a given subsystem or component through a low-weight diagnostic algorithm. If a CE detects a critical condition during monitoring, it raises a flag. Depending on availability of resources, a networked local cluster of CEs is formed that then carries out prognostics and fault mitigation by efficient distribution of the tasks. It should be noted that the CEs are expected not to suspend their previous tasks in the prognostic mode. When the

  12. Replica sizing strategy for aortic valve replacement improves haemodynamic outcome of the epic supra valve.

    Science.gov (United States)

    Gonzalez-Lopez, David; Faerber, Gloria; Diab, Mahmoud; Amorim, Paulo; Zeynalov, Natig; Doenst, Torsten

    2017-10-01

    Current sizing strategies suggest valve selection based on annulus diameter despite supra-annular placement of biological prostheses potentially allowing placement of a larger size. We assessed the frequency of selecting a larger prosthesis if prosthesis size was selected using a replica (upsizing) and evaluated its impact on haemodynamics. We analysed all discharge echocardiograms between June 2012 and June 2014, where a replica sizer was used for isolated aortic valve replacement (Epic Supra: 266 patients, Trifecta: 49 patients). Upsizing was possible in 71% of the Epic Supra valves (by 1 size: 168, by 2 sizes: 20) and in 59% of the Trifectas (by 1 size: 26, by 2 sizes: 3). Patients for whom upsizing was possible had the lowest pressure gradients within their annulus size groups. The difference was significant in annulus diameters of 21-22 or 25-26 mm (Epic Supra) and 23-24 mm (Trifecta). Trifecta gradients were the lowest. However, the ability to upsize the Epic Supra by 2 sizes eliminated the differences between Epic Supra and Trifecta. Upsizing did not cause intraoperative complications. Using replica sizers for aortic prosthesis size selection allows the implantation of bigger prostheses than recommended in most cases and reduces postoperative gradients, specifically for Epic Supra. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  13. Effectively Managing the Air Force Enterprise Architecture

    National Research Council Canada - National Science Library

    Sharkey, Jamie P

    2005-01-01

    The Air Force is developing and implementing an enterprise architecture to meet the Clinger-Cohen Act's requirement that all federal agencies use an architecture to guide their information technology (IT) investments...

  14. Inferring predator behavior from attack rates on prey-replicas that differ in conspicuousness.

    Directory of Open Access Journals (Sweden)

    Yoel E Stuart

    Full Text Available Behavioral ecologists and evolutionary biologists have long studied how predators respond to prey items novel in color and pattern. Because a predatory response is influenced by both the predator's ability to detect the prey and a post-detection behavioral response, variation among prey types in conspicuousness may confound inference about post-prey-detection predator behavior. That is, a relatively high attack rate on a given prey type may result primarily from enhanced conspicuousness and not predators' direct preference for that prey. Few studies, however, account for such variation in conspicuousness. In a field experiment, we measured predation rates on clay replicas of two aposematic forms of the poison dart frog Dendrobates pumilio, one novel and one familiar, and two cryptic controls. To ask whether predators prefer or avoid a novel aposematic prey form independently of conspicuousness differences among replicas, we first modeled the visual system of a typical avian predator. Then, we used this model to estimate replica contrast against a leaf litter background to test whether variation in contrast alone could explain variation in predator attack rate. We found that absolute predation rates did not differ among color forms. Predation rates relative to conspicuousness did, however, deviate significantly from expectation, suggesting that predators do make post-detection decisions to avoid or attack a given prey type. The direction of this deviation from expectation, though, depended on assumptions we made about how avian predators discriminate objects from the visual background. Our results show that it is important to account for prey conspicuousness when investigating predator behavior and also that existing models of predator visual systems need to be refined.

  15. An Open Distributed Architecture for Sensor Networks for Risk Management

    Directory of Open Access Journals (Sweden)

    Ralf Denzer

    2008-03-01

    Full Text Available Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word ‘sensors’ covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth’s surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: ‘Open Architecture and Spatial Data

  16. Fabrication of micropillar substrates using replicas of alpha-particle irradiated and chemically etched PADC films

    Energy Technology Data Exchange (ETDEWEB)

    Ng, C.K.M. [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong); Chong, E.Y.W. [Department of Orthopaedics and Traumatology, University of Hong Kong (Hong Kong); Roy, V.A.L. [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong); Cheung, K.M.C.; Yeung, K.W.K. [Department of Orthopaedics and Traumatology, University of Hong Kong (Hong Kong); Yu, K.N., E-mail: appetery@cityu.edu.hk [Department of Physics and Materials Science, City University of Hong Kong, Tat Chee Avenue, Kowloon Tong (Hong Kong)

    2012-07-15

    We proposed a simple method to fabricate micropillar substrates. Polyallyldiglycol carbonate (PADC) films were irradiated by alpha particles and then chemically etched to form a cast with micron-scale spherical pores. A polydimethylsiloxane (PDMS) replica of this PADC film gave a micropillar substrate with micron-scale spherical pillars. HeLa cells cultured on such a micropillar substrate had significantly larger percentage of cells entering S-phase, attached cell numbers and cell spreading areas. - Highlights: Black-Right-Pointing-Pointer We proposed a simple method to fabricate micropillar substrates. Black-Right-Pointing-Pointer Polyallyldiglycol carbonate films were irradiated and etched to form casts. Black-Right-Pointing-Pointer Polydimethylsiloxane replica then formed the micropillar substrates. Black-Right-Pointing-Pointer Attachment and proliferation of HeLa cells were enhanced on these substrates.

  17. An Empirical Investigation of Architectural Heritage Management Implications for Tourism: The Case of Portugal

    Directory of Open Access Journals (Sweden)

    Shahrbanoo Gholitabar

    2018-01-01

    Full Text Available The aims of this study are manifold. First, to investigate the potentials of architectural heritage in the context of tourism destination development, as well as examine public sector policies and make plans toward the preservation of these resources. Secondly, to appraise the outcome of preservation and its implications for tourism. The study is an effort to explore and understand the interrelationships between tourism and architectural heritage sites through tourist image and perception. For the purposes of this research, numerous heritage sites were sampled in Portugal. A mixed research method was utilized to gauge tourists’ image/perception of heritage resources, and impact (quantitative approach. A qualitative approach was utilized to assess the priority of tourists in their visits and public-sector policies toward heritage resource management and planning. The fuzzy logic method was used to assess the architectural value and the tourist and preservation potential of historical buildings in Porto/Aveiro. The contribution and implications of the study are also explained. The results revealed that architectural heritage resources have the most appeal to tourists. The study to date demonstrates the architectural value and tourist and preservation potential of the buildings observed via evaluation by fuzzy logic methods.

  18. Information architecture. Volume 1, The foundations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    The Information Management Planning and Architecture Coordinating Team was formed to establish an information architecture framework to meet DOE`s current and future information needs. This department- wide activity was initiated in accordance with the DOE Information Management Strategic Plan; it also supports the Departmental Strategic Plan. It recognizes recent changes in emphasis as reflected in OMB Circular A-130 and the Information Resources Management Planning Process Improvement Team recommendations. Sections of this document provides the foundation for establishing DOE`s Information Architecture: Background, Business Case (reduced duplication of effort, increased integration of activities, improved operational capabilities), Baseline (technology baseline currently in place within DOE), Vision (guiding principles for future DOE Information Architecture), Standards Process, Policy and Process Integration (describes relations between information architecture and business processes), and Next Steps. Following each section is a scenario. A glossary of terms is provided.

  19. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    Science.gov (United States)

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  20. Every Second Counts: Integrating Edge Computing and Service Oriented Architecture for Automatic Emergency Management

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2018-01-01

    Full Text Available Emergency management has long been recognized as a social challenge due to the criticality of the response time. In emergency situations such as severe traffic accidents, minimizing the response time, which requires close collaborations between all stakeholders involved and distributed intelligence support, leads to greater survival chance of the injured. However, the current response system is far from efficient, despite the rapid development of information and communication technologies. This paper presents an automated collaboration framework for emergency management that coordinates all stakeholders within the emergency response system and fully automates the rescue process. Applying the concept of multiaccess edge computing architecture, as well as choreography of the service oriented architecture, the system allows seamless coordination between multiple organizations in a distributed way through standard web services. A service choreography is designed to globally model the emergency management process from the time an accident occurs until the rescue is finished. The choreography can be synthesized to generate detailed specification on peer-to-peer interaction logic, and then the specification can be enacted and deployed on cloud infrastructures.

  1. A Risk Management Architecture for Emergency Integrated Aircraft Control

    Science.gov (United States)

    McGlynn, Gregory E.; Litt, Jonathan S.; Lemon, Kimberly A.; Csank, Jeffrey T.

    2011-01-01

    Enhanced engine operation--operation that is beyond normal limits--has the potential to improve the adaptability and safety of aircraft in emergency situations. Intelligent use of enhanced engine operation to improve the handling qualities of the aircraft requires sophisticated risk estimation techniques and a risk management system that spans the flight and propulsion controllers. In this paper, an architecture that weighs the risks of the emergency and of possible engine performance enhancements to reduce overall risk to the aircraft is described. Two examples of emergency situations are presented to demonstrate the interaction between the flight and propulsion controllers to facilitate the enhanced operation.

  2. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  3. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  4. Communications System Architecture Development for Air Traffic Management and Aviation Weather Information Dissemination

    Science.gov (United States)

    Gallagher, Seana; Olson, Matt; Blythe, Doug; Heletz, Jacob; Hamilton, Griff; Kolb, Bill; Homans, Al; Zemrowski, Ken; Decker, Steve; Tegge, Cindy

    2000-01-01

    This document is the NASA AATT Task Order 24 Final Report. NASA Research Task Order 24 calls for the development of eleven distinct task reports. Each task was a necessary exercise in the development of comprehensive communications systems architecture (CSA) for air traffic management and aviation weather information dissemination for 2015, the definition of the interim architecture for 2007, and the transition plan to achieve the desired End State. The eleven tasks are summarized along with the associated Task Order reference. The output of each task was an individual task report. The task reports that make up the main body of this document include Task 5, Task 6, Task 7, Task 8, Task 10, and Task 11. The other tasks provide the supporting detail used in the development of the architecture. These reports are included in the appendices. The detailed user needs, functional communications requirements and engineering requirements associated with Tasks 1, 2, and 3 have been put into a relational database and are provided electronically.

  5. ENTERPRISE ARCHITECTURE: AN INTERFACE CONCEPT BETWEEN THE ECONOMICS AND THE MANAGEMENT OF THE FIRM

    Directory of Open Access Journals (Sweden)

    José Carlos Cavalcanti

    2010-01-01

    Full Text Available This paper aims to broadly discuss a subject that intends to be an interface between the economics and the management of the firm: the Enterprise Architecture. This concept is viewed here as the most appropriate means to understand the impact of the information content, of the information systems, and of the information and communication technologies- ICTs on the internal technological and organizational choices of the firm. In support to this argument it relies on three main steps. Initially, a brief review of the main theories (economic and management of the firm is made highlighting their contributions, caveats and convergences. Then the paper bases its analysis on the concept of the firm as an “engine of information” and on a concept from the Computing Science and Engineering, Enterprise Architecture, to point out that these concepts bring up important contributions towards a more consistent interpretation of what the firm is (or how it is organized currently, in which is practically impossible to exist without the modern information tools. Finally, it is presented an innovative methodology, in an analogy to the Structure-Conduct-Performance Paradigm (that is traditionally used on the empirical market analysis, which identifies the firm according to three linear connected approaches: its architecture, its governance, and its growth strategy.

  6. ENTERPRISE ARCHITECTURE: AN INTERFACE CONCEPT BETWEEN THE ECONOMICS AND THE MANAGEMENT OF THE FIRM

    Directory of Open Access Journals (Sweden)

    José Carlos Cavalcanti

    2009-12-01

    Full Text Available This paper aims to broadly discuss a subject that intends to be an interface between the economics and the management of the firm: the Enterprise Architecture. This concept is viewed here as the most appropriate means to understand the impact of the information content, of the information systems, and of the information and communication technologies- ICTs on the internal technological and organizational choices of the firm. In support to this argument it relies on three main steps. Initially, a brief review of the main theories (economic and management of the firm is made highlighting their contributions, caveats and convergences. Then the paper bases its analysis on the concept of the firm as an “engine of information” and on a concept from the Computing Science and Engineering, Enterprise Architecture, to point out that these concepts bring up important contributions towards a more consistent interpretation of what the firm is (or how it is organized currently, in which is practically impossible to exist without the modern information tools. Finally, it is presented an innovative methodology, in an analogy to the Structure-Conduct-Performance Paradigm (that is traditionally used on the empirical market analysis, which identifies the firm according to three linear connected approaches: its architecture, its governance, and its growth strategy.

  7. Efficacy of independence sampling in replica exchange simulations of ordered and disordered proteins.

    Science.gov (United States)

    Lee, Kuo Hao; Chen, Jianhan

    2017-11-15

    Recasting temperature replica exchange (T-RE) as a special case of Gibbs sampling has led to a simple and efficient scheme for enhanced mixing (Chodera and Shirts, J. Chem. Phys., 2011, 135, 194110). To critically examine if T-RE with independence sampling (T-REis) improves conformational sampling, we performed T-RE and T-REis simulations of ordered and disordered proteins using coarse-grained and atomistic models. The results demonstrate that T-REis effectively increase the replica mobility in temperatures space with minimal computational overhead, especially for folded proteins. However, enhanced mixing does not translate well into improved conformational sampling. The convergences of thermodynamic properties interested are similar, with slight improvements for T-REis of ordered systems. The study re-affirms the efficiency of T-RE does not appear to be limited by temperature diffusion, but by the inherent rates of spontaneous large-scale conformational re-arrangements. Due to its simplicity and efficacy of enhanced mixing, T-REis is expected to be more effective when incorporated with various Hamiltonian-RE protocols. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. Order parameter free enhanced sampling of the vapor-liquid transition using the generalized replica exchange method.

    Science.gov (United States)

    Lu, Qing; Kim, Jaegil; Straub, John E

    2013-03-14

    The generalized Replica Exchange Method (gREM) is extended into the isobaric-isothermal ensemble, and applied to simulate a vapor-liquid phase transition in Lennard-Jones fluids. Merging an optimally designed generalized ensemble sampling with replica exchange, gREM is particularly well suited for the effective simulation of first-order phase transitions characterized by "backbending" in the statistical temperature. While the metastable and unstable states in the vicinity of the first-order phase transition are masked by the enthalpy gap in temperature replica exchange method simulations, they are transformed into stable states through the parameterized effective sampling weights in gREM simulations, and join vapor and liquid phases with a succession of unimodal enthalpy distributions. The enhanced sampling across metastable and unstable states is achieved without the need to identify a "good" order parameter for biased sampling. We performed gREM simulations at various pressures below and near the critical pressure to examine the change in behavior of the vapor-liquid phase transition at different pressures. We observed a crossover from the first-order phase transition at low pressure, characterized by the backbending in the statistical temperature and the "kink" in the Gibbs free energy, to a continuous second-order phase transition near the critical pressure. The controlling mechanisms of nucleation and continuous phase transition are evident and the coexistence properties and phase diagram are found in agreement with literature results.

  9. A dynamic replication management strategy in distributed GIS

    Science.gov (United States)

    Pan, Shaoming; Xiong, Lian; Xu, Zhengquan; Chong, Yanwen; Meng, Qingxiang

    2018-03-01

    Replication strategy is one of effective solutions to meet the requirement of service response time by preparing data in advance to avoid the delay of reading data from disks. This paper presents a brand-new method to create copies considering the selection of replicas set, the number of copies for each replica and the placement strategy of all copies. First, the popularities of all data are computed considering both the historical access records and the timeliness of the records. Then, replica set can be selected based on their recent popularities. Also, an enhanced Q-value scheme is proposed to assign the number of copies for each replica. Finally, a reasonable copies placement strategy is designed to meet the requirement of load balance. In addition, we present several experiments that compare the proposed method with techniques that use other replication management strategies. The results show that the proposed model has better performance than other algorithms in all respects. Moreover, the experiments based on different parameters also demonstrated the effectiveness and adaptability of the proposed algorithm.

  10. Peer-To-Peer Architectures in Distributed Data Management Systems for Large Hadron Collider Experiments

    CERN Document Server

    Lo Presti, Giuseppe; Lo Re, G; Orsini, L

    2005-01-01

    The main goal of the presented research is to investigate Peer-to-Peer architectures and to leverage distributed services to support networked autonomous systems. The research work focuses on development and demonstration of technologies suitable for providing autonomy and flexibility in the context of distributed network management and distributed data acquisition. A network management system enables the network administrator to monitor a computer network and properly handle any failure that can arise within the network. An online data acquisition (DAQ) system for high-energy physics experiments has to collect, combine, filter, and store for later analysis a huge amount of data, describing subatomic particles collision events. Both domains have tight constraints which are discussed and tackled in this work. New emerging paradigms have been investigated to design novel middleware architectures for such distributed systems, particularly the Active Networks paradigm and the Peer-to-Peer paradigm. A network man...

  11. Impact of Channel Estimation Errors on Multiuser Detection via the Replica Method

    Directory of Open Access Journals (Sweden)

    Li Husheng

    2005-01-01

    Full Text Available For practical wireless DS-CDMA systems, channel estimation is imperfect due to noise and interference. In this paper, the impact of channel estimation errors on multiuser detection (MUD is analyzed under the framework of the replica method. System performance is obtained in the large system limit for optimal MUD, linear MUD, and turbo MUD, and is validated by numerical results for finite systems.

  12. A COMPARATIVE STUDY OF SYSTEM NETWORK ARCHITECTURE Vs DIGITAL NETWORK ARCHITECTURE

    OpenAIRE

    Seema; Mukesh Arya

    2011-01-01

    The efficient managing system of sources is mandatory for the successful running of any network. Here this paper describes the most popular network architectures one of developed by IBM, System Network Architecture (SNA) and other is Digital Network Architecture (DNA). As we know that the network standards and protocols are needed for the network developers as well as users. Some standards are The IEEE 802.3 standards (The Institute of Electrical and Electronics Engineers 1980) (LAN), IBM Sta...

  13. Using the multi-objective optimization replica exchange Monte Carlo enhanced sampling method for protein-small molecule docking.

    Science.gov (United States)

    Wang, Hongrui; Liu, Hongwei; Cai, Leixin; Wang, Caixia; Lv, Qiang

    2017-07-10

    In this study, we extended the replica exchange Monte Carlo (REMC) sampling method to protein-small molecule docking conformational prediction using RosettaLigand. In contrast to the traditional Monte Carlo (MC) and REMC sampling methods, these methods use multi-objective optimization Pareto front information to facilitate the selection of replicas for exchange. The Pareto front information generated to select lower energy conformations as representative conformation structure replicas can facilitate the convergence of the available conformational space, including available near-native structures. Furthermore, our approach directly provides min-min scenario Pareto optimal solutions, as well as a hybrid of the min-min and max-min scenario Pareto optimal solutions with lower energy conformations for use as structure templates in the REMC sampling method. These methods were validated based on a thorough analysis of a benchmark data set containing 16 benchmark test cases. An in-depth comparison between MC, REMC, multi-objective optimization-REMC (MO-REMC), and hybrid MO-REMC (HMO-REMC) sampling methods was performed to illustrate the differences between the four conformational search strategies. Our findings demonstrate that the MO-REMC and HMO-REMC conformational sampling methods are powerful approaches for obtaining protein-small molecule docking conformational predictions based on the binding energy of complexes in RosettaLigand.

  14. Application of Data Architecture Model in Enterprise Management

    Directory of Open Access Journals (Sweden)

    Shi Song

    2017-01-01

    Full Text Available Today is in the era of rapid development of information, data volume of high-speed expansion, it is difficult in the previous system for communication, sharing and integration. In order to integrate data resources, eliminate the “information island”, build enterprise development blueprints, people gradually realize the importance of top design. Many enterprises for their own development to establish their own enterprise architecture of the top design, and as its core data architecture model is also reflected in different industries according to different development. This paper mainly studies the data architecture model, expounds the role of data architecture model and its relationship.

  15. Analysis of control and management plane for hybrid fiber radio architectures

    DEFF Research Database (Denmark)

    Kardaras, Georgios; Pham, Tien Thang; Soler, José

    2010-01-01

    This paper presents the existing Radio over Fiber (RoF) architectures and focuses on the control and management plane of the Remote Antenna Unit (RAU). Broadband wireless standards, such as WiMAX and LTE, incorporate optical technologies following the distributed base station concept. The control...... and management of the RAU becomes a critical task, since it can facilitate allocation of resources, configuration and upgrade of the remote unit and constant monitoring of its performance. In the case of baseband over fiber, two protocols (OBSAI and CPRI) introduce a well-defined control and management plane....... In the case of intermediate/radio frequency over fiber, this paper presents a simple approach, which can provide configurability and real-time monitoring of the RAU over the same optical link. This is realized by multiplexing high frequency user data with baseband frequency control data at the Central Office...

  16. Three-Dimensional Interpretation of Sculptural Heritage with Digital and Tangible 3D Printed Replicas

    Science.gov (United States)

    Saorin, José Luis; Carbonell-Carrera, Carlos; Cantero, Jorge de la Torre; Meier, Cecile; Aleman, Drago Diaz

    2017-01-01

    Spatial interpretation features as a skill to acquire in the educational curricula. The visualization and interpretation of three-dimensional objects in tactile devices and the possibility of digital manufacturing with 3D printers, offers an opportunity to include replicas of sculptures in teaching and, thus, facilitate the 3D interpretation of…

  17. PHYSICAL DISABILITY, STIGMA, AND PHYSICAL ACTIVITY IN CHILDREN: A REPLICA STUDY

    Directory of Open Access Journals (Sweden)

    Markus GEBHARDT

    2016-04-01

    Full Text Available Introduction: Stereotypes can be reduced through positive descriptions. A stigma that able-bodied adults have towards children with physical disability can be reduced when the child is portrayed as being active. The study found out that a sporty active child, who uses a wheelchair, is perceived as more competent than the sporty active able-bodied child. Objective: This study is a replica study to support the hypotheses and to examine the stereotypes of able-bodied adults towards children with and without (physical disabilities. Methods: This study presents two experimental replica studies using a 2 (physical activity x 2 (sporty activities. The dependent variables were the perception of competencies and warmth according to Stereotype Content Model (SCM. Study 1 is an online experiment with 355 students of the Open University of Hagen. Study 2 surveys 1176 participants (from Munich and Graz with a paper-pencil-questionnaire. Results: The significant interaction effect was not supported by our studies. The sporty able-bodied child was rated higher in competences than the sporty child, who use a wheelchair. Sporting activity only reduces the stigma towards children with a physical disability slightly. Conclusion: The stigma towards children with physical disability can be reduced when the child is portrayed as being active, but the effect was not strong enough to chance the original classification by the SCM.

  18. Energy Management Systems and tertiary regulation in hierarchical control architectures for islanded micro-grids

    DEFF Research Database (Denmark)

    Sanseverino, Eleonora Riva; Di Silvestre, Maria Luisa; Quang, Ninh Nguyen

    2015-01-01

    In this paper, the structure of the highest level of a hierarchical control architecture for micro-grids is proposed. Such structure includes two sub-levels: the Energy Management System, EMS, and the tertiary regulation. The first devoted to energy resources allocation in each time slot based...

  19. Toward Measures for Software Architectures

    National Research Council Canada - National Science Library

    Chastek, Gary; Ferguson, Robert

    2006-01-01

    .... Defining these architectural measures is very difficult. The software architecture deeply affects subsequent development and project management decisions, such as the breakdown of the coding tasks and the definition of the development increments...

  20. Developing intelligent transportation systems using the national ITS architecture: an executive edition for senior transportation managers

    Science.gov (United States)

    1998-02-01

    This document has been produced to provide senior transportation managers of state and local departments of transportation with practical guidance for deploying Intelligent Transportation Systems (ITS) consistent with the National ITS Architecture. T...

  1. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  2. Information Architecture: Looking Ahead.

    Science.gov (United States)

    Rosenfeld, Louis

    2002-01-01

    Considers the future of the field of information architecture. Highlights include a comparison with the growth of the field of professional management; the design of information systems since the Web; more demanding users; the need for an interdisciplinary approach; and how to define information architecture. (LRW)

  3. A Generalized DRM Architectural Framework

    Directory of Open Access Journals (Sweden)

    PATRICIU, V. V.

    2011-02-01

    Full Text Available Online digital goods distribution environment lead to the need for a system to protect digital intellectual property. Digital Rights Management (DRM is the system born to protect and control distribution and use of those digital assets. The present paper is a review of the current state of DRM, focusing on architectural design, security technologies, and important DRM deployments. The paper primarily synthesizes DRM architectures within a general framework. We also present DRM ecosystem as providing a better understanding of what is currently happening to content rights management from a technological point of view. This paper includes conclusions of several DRM initiative studies, related to rights management systems with the purpose of identifying and describing the most significant DRM architectural models. The basic functions and processes of the DRM solutions are identified.

  4. Fully distributed monitoring architecture supporting multiple trackees and trackers in indoor mobile asset management application.

    Science.gov (United States)

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2014-03-21

    A tracking service like asset management is essential in a dynamic hospital environment consisting of numerous mobile assets (e.g., wheelchairs or infusion pumps) that are continuously relocated throughout a hospital. The tracking service is accomplished based on the key technologies of an indoor location-based service (LBS), such as locating and monitoring multiple mobile targets inside a building in real time. An indoor LBS such as a tracking service entails numerous resource lookups being requested concurrently and frequently from several locations, as well as a network infrastructure requiring support for high scalability in indoor environments. A traditional centralized architecture needs to maintain a geographic map of the entire building or complex in its central server, which can cause low scalability and traffic congestion. This paper presents a self-organizing and fully distributed indoor mobile asset management (MAM) platform, and proposes an architecture for multiple trackees (such as mobile assets) and trackers based on the proposed distributed platform in real time. In order to verify the suggested platform, scalability performance according to increases in the number of concurrent lookups was evaluated in a real test bed. Tracking latency and traffic load ratio in the proposed tracking architecture was also evaluated.

  5. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Science.gov (United States)

    2010-07-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  6. Marshall Application Realignment System (MARS) Architecture

    Science.gov (United States)

    Belshe, Andrea; Sutton, Mandy

    2010-01-01

    The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most

  7. IT Service Management Architectures

    DEFF Research Database (Denmark)

    Tambo, Torben; Filtenborg, Jacob

    2018-01-01

    IT service providers tend to view their services as quasi-embedded in the client organisations infrastructure. Therefore, IT service providers lack a full picture of being an organisation with its own enterprise archicture. By systematically developing an enterprise architecture using the unifica...... the unification operating model, IT service providers can much more efficient develop relevant service catalogues with connected reporting services related to SLA's and KPI's based on ITIL and newer frameworks like SIAM....

  8. Concepts and diagram elements for architectural knowledge management

    NARCIS (Netherlands)

    Orlic, B.; Mak, R.H.; David, I.; Lukkien, J.J.

    2011-01-01

    Capturing architectural knowledge is very important for the evolution of software products. There is increasing awareness that an essential part of this knowledge is in fact the very process of architectural reasoning and decision making, and not just its end results. Therefore, a conceptual

  9. Power Management for A Distributed Wireless Health Management Architecture

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed wireless architectures for prognostics is an important enabling step in prognostic research in order to achieve feasible real-time system health...

  10. Long-time atomistic simulations with the Parallel Replica Dynamics method

    Science.gov (United States)

    Perez, Danny

    Molecular Dynamics (MD) -- the numerical integration of atomistic equations of motion -- is a workhorse of computational materials science. Indeed, MD can in principle be used to obtain any thermodynamic or kinetic quantity, without introducing any approximation or assumptions beyond the adequacy of the interaction potential. It is therefore an extremely powerful and flexible tool to study materials with atomistic spatio-temporal resolution. These enviable qualities however come at a steep computational price, hence limiting the system sizes and simulation times that can be achieved in practice. While the size limitation can be efficiently addressed with massively parallel implementations of MD based on spatial decomposition strategies, allowing for the simulation of trillions of atoms, the same approach usually cannot extend the timescales much beyond microseconds. In this article, we discuss an alternative parallel-in-time approach, the Parallel Replica Dynamics (ParRep) method, that aims at addressing the timescale limitation of MD for systems that evolve through rare state-to-state transitions. We review the formal underpinnings of the method and demonstrate that it can provide arbitrarily accurate results for any definition of the states. When an adequate definition of the states is available, ParRep can simulate trajectories with a parallel speedup approaching the number of replicas used. We demonstrate the usefulness of ParRep by presenting different examples of materials simulations where access to long timescales was essential to access the physical regime of interest and discuss practical considerations that must be addressed to carry out these simulations. Work supported by the United States Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division.

  11. Gamma-ray spectra and doses from the Little Boy replica

    International Nuclear Information System (INIS)

    Moss, C.E.; Lucas, M.C.; Tisinger, E.W.; Hamm, M.E.

    1984-01-01

    Most radiation safety guidelines in the nuclear industry are based on the data concerning the survivors of the nuclear explosions at Hiroshima and Nagasaki. Crucial to determining these guidelines is the radiation from the explosions. We have measured gamma-ray pulse-height distributions from an accurate replica of the Little Boy device used at Hiroshima, operated at low power levels near critical. The device was placed outdoors on a stand 4 m from the ground to minimize environmental effects. The power levels were based on a monitor detector calibrated very carefully in independent experiments. High-resolution pulse-height distributions were acquired with a germanium detector to identify the lines and to obtain line intensities. The 7631 to 7645 keV doublet from neutron capture in the heavy steel case was dominant. Low-resolution pulse-height distributions were acquired with bismuth-germanate detectors. We calculated flux spectra from these distributions using accurately measured detector response functions and efficiency curves. We then calculated dose-rate spectra from the flux spectra using a flux-to-dose-rate conversion procedure. The integral of each dose-rate spectrum gave an integral dose rate. The integral doses at 2 m ranged from 0.46 to 1.03 mrem per 10 13 fissions. The output of the Little Boy replica can be calculated with Monte Carlo codes. Comparison of our experimental spectra, line intensities, and integral doses can be used to verify these calculations at low power levels and give increased confidence to the calculated values from the explosion at Hiroshima. These calculations then can be used to establish better radiation safety guidelines. 7 references, 7 figures, 2 tables

  12. A Global Navigation Management Architecture Applied to Autonomous Robots in Urban Environments

    OpenAIRE

    Kenmogne , Ide-Flore; Alves De Lima , Danilo; Corrêa Victorino , Alessandro

    2015-01-01

    International audience; This paper presents a global behavioral architecture used as a coordinator for the global navigation of an autonomous vehicle in an urban context including traffic laws and other features. As an extension to our previous work, the approach presented here focuses on how this manager uses perceived information (from low cost cameras and laser scanners) combined with digital road-map data to take decisions. This decision consists in retrieving the car's state regarding th...

  13. A procedure to analyze surface profiles of the protein molecules visualized by quick-freeze deep-etch replica electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kimori, Yoshitaka [Division of Biomolecular Imaging, Institute of Medical Science, The University of Tokyo, Minato-ku, Tokyo 108-8639 (Japan); Department of Bioscience and Bioinformatics, Kyushu Institute of Technology, Iizuka, Fukuoka 820-8502 (Japan); Oguchi, Yosuke [Department of Electric Engineering, Kogakuin University, Hachioji, Tokyo 192-0015 (Japan); Ichise, Norihiko [Department of Visual Communication, Komazawa Women' s University, Inagi, Tokyo 206-8511 (Japan); Baba, Norio [Department of Electric Engineering, Kogakuin University, Hachioji, Tokyo 192-0015 (Japan); Katayama, Eisaku [Division of Biomolecular Imaging, Institute of Medical Science, The University of Tokyo, Minato-ku, Tokyo 108-8639 (Japan)]. E-mail: ekatayam@ims.u-tokyo.ac.jp

    2007-01-15

    Quick-freeze deep-etch replica electron microscopy gives high contrast snapshots of individual protein molecules under physiological conditions in vitro or in situ. The images show delicate internal pattern, possibly reflecting the rotary-shadowed surface profile of the molecule. As a step to build the new system for the 'Structural analysis of single molecules', we propose a procedure to quantitatively characterize the structural property of individual molecules; e.g. conformational type and precise view-angle of the molecules, if the crystallographic structure of the target molecule is available. This paper presents a framework to determine the observed face of the protein molecule by analyzing the surface profile of individual molecules visualized in freeze-replica specimens. A comprehensive set of rotary-shadowed views of the protein molecule was artificially generated from the available atomic coordinates using light-rendering software. Exploiting new mathematical morphology-based image filter, characteristic features were extracted from each image and stored as template. Similar features were extracted from the true replica image and the most likely projection angle and the conformation of the observed particle were determined by quantitative comparison with a set of archived images. The performance and the robustness of the procedure were examined with myosin head structure in defined configuration for actual application.

  14. A procedure to analyze surface profiles of the protein molecules visualized by quick-freeze deep-etch replica electron microscopy

    International Nuclear Information System (INIS)

    Kimori, Yoshitaka; Oguchi, Yosuke; Ichise, Norihiko; Baba, Norio; Katayama, Eisaku

    2007-01-01

    Quick-freeze deep-etch replica electron microscopy gives high contrast snapshots of individual protein molecules under physiological conditions in vitro or in situ. The images show delicate internal pattern, possibly reflecting the rotary-shadowed surface profile of the molecule. As a step to build the new system for the 'Structural analysis of single molecules', we propose a procedure to quantitatively characterize the structural property of individual molecules; e.g. conformational type and precise view-angle of the molecules, if the crystallographic structure of the target molecule is available. This paper presents a framework to determine the observed face of the protein molecule by analyzing the surface profile of individual molecules visualized in freeze-replica specimens. A comprehensive set of rotary-shadowed views of the protein molecule was artificially generated from the available atomic coordinates using light-rendering software. Exploiting new mathematical morphology-based image filter, characteristic features were extracted from each image and stored as template. Similar features were extracted from the true replica image and the most likely projection angle and the conformation of the observed particle were determined by quantitative comparison with a set of archived images. The performance and the robustness of the procedure were examined with myosin head structure in defined configuration for actual application

  15. Development of isothermal-isobaric replica-permutation method for molecular dynamics and Monte Carlo simulations and its application to reveal temperature and pressure dependence of folded, misfolded, and unfolded states of chignolin

    Science.gov (United States)

    Yamauchi, Masataka; Okumura, Hisashi

    2017-11-01

    We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.

  16. Scalable Multi-core Architectures Design Methodologies and Tools

    CERN Document Server

    Jantsch, Axel

    2012-01-01

    As Moore’s law continues to unfold, two important trends have recently emerged. First, the growth of chip capacity is translated into a corresponding increase of number of cores. Second, the parallalization of the computation and 3D integration technologies lead to distributed memory architectures. This book provides a current snapshot of industrial and academic research, conducted as part of the European FP7 MOSART project, addressing urgent challenges in many-core architectures and application mapping.  It addresses the architectural design of many core chips, memory and data management, power management, design and programming methodologies. It also describes how new techniques have been applied in various industrial case studies. Describes trends towards distributed memory architectures and distributed power management; Integrates Network on Chip with distributed, shared memory architectures; Demonstrates novel design methodologies and frameworks for multi-core design space exploration; Shows how midll...

  17. Optimizing bulk data transfers using network measurements: A practical case

    International Nuclear Information System (INIS)

    Ciuffoletti, A; Merola, L; Palmieri, F; Russo, G; Pardi, S

    2010-01-01

    In modern Data Grid infrastructures, we increasingly face the problem of providing the running applications with fast and reliable access to large data volumes, often geographically distributed across the network. As a direct consequence, the concept of replication has been adopted by the grid community to increase data availability and maximize job throughput. To be really effective, such process has to be driven by specific optimization strategies that define when and where replicas should be created or deleted on a per-site basis, and which replicas a job should use. These strategies have to take into account the available network bandwidth as a primary resource, prior to any consideration about storage or processing power. We present a novel replica management service, integrated within the Gluedomains active network monitoring architecture, designed and implemented within the centralized collective middleware framework of the SCoPE project to provide network-aware transfer services for data intensive Grid applications.

  18. Enterprise-wide PACS: beyond radiology, an architecture to manage all medical images.

    Science.gov (United States)

    Bandon, David; Lovis, Christian; Geissbühler, Antoine; Vallée, Jean-Paul

    2005-08-01

    Picture archiving and communication systems (PACS) have the vocation to manage all medical images acquired within the hospital. To address the various situations encountered in the imaging specialties, the traditional architecture used for the radiology department has to evolve. We present our preliminarily results toward an enterprise-wide PACS intended to support all kind of image production in medicine, from biomolecular images to whole-body pictures. Our solution is based on an existing radiologic PACS system from which images are distributed through an electronic patient record to all care facilities. This platform is enriched with a flexible integration framework supporting digital image communication in medicine (DICOM) and DICOM-XML formats. In addition, a generic workflow engine highly customizable is used to drive work processes. Echocardiology; hematology; ear, nose, and throat; and dermatology, including wounds, follow-up is the first implemented extensions outside of radiology. We also propose a global strategy for further developments based on three possible architectures for an enterprise-wide PACS.

  19. On the Application of Replica Molding Technology for the Indirect Measurement of Surface and Geometry of Micromilled Components

    DEFF Research Database (Denmark)

    Baruffi, Federico; Parenti, Paolo; Cacciatore, Francesco

    2017-01-01

    the replica molding technology. The method consists of obtaining a replica of the feature that is inaccessible for standard measurement devices and performing its indirect measurement. This paper examines the performance of a commercial replication media applied to the indirect measurement of micromilled...... components. Two specifically designed micromilled benchmark samples were used to assess the accuracy in replicating both surface texture and geometry. A 3D confocal microscope and a focus variation instrument were employed and the associated uncertainties were evaluated. The replication method proved...... to be suitable for characterizing micromilled surface texture even though an average overestimation in the nano-metric level of the Sa parameter was observed. On the other hand, the replicated geometry generally underestimated that of the master, often leading to a different measurement output considering...

  20. A new creep-strain-replica method for evaluating the remaining life time of components

    International Nuclear Information System (INIS)

    Joas, H.D.

    2001-01-01

    To realise a safe and economic operation of older power- or chemical plants a strategy for maintenance is necessary, which makes it possible to operate a component or the plant longer than 300,000 operating hours, this also for the situation that the mode of operation has changed meanwhile. In Germany a realistic evaluation of the remaining life-time is done by comparing the actual calculated test data of a component with the code TRD 301 and TRD 508 and additional non-destructive tests or other codes like ASME, Sec. II, BS 5500, AFCEN (1985). According to many boundary conditions, the calculated data are inaccurate and the measuring of creep-strain at temperatures of about 600 o C with capacitive strain-gauges very expensive. Description of the approach of the problems: spotwelding of two gauges to the surface of a component (in a defined distance), forming a gap, producing of replica of the gap after certain operating hours at shut-down conditions by trained personal, evaluation of the replica to gain the amount of creep-strain using a scanning electron microscope, assessment of the creep-strain data. (Author)

  1. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context.

    Science.gov (United States)

    Cerchecci, Matteo; Luti, Francesco; Mecocci, Alessandro; Parrino, Stefano; Peruzzi, Giacomo; Pozzebon, Alessandro

    2018-04-21

    This paper focuses on the realization of an Internet of Things (IoT) architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network) technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented.

  2. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context

    Directory of Open Access Journals (Sweden)

    Matteo Cerchecci

    2018-04-01

    Full Text Available This paper focuses on the realization of an Internet of Things (IoT architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented.

  3. A Low Power IoT Sensor Node Architecture for Waste Management Within Smart Cities Context

    Science.gov (United States)

    Cerchecci, Matteo; Luti, Francesco; Mecocci, Alessandro; Parrino, Stefano; Peruzzi, Giacomo

    2018-01-01

    This paper focuses on the realization of an Internet of Things (IoT) architecture to optimize waste management in the context of Smart Cities. In particular, a novel typology of sensor node based on the use of low cost and low power components is described. This node is provided with a single-chip microcontroller, a sensor able to measure the filling level of trash bins using ultrasounds and a data transmission module based on the LoRa LPWAN (Low Power Wide Area Network) technology. Together with the node, a minimal network architecture was designed, based on a LoRa gateway, with the purpose of testing the IoT node performances. Especially, the paper analyzes in detail the node architecture, focusing on the energy saving technologies and policies, with the purpose of extending the batteries lifetime by reducing power consumption, through hardware and software optimization. Tests on sensor and radio module effectiveness are also presented. PMID:29690552

  4. The BWS Open Business Enterprise System Architecture

    Directory of Open Access Journals (Sweden)

    Cristian IONITA

    2011-01-01

    Full Text Available Business process management systems play a central role in supporting the business operations of medium and large organizations. This paper analyses the properties current business enterprise systems and proposes a new application type called Open Business Enterprise System. A new open system architecture called Business Workflow System is proposed. This architecture combines the instruments for flexible data management, business process management and integration into a flexible system able to manage modern business operations. The architecture was validated by implementing it into the DocuMentor platform used by major companies in Romania and US. These implementations offered the necessary data to create and refine an enterprise integration methodology called DMCPI. The final section of the paper presents the concepts, stages and techniques employed by the methodology.

  5. Knowledge Management Practice in Two Australian Architecture-Engineering-Construction (AEC Companies

    Directory of Open Access Journals (Sweden)

    Patrick Zou

    2012-11-01

    Full Text Available Knowledge management (KM could be described as a management system that supports the creation, sharing and retrieving of valued information, expertise and insight within and across communities of people and related organizations using information and communication technologies and hence it is a combination of the effective application of information technlogy and management of human resources. KM is becoming a core competitive factor in construction operations. This paper presents the results of two case studies of KM practices in large AEC (architecture, engineering and construction companies through desk-top study and semi-structured interviews. The results indicate that implementing KM in AEC companies leads to competitive advantages and improved decision-making, problem solving and business performance. The results also indicateed that while technology plays an important role, top management commitment, total employee involvement, performance assessment and the culture of knowledge-learning and sharing must be considered when implementing KM. Therefore it is suggested that the implementation of KM should incorporate the company's vision, work processes, technology and culture, to improve the ability of knowledge creating, capturing, sharing, retrieving and ultimately, to improve the company's competitive advantage, decision making, problem solving and innovation.

  6. Enterprise Architecture : Management tool and blueprint for the organization

    NARCIS (Netherlands)

    Jonkers, Henk; Lankhorst, Marc M.; ter Doest, Hugo W.L.; Arbab, Farhad; Bosma, Hans; Wieringa, Roelf J.

    2006-01-01

    This is an editorial to a special issue of ISF on enterprise architecture.We define the concept of enterprise architecture, notivate its importance, and then introduce the papers in this special issue.

  7. One step replica symmetry breaking and extreme order statistics of logarithmic REMs

    Directory of Open Access Journals (Sweden)

    Xiangyu Cao, Yan V. Fyodorov, Pierre Le Doussal

    2016-12-01

    Full Text Available Building upon the one-step replica symmetry breaking formalism, duly understood and ramified, we show that the sequence of ordered extreme values of a general class of Euclidean-space logarithmically correlated random energy models (logREMs behave in the thermodynamic limit as a randomly shifted decorated exponential Poisson point process. The distribution of the random shift is determined solely by the large-distance ("infra-red", IR limit of the model, and is equal to the free energy distribution at the critical temperature up to a translation. the decoration process is determined solely by the small-distance ("ultraviolet", UV limit, in terms of the biased minimal process. Our approach provides connections of the replica framework to results in the probability literature and sheds further light on the freezing/duality conjecture which was the source of many previous results for log-REMs. In this way we derive the general and explicit formulae for the joint probability density of depths of the first and second minima (as well its higher-order generalizations in terms of model-specific contributions from UV as well as IR limits. In particular, we show that the second min statistics is largely independent of details of UV data, whose influence is seen only through the mean value of the gap. For a given log-correlated field this parameter can be evaluated numerically, and we provide several numerical tests of our theory using the circular model of $1/f$-noise.

  8. Space Station needs, attributes and architectural options. Volume 2, book 2, part 2, Task 2: Information management system

    Science.gov (United States)

    1983-01-01

    Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.

  9. The Platform Architecture and Key Technology of Cloud Service that Support Wisdom City Management

    Directory of Open Access Journals (Sweden)

    Liang Xiao

    2013-05-01

    Full Text Available According to the new requirement of constructing “resource sharing and service on demand” wisdom city system, this paper put forward the platform architecture of cloud service for wisdom city management which support IaaS, PaaS and SaaS three types of service model on the basis of researching the operation mode of the wisdom city which under cloud computing environment and through the research of mass storing technology of cloud data, building technology of cloud resource pool, scheduling management methods and monitoring technology of cloud resource, security management and control technology of cloud platform and other key technologies. The platform supports wisdom city system to achieve business or resource scheduling management optimization and the unified and efficient management of large-scale hardware and software, which has the characteristics of cross-domain resource scheduling, cross-domain data sharing, cross-domain facilities integration and cross-domain service integration.

  10. 2005 dossier: clay. Tome: architecture and management of the geologic disposal facility

    International Nuclear Information System (INIS)

    2005-01-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the design of a geologic disposal facility for high-level and long-lived radioactive wastes in argilite formations. Content: 1 - approach of the study: goal, main steps of the design study, iterative approach, content; 2 - general description: high-level and long-lived radioactive wastes, purposes of a reversible disposal, geologic context of the Meuse/Haute-Marne site - the Callovo-Oxfordian formation, design principles of the disposal facility architecture, role of the different disposal components; 3 - high-level and long-lived wastes: production scenarios, description of primary containers, inventory model, hypotheses about receipt fluxes of primary containers; 4- disposal containers: B-type waste containers, C-type waste containers, spent fuel disposal containers; 5 - disposal modules: B-type waste disposal modules, C-type waste disposal modules, spent-fuel disposal modules; 6 - overall underground architecture: main safety questions, overall design, dimensioning factors, construction logic and overall exploitation of the facility, dimensioning of galleries, underground architecture adaptation to different scenarios; 7 - boreholes and galleries: general needs, design principles retained, boreholes description, galleries description, building up of boreholes and galleries, durability of facilities, backfilling and sealing up of boreholes and galleries; 8 - surface facilities: general organization, nuclear area, industrial and administrative area, tailings area; 9 - nuclear exploitation means of the facility: receipt of primary containers and preparation of disposal containers, transfer of disposal containers from the surface to the disposal alveoles, setting up of containers inside alveoles; 10 - reversible management of the disposal: step by step disposal process, mastery of disposal behaviour and action capacity, observation and

  11. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management.

    Science.gov (United States)

    Luo, Qiang; Ma, Yina; Bhatt, Meghana A; Montague, P Read; Feng, Jianfeng

    2017-01-01

    Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent noise (SDN), which captured the directional connectivity underlying the impression management during the bargaining game. We found that the sophisticated strategists engaged stronger directional connectivity from both dorsal anterior cingulate cortex and retrosplenial cortex to rostral prefrontal cortex, and the strengths of these directional influences were associated with higher level of deception during the game. Using the directional connectivity as a neural signature, we identified the strategic deception with 80% accuracy by a machine-learning classifier. These results suggest that different social strategies are supported by distinct patterns of directional connectivity among key brain regions for social cognition.

  12. Enhanced Sampling in Molecular Dynamics Using Metadynamics, Replica-Exchange, and Temperature-Acceleration

    Directory of Open Access Journals (Sweden)

    Cameron Abrams

    2013-12-01

    Full Text Available We review a selection of methods for performing enhanced sampling in molecular dynamics simulations. We consider methods based on collective variable biasing and on tempering, and offer both historical and contemporary perspectives. In collective-variable biasing, we first discuss methods stemming from thermodynamic integration that use mean force biasing, including the adaptive biasing force algorithm and temperature acceleration. We then turn to methods that use bias potentials, including umbrella sampling and metadynamics. We next consider parallel tempering and replica-exchange methods. We conclude with a brief presentation of some combination methods.

  13. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    OpenAIRE

    Alsahli, Abdulaziz; Khan, Hameed; Alyahya, Sultan

    2016-01-01

    Requirement change management (RCM) is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD).The o...

  14. INTEGRATED INFORMATION SYSTEM ARCHITECTURE PROVIDING BEHAVIORAL FEATURE

    Directory of Open Access Journals (Sweden)

    Vladimir N. Shvedenko

    2016-11-01

    Full Text Available The paper deals with creation of integrated information system architecture capable of supporting management decisions using behavioral features. The paper considers the architecture of information decision support system for production system management. The behavioral feature is given to an information system, and it ensures extraction, processing of information, management decision-making with both automated and automatic modes of decision-making subsystem being permitted. Practical implementation of information system with behavior is based on service-oriented architecture: there is a set of independent services in the information system that provides data of its subsystems or data processing by separate application under the chosen variant of the problematic situation settlement. For creation of integrated information system with behavior we propose architecture including the following subsystems: data bus, subsystem for interaction with the integrated applications based on metadata, business process management subsystem, subsystem for the current state analysis of the enterprise and management decision-making, behavior training subsystem. For each problematic situation a separate logical layer service is created in Unified Service Bus handling problematic situations. This architecture reduces system information complexity due to the fact that with a constant amount of system elements the number of links decreases, since each layer provides communication center of responsibility for the resource with the services of corresponding applications. If a similar problematic situation occurs, its resolution is automatically removed from problem situation metamodel repository and business process metamodel of its settlement. In the business process performance commands are generated to the corresponding centers of responsibility to settle a problematic situation.

  15. Medical Data Architecture Project Status

    Science.gov (United States)

    Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2018-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.

  16. Enterprise Architecture v systému řízení

    OpenAIRE

    Sládek, Pavel

    2008-01-01

    This thesis deals with subject of enterprise architecture management. Topcis covered are enterprise environment integration, enterprise architecture parts description and description of shift in enterprise architecture focus point. This trend can be described as shift from IT related activity for IT management through tool for IT and business alignment to the final state as a tool for business change implementation. There are four goals of this thesis. Description of enterprise architecture, ...

  17. Laboratory studies of groundwater degassing in replicas of natural fractured rock for linear flow geometry

    International Nuclear Information System (INIS)

    Geller, J.T.

    1998-02-01

    Laboratory experiments to simulate two-phase (gas and water) flow in fractured rock evolving from groundwater degassing were conducted in transparent replicas of natural rock fractures. These experiments extend the work by Geller et al. (1995) and Jarsjo and Geller (1996) that tests the hypothesis that groundwater degassing caused observed flow reductions in the Stripa Simulated Drift Experiment (SDE). Understanding degassing effects over a range of gas contents is needed due to the uncertainty in the gas contents of the water at the SDE. The main objectives of this study were to: (1) measure the effect of groundwater degassing on liquid flow rates for lower gas contents than the values used in Geller for linear flow geometry in the same fracture replicas of Geller; (2) provide a data set to develop a predictive model of two-phase flow in fractures for conditions of groundwater degassing; and (3) improve the certainty of experimental gas contents (this effort included modifications to the experimental system used by Geller et al. and separate gas-water equilibration tests). The Stripa site is being considered for a high-level radioactive waste repository

  18. A sequence of Clifford algebras and three replicas of Dirac particle

    International Nuclear Information System (INIS)

    Krolikowski, W.; Warsaw Univ.

    1990-01-01

    The embedding of Dirac algebra into a sequence N=1, 2, 3,... of Clifford algebras is discussed, leading to Dirac equations with N=1 additional, electromagnetically ''hidden'' spins 1/2. It is shown that there are three and only three replicas N=1, 3, 5 of Dirac particle if the theory of relativity together with the probability interpretation of wave function is applied both to the ''visible'' spin and ''hidden'' spins, and a new ''hidden exclusion principle''is imposed on the wave function (then ''hidden'' spins add up to zero). It is appealing to explore this idea in order to explain the puzzle of three generations of lepton and quarks. (author)

  19. Time-reversal focusing of an expanding soliton gas in disordered replicas

    KAUST Repository

    Fratalocchi, Andrea; Armaroli, A.; Trillo, S.

    2011-01-01

    We investigate the properties of time reversibility of a soliton gas, originating from a dispersive regularization of a shock wave, as it propagates in a strongly disordered environment. An original approach combining information measures and spin glass theory shows that time-reversal focusing occurs for different replicas of the disorder in forward and backward propagation, provided the disorder varies on a length scale much shorter than the width of the soliton constituents. The analysis is performed by starting from a new class of reflectionless potentials, which describe the most general form of an expanding soliton gas of the defocusing nonlinear Schrödinger equation.

  20. Time-reversal focusing of an expanding soliton gas in disordered replicas

    KAUST Repository

    Fratalocchi, Andrea

    2011-05-31

    We investigate the properties of time reversibility of a soliton gas, originating from a dispersive regularization of a shock wave, as it propagates in a strongly disordered environment. An original approach combining information measures and spin glass theory shows that time-reversal focusing occurs for different replicas of the disorder in forward and backward propagation, provided the disorder varies on a length scale much shorter than the width of the soliton constituents. The analysis is performed by starting from a new class of reflectionless potentials, which describe the most general form of an expanding soliton gas of the defocusing nonlinear Schrödinger equation.

  1. Architectural Design of a LMS with LTSA-Conformance

    Science.gov (United States)

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper illustrates an approach for architectural design of a Learning Management System (LMS), which is verifiable against the Learning Technology System Architecture (LTSA) conformance rules. We introduce a new method for software architectural design that extends the Unified Modeling Language (UML) component diagram with the formal…

  2. Life Management of high energy piping girth welds

    International Nuclear Information System (INIS)

    Cohn, M.J.; Paterson, S.R.

    1994-01-01

    Life management of high energy piping systems is a synergistic process that combines the collective results from nondestructive examination (NDE), stress analysis, metallurgical replication, and fracture mechanics evaluations. To achieve conclusions with high confidence and reliability, the methodology requires that: (1) all weldments must be appropriately examined to establish initial baseline data and indicate both fabrication and inservice damage, (2) as-found stress analyses must be consistent with field estimated displacements to select sites of maximum in-service damage, and (3) metallurgical replicas must be taken at high stress sites determined as an outcome of a life exhaustion evaluation. The multidiscipline tasks are effectively managed by developing a rational framework incorporating all of the above requirements. Analytical algorithms that estimate linear and nonlinear degradation effects are included in the remaining life predictions. High damage locations are selected based on a life exhaustion calculation. Localized remaining life and reexamination intervals for failures governed by creep and fatigue damage are estimated by metallurgical replica cavitation damage and the service life. Stress-based creep damage identified in metallurgical replicas should correspond to the predicted high damage locations from the life exhaustion calculation. The methodology minimizes future reexamination locations while providing high certainty of monitoring lead-the-fleet damage. This life management program recognizes the aging process in plant equipment. It establishes a continuous process of examinations, evaluations, and decisions to track degradation of the piping system life cycle. Potential problems are identified long before failures occur. Corrective action is taken during scheduled outages to maintain the required level of plant performance

  3. SET: Session Layer-Assisted Efficient TCP Management Architecture for 6LoWPAN with Multiple Gateways

    Directory of Open Access Journals (Sweden)

    Akbar AliHammad

    2010-01-01

    Full Text Available 6LoWPAN (IPv6 based Low-Power Personal Area Network is a protocol specification that facilitates communication of IPv6 packets on top of IEEE 802.15.4 so that Internet and wireless sensor networks can be inter-connected. This interconnection is especially required in commercial and enterprise applications of sensor networks where reliable and timely data transfers such as multiple code updates are needed from Internet nodes to sensor nodes. For this type of inbound traffic which is mostly bulk, TCP as transport layer protocol is essential, resulting in end-to-end TCP session through a default gateway. In this scenario, a single gateway tends to become the bottleneck because of non-uniform connectivity to all the sensor nodes besides being vulnerable to buffer overflow. We propose SET; a management architecture for multiple split-TCP sessions across a number of serving gateways. SET implements striping and multiple TCP session management through a shim at session layer. Through analytical modeling and ns2 simulations, we show that our proposed architecture optimizes communication for ingress bulk data transfer while providing associated load balancing services. We conclude that multiple split-TCP sessions managed in parallel across a number of gateways result in reduced latency for bulk data transfer and provide robustness against gateway failures.

  4. An agent based architecture for high-risk neonate management at neonatal intensive care unit.

    Science.gov (United States)

    Malak, Jaleh Shoshtarian; Safdari, Reza; Zeraati, Hojjat; Nayeri, Fatemeh Sadat; Mohammadzadeh, Niloofar; Farajollah, Seide Sedighe Seied

    2018-01-01

    In recent years, the use of new tools and technologies has decreased the neonatal mortality rate. Despite the positive effect of using these technologies, the decisions are complex and uncertain in critical conditions when the neonate is preterm or has a low birth weight or malformations. There is a need to automate the high-risk neonate management process by creating real-time and more precise decision support tools. To create a collaborative and real-time environment to manage neonates with critical conditions at the NICU (Neonatal Intensive Care Unit) and to overcome high-risk neonate management weaknesses by applying a multi agent based analysis and design methodology as a new solution for NICU management. This study was a basic research for medical informatics method development that was carried out in 2017. The requirement analysis was done by reviewing articles on NICU Decision Support Systems. PubMed, Science Direct, and IEEE databases were searched. Only English articles published after 1990 were included; also, a needs assessment was done by reviewing the extracted features and current processes at the NICU environment where the research was conducted. We analyzed the requirements and identified the main system roles (agents) and interactions by a comparative study of existing NICU decision support systems. The Universal Multi Agent Platform (UMAP) was applied to implement a prototype of our multi agent based high-risk neonate management architecture. Local environment agents interacted inside a container and each container interacted with external resources, including other NICU systems and consultation centers. In the NICU container, the main identified agents were reception, monitoring, NICU registry, and outcome prediction, which interacted with human agents including nurses and physicians. Managing patients at the NICU units requires online data collection, real-time collaboration, and management of many components. Multi agent systems are applied as

  5. PICNIC Architecture.

    Science.gov (United States)

    Saranummi, Niilo

    2005-01-01

    The PICNIC architecture aims at supporting inter-enterprise integration and the facilitation of collaboration between healthcare organisations. The concept of a Regional Health Economy (RHE) is introduced to illustrate the varying nature of inter-enterprise collaboration between healthcare organisations collaborating in providing health services to citizens and patients in a regional setting. The PICNIC architecture comprises a number of PICNIC IT Services, the interfaces between them and presents a way to assemble these into a functioning Regional Health Care Network meeting the needs and concerns of its stakeholders. The PICNIC architecture is presented through a number of views relevant to different stakeholder groups. The stakeholders of the first view are national and regional health authorities and policy makers. The view describes how the architecture enables the implementation of national and regional health policies, strategies and organisational structures. The stakeholders of the second view, the service viewpoint, are the care providers, health professionals, patients and citizens. The view describes how the architecture supports and enables regional care delivery and process management including continuity of care (shared care) and citizen-centred health services. The stakeholders of the third view, the engineering view, are those that design, build and implement the RHCN. The view comprises four sub views: software engineering, IT services engineering, security and data. The proposed architecture is founded into the main stream of how distributed computing environments are evolving. The architecture is realised using the web services approach. A number of well established technology platforms and generic standards exist that can be used to implement the software components. The software components that are specified in PICNIC are implemented in Open Source.

  6. QoS Management and Control for an All-IP WiMAX Network Architecture: Design, Implementation and Evaluation

    Directory of Open Access Journals (Sweden)

    Thomas Michael Bohnert

    2008-01-01

    Full Text Available The IEEE 802.16 standard provides a specification for a fixed and mobile broadband wireless access system, offering high data rate transmission of multimedia services with different Quality-of-Service (QoS requirements through the air interface. The WiMAX Forum, going beyond the air interface, defined an end-to-end WiMAX network architecture, based on an all-IP platform in order to complete the standards required for a commercial rollout of WiMAX as broadband wireless access solution. As the WiMAX network architecture is only a functional specification, this paper focuses on an innovative solution for an end-to-end WiMAX network architecture offering in compliance with the WiMAX Forum specification. To our best knowledge, this is the first WiMAX architecture built by a research consortium globally and was performed within the framework of the European IST project WEIRD (WiMAX Extension to Isolated Research Data networks. One of the principal features of our architecture is support for end-to-end QoS achieved by the integration of resource control in the WiMAX wireless link and the resource management in the wired domains in the network core. In this paper we present the architectural design of these QoS features in the overall WiMAX all-IP framework and their functional as well as performance evaluation. The presented results can safely be considered as unique and timely for any WiMAX system integrator.

  7. Microprocessor architectures RISC, CISC and DSP

    CERN Document Server

    Heath, Steve

    1995-01-01

    'Why are there all these different processor architectures and what do they all mean? Which processor will I use? How should I choose it?' Given the task of selecting an architecture or design approach, both engineers and managers require a knowledge of the whole system and an explanation of the design tradeoffs and their effects. This is information that rarely appears in data sheets or user manuals. This book fills that knowledge gap.Section 1 provides a primer and history of the three basic microprocessor architectures. Section 2 describes the ways in which the architectures react with the

  8. The Functional Architecture of the Brain Underlies Strategic Deception in Impression Management

    Directory of Open Access Journals (Sweden)

    Qiang Luo

    2017-11-01

    Full Text Available Impression management, as one of the most essential skills of social function, impacts one's survival and success in human societies. However, the neural architecture underpinning this social skill remains poorly understood. By employing a two-person bargaining game, we exposed three strategies involving distinct cognitive processes for social impression management with different levels of strategic deception. We utilized a novel adaptation of Granger causality accounting for signal-dependent noise (SDN, which captured the directional connectivity underlying the impression management during the bargaining game. We found that the sophisticated strategists engaged stronger directional connectivity from both dorsal anterior cingulate cortex and retrosplenial cortex to rostral prefrontal cortex, and the strengths of these directional influences were associated with higher level of deception during the game. Using the directional connectivity as a neural signature, we identified the strategic deception with 80% accuracy by a machine-learning classifier. These results suggest that different social strategies are supported by distinct patterns of directional connectivity among key brain regions for social cognition.

  9. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  10. Space Elevators Preliminary Architectural View

    Science.gov (United States)

    Pullum, L.; Swan, P. A.

    Space Systems Architecture has been expanded into a process by the US Department of Defense for their large scale systems of systems development programs. This paper uses the steps in the process to establishes a framework for Space Elevator systems to be developed and provides a methodology to manage complexity. This new approach to developing a family of systems is based upon three architectural views: Operational View OV), Systems View (SV), and Technical Standards View (TV). The top level view of the process establishes the stages for the development of the first Space Elevator and is called Architectural View - 1, Overview and Summary. This paper will show the guidelines and steps of the process while focusing upon components of the Space Elevator Preliminary Architecture View. This Preliminary Architecture View is presented as a draft starting point for the Space Elevator Project.

  11. Management of cyber physical objects in the future Internet of Things methods, architectures and applications

    CERN Document Server

    Loscri, Valeria; Rovella, Anna; Fortino, Giancarlo

    2016-01-01

    This book focuses on new methods, architectures, and applications for the management of Cyber Physical Objects (CPOs) in the context of the Internet of Things (IoT). It covers a wide range of topics related to CPOs, such as resource management, hardware platforms, communication and control, and control and estimation over networks. It also discusses decentralized, distributed, and cooperative optimization as well as effective discovery, management, and querying of CPOs. Other chapters outline the applications of control, real-time aspects, and software for CPOs and introduce readers to agent-oriented CPOs, communication support for CPOs, real-world deployment of CPOs, and CPOs in Complex Systems. There is a focus on the importance of application of IoT technologies for Smart Cities.

  12. A Novel Architecture of Metadata Management System Based on Intelligent Cache

    Institute of Scientific and Technical Information of China (English)

    SONG Baoyan; ZHAO Hongwei; WANG Yan; GAO Nan; XU Jin

    2006-01-01

    This paper introduces a novel architecture of metadata management system based on intelligent cache called Metadata Intelligent Cache Controller (MICC). By using an intelligent cache to control the metadata system, MICC can deal with different scenarios such as splitting and merging of queries into sub-queries for available metadata sets in local, in order to reduce access time of remote queries. Application can find results patially from local cache and the remaining portion of the metadata that can be fetched from remote locations. Using the existing metadata, it can not only enhance the fault tolerance and load balancing of system effectively, but also improve the efficiency of access while ensuring the access quality.

  13. Measurements of the absolute neutron fluence spectrum emitted at 00 and 900 from the Little-Boy replica

    International Nuclear Information System (INIS)

    Roberts, J.H.; Gold, R.; Preston, C.C.

    1986-01-01

    Nuclear research emulsions (NRE) have been used to characterize the neutron spectrum emitted by the Little-Boy replica. NRE were irradiated at the Little-Boy surface, as well as approximately 2 m from the center of the Little-Boy replica, using polar angles of 0 0 , 30 0 , 60 0 , and 90 0 . For the NRE exposed at 2 m, neutron background was determined using shadow shields of borated polyethylene. Emulsion scanning to date has concentrated exclusively on the 2-m, 0 0 and 2-m, 90 0 locations. Approximately 5000 proton-recoil tracks have been measured in NRE irradiated at each of these locations. At the 2-m, 90 0 location, the NRE neutron spectrum extends from 0.37 MeV up to 8.2 MeV; whereas the NRE neutron spectrum at the 2-m, 0 0 location is much softer and extends only up to 2.7 MeV. NRE neutron spectrometry results at these two locations are compared with both liquid scintillator neutron spectrometry and Monte Carlo calculations. (author)

  14. Enterprise architecture approach to mining companies engineering

    Directory of Open Access Journals (Sweden)

    Ilin’ Igor

    2017-01-01

    Full Text Available As Russian economy is still largely oriented on commodities production, there are a lot of cities where mining and commodity-oriented enterprises are the backbone of city economy. The mentioned enterprises mostly define the life quality of citizens in such cities, thus there are high requirements for engineering of city-forming enterprises. The paper describes the enterprise architecture approach for management system engineering of the mining enterprises. The paper contains the model of the mining enterprise architecture, the approach to the development and implementation of an integrated management system based on the concept of enterprise architecture and the structure of information systems and information technology infrastructure of the mining enterprise.

  15. Trends in PACS architecture

    International Nuclear Information System (INIS)

    Bellon, Erwin; Feron, Michel; Deprez, Tom; Reynders, Reinoud; Van den Bosch, Bart

    2011-01-01

    Radiological Picture Archiving and Communication Systems (PACS) have only relatively recently become abundant. Many hospitals have made the transition to PACS about a decade ago. During that decade requirements and available technology have changed considerably. In this paper we look at factors that influence the design of tomorrow's systems, especially those in larger multidisciplinary hospitals. We discuss their impact on PACS architecture (a technological perspective) as well as their impact on radiology (a management perspective). We emphasize that many of these influencing factors originate outside radiology and that radiology has little impact on these factors. That makes it the more important for managers in radiology to be aware of architectural aspects and it may change cooperation of radiology with, among others, the hospital's central IT department.

  16. Seed-a distributed data base architecture for global management of steam-generator inspection data

    International Nuclear Information System (INIS)

    Soon Ju Kang; Yu Rak Choi; Hee Gon Woo; Seong Su Choi

    1996-01-01

    This paper deals with a data management system - called SEED (Steam-generator Eddy-current Expert Database) for global handling of SG (steam generator) tube inspection data in nuclear power plants. The SEED integrates all stages in SG tube inspection process and supports all data such as raw eddy current data, inspection history data, SG tube information, etc. SEED implemented under client/server computing architecture for supporting LAN/WAN based graphical user interface facilities using WWW programming tools. (author)

  17. Design, Analysis and User Acceptance of Architectural Design Education in Learning System Based on Knowledge Management Theory

    Science.gov (United States)

    Wu, Yun-Wu; Lin, Yu-An; Wen, Ming-Hui; Perng, Yeng-Hong; Hsu, I-Ting

    2016-01-01

    The major purpose of this study is to develop an architectural design knowledge management learning system with corresponding learning activities to help the students have meaningful learning and improve their design capability in their learning process. Firstly, the system can help the students to obtain and share useful knowledge. Secondly,…

  18. Satellite ATM Networks: Architectures and Guidelines Developed

    Science.gov (United States)

    vonDeak, Thomas C.; Yegendu, Ferit

    1999-01-01

    An important element of satellite-supported asynchronous transfer mode (ATM) networking will involve support for the routing and rerouting of active connections. Work published under the auspices of the Telecommunications Industry Association (http://www.tiaonline.org), describes basic architectures and routing protocol issues for satellite ATM (SATATM) networks. The architectures and issues identified will serve as a basis for further development of technical specifications for these SATATM networks. Three ATM network architectures for bent pipe satellites and three ATM network architectures for satellites with onboard ATM switches were developed. The architectures differ from one another in terms of required level of mobility, supported data rates, supported terrestrial interfaces, and onboard processing and switching requirements. The documentation addresses low-, middle-, and geosynchronous-Earth-orbit satellite configurations. The satellite environment may require real-time routing to support the mobility of end devices and nodes of the ATM network itself. This requires the network to be able to reroute active circuits in real time. In addition to supporting mobility, rerouting can also be used to (1) optimize network routing, (2) respond to changing quality-of-service requirements, and (3) provide a fault tolerance mechanism. Traffic management and control functions are necessary in ATM to ensure that the quality-of-service requirements associated with each connection are not violated and also to provide flow and congestion control functions. Functions related to traffic management were identified and described. Most of these traffic management functions will be supported by on-ground ATM switches, but in a hybrid terrestrial-satellite ATM network, some of the traffic management functions may have to be supported by the onboard satellite ATM switch. Future work is planned to examine the tradeoffs of placing traffic management functions onboard a satellite as

  19. Parylene C coating for high-performance replica molding.

    Science.gov (United States)

    Heyries, Kevin A; Hansen, Carl L

    2011-12-07

    This paper presents an improvement to the soft lithography fabrication process that uses chemical vapor deposition of poly(chloro-p-xylylene) (parylene C) to protect microfabricated masters and to improve the release of polymer devices following replica molding. Chemical vapor deposition creates nanometre thick conformal coatings of parylene C on silicon wafers having arrays of 30 μm high SU8 pillars with densities ranging from 278 to 10,040 features per mm(2) and aspect ratios (height : width) from 1 : 1 to 6 : 1. A single coating of parylene C was sufficient to permanently promote poly(dimethyl)siloxane (PDMS) mold release and to protect masters for an indefinite number of molding cycles. We also show that the improved release properties of parylene treated masters allow for fabrication with hard polymers, such as poly(urethane), that would otherwise not be compatible with SU8 on silicon masters. Parylene C provides a robust and high performance mold release coating for soft lithography microfabrication that extends the life of microfabricated masters and improves the achievable density and aspect ratio of replicated features.

  20. ANALYSIS OF THE KEY ACTIVITIES OF THE LIFE CYCLE OF KNOWLEDGE MANAGEMENT IN THE UNIVERSITY AND DEVELOPMENT OF THE CONCEPTUAL ARCHITECTURE OF THE KNOWLEDGE MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Eugene N. Tcheremsina

    2013-01-01

    Full Text Available This article gives an analysis of the key activities of the life cycle of knowledge management in terms of the features of knowledge management in higher education. Based on the analysis we propose the model of the conceptual architecture of virtual knowledge-space of a university. The proposed model is the basis for the development of kernel intercollegiate virtual knowledge-space, based on cloud technology. 

  1. Medical Data Architecture (MDA) Project Status

    Science.gov (United States)

    Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2018-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.

  2. Non-invasive Florentine Renaissance Panel Painting Replica Structures Investigation by Using Terahertz Time-Domain Imaging (THz-TDI) Technique

    DEFF Research Database (Denmark)

    Dandolo, Corinna Ludovica Koch; Picollo, Marcello; Cucci, Costanza

    2016-01-01

    The potentials of the Terahertz Time-Domain Imaging (THz-TDI) technique for a non-invasive inspection of panel paintings have been considered in detail. The THz-TD data acquired on a replica of a panel painting made in imitation of Italian Renaissance panel paintings were processed in order to pr...

  3. Architecture of security management unit for safe hosting of multiple agents

    Science.gov (United States)

    Gilmont, Tanguy; Legat, Jean-Didier; Quisquater, Jean-Jacques

    1999-04-01

    In such growing areas as remote applications in large public networks, electronic commerce, digital signature, intellectual property and copyright protection, and even operating system extensibility, the hardware security level offered by existing processors is insufficient. They lack protection mechanisms that prevent the user from tampering critical data owned by those applications. Some devices make exception, but have not enough processing power nor enough memory to stand up to such applications (e.g. smart cards). This paper proposes an architecture of secure processor, in which the classical memory management unit is extended into a new security management unit. It allows ciphered code execution and ciphered data processing. An internal permanent memory can store cipher keys and critical data for several client agents simultaneously. The ordinary supervisor privilege scheme is replaced by a privilege inheritance mechanism that is more suited to operating system extensibility. The result is a secure processor that has hardware support for extensible multitask operating systems, and can be used for both general applications and critical applications needing strong protection. The security management unit and the internal permanent memory can be added to an existing CPU core without loss of performance, and do not require it to be modified.

  4. Trust information-based privacy architecture for ubiquitous health.

    Science.gov (United States)

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system

  5. The Management of Manufacturing-Oriented Informatics Systems Using Efficient and Flexible Architectures

    Directory of Open Access Journals (Sweden)

    Constantin Daniel AVRAM

    2011-01-01

    Full Text Available Industry and in particular the manufacturing-oriented sector has always been researched and innovated as a result of technological progress, diversification and differentiation among consumers' demands. A company that provides to its customers products matching perfectly their demands at competitive prices has a great advantage over its competitors. Manufacturing-oriented information systems are becoming more flexible and configurable and they require integration with the entire organization. This can be done using efficient software architectures that will allow the coexistence between commercial solutions and open source components while sharing computing resources organized in grid infrastructures and under the governance of powerful management tools.

  6. PLM support to architecture based development

    DEFF Research Database (Denmark)

    Bruun, Hans Peter Lomholt

    , organisation, processes, etc. To identify, evaluate, and align aspects of these domains are necessary for developing the optimal layout of product architectures. It is stated in this thesis that architectures describe building principles for products, product families, and product programs, where this project...... and developing architectures can be difficult to manage, update, and maintain during development. The concept of representing product architectures in computer-based product information tools has though been central in this research, and in the creation of results. A standard PLM tool (Windchill PDMLink...... architectures in computer systems. Presented results build on research literature and experiences from industrial partners. Verification of the theory contributions, approaches, models, and tools, have been carried out in industrial projects, with promising results. This thesis describes the means for: (1...

  7. Effect of boundary conditions on the strength and deformability of replicas of natural fractures in welded tuff: Comparison between predicted and observed shear behavior using a graphical method

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.; Robertson, A.B.

    1993-09-01

    Four series of cyclic direct-shear experiments were conducted on several replicas of three natural fractures and a laboratory-developed tensile fracture of welded tuff from Yucca Mountain to test the graphical load-displacement analysis method proposed by Saeb (1989) and Amadei and Saeb (1990). Based on the results of shear tests conducted on several joint replicas under different levels of constant normal load ranging between 0.6 and 25.6 kips (2.7 and 113.9 kN), the shear behavior of joint replicas under constant normal stiffness ranging between 14.8 and 187.5 kips/in. (25.9 and 328.1 kN/cm) was predicted by using the graphical method. The predictions were compared to the results of actual shear tests conducted for the same range of constant normal stiffness. In general, a good agreement was found between the predicted and the observed shear behavior

  8. Architecture of Environmental Engineering

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Alting, Leo

    2006-01-01

    An architecture of Environmental Engineering has been developed comprising the various disciplines and tools involved. It identifies industry as the major actor and target group, and it builds on the concept of Eco-efficiency. To improve Eco-efficiency, there is a limited number of intervention......-efficiency is the aim of Environmental Engineering, the discipline of synthesis – design and creation of solutions – will form a core pillar of the architecture. Other disciplines of Environmental Engineering exist forming the necessary background and frame for the synthesis. Environmental Engineering, thus, in essence...... comprise the disciplines of: management, system description & inventory, analysis & assessment, prioritisation, synthesis, and communication, each existing at all levels of intervention. The developed architecture of Environmental Engineering, thus, consists of thirty individual disciplines, within each...

  9. Architecture of Environmental Engineering

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Alting, Leo

    2004-01-01

    An architecture of Environmental Engineering has been developed comprising the various disciplines and tools involved. It identifies industry as the major actor and target group, and it builds on the concept of Eco-efficiency. To improve Eco-efficiency, there is a limited number of intervention...... of Eco-efficiency is the aim of Environmental Engineering, the discipline of synthesis – design and creation of solutions – will form a core pillar of the architecture. Other disciplines of Environmental Engineering exist forming the necessary background and frame for the synthesis. Environmental...... Engineering, thus, in essence comprise the disciplines of: management, system description & inventory, analysis & assessment, prioritisation, synthesis, and communication, each existing at all levels of intervention. The developed architecture of Environmental Engineering, thus, consists of thirty individual...

  10. An Enhanced System Architecture for Optimized Demand Side Management in Smart Grid

    Directory of Open Access Journals (Sweden)

    Anzar Mahmood

    2016-04-01

    Full Text Available Demand Side Management (DSM through optimization of home energy consumption in the smart grid environment is now one of the well-known research areas. Appliance scheduling has been done through many different algorithms to reduce peak load and, consequently, the Peak to Average Ratio (PAR. This paper presents a Comprehensive Home Energy Management Architecture (CHEMA with integration of multiple appliance scheduling options and enhanced load categorization in a smart grid environment. The CHEMA model consists of six layers and has been modeled in Simulink with an embedded MATLAB code. A single Knapsack optimization technique is used for scheduling and four different cases of cost reduction are modeled at the second layer of CHEMA. Fault identification and electricity theft control have also been added in CHEMA. Furthermore, carbon footprint calculations have been incorporated in order to make the users aware of environmental concerns. Simulation results prove the effectiveness of the proposed model.

  11. Putting Home Data Management into Perspective

    Science.gov (United States)

    2009-12-01

    lounge . Third, I performed users studies to evaluate the impact of view-based management. I created a user interface for replica management and placed...computers]. An electronic desert island out there. An e -Island.” Of course some of the data left behind on old devices should be left there; a device upgrade... bank , was unwilling to back up all their data and had a separate policy for raw video footage and finished items to save space and time. This cost

  12. A MCIN-based architecture of smart agriculture

    Directory of Open Access Journals (Sweden)

    Xiang Gu

    2017-09-01

    Full Text Available Purpose – Material conscious and information network (MCIN is a kind of cyber physics social system. This paper aims to study the MCIN modeling method and design the MCIN-based architecture of smart agriculture (MCIN-ASA which is different from current vertical architecture and involves production, management and commerce. Architecture is composed of three MCIN-ASA participants which are MCIN-ASA enterprises, individuals and commodity. Design/methodology/approach – Architecture uses enterprises and individuals personalized portals as the carriers which are linked precisely with each other through a peer-to-peer network called six-degrees-of-separation block-chain. The authors want to establish a self-organization, open and ecological operational system which includes active, personalized consumption, direct, centralized distribution, distributed and smart production. Findings – The paper models three main MCIN-ASA participants, namely, design the smart supply, demand and management functions, which show the feasibility innovation and high efficiency of implementing MCIN on agriculture. At the same time, the paper presents a prototype system based on the architecture. Originality/value – The authors think that MCIN-ASA improves current agriculture greatly and inspires a lot in production-marketing-combined electronic commerce.

  13. Designing an architectural style for dynamic medical Cross-Organizational Workflow management system: an approach based on agents and web services.

    Science.gov (United States)

    Bouzguenda, Lotfi; Turki, Manel

    2014-04-01

    This paper shows how the combined use of agent and web services technologies can help to design an architectural style for dynamic medical Cross-Organizational Workflow (COW) management system. Medical COW aims at supporting the collaboration between several autonomous and possibly heterogeneous medical processes, distributed over different organizations (Hospitals, Clinic or laboratories). Dynamic medical COW refers to occasional cooperation between these health organizations, free of structural constraints, where the medical partners involved and their number are not pre-defined. More precisely, this paper proposes a new architecture style based on agents and web services technologies to deal with two key coordination issues of dynamic COW: medical partners finding and negotiation between them. It also proposes how the proposed architecture for dynamic medical COW management system can connect to a multi-agent system coupling the Clinical Decision Support System (CDSS) with Computerized Prescriber Order Entry (CPOE). The idea is to assist the health professionals such as doctors, nurses and pharmacists with decision making tasks, as determining diagnosis or patient data analysis without stopping their clinical processes in order to act in a coherent way and to give care to the patient.

  14. Research and Design in Unified Coding Architecture for Smart Grids

    Directory of Open Access Journals (Sweden)

    Gang Han

    2013-09-01

    Full Text Available Standardized and sharing information platform is the foundation of the Smart Grids. In order to improve the dispatching center information integration of the power grids and achieve efficient data exchange, sharing and interoperability, a unified coding architecture is proposed. The architecture includes coding management layer, coding generation layer, information models layer and application system layer. Hierarchical design makes the whole coding architecture to adapt to different application environments, different interfaces, loosely coupled requirements, which can realize the integration model management function of the power grids. The life cycle and evaluation method of survival of unified coding architecture is proposed. It can ensure the stability and availability of the coding architecture. Finally, the development direction of coding technology of the Smart Grids in future is prospected.

  15. Autotransplantation of Premolars With a 3-Dimensional Printed Titanium Replica of the Donor Tooth Functioning as a Surgical Guide: Proof of Concept.

    Science.gov (United States)

    Verweij, Jop P; Moin, David Anssari; Mensink, Gertjan; Nijkamp, Peter; Wismeijer, Daniel; van Merkesteyn, J P Richard

    2016-06-01

    Autotransplantation of premolars is a good treatment option for young patients who have missing teeth. This study evaluated the use of a preoperatively 3-dimensional (3D)-printed replica of the donor tooth that functions as a surgical guide during autotransplantation. Five consecutive procedures were prospectively observed. Transplantations of maxillary premolars with optimal root development were included in this study. A 3D-printed replica of the donor tooth was used to prepare a precisely fitting new alveolus at the recipient site before extracting the donor tooth. Procedure time, extra-alveolar time, and number of attempts needed to achieve a good fit of the donor tooth in the new alveolus were recorded. For each transplantation procedure, the surgical time was shorter than 30 minutes. An immediate good fit of the donor tooth in the new alveolus was achieved with an extra-alveolar time shorter than 1 minute for all transplantations. These results show that the extra-alveolar time is very short when the surgical guide is used; therefore, the chance of iatrogenic damage to the donor tooth is minimized. The use of a replica of the donor tooth makes the autotransplantation procedure easier for the surgeon and facilitates optimal placement of the transplant. Copyright © 2016 The American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Fabrication of porous TiO{sub 2} films using a spongy replica prepared by layer-by-layer self-assembly method: Application to dye-sensitized solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Tsuge, Yosuke [Department of Applied Physics and Physico-informatics, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama-shi 223-8522 (Japan)]. E-mail: yotsuge@appi.keio.ac.jp; Inokuchi, Kohei [Department of Applied Physics and Physico-informatics, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama-shi 223-8522 (Japan); Onozuka, Katsuhiro [Department of Applied Physics and Physico-informatics, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama-shi 223-8522 (Japan); Shingo, Ohno [Research and Development Division, Bridgestone Corporation, 3-1-1 Ogawahigashi-cho, Kodaira-shi, Tokyo-to 187-8531 (Japan); Sugi, Shinichiro [Research and Development Division, Bridgestone Corporation, 3-1-1 Ogawahigashi-cho, Kodaira-shi, Tokyo-to 187-8531 (Japan); Yoshikawa, Masato [Research and Development Division, Bridgestone Corporation, 3-1-1 Ogawahigashi-cho, Kodaira-shi, Tokyo-to 187-8531 (Japan); Shiratori, Seimei [Department of Applied Physics and Physico-informatics, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama-shi 223-8522 (Japan)]. E-mail: shiratori@appi.keio.ac.jp

    2006-03-21

    In this study, we report the fabrication of the anatase TiO{sub 2} films with high porosity using a new spongy replica which prepared by layer-by-layer self-assembly technique. The scanning electron microscope photographs revealed that the spongy replica has an extremely porous microstructure and high surface area. Moreover, this porous replica was easily fabricated from a very flat film through the action with silver acetate solution. This method facilitated the porous TiO{sub 2} films with a high surface area. Additionally, by this method, a necking between the TiO{sub 2} films was strong and the amount of loaded dye was increased, so that the increase of forward electron transfer between the TiO{sub 2} films on the surface and the TiO{sub 2} films on the substrate. By using the fabricated porous TiO{sub 2} films as the photoelectrode for dye-sensitized solar cell, the improvement of the photocurrent-voltage characteristic was achieved, resulting in an energy conversion efficiency of Eff = 2.66% with the thickness of approximately 5 {mu}m.

  17. Managing the complexity of collective architectural designing

    NARCIS (Netherlands)

    Sebastian, R.

    2006-01-01

    This paper addresses the complexity of architectural designing whereby multiple designer work in close collaboration to conceive the design of an integrated building project at urban context. collaborative design conception has only recently been observed as a phenomenon of the built environment

  18. A Probabilistic Framework for Constructing Temporal Relations in Replica Exchange Molecular Trajectories.

    Science.gov (United States)

    Chattopadhyay, Aditya; Zheng, Min; Waller, Mark Paul; Priyakumar, U Deva

    2018-05-23

    Knowledge of the structure and dynamics of biomolecules is essential for elucidating the underlying mechanisms of biological processes. Given the stochastic nature of many biological processes, like protein unfolding, it's almost impossible that two independent simulations will generate the exact same sequence of events, which makes direct analysis of simulations difficult. Statistical models like Markov Chains, transition networks etc. help in shedding some light on the mechanistic nature of such processes by predicting long-time dynamics of these systems from short simulations. However, such methods fall short in analyzing trajectories with partial or no temporal information, for example, replica exchange molecular dynamics or Monte Carlo simulations. In this work we propose a probabilistic algorithm, borrowing concepts from graph theory and machine learning, to extract reactive pathways from molecular trajectories in the absence of temporal data. A suitable vector representation was chosen to represent each frame in the macromolecular trajectory (as a series of interaction and conformational energies) and dimensionality reduction was performed using principal component analysis (PCA). The trajectory was then clustered using a density-based clustering algorithm, where each cluster represents a metastable state on the potential energy surface (PES) of the biomolecule under study. A graph was created with these clusters as nodes with the edges learnt using an iterative expectation maximization algorithm. The most reactive path is conceived as the widest path along this graph. We have tested our method on RNA hairpin unfolding trajectory in aqueous urea solution. Our method makes the understanding of the mechanism of unfolding in RNA hairpin molecule more tractable. As this method doesn't rely on temporal data it can be used to analyze trajectories from Monte Carlo sampling techniques and replica exchange molecular dynamics (REMD).

  19. Computer simulation study of in-zeolites templated carbon replicas: structural and adsorption properties for hydrogen storage application

    International Nuclear Information System (INIS)

    Roussel, T.

    2007-05-01

    Hydrogen storage is the key issue to envisage this gas for instance as an energy vector in the field of transportation. Porous carbons are materials that are considered as possible candidates. We have studied well-controlled microporous carbon nano-structures, carbonaceous replicas of meso-porous ordered silica materials and zeolites. We realized numerically (using Grand Canonical Monte Carlo Simulations, GCMC) the atomic nano-structures of the carbon replication of four zeolites: AlPO 4 -5, silicalite-1, and Faujasite (FAU and EMT). The faujasite replicas allow nano-casting of a new form of carbon crystalline solid made of tetrahedrally or hexagonally interconnected single wall nano-tubes. The pore size networks are nano-metric giving these materials optimized hydrogen molecular storage capacities (for pure carbon phases). However, we demonstrate that these new carbon forms are not interesting for room temperature efficient storage compared to the void space of a classical gas cylinder. We showed that doping with an alkaline element, such as lithium, one could store the same quantities at 350 bar compared to a classical tank at 700 bar. This result is a possible route to achieve interesting performances for on-board docking systems for instance. (author)

  20. From green architecture to architectural green

    DEFF Research Database (Denmark)

    Earon, Ofri

    2011-01-01

    that describes the architectural exclusivity of this particular architecture genre. The adjective green expresses architectural qualities differentiating green architecture from none-green architecture. Currently, adding trees and vegetation to the building’s facade is the main architectural characteristics...... they have overshadowed the architectural potential of green architecture. The paper questions how a green space should perform, look like and function. Two examples are chosen to demonstrate thorough integrations between green and space. The examples are public buildings categorized as pavilions. One......The paper investigates the topic of green architecture from an architectural point of view and not an energy point of view. The purpose of the paper is to establish a debate about the architectural language and spatial characteristics of green architecture. In this light, green becomes an adjective...

  1. The Walk-Man Robot Software Architecture

    Directory of Open Access Journals (Sweden)

    Mirko Ferrati

    2016-05-01

    Full Text Available A software and control architecture for a humanoid robot is a complex and large project, which involves a team of developers/researchers to be coordinated and requires many hard design choices. If such project has to be done in a very limited time, i.e., less than 1 year, more constraints are added and concepts, such as modular design, code reusability, and API definition, need to be used as much as possible. In this work, we describe the software architecture developed for Walk-Man, a robot participant at the Darpa Robotics Challenge. The challenge required the robot to execute many different tasks, such as walking, driving a car, and manipulating objects. These tasks need to be solved by robotics specialists in their corresponding research field, such as humanoid walking, motion planning, or object manipulation. The proposed architecture was developed in 10 months, provided boilerplate code for most of the functionalities required to control a humanoid robot and allowed robotics researchers to produce their control modules for DRC tasks in a short time. Additional capabilities of the architecture include firmware and hardware management, mixing of different middlewares, unreliable network management, and operator control station GUI. All the source code related to the architecture and some control modules have been released as open source projects.

  2. Management practices and influences on IT architecture decisions: a case study in a telecom company

    OpenAIRE

    Hsing, Chen Wen; Souza, Cesar Alexandre de

    2012-01-01

    The study aims to analyze the IT architecture management practices associated with their degree of maturity and the influence of institutional and strategic factors on the decisions involved through a case study in a large telecom organization. The case study allowed us to identify practices that led the company to its current stage of maturity and identify practices that can lead the company to the next stage. The strategic influence was mentioned by most respondents and the institutional in...

  3. EVALUATION OF UTILIZING SERVICE ORIENTED ARCHITECTURE AS A SUITABLE SOLUTION TO ALIGN UNIVERSITY MANAGEMENT INFORMATION SYSTEMS AND LEARNING MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. M. RIAD

    2009-10-01

    Full Text Available To help universities achieve their goals, it is important to align managerial functionalities side by side with educational aspects. Universities consume University Management Information Systems (UMIS to handle managerial aspects as they do with Learning Management Systems (LMS to achieve learning objectives. UMIS advances LMS by decades and has reached stable and mature consistency level. LMS is the newly acquired solution in Universities; compared to UMIS, and so adopting LMSs in universities can be achieved via three different deployment approaches. First approach believes in LMS ability to replace UMIS and performing its functionalities. Second approach presents the idea of extending UMIS to include LMS functionalities. Third approach arises from the shortages of the two proposed approaches and present integration between both as the appropriate deployment approach. Service Oriented Architecture (SOA is a design pattern that can be used as a suitable architectural solution to align UMIS and LMS. SOA can be utilized in universities to overcome some of information systems’ challenges like the integration between UMIS and LMS. This paper presents the current situation at Mansoura University; Egypt, presents integration as the most suitable solution, and evaluates three different implementation techniques: Dynamic Query, Stored Procedure, and Web services. Evaluation concludes that though SOA enhanced many different aspects of both UMIS and LMS; and consequently university overall. It is not recommended to adopt SOA via Web services as the building unit of the system, but as the interdisciplinary interface between systems.

  4. Essential Layers, Artifacts, and Dependencies of Enterprise Architecture

    OpenAIRE

    Winter, Robert; Fischer, Ronny

    2007-01-01

    After a period where implementation speed was more important than integration, consistency and reduction of complexity, architectural considerations have become a key issue of information management in recent years again. Enterprise architecture is widely accepted as an essential mechanism for ensuring agility and consistency, compliance and efficiency. Although standards like TOGAF and FEAF have developed, however, there is no common agreement on which architecture layers, which artifact typ...

  5. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... correlation between the study of existing architectures and the training of competences to design for present-day realities.......This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...

  6. Functional Interface Considerations within an Exploration Life Support System Architecture

    Science.gov (United States)

    Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad

    2016-01-01

    As notional life support system (LSS) architectures are developed and evaluated, myriad options must be considered pertaining to process technologies, components, and equipment assemblies. Each option must be evaluated relative to its impact on key functional interfaces within the LSS architecture. A leading notional architecture has been developed to guide the path toward realizing future crewed space exploration goals. This architecture includes atmosphere revitalization, water recovery and management, and environmental monitoring subsystems. Guiding requirements for developing this architecture are summarized and important interfaces within the architecture are discussed. The role of environmental monitoring within the architecture is described.

  7. Enterprise Architecture in the Supply Chain

    DEFF Research Database (Denmark)

    Tambo, Torben; Koch, Christian

    2010-01-01

    Information systems in supply chain management (SCM) is common, bringing architecture on the agenda . The paper uses three perspectives on enterprise architecture (EA) in the supply chain: The correlation view, the remote view and the institutional view. It is shown that the EA in the domain...... of supply chain has to meet quite a complicated set of demands. Coherency Management (CM) for the aligning of business processes and underlying technology is used by proposing three parameters for EA: Alignment, agility and assurance. Alignment addresses the depth of business vs. technology correspondence...... is presented and discussed. The case outlines potentials for an enhanced alignment and coherence between management, business processes and underlying information system; innovation is led by tighter integration with business partners, higher versatility in the adaption to formal business requirements...

  8. The NASA Integrated Information Technology Architecture

    Science.gov (United States)

    Baldridge, Tim

    1997-01-01

    of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.

  9. The brand architecture of grocery retailers

    DEFF Research Database (Denmark)

    Bech-Larsen, Tino; Esbjerg, Lars

    2009-01-01

    This article discusses how the brand architecture of grocery retailers set material and symbolic boundaries for consumer choice, thus limiting consumer sovereignty. The article first discusses previous work on store atmospherics, servicescapes and brand architecture. It is argued that work based...... on these concepts has taken an internal management perspective on how retailers can manipulate aspects of the retail setting to serve their own interests. Then, we develop an alternative conceptualisation of retailer brand architecture that takes into account that consumers (and other constituents) are active co......- constructors of material and symbolic aspects of retail settings. It is discussed how consumers participate in constructing retailer brand architecture and how this concept differs from previous research. Implications for both research and practice are discussed....

  10. Popularity Prediction Tool for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2013-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distri...

  11. Popularity Prediction Tool for ATLAS Distributed Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distri...

  12. The GOES-R Product Generation Architecture

    Science.gov (United States)

    Dittberner, G. J.; Kalluri, S.; Hansen, D.; Weiner, A.; Tarpley, A.; Marley, S.

    2011-12-01

    The GOES-R system will substantially improve users' ability to succeed in their work by providing data with significantly enhanced instruments, higher resolution, much shorter relook times, and an increased number and diversity of products. The Product Generation architecture is designed to provide the computer and memory resources necessary to achieve the necessary latency and availability for these products. Over time, new and updated algorithms are expected to be added and old ones removed as science advances and new products are developed. The GOES-R GS architecture is being planned to maintain functionality so that when such changes are implemented, operational product generation will continue without interruption. The primary parts of the PG infrastructure are the Service Based Architecture (SBA) and the Data Fabric (DF). SBA is the middleware that encapsulates and manages science algorithms that generate products. It is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DF to provide this data communication layer between algorithms. The DF provides an abstract interface over a distributed and persistent multi-layered storage system (e.g., memory based caching above disk-based storage) and an event management system that allows event-driven algorithm services to know when instrument data are available and where they reside. Together, the SBA and the DF provide a

  13. Replica analysis for the duality of the portfolio optimization problem.

    Science.gov (United States)

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  14. Replica analysis for the duality of the portfolio optimization problem

    Science.gov (United States)

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  15. Hadoop Oriented Smart Cities Architecture

    Science.gov (United States)

    Bologa, Ana-Ramona; Bologa, Razvan

    2018-01-01

    A smart city implies a consistent use of technology for the benefit of the community. As the city develops over time, components and subsystems such as smart grids, smart water management, smart traffic and transportation systems, smart waste management systems, smart security systems, or e-governance are added. These components ingest and generate a multitude of structured, semi-structured or unstructured data that may be processed using a variety of algorithms in batches, micro batches or in real-time. The ICT architecture must be able to handle the increased storage and processing needs. When vertical scaling is no longer a viable solution, Hadoop can offer efficient linear horizontal scaling, solving storage, processing, and data analyses problems in many ways. This enables architects and developers to choose a stack according to their needs and skill-levels. In this paper, we propose a Hadoop-based architectural stack that can provide the ICT backbone for efficiently managing a smart city. On the one hand, Hadoop, together with Spark and the plethora of NoSQL databases and accompanying Apache projects, is a mature ecosystem. This is one of the reasons why it is an attractive option for a Smart City architecture. On the other hand, it is also very dynamic; things can change very quickly, and many new frameworks, products and options continue to emerge as others decline. To construct an optimized, modern architecture, we discuss and compare various products and engines based on a process that takes into consideration how the products perform and scale, as well as the reusability of the code, innovations, features, and support and interest in online communities. PMID:29649172

  16. Hadoop Oriented Smart Cities Architecture

    Directory of Open Access Journals (Sweden)

    Vlad Diaconita

    2018-04-01

    Full Text Available A smart city implies a consistent use of technology for the benefit of the community. As the city develops over time, components and subsystems such as smart grids, smart water management, smart traffic and transportation systems, smart waste management systems, smart security systems, or e-governance are added. These components ingest and generate a multitude of structured, semi-structured or unstructured data that may be processed using a variety of algorithms in batches, micro batches or in real-time. The ICT architecture must be able to handle the increased storage and processing needs. When vertical scaling is no longer a viable solution, Hadoop can offer efficient linear horizontal scaling, solving storage, processing, and data analyses problems in many ways. This enables architects and developers to choose a stack according to their needs and skill-levels. In this paper, we propose a Hadoop-based architectural stack that can provide the ICT backbone for efficiently managing a smart city. On the one hand, Hadoop, together with Spark and the plethora of NoSQL databases and accompanying Apache projects, is a mature ecosystem. This is one of the reasons why it is an attractive option for a Smart City architecture. On the other hand, it is also very dynamic; things can change very quickly, and many new frameworks, products and options continue to emerge as others decline. To construct an optimized, modern architecture, we discuss and compare various products and engines based on a process that takes into consideration how the products perform and scale, as well as the reusability of the code, innovations, features, and support and interest in online communities.

  17. Hadoop Oriented Smart Cities Architecture.

    Science.gov (United States)

    Diaconita, Vlad; Bologa, Ana-Ramona; Bologa, Razvan

    2018-04-12

    A smart city implies a consistent use of technology for the benefit of the community. As the city develops over time, components and subsystems such as smart grids, smart water management, smart traffic and transportation systems, smart waste management systems, smart security systems, or e-governance are added. These components ingest and generate a multitude of structured, semi-structured or unstructured data that may be processed using a variety of algorithms in batches, micro batches or in real-time. The ICT architecture must be able to handle the increased storage and processing needs. When vertical scaling is no longer a viable solution, Hadoop can offer efficient linear horizontal scaling, solving storage, processing, and data analyses problems in many ways. This enables architects and developers to choose a stack according to their needs and skill-levels. In this paper, we propose a Hadoop-based architectural stack that can provide the ICT backbone for efficiently managing a smart city. On the one hand, Hadoop, together with Spark and the plethora of NoSQL databases and accompanying Apache projects, is a mature ecosystem. This is one of the reasons why it is an attractive option for a Smart City architecture. On the other hand, it is also very dynamic; things can change very quickly, and many new frameworks, products and options continue to emerge as others decline. To construct an optimized, modern architecture, we discuss and compare various products and engines based on a process that takes into consideration how the products perform and scale, as well as the reusability of the code, innovations, features, and support and interest in online communities.

  18. Rucio WebUI - The Web Interface for the ATLAS Distributed Data Management

    CERN Document Server

    Beermann, Thomas; The ATLAS collaboration; Barisits, Martin-Stefan; Serfon, Cedric; Garonne, Vincent

    2016-01-01

    With the current distributed data management system for ATLAS, called Rucio, all user interactions, e.g. the Rucio command line tools or the ATLAS workload management system, communicate with Rucio through the same REST-API. This common interface makes it possible to interact with Rucio using a lot of different programming languages, including Javascript. Using common web application frameworks like JQuery and web.py, a web application for Rucio was built. The main component is R2D2 - the Rucio Rule Definition Droid - which gives the users a simple way to manage their data on the grid. They can search for particular datasets and get details about its metadata and available replicas and easily create rules to create new replicas and delete them if not needed anymore. On the other hand it is possible for site admins to restrict transfers to their site by setting quotas and manually approve transfers. Besides R2D2 additional features include transfer backlog monitoring for shifters, group space monitoring for gr...

  19. LTSA Conformance Testing to Architectural Design of LMS Using Ontology

    Science.gov (United States)

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…

  20. Program information architecture/document hierarchy

    International Nuclear Information System (INIS)

    Woods, T.W.

    1991-09-01

    The Nuclear Waste Management System (NWMS) Management Systems Improvement Strategy (MSIS) (DOE 1990) requires that the information within the computer program and information management system be ordered into a precedence hierarchy for consistency. Therefore, the US Department of Energy (DOE). Office of Civilian Radioactive Waste Management (OCRWM) requested Westinghouse Hanford Company to develop a plan for NWMS program information which the MSIS calls a document hierarchy. This report provides the results of that effort and describes the management system as a ''program information architecture.'' 3 refs., 3 figs

  1. Validation of the BUGJEFF311.BOLIB, BUGENDF70.BOLIB and BUGLE-B7 broad-group libraries on the PCA-Replica (H2O/Fe) neutron shielding benchmark experiment

    OpenAIRE

    Pescarini Massimo; Orsi Roberto; Frisoni Manuela

    2016-01-01

    The PCA-Replica 12/13 (H2O/Fe) neutron shielding benchmark experiment was analysed using the TORT-3.2 3D SN code. PCA-Replica reproduces a PWR ex-core radial geometry with alternate layers of water and steel including a pressure vessel simulator. Three broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format with the same energy group structure (47 n + 20 γ) and based on different nuclear data were alternatively used: the ENEA BUGJEFF311.BOLIB (JEFF-3.1.1) and U...

  2. An extensible database architecture for nationwide power quality monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Kuecuek, Dilek; Inan, Tolga; Salor, Oezguel; Demirci, Turan; Buhan, Serkan; Boyrazoglu, Burak [TUBITAK Uzay, Power Electronics Group, TR 06531 Ankara (Turkey); Akkaya, Yener; Uensar, Oezguer; Altintas, Erinc; Haliloglu, Burhan [Turkish Electricity Transmission Co. Inc., TR 06490 Ankara (Turkey); Cadirci, Isik [TUBITAK Uzay, Power Electronics Group, TR 06531 Ankara (Turkey); Hacettepe University, Electrical and Electronics Eng. Dept., TR 06532 Ankara (Turkey); Ermis, Muammer [METU, Electrical and Electronics Eng. Dept., TR 06531 Ankara (Turkey)

    2010-07-15

    Electrical power quality (PQ) data is one of the prevalent types of engineering data. Its measurement at relevant sampling rates leads to large volumes of PQ data to be managed and analyzed. In this paper, an extensible database architecture is presented based on a novel generic data model for PQ data. The proposed architecture is operated on the nationwide PQ data of the Turkish Electricity Transmission System measured in the field by mobile PQ monitoring systems. The architecture is extensible in the sense that it can be used to store and manage PQ data collected by any means with little or no customization. The architecture has three modules: a PQ database corresponding to the implementation of the generic data model, a visual user query interface to enable its users to specify queries to the PQ database and a query processor acting as a bridge between the query interface and the database. The operation of the architecture is illustrated on the field PQ data with several query examples through the visual query interface. The execution of the architecture on this data of considerable volume supports its applicability and convenience for PQ data. (author)

  3. Architectural design of experience based factory model for software ...

    African Journals Online (AJOL)

    architectural design. Automation features are incorporated in the design in which workflow system and intelligent agents are integrated, and the facilitation of cloud environment is empowered to further support the automation. Keywords: architectural design; knowledge management; experience factory; workflow;

  4. Flow field analysis in a compliant acinus replica model using particle image velocimetry (PIV).

    Science.gov (United States)

    Berg, Emily J; Weisman, Jessica L; Oldham, Michael J; Robinson, Risa J

    2010-04-19

    Inhaled particles reaching the alveolar walls have the potential to cross the blood-gas barrier and enter the blood stream. Experimental evidence of pulmonary dosimetry, however, cannot be explained by current whole lung dosimetry models. Numerical and experimental studies shed some light on the mechanisms of particle transport, but realistic geometries have not been investigated. In this study, a three dimensional expanding model including two generations of respiratory bronchioles and five terminal alveolar sacs was created from a replica human lung cast. Flow visualization techniques were employed to quantify the fluid flow while utilizing streamlines to evaluate recirculation. Pathlines were plotted to track the fluid motion and estimate penetration depth of inhaled air. This study provides evidence that the two generations immediately proximal to the terminal alveolar sacs do not have recirculating eddies, even for intense breathing. Results of Peclet number calculations indicate that substantial convective motion is present in vivo for the case of deep breathing, which significantly increases particle penetration into the alveoli. However, particle diffusion remains the dominant mechanism of particle transport over convection, even for intense breathing because inhaled particles do not reach the alveolar wall in a single breath by convection alone. Examination of the velocity fields revealed significant uneven ventilation of the alveoli during a single breath, likely due to variations in size and location. This flow field data, obtained from replica model geometry with realistic breathing conditions, provides information to better understand fluid and particle behavior in the acinus region of the lung. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Replica symmetry breaking in short range spin glasses: A review of the theoretical foundations and of the numerical evidence

    International Nuclear Information System (INIS)

    Marinari, E.; Zuliani, F.; Parisi, G.; Ricci-Tersenghi, F.; Ruiz-Lorenzo, J.J.

    2000-04-01

    We discuss Replica Symmetry Breaking (RSB) in Spin Glasses. We present an update about the state of the matter, both from the analytical and from the numerical point of view. We put a particular attention in discussing the difficulties stressed by Newman and Stein concerning the problem of constructing pure states in spin glass systems. We mainly discuss about what happens in finite dimensional, realistic spin glasses. Together with a detailed review of some of most important features, facts, data, phenomena, we present some new theoretical ideas and numerical results. We discuss among others the basic idea of the RSB theory, correlation functions, interfaces, overlaps, pure states, random field and the dynamical approach. We present new numerical results for the behavior of coupled replicas and about the numerical verification of sum rules, and we review some of the available numerical results that we consider of larger importance (for example the determination of the phase transition point, the correlation functions, the window overlaps, the dynamical behavior of the system). (author)

  6. Supporting self-management of obesity using a novel game architecture.

    Science.gov (United States)

    Giabbanelli, Philippe J; Crutzen, Rik

    2015-09-01

    Obesity has commonly been addressed using a 'one size fits all' approach centred on a combination of diet and exercise. This has not succeeded in halting the obesity epidemic, as two-thirds of American adults are now obese or overweight. Practitioners are increasingly highlighting that one's weight is shaped by myriad factors, suggesting that interventions should be tailored to the specific needs of individuals. Health games have potential to provide such tailored approach. However, they currently tend to focus on communicating and/or reinforcing knowledge, in order to suscitate learning in the participants. We argue that it would be equally, if not more valuable, that games learn from participants using recommender systems. This would allow treatments to be comprehensive, as games can deduce from the participants' behaviour which factors seem to be most relevant to his or her weight and focus on them. We introduce a novel game architecture and discuss its implications on facilitating the self-management of obesity. © The Author(s) 2014.

  7. Anti-stiction coating of PDMS moulds for rapid microchannel fabrication by double replica moulding

    DEFF Research Database (Denmark)

    Zhuang, Guisheng; Kutter, Jörg Peter

    2011-01-01

    ), which resulted in an anti-stiction layer for the improved release after PDMS casting. The deposition of FDTS on an O2 plasma-activated surface of PDMS produced a reproducible and well-performing anti-stiction monolayer of fluorocarbon, and we used the FDTS-coated moulds as micro-masters for rapid......In this paper, we report a simple and precise method to rapidly replicate master structures for fast microchannel fabrication by double replica moulding of polydimethylsiloxane (PDMS). A PDMS mould was surface-treated by vapour phase deposition of 1H,1H,2H,2H-perfluorodecyltrichlorosilane (FDTS...

  8. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  9. Evaluating the Use of Synthetic Replicas for SEM Identification of Bloodstains (with Emphasis on Archaeological and Ethnographic Artifacts).

    Science.gov (United States)

    Hortolà, Policarp

    2015-12-01

    Some archaeological or ethnographic specimens are unavailable for direct examination using a scanning electron microscope (SEM) due to methodological obstacles or legal issues. In order to assess the feasibility of using SEM synthetic replicas for the identification of bloodstains (BSs) via morphology of red blood cells (RBCs), three fragments of different natural raw material (inorganic, stone; plant, wood; animal, shell) were smeared with peripheral human blood. Afterwards, molds and casts of the bloodstained areas were made using vinyl polysiloxane (VPS) silicone impression and polyurethane (PU) resin casting material, respectively. Then, the original samples and the resulting casts were coated with gold and examined in secondary-electron mode using a high-vacuum SEM. Results suggest that PU resin casts obtained from VPS silicone molds can preserve RBC morphology in BSs, and consequently that synthetic replicas are feasible for SEM identification of BSs on cultural heritage specimens made of natural raw materials. Although the focus of this study was on BSs, the method reported in this paper may be applicable to organic residues other than blood, as well as to the surface of other specimens when, for any reason, the original is unavailable for an SEM.

  10. Hybrid Three-Phase/Single-Phase Microgrid Architecture with Power Management Capabilities

    DEFF Research Database (Denmark)

    Sun, Qiuye; Zhou, Jianguo; Guerrero, Josep M.

    2015-01-01

    With the fast proliferation of single-phase distributed generation (DG) units and loads integrated into residential microgrids, independent power sharing per phase and full use of the energy generated by DGs have become crucial. To address these issues, this paper proposes a hybrid microgrid...... architecture and its power management strategy. In this microgrid structure, a power sharing unit (PSU), composed of three single-phase back-to-back (SPBTB) converters, is proposed to be installed at the point of common coupling (PCC). The aim of the PSU is mainly to realize the power exchange and coordinated...... control of load power sharing among phases, as well as to allow fully utilization of the energy generated by DGs. Meanwhile, the method combining the modified adaptive backstepping-sliding mode control approach and droop control is also proposed to design the SPBTB system controllers. With the application...

  11. Trust-based information system architecture for personal wellness.

    Science.gov (United States)

    Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd

    2014-01-01

    Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.

  12. Meme media and meme market architectures knowledge media for editing distributing and managing intellectual resources

    CERN Document Server

    Tanaka, Y

    2003-01-01

    "In this book, Yuzuru Tanaka proposes a powerful new paradigm: that knowledge media, or "memes," operate in a way that closely resembles the biological function of genes, with their network publishing repository working as a gene pool to accelerate the evolution of knowledge shared in our societies. In Meme Media and Meme Market Architectures: Knowledge Media for Editing, Distributing, and Managing Intellectual Resources, Tanaka outlines a ready-to-use knowledge media system, supplemented with sample media objects, which allows readers to experience the knowledge media paradigm."--Jacket.

  13. The Membranes of the Basal Labyrinth in Kidney Cells of the Stickleback, Gasterosteus aculeatus, Studied in Ultrathin Sections and Freeze-Etch Replicas

    NARCIS (Netherlands)

    Wendelaar Bonga, S.E.; Veenhuis, M.

    1974-01-01

    The structure of the basal labyrinth in kidney cells of freshwater sticklebacks was studied in ultrathin sections (after fixation with permanganate, osmium tetroxide, and combinations of glutaraldehyde with osmium tetroxide) and in freeze-etch replicas (after pretreatment with glutaraldehyde and/or

  14. Emerging trends in the evolution of service-oriented and enterprise architectures

    CERN Document Server

    Zimmermann, Alfred; Jain, Lakhmi

    2016-01-01

    This book presents emerging trends in the evolution of service-oriented and enterprise architectures. New architectures and methods of both business and IT are integrating services to support mobility systems, Internet of Things, Ubiquitous Computing, collaborative and adaptive business processes, Big Data, and Cloud ecosystems. They inspire current and future digital strategies and create new opportunities for the digital transformation of next digital products and services. Services Oriented Architectures (SOA) and Enterprise Architectures (EA) have emerged as a useful framework for developing interoperable, large-scale systems, typically implementing various standards, like Web Services, REST, and Microservices. Managing the adaptation and evolution of such systems presents a great challenge. Service-Oriented Architecture enables flexibility through loose coupling, both between the services themselves and between the IT organizations that manage them. Enterprises evolve continuously by transforming and ext...

  15. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Uuspaeae, P. [VTT Energy, Espoo (Finland)

    1996-12-31

    The scope of this research is data communications for electric utilities. Demand Side Management (DSM) calls for communication between the Electric Utility and the Customer. The communication capacity needed will depend on the functions that are chosen for DSM, and on the number of customers. Some functions may be handled with one-way communications, some functions require two-way communication. Utility Communication Architecture looks for an overall view of the communications needs and communication systems in an electric utility. The objective is to define and specify suitable and compatible communications procedures within the Utility and also to outside parties. (27 refs.)

  16. Communications technologies for demand side management, DSM, and European utility communications architecture, EurUCA

    Energy Technology Data Exchange (ETDEWEB)

    Uuspaeae, P [VTT Energy, Espoo (Finland)

    1997-12-31

    The scope of this research is data communications for electric utilities. Demand Side Management (DSM) calls for communication between the Electric Utility and the Customer. The communication capacity needed will depend on the functions that are chosen for DSM, and on the number of customers. Some functions may be handled with one-way communications, some functions require two-way communication. Utility Communication Architecture looks for an overall view of the communications needs and communication systems in an electric utility. The objective is to define and specify suitable and compatible communications procedures within the Utility and also to outside parties. (27 refs.)

  17. The Measurement and Modeling of a P2P Streaming Video Service

    Science.gov (United States)

    Gao, Peng; Liu, Tao; Chen, Yanming; Wu, Xingyao; El-Khatib, Yehia; Edwards, Christopher

    Most of the work on grid technology in video area has been generally restricted to aspects of resource scheduling and replica management. The traffic of such service has a lot of characteristics in common with that of the traditional video service. However the architecture and user behavior in Grid networks are quite different from those of traditional Internet. Considering the potential of grid networks and video sharing services, measuring and analyzing P2P IPTV traffic are important and fundamental works in the field grid networks.

  18. The architectural evaluation of buildings’ indices in explosion crisis management

    Directory of Open Access Journals (Sweden)

    Mahdi Bitarafan

    2016-12-01

    Full Text Available Identifying the probable damages plays an important role in preparing for encountering and resisting negative effects of martial attacks to urban areas. The ultimate goal of this study was to identify some facilities and solutions of immunizing buildings against marital attacks and resisting explosion effects. Explosion and its coming waves, which are caused by bombardment, will damage the buildings and cause difficulties. So, defining indices to identify architectural vulnerability of buildings in explosion is needed. The Basic indices for evaluating the blast-resistant architectural spaces were identified in this study using library resources. The proposed indices were extracted through interviewing architectural and explosive experts. This study has also applied group decision making method based on pairwise comparison model, and then the necessity degree of each index was calculated. Finally, the preferences and ultimate weights of the indices were determined.

  19. Incorporating enterprise strategic plans into enterprise architecture

    NARCIS (Netherlands)

    Lins Borges Azevedo, Carlos

    2017-01-01

    In the last years, information technology (IT) executives have identified IT-business strategic alignment as a top management concern. In the information technology area, emphasis has been given to the Enterprise Architecture (EA) discipline with respect to enterprise management. The focus of the

  20. Study on Information Management for the Conservation of Traditional Chinese Architectural Heritage - 3d Modelling and Metadata Representation

    Science.gov (United States)

    Yen, Y. N.; Weng, K. H.; Huang, H. Y.

    2013-07-01

    After over 30 years of practise and development, Taiwan's architectural conservation field is moving rapidly into digitalization and its applications. Compared to modern buildings, traditional Chinese architecture has considerably more complex elements and forms. To document and digitize these unique heritages in their conservation lifecycle is a new and important issue. This article takes the caisson ceiling of the Taipei Confucius Temple, octagonal with 333 elements in 8 types, as a case study for digitization practise. The application of metadata representation and 3D modelling are the two key issues to discuss. Both Revit and SketchUp were appliedin this research to compare its effectiveness to metadata representation. Due to limitation of the Revit database, the final 3D models wasbuilt with SketchUp. The research found that, firstly, cultural heritage databasesmustconvey that while many elements are similar in appearance, they are unique in value; although 3D simulations help the general understanding of architectural heritage, software such as Revit and SketchUp, at this stage, could onlybe used tomodel basic visual representations, and is ineffective indocumenting additional critical data ofindividually unique elements. Secondly, when establishing conservation lifecycle information for application in management systems, a full and detailed presentation of the metadata must also be implemented; the existing applications of BIM in managing conservation lifecycles are still insufficient. Results of the research recommends SketchUp as a tool for present modelling needs, and BIM for sharing data between users, but the implementation of metadata representation is of the utmost importance.

  1. A Geo-Distributed System Architecture for Different Domains

    Science.gov (United States)

    Moßgraber, Jürgen; Middleton, Stuart; Tao, Ran

    2013-04-01

    The presentation will describe work on the system-of-systems (SoS) architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". In this project we deal with two use-cases: Natural Crisis Management (e.g. Tsunami Early Warning) and Industrial Subsurface Development (e.g. drilling for oil). These use-cases seem to be quite different at first sight but share a lot of similarities, like managing and looking up available sensors, extracting data from them and annotate it semantically, intelligently manage the data (big data problem), run mathematical analysis algorithms on the data and finally provide decision support on this basis. The main challenge was to create a generic architecture which fits both use-cases. The requirements to the architecture are manifold and the whole spectrum of a modern, geo-distributed and collaborative system comes into play. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. The most important architectural challenges we needed to address are 1. Build a scalable communication layer for a System-of-sytems 2. Build a resilient communication layer for a System-of-sytems 3. Efficiently publish large volumes of semantically rich sensor data 4. Scalable and high performance storage of large distributed datasets 5. Handling federated multi-domain heterogeneous data 6. Discovery of resources in a geo-distributed SoS 7. Coordination of work between geo-distributed systems The design decisions made for each of them will be presented. These developed concepts are also applicable to the requirements of the Future Internet (FI) and Internet of Things (IoT) which will provide services like smart grids, smart metering, logistics and

  2. Real-time collaboration in activity-based architectures

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak

    2004-01-01

    With the growing research into mobile and ubiquitous computing, there is a need for addressing how such infrastructures can support collaboration between nomadic users. We present the activity based computing paradigm and outline a proposal for handling collaboration in an activity......-based architecture. We argue that activity-based computing establishes a natural and sound conceptual and architectural basis for session management in real-time, synchronous collaboration....

  3. Multilevel and Hybrid Architecture for Device Abstraction and Context Information Management in Smart Home Environments

    Science.gov (United States)

    Peláez, Víctor; González, Roberto; San Martín, Luis Ángel; Campos, Antonio; Lobato, Vanesa

    Hardware device management, and context information acquisition and abstraction are key factors to develop the ambient intelligent paradigm in smart homes. This work presents an architecture that addresses these two problems and provides a usable framework to develop applications easily. In contrast to other proposals, this work addresses performance issues specifically. Results show that the execution performance of the developed prototype is suitable for deployment in a real environment. In addition, the modular design of the system allows the user to develop applications using different techniques and different levels of abstraction.

  4. Security in the Cache and Forward Architecture for the Next Generation Internet

    Science.gov (United States)

    Hadjichristofi, G. C.; Hadjicostis, C. N.; Raychaudhuri, D.

    The future Internet architecture will be comprised predominately of wireless devices. It is evident at this stage that the TCP/IP protocol that was developed decades ago will not properly support the required network functionalities since contemporary communication profiles tend to be data-driven rather than host-based. To address this paradigm shift in data propagation, a next generation architecture has been proposed, the Cache and Forward (CNF) architecture. This research investigates security aspects of this new Internet architecture. More specifically, we discuss content privacy, secure routing, key management and trust management. We identify security weaknesses of this architecture that need to be addressed and we derive security requirements that should guide future research directions. Aspects of the research can be adopted as a step-stone as we build the future Internet.

  5. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    Science.gov (United States)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate

  6. The architecture and prototype implementation of the Model Environment system

    Science.gov (United States)

    Donchyts, G.; Treebushny, D.; Primachenko, A.; Shlyahtun, N.; Zheleznyak, M.

    2007-01-01

    An approach that simplifies software development of the model based decision support systems for environmental management has been introduced. The approach is based on definition and management of metadata and data related to computational model without losing data semantics and proposed methods of integration of the new modules into the information system and their management. An architecture of the integrated modelling system is presented. The proposed architecture has been implemented as a prototype of integrated modelling system using. NET/Gtk{#} and is currently being used to re-design European Decision Support System for Nuclear Emergency Management RODOS (http://www.rodos.fzk.de) using Java/Swing.

  7. Autotransplantation of premolars with a 3-dimensional printed titanium replica of the donor tooth functioning as a surgical guide: proof of concept

    NARCIS (Netherlands)

    Verweij, J.P.; Moin, D.A.; Mensink, G.; Nijkamp, P.; Wismeijer, D.; van Merkesteyn, J.P.R.

    2016-01-01

    Purpose: Autotransplantation of premolars is a good treatment option for young patients who have missing teeth. This study evaluated the use of a preoperatively 3-dimensional (3D)-printed replica of the donor tooth that functions as a surgical guide during autotransplantation. Materials and Methods:

  8. Study on Global GIS architecture and its key technologies

    Science.gov (United States)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2010-11-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  9. CIM overview. (2). ; Architecture, infrastructure, information technology. CIM soron. (2). ; Architecture, infra, joho riyo shien gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Seryo, K [IBM Japan Ltd., Tokyo (Japan)

    1991-12-10

    The materialization of computer integrated manufacturing (CIM) requires an establishment of its systematizing system, i.e., architecturer and planning of the infrastructure to support it and basic engineering to support the informational utilization. The CIM architecture is classified into management and system structure, strategic planning method, System development introduction method, etc. The infrastructure aims at epoch-makingly heightening the productivity and speed by integrating the production planning, engineering design, accounting, sales, general business and affairs, production engineering, production activities, and activities of suppliers and clients. The informational utilization support engineering comprises the management support tool, decision making support tool, application development tool, etc. What is important is to establish a system of systematizing engineering in order not to be behind the strategic activation era of information to come. 17 refs., 4 figs., 1 tab.

  10. Dynamic tables: an architecture for managing evolving, heterogeneous biomedical data in relational database management systems.

    Science.gov (United States)

    Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.

  11. Popularity Prediction Tool for ATLAS Distributed Data Management

    Science.gov (United States)

    Beermann, T.; Maettig, P.; Stewart, G.; Lassnig, M.; Garonne, V.; Barisits, M.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.

  12. Popularity prediction tool for ATLAS distributed data management

    International Nuclear Information System (INIS)

    Beermann, T; Maettig, P; Stewart, G; Lassnig, M; Garonne, V; Barisits, M; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.

  13. HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains

    Science.gov (United States)

    Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro

    The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.

  14. 2005 dossier: granite. Tome: architecture and management of the geologic disposal

    International Nuclear Information System (INIS)

    2005-01-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the geologic disposal of high-level and long-lived radioactive wastes in granite formations. Content: 1 - Approach of the study: main steps since the December 30, 1991 law, ANDRA's research program on disposal in granitic formations; 2 - high-level and long-lived (HLLL) wastes: production scenarios, waste categories, inventory model; 3 - disposal facility design in granitic environment: definition of the geologic disposal functions, the granitic material, general facility design options; 4 - general architecture of a disposal facility in granitic environment: surface facilities, underground facilities, disposal process, operational safety; 5 - B-type wastes disposal area: primary containers of B-type wastes, safety options, concrete containers, disposal alveoles, architecture of the B-type wastes disposal area, disposal process and feasibility aspects, functions of disposal components with time; 6 - C-type wastes disposal area: C-type wastes primary containers, safety options, super-containers, disposal alveoles, architecture of the C-type wastes disposal area, disposal process in a reversibility logics, functions of disposal components with time; 7 - spent fuels disposal area: spent fuel assemblies, safety options, spent fuel containers, disposal alveoles, architecture of the spent fuel disposal area, disposal process in a reversibility logics, functions of disposal components with time; 8 - conclusions: suitability of the architecture with various types of French granites, strong design, reversibility taken into consideration. (J.S.)

  15. A Hybrid Three Layer Architecture for Fire Agent Management in Rescue Simulation Environment

    Directory of Open Access Journals (Sweden)

    Alborz Geramifard

    2005-06-01

    Full Text Available This paper presents a new architecture called FAIS for implementing intelligent agents cooperating in a special Multi Agent environment, namely the RoboCup Rescue Simulation System. This is a layered architecture which is customized for solving fire extinguishing problem. Structural decision making algorithms are combined with heuristic ones in this model, so it's a hybrid architecture.

  16. Secure Service Oriented Architectures (SOA) Supporting NEC [Architecture orientée service (SOA) gérant la NEC

    NARCIS (Netherlands)

    Meiler, P.P.; Schmeing, M.

    2009-01-01

    Combined scenario ; Data management ; Data processing ; Demonstrator ; Information systems ; Integrated systems ; Interoperability ; Joint scenario ; Network Enabled Capability (NEC) ; Operational effectiveness ; Operations research ; Scenarios ; Secure communication ; Service Oriented Architecture

  17. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  18. A Review of Enterprise Architecture Use in Defence

    Science.gov (United States)

    2014-09-01

    dictionary of terms; • architecture description language; • architectural information (pertaining both to specific projects and higher level...UNCLASSIFIED 59 Z39.19 2005 Monolingual Controlled Vocabularies, National Information Standards Organisation, Bethesda: NISO Press, 2005. BABOK 2009...togaf/ Z39.19 2005 ANSI/NISO Z39.19 – Guidelines for the Construction, Format, and Management of Monolingual Controlled Vocabularies, Bethesda: NISO

  19. A Hybrid Three Layer Architecture for Fire Agent Management in Rescue Simulation Environment

    Directory of Open Access Journals (Sweden)

    Alborz Geramifard

    2008-11-01

    Full Text Available This paper presents a new architecture called FAIS for imple- menting intelligent agents cooperating in a special Multi Agent environ- ment, namely the RoboCup Rescue Simulation System. This is a layered architecture which is customized for solving fire extinguishing problem. Structural decision making algorithms are combined with heuristic ones in this model, so it's a hybrid architecture.

  20. An Ingenious Super Light Trapping Surface Templated from Butterfly Wing Scales

    Science.gov (United States)

    Han, Zhiwu; Li, Bo; Mu, Zhengzhi; Yang, Meng; Niu, Shichao; Zhang, Junqiu; Ren, Luquan

    2015-08-01

    Based on the super light trapping property of butterfly Trogonoptera brookiana wings, the SiO2 replica of this bionic functional surface was successfully synthesized using a simple and highly effective synthesis method combining a sol-gel process and subsequent selective etching. Firstly, the reflectivity of butterfly wing scales was carefully examined. It was found that the whole reflectance spectroscopy of the butterfly wings showed a lower level (less than 10 %) in the visible spectrum. Thus, it was confirmed that the butterfly wings possessed a super light trapping effect. Afterwards, the morphologies and detailed architectures of the butterfly wing scales were carefully investigated using the ultra-depth three-dimensional (3D) microscope and field emission scanning electronic microscopy (FESEM). It was composed by the parallel ridges and quasi-honeycomb-like structure between them. Based on the biological properties and function above, an exact SiO2 negative replica was fabricated through a synthesis method combining a sol-gel process and subsequent selective etching. At last, the comparative analysis of morphology feature size and the reflectance spectroscopy between the SiO2 negative replica and the flat plate was conducted. It could be concluded that the SiO2 negative replica inherited not only the original super light trapping architectures, but also the super light trapping characteristics of bio-template. This work may open up an avenue for the design and fabrication of super light trapping materials and encourage people to look for more super light trapping architectures in nature.

  1. A block chain based architecture for asset management in coalition operations

    Science.gov (United States)

    Verma, Dinesh; Desai, Nirmit; Preece, Alun; Taylor, Ian

    2017-05-01

    To support dynamic communities of interests in coalition operations, new architectures for efficient sharing of ISR assets are needed. The use of blockchain technology in wired business environments, such as digital currency systems, offers an interesting solution by creating a way to maintain a distributed shared ledger without requiring a single trusted authority. In this paper, we discuss how a blockchain-based system can be modified to provide a solution for dynamic asset sharing amongst coalition members, enabling the creation of a logically centralized asset management system by a seamless policy-compliant federation of different coalition systems. We discuss the use of blockchain for three different types of assets in a coalition context, showing how blockchain can offer a suitable solution for sharing assets in those environments. We also discuss the limitations in the current implementations of blockchain which need to be overcome for the technology to become more effective in a decentralized tactical edge environment.

  2. Enhancing dry adhesives and replica molding with ethyl cyano-acrylate

    International Nuclear Information System (INIS)

    Bovero, E; Menon, C

    2014-01-01

    The use of cyano-acrylate to improve the performance of dry adhesives and their method of fabrication is investigated. Specifically, the contributions of this work are: (1) a new adhesion method to adhere to a large variety of surfaces, (2) a strategy to increase the compliance of dry adhesives, and (3) an improved fabrication process for micro-structured dry adhesives based on replica molding. For the first contribution, the adhesion method consists of anchoring a micro-structured dry adhesive to a surface through a layer of hardened ethyl cyano-acrylate (ECA). This method increases the adhesion of the orders of magnitude at the expense of leaving residue after detachment. However, this method preserves reusability. For the second contribution, a double-sided dry adhesive is obtained by introducing a substrate with a millimeter-sized pillar structure, which enabled further increasing adhesion. For the third contribution, an ECA layer is used as a mold for the fabrication of new adhesives. These new types of molds proved able to produce dry adhesives with high reproducibility and low degradation. (paper)

  3. Big data in cloud : a data architecture

    OpenAIRE

    Sá, Jorge Vaz de Oliveira e; Martins, César Silva; Simões, Paulo

    2015-01-01

    Nowadays, organizations have at their disposal a large volume of data with a wide variety of types. Technology-driven organizations want to capture process and analyze this data at a fast velocity, in order to better understand and manage their customers, their operations and their business processes. As much as data volume and variety increases and as faster analytic results are needed, more demanding is for a data architecture. This data architecture should enable collecting,...

  4. Multicore technology architecture, reconfiguration, and modeling

    CERN Document Server

    Qadri, Muhammad Yasir

    2013-01-01

    The saturation of design complexity and clock frequencies for single-core processors has resulted in the emergence of multicore architectures as an alternative design paradigm. Nowadays, multicore/multithreaded computing systems are not only a de-facto standard for high-end applications, they are also gaining popularity in the field of embedded computing. The start of the multicore era has altered the concepts relating to almost all of the areas of computer architecture design, including core design, memory management, thread scheduling, application support, inter-processor communication, debu

  5. MANAGEMENT PRACTICES AND INFLUENCES ON IT ARCHITECTURE DECISIONS: A CASE STUDY IN A TELECOM COMPANY

    Directory of Open Access Journals (Sweden)

    Chen Wen Hsing

    2012-12-01

    Full Text Available The study aims to analyze the IT architecture management practices associated with their degree of maturity and the influence of institutional and strategic factors on the decisions involved through a case study in a large telecom organization. The case study allowed us to identify practices that led the company to its current stage of maturity and identify practices that can lead the company to the next stage. The strategic influence was mentioned by most respondents and the institutional influence was present in decisions related to innovation and those dealing with a higher level of uncertainties.

  6. A Tokenization-Based Communication Architecture for HCE-Enabled NFC Services

    Directory of Open Access Journals (Sweden)

    Busra Ozdenizci

    2016-01-01

    Full Text Available Following the announcement of Host Card Emulation (HCE technology, card emulation mode based Near Field Communication (NFC services have gained further appreciation as an enabler of the Cloud-based Secure Element (SE concept. A comprehensive and complete architecture with a centralized and feasible business model for diverse HCE-based NFC services will be highly appreciated, particularly by Service Providers and users. To satisfy the need in this new emerging research area, a Tokenization-based communication architecture for HCE-based NFC services is presented in this paper. Our architecture proposes Two-Phased Tokenization to enable the identity management of both user and Service Provider. NFC Smartphone users can store, manage, and make use of their sensitive data on the Cloud for NFC services; Service Providers can also provide diverse card emulation NFC services easily through the proposed architecture. In this paper, we initially present the Two-Phased Tokenization model and then validate the proposed architecture by providing a case study on access control. We further evaluate the usability aspect in terms of an authentication scheme. We then discuss the ecosystem and business model comprised of the proposed architecture and emphasize the contributions to ecosystem actors. Finally, suggestions are provided for data protection in transit and at rest.

  7. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  8. Obtain ceramic porous alumina-zirconia by replica method calcium phosphate coated

    International Nuclear Information System (INIS)

    Silva, A.D.R.; Rigoli, W.R.; Osiro, Denise; Pallone, E.M.J.A.

    2016-01-01

    Biomaterials used in bone replacement, including porous bioceramics, are often used as support structure for bone formation and repair. The porous bioceramics are used because present features as biocompatibility, high porosity and pore morphology that confer adequate mechanical strength and induce bone growth. In this work were obtained porous specimens of alumina containing 5% by inclusion of volume of zirconia produced by the replica method. The porous specimens had its surface chemically treated with phosphoric acid and were coated with calcium phosphate. The coating was performed using the biomimetic method during 14 days and an initial pH of 6.1. The porous specimens were characterized using the follow techniques: porosity, axial compression tests, microtomography, scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS), X-ray diffraction (XRD) and pH measurements SBF solution. The results showed specimens with suitable pore morphology for application as biomaterial, and even a reduced time of incubation favored the calcium phosphate phases formation on the material surfaces. (author)

  9. Fault Tolerant Control Architecture Design for Mobile Manipulation in Scientific Facilities

    Directory of Open Access Journals (Sweden)

    Mohammad M. Aref

    2015-01-01

    Full Text Available This paper describes one of the challenging issues implied by scientific infrastructures on a mobile robot cognition architecture. For a generally applicable cognition architecture, we study the dependencies and logical relations between several tasks and subsystems. The overall view of the software modules is described, including their relationship with a fault management module that monitors the consistency of the data flow among the modules. The fault management module is the solution of the deliberative architecture for the single point failures, and the safety anchor is the reactive solution for the faults by redundant equipment. In addition, a hardware architecture is proposed to ensure safe robot movement as a redundancy for the cognition of the robot. The method is designed for a four-wheel steerable (4WS mobile manipulator (iMoro as a case study.

  10. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  11. Using Runtime Systems Tools to Implement Efficient Preconditioners for Heterogeneous Architectures

    Directory of Open Access Journals (Sweden)

    Roussel Adrien

    2016-11-01

    Full Text Available Solving large sparse linear systems is a time-consuming step in basin modeling or reservoir simulation. The choice of a robust preconditioner strongly impact the performance of the overall simulation. Heterogeneous architectures based on General Purpose computing on Graphic Processing Units (GPGPU or many-core architectures introduce programming challenges which can be managed in a transparent way for developer with the use of runtime systems. Nevertheless, algorithms need to be well suited for these massively parallel architectures. In this paper, we present preconditioning techniques which enable to take advantage of emerging architectures. We also present our task-based implementations through the use of the HARTS (Heterogeneous Abstract RunTime System runtime system, which aims to manage the recent architectures. We focus on two preconditoners. The first is ILU(0 preconditioner implemented on distributing memory systems. The second one is a multi-level domain decomposition method implemented on a shared-memory system. Obtained results are then presented on corresponding architectures, which open the way to discuss on the scalability of such methods according to numerical performances while keeping in mind that the next step is to propose a massively parallel implementations of these techniques.

  12. Three-Dimensional (X,Y,Z) Deterministic Analysis of the PCA-Replica Neutron Shielding Benchmark Experiment using the TORT-3.2 Code and Group Cross Section Libraries for LWR Shielding and Pressure Vessel Dosimetry

    OpenAIRE

    Pescarini Massimo; Orsi Roberto; Frisoni Manuela

    2016-01-01

    The PCA-Replica 12/13 (H2O/Fe) neutron shielding benchmark experiment was analysed using the ORNL TORT-3.2 3D SN code. PCA-Replica, specifically conceived to test the accuracy of nuclear data and transport codes employed in LWR shielding and radiation damage calculations, reproduces a PWR ex-core radial geometry with alternate layers of water and steel including a PWR pressure vessel simulator. Three broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format with ...

  13. Architecture of a consent management suite and integration into IHE-based Regional Health Information Networks.

    Science.gov (United States)

    Heinze, Oliver; Birkle, Markus; Köster, Lennart; Bergh, Björn

    2011-10-04

    The University Hospital Heidelberg is implementing a Regional Health Information Network (RHIN) in the Rhine-Neckar-Region in order to establish a shared-care environment, which is based on established Health IT standards and in particular Integrating the Healthcare Enterprise (IHE). Similar to all other Electronic Health Record (EHR) and Personal Health Record (PHR) approaches the chosen Personal Electronic Health Record (PEHR) architecture relies on the patient's consent in order to share documents and medical data with other care delivery organizations, with the additional requirement that the German legislation explicitly demands a patients' opt-in and does not allow opt-out solutions. This creates two issues: firstly the current IHE consent profile does not address this approach properly and secondly none of the employed intra- and inter-institutional information systems, like almost all systems on the market, offers consent management solutions at all. Hence, the objective of our work is to develop and introduce an extensible architecture for creating, managing and querying patient consents in an IHE-based environment. Based on the features offered by the IHE profile Basic Patient Privacy Consent (BPPC) and literature, the functionalities and components to meet the requirements of a centralized opt-in consent management solution compliant with German legislation have been analyzed. Two services have been developed and integrated into the Heidelberg PEHR. The standard-based Consent Management Suite consists of two services. The Consent Management Service is able to receive and store consent documents. It can receive queries concerning a dedicated patient consent, process it and return an answer. It represents a centralized policy enforcement point. The Consent Creator Service allows patients to create their consents electronically. Interfaces to a Master Patient Index (MPI) and a provider index allow to dynamically generate XACML-based policies which are

  14. Architectural slicing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2013-01-01

    Architectural prototyping is a widely used practice, con- cerned with taking architectural decisions through experiments with light- weight implementations. However, many architectural decisions are only taken when systems are already (partially) implemented. This is prob- lematic in the context...... of architectural prototyping since experiments with full systems are complex and expensive and thus architectural learn- ing is hindered. In this paper, we propose a novel technique for harvest- ing architectural prototypes from existing systems, \\architectural slic- ing", based on dynamic program slicing. Given...... a system and a slicing criterion, architectural slicing produces an architectural prototype that contain the elements in the architecture that are dependent on the ele- ments in the slicing criterion. Furthermore, we present an initial design and implementation of an architectural slicer for Java....

  15. Obtain ceramic porous alumina-zirconia by replica method calcium phosphate coated; Oobtencao de ceramicas porosas de alumina-zirconia pelo metodo da replica recobertas com fosfato de calcio

    Energy Technology Data Exchange (ETDEWEB)

    Silva, A.D.R.; Rigoli, W.R.; Osiro, Denise; Pallone, E.M.J.A., E-mail: adinizrs@yahoo.com.br [Universidade de Sao Paulo (FZEA/USP), Pirassununga, SP (Brazil). Faculdade de Zootecnia e Engenharia de Alimentos; Lobo, A.O. [Universidade do Vale do Paraiba (UNIVAP), Sao Jose dos Campos, SP (Brazil)

    2016-07-01

    Biomaterials used in bone replacement, including porous bioceramics, are often used as support structure for bone formation and repair. The porous bioceramics are used because present features as biocompatibility, high porosity and pore morphology that confer adequate mechanical strength and induce bone growth. In this work were obtained porous specimens of alumina containing 5% by inclusion of volume of zirconia produced by the replica method. The porous specimens had its surface chemically treated with phosphoric acid and were coated with calcium phosphate. The coating was performed using the biomimetic method during 14 days and an initial pH of 6.1. The porous specimens were characterized using the follow techniques: porosity, axial compression tests, microtomography, scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS), X-ray diffraction (XRD) and pH measurements SBF solution. The results showed specimens with suitable pore morphology for application as biomaterial, and even a reduced time of incubation favored the calcium phosphate phases formation on the material surfaces. (author)

  16. The Role of Institutional Logics in Shaping Architecture Governance

    DEFF Research Database (Denmark)

    Andersen, Peter; Svejvig, Per; Carugati, Andrea

    2016-01-01

    are discussed in relation to current practice and theory on both architecture and IT governance. Generally, the findings show how architecture governance is shaped through a complex, contextual and social process beyond rational, managerial decision-making. Finally, we propose that institutional logics can...... to IT governance literature in general. IT governance is often described as a management prerogative, however, using institutional logics as a theoretical lens and sensitizing device, we show how different logics emerged over time and influenced how the organization governed its architecture. These findings...

  17. Procurement of Architectural and Engineering Services for Sustainable Buildings: A Guide for Federal Project Managers

    Energy Technology Data Exchange (ETDEWEB)

    2004-06-01

    This guide was prepared to be a resource for federal construction project managers and others who want to integrate the principles of sustainable design into the procurement of professional building design and consulting services. To economize on energy costs and improve the safety, comfort, and health of building occupants, building design teams can incorporate daylighting, energy efficiency, renewable energy, and passive solar design into all projects in which these elements are technically and economically feasible. The information presented here will help project leaders begin the process and manage the inclusion of sustainable design in the procurement process. The section on establishing selection criteria contains key elements to consider before selecting an architectural and engineering (A/E) firm. The section on preparing the statement of work discusses the broad spectrum of sustainable design services that an A/E firm can provide. Several helpful checklists are included.

  18. SUSTAINABLE ARCHITECTURE : WHAT ARCHITECTURE STUDENTS THINK

    OpenAIRE

    SATWIKO, PRASASTO

    2013-01-01

    Sustainable architecture has become a hot issue lately as the impacts of climate change become more intense. Architecture educations have responded by integrating knowledge of sustainable design in their curriculum. However, in the real life, new buildings keep coming with designs that completely ignore sustainable principles. This paper discusses the results of two national competitions on sustainable architecture targeted for architecture students (conducted in 2012 and 2013). The results a...

  19. Enhancing Architecture-Implementation Conformance with Change Management and Support for Behavioral Mapping

    Science.gov (United States)

    Zheng, Yongjie

    2012-01-01

    Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…

  20. Architecture

    OpenAIRE

    Clear, Nic

    2014-01-01

    When discussing science fiction’s relationship with architecture, the usual practice is to look at the architecture “in” science fiction—in particular, the architecture in SF films (see Kuhn 75-143) since the spaces of literary SF present obvious difficulties as they have to be imagined. In this essay, that relationship will be reversed: I will instead discuss science fiction “in” architecture, mapping out a number of architectural movements and projects that can be viewed explicitly as scien...

  1. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  2. Group key management

    Energy Technology Data Exchange (ETDEWEB)

    Dunigan, T.; Cao, C.

    1997-08-01

    This report describes an architecture and implementation for doing group key management over a data communications network. The architecture describes a protocol for establishing a shared encryption key among an authenticated and authorized collection of network entities. Group access requires one or more authorization certificates. The implementation includes a simple public key and certificate infrastructure. Multicast is used for some of the key management messages. An application programming interface multiplexes key management and user application messages. An implementation using the new IP security protocols is postulated. The architecture is compared with other group key management proposals, and the performance and the limitations of the implementation are described.

  3. CASPER: Embedding Power Estimation and Hardware-Controlled Power Management in a Cycle-Accurate Micro-Architecture Simulation Platform for Many-Core Multi-Threading Heterogeneous Processors

    Directory of Open Access Journals (Sweden)

    Arun Ravindran

    2012-02-01

    Full Text Available Despite the promising performance improvement observed in emerging many-core architectures in high performance processors, high power consumption prohibitively affects their use and marketability in the low-energy sectors, such as embedded processors, network processors and application specific instruction processors (ASIPs. While most chip architects design power-efficient processors by finding an optimal power-performance balance in their design, some use sophisticated on-chip autonomous power management units, which dynamically reduce the voltage or frequencies of idle cores and hence extend battery life and reduce operating costs. For large scale designs of many-core processors, a holistic approach integrating both these techniques at different levels of abstraction can potentially achieve maximal power savings. In this paper we present CASPER, a robust instruction trace driven cycle-accurate many-core multi-threading micro-architecture simulation platform where we have incorporated power estimation models of a wide variety of tunable many-core micro-architectural design parameters, thus enabling processor architects to explore a sufficiently large design space and achieve power-efficient designs. Additionally CASPER is designed to accommodate cycle-accurate models of hardware controlled power management units, enabling architects to experiment with and evaluate different autonomous power-saving mechanisms to study the run-time power-performance trade-offs in embedded many-core processors. We have implemented two such techniques in CASPER–Chipwide Dynamic Voltage and Frequency Scaling, and Performance Aware Core-Specific Frequency Scaling, which show average power savings of 35.9% and 26.2% on a baseline 4-core SPARC based architecture respectively. This power saving data accounts for the power consumption of the power management units themselves. The CASPER simulation platform also provides users with complete support of SPARCV9

  4. Effect of boundary conditions on the strength and deformability of replicas of natural fractures in welded tuff: Data analysis

    International Nuclear Information System (INIS)

    Wibowo, J.; Amadei, B.; Sture, S.

    1994-04-01

    Assessing the shear behavior of intact rock ampersand rock fractures is an important issue in the design of a potential nuclear waste repository at Yucca Mountain Nevada. Cyclic direct shear experiments were conducted on replicas of three natural fractures and a laboratory-developed tensile fracture of welded tuff. The tests were carried out under constant normal loads or constant normal stiffnesses with different initial normal load levels. Each test consisted of five cycles of forward and reverse shear motion. Based on the results of the shear tests conducted under constant normal load, the shear behavior of the joint replicas tested under constant normal stiffness was predicted by using the graphical analysis method of Saeb (1989), and Amadei and Saeb (1990). Comparison between the predictions and the actual constant stiffness direct shear experiment results can be found in a report by Wibowo et al. (1993b). Results of the constant normal load shear experiments are analyzed using several constitutive models proposed in the rock mechanics literature for joint shear strength, dilatancy, and joint surface damage. It is shown that some of the existing models have limitations. New constitutive models are proposed and are included in a mathematical analysis tool that can be used to predict joint behavior under various boundary conditions

  5. Cost management and cross-functional communication through product architectures

    NARCIS (Netherlands)

    Zwerink, Ruud; Wouters, Marc; Hissel, Paul; Kerssens-van Drongelen, I.C.

    2007-01-01

    Product architecture decisions regarding, for example, product modularity, component commonality, and design re-use, are important for balancing costs, responsiveness, quality, and other important business objectives. Firms are challenged with complex tradeoffs between competing design priorities,

  6. Secure Architectures in the Cloud

    NARCIS (Netherlands)

    De Capitani di Vimercati, Sabrina; Pieters, Wolter; Probst, Christian W.

    2011-01-01

    This report documents the outcomes of Dagstuhl Seminar 11492 “Secure Architectures in the Cloud‿. In cloud computing, data storage and processing are offered as services, and data are managed by external providers that reside outside the control of the data owner. The use of such services reduces

  7. Nano-textured polymers for future architectural needs

    Directory of Open Access Journals (Sweden)

    Cees W.M. Bastiaansen

    2013-12-01

    Full Text Available The rapid developments in molecular sciences like nanotechnology and self-organizing molecular systems generate a wealth of new materials and functions. In comparison to electronics the application in architecture remains somewhat underexposed. New functionalities in optics, responsive mechanics, sensing and adjustable permeation for gases and water might add to new opportunities in providing for personal comfort and energy management in houses and professional buildings.With a number of examples we demonstrate how complex but well-controlled molecular architectures provide functionalities worthwhile of being integrated in architectural designs. Optical coatings are capable of switching colors or reflectivity, creating possibilities for design but also for the control of thermal transmission through windows. They respond to temperature, light intensity, or both. Selectively-reflective thin polymer layers or paint pigments can be designed to switch between infrared and visible regions of the solar spectrum. Coatings can be designed to change their topology and thereby their appearance, of interest for in-house light management, or just for aesthetic appeal. Plastic materials can be imbued with the property of autonomous sun tracking and provided morphing behavior upon contact with moisture or exposure to light. Many of these materials need further developments to meet the requirements for building integration with respect to robustness, lifetime, and the like, which will only be accomplished after demonstration of interest from the architectural world.

  8. Pion emission from the T2K replica target: method, results and application

    CERN Document Server

    Abgrall, N.; Anticic, T.; Antoniou, N.; Argyriades, J.; Baatar, B.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bravar, A.; Brooks, W.; Brzychczyk, J.; Bubak, A.; Bunyatov, S.A.; Busygina, O.; Christakoglou, P.; Chung, P.; Czopowicz, T.; Davis, N.; Debieux, S.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Esposito, L.S.; Feofilov, G.A.; Fodor, Z.; Ferrero, A.; Fulop, A.; Gazdzicki, M.; Golubeva, M.; Grabez, B.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hakobyan, H.; Hasegawa, T.; Idczak, R.; Igolkin, S.; Ivanov, Y.; Ivashkin, A.; Kadija, K.; Kapoyannis, A.; Katrynska, N.; Kielczewska, D.; Kikola, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kochebina, O.; Kolesnikov, V.I.; Kolev, D.; Kondratiev, V.P.; Korzenev, A.; Kowalski, S.; Krasnoperov, A.; Kuleshov, S.; Kurepin, A.; Lacey, R.; Larsen, D.; Laszlo, A.; Lyubushkin, V.V.; Mackowiak-Pawlowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A.I.; Maletic, D.; Marchionni, A.; Marcinek, A.; Maris, I.; Marin, V.; Marton, K.; Matulewicz, T.; Matveev, V.; Melkumov, G.L.; Messina, M.; Mrowczynski, St.; Murphy, S.; Nakadaira, T.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A.D.; Paul, T.; Peryt, W.; Petukhov, O.; Planeta, R.; Pluta, J.; Popov, B.A.; Posiadala, M.; Pulawski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Renfordt, R.; Robert, A.; Rohrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Sekiguchi, T.; Seyboth, P.; Shibata, M.; Sipos, M.; Skrzypczak, E.; Slodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Strabel, C.; Strobele, H.; Susa, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V.V.; Vesztergombi, G.; Wilczek, A.; Wlodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszynski, O.; Zambelli, L.; Zipper, W.; Hartz, M.; Ichikawa, A.K.; Kubo, H.; Marino, A.D.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Yuan, T.; Zimmerman, E.D.

    2013-01-01

    The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

  9. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links.

    Science.gov (United States)

    Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen

    2018-05-25

    Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection

  10. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links

    Directory of Open Access Journals (Sweden)

    Hongbo Zhao

    2018-05-01

    Full Text Available Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR, complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS and BeiDou Navigation Satellite System (BDS adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST. This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher

  11. Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study

    Science.gov (United States)

    Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.

    2011-01-01

    A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.

  12. Modeling Architectural Patterns Using Architectural Primitives

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    2005-01-01

    Architectural patterns are a key point in architectural documentation. Regrettably, there is poor support for modeling architectural patterns, because the pattern elements are not directly matched by elements in modeling languages, and, at the same time, patterns support an inherent variability that

  13. Preserving the Boltzmann ensemble in replica-exchange molecular dynamics.

    Science.gov (United States)

    Cooke, Ben; Schmidler, Scott C

    2008-10-28

    We consider the convergence behavior of replica-exchange molecular dynamics (REMD) [Sugita and Okamoto, Chem. Phys. Lett. 314, 141 (1999)] based on properties of the numerical integrators in the underlying isothermal molecular dynamics (MD) simulations. We show that a variety of deterministic algorithms favored by molecular dynamics practitioners for constant-temperature simulation of biomolecules fail either to be measure invariant or irreducible, and are therefore not ergodic. We then show that REMD using these algorithms also fails to be ergodic. As a result, the entire configuration space may not be explored even in an infinitely long simulation, and the simulation may not converge to the desired equilibrium Boltzmann ensemble. Moreover, our analysis shows that for initial configurations with unfavorable energy, it may be impossible for the system to reach a region surrounding the minimum energy configuration. We demonstrate these failures of REMD algorithms for three small systems: a Gaussian distribution (simple harmonic oscillator dynamics), a bimodal mixture of Gaussians distribution, and the alanine dipeptide. Examination of the resulting phase plots and equilibrium configuration densities indicates significant errors in the ensemble generated by REMD simulation. We describe a simple modification to address these failures based on a stochastic hybrid Monte Carlo correction, and prove that this is ergodic.

  14. Advanced Ground Systems Maintenance Enterprise Architecture Project

    Science.gov (United States)

    Perotti, Jose M. (Compiler)

    2015-01-01

    The project implements an architecture for delivery of integrated health management capabilities for the 21st Century launch complex. The delivered capabilities include anomaly detection, fault isolation, prognostics and physics based diagnostics.

  15. Replica symmetry breaking solution for two-sublattice fermionic Ising spin glass models in a transverse field

    International Nuclear Information System (INIS)

    Zimmer, F.M.; Magalhaes, S.G.

    2007-01-01

    The one-step replica symmetry breaking is used to study the competition between spin glass (SG) and antiferromagnetic order (AF) in two-sublattice fermionic Ising SG models in the presence of a transverse Γ and a parallel H magnetic fields. Inter- and intra-sublattice exchange interactions following Gaussian distributions are considered. The problem is formulated in a Grassmann path integral formalism within the static ansatz. Results show that H favors the non-ergodic mixed phase (AF+SG) and it destroys the AF. The Γ suppresses the magnetic orders, and the intra-sublattice interaction can introduce a discontinuous phase transition

  16. The use of extraction and electronic diffraction replicas for precipitates characterization in welded Cr-Mo Steels

    International Nuclear Information System (INIS)

    Gutierrez de Saiz-Solabarria, S.; San Juan Nunez, J.M.

    1997-01-01

    The precipitates and phases found in the structure of welded joints of Heat Interchanges Tubes were studied and identified. The base material satisfied the requirements of ASME Sec II, SA 213 Gr T22 (2 1/4 Cr 1 Mo). Compositions of Filler Metals were: 2 1/4 Cr 1 Mo and 2 1/4 Cr 1 Mo 1/4 Nb. The chemical composition of base and weld materials were analyzed by atomic emission spectroscopy in high vacuum electric discharge and by inductive plasma coupled. For the constituents characterization extraction and diffraction microscopy replicas were used. (Author) 65 refs

  17. Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.

    Science.gov (United States)

    Beltrametti, Monica; English, Will

    1994-01-01

    Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…

  18. Communications Architecture Recommendations to Enable Joint Vision 2020

    National Research Council Canada - National Science Library

    Armstrong, R. B

    2003-01-01

    The Mission Information Management (MIM) Communications Architecture provides a framework to develop an integrated space, air, and terrestrial communications network that supports all national security users...

  19. A multi-agent system architecture for sensor networks.

    Science.gov (United States)

    Fuentes-Fernández, Rubén; Guijarro, María; Pajares, Gonzalo

    2009-01-01

    The design of the control systems for sensor networks presents important challenges. Besides the traditional problems about how to process the sensor data to obtain the target information, engineers need to consider additional aspects such as the heterogeneity and high number of sensors, and the flexibility of these networks regarding topologies and the sensors in them. Although there are partial approaches for resolving these issues, their integration relies on ad hoc solutions requiring important development efforts. In order to provide an effective approach for this integration, this paper proposes an architecture based on the multi-agent system paradigm with a clear separation of concerns. The architecture considers sensors as devices used by an upper layer of manager agents. These agents are able to communicate and negotiate services to achieve the required functionality. Activities are organized according to roles related with the different aspects to integrate, mainly sensor management, data processing, communication and adaptation to changes in the available devices and their capabilities. This organization largely isolates and decouples the data management from the changing network, while encouraging reuse of solutions. The use of the architecture is facilitated by a specific modelling language developed through metamodelling. A case study concerning a generic distributed system for fire fighting illustrates the approach and the comparison with related work.

  20. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  1. 2005 dossier: granite. Tome: architecture and management of the geologic disposal; Dossier 2005: granite. Tome architecture et gestion du stockage geologique

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the geologic disposal of high-level and long-lived radioactive wastes in granite formations. Content: 1 - Approach of the study: main steps since the December 30, 1991 law, ANDRA's research program on disposal in granitic formations; 2 - high-level and long-lived (HLLL) wastes: production scenarios, waste categories, inventory model; 3 - disposal facility design in granitic environment: definition of the geologic disposal functions, the granitic material, general facility design options; 4 - general architecture of a disposal facility in granitic environment: surface facilities, underground facilities, disposal process, operational safety; 5 - B-type wastes disposal area: primary containers of B-type wastes, safety options, concrete containers, disposal alveoles, architecture of the B-type wastes disposal area, disposal process and feasibility aspects, functions of disposal components with time; 6 - C-type wastes disposal area: C-type wastes primary containers, safety options, super-containers, disposal alveoles, architecture of the C-type wastes disposal area, disposal process in a reversibility logics, functions of disposal components with time; 7 - spent fuels disposal area: spent fuel assemblies, safety options, spent fuel containers, disposal alveoles, architecture of the spent fuel disposal area, disposal process in a reversibility logics, functions of disposal components with time; 8 - conclusions: suitability of the architecture with various types of French granites, strong design, reversibility taken into consideration. (J.S.)

  2. Managing changes in the enterprise architecture modelling context

    Science.gov (United States)

    Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya

    2016-07-01

    Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.

  3. BIM and architectural heritage: towards an operational methodology for the knowledge and the management of Cultural Heritage

    Directory of Open Access Journals (Sweden)

    Laura Inzerillo

    2016-06-01

    Full Text Available The study aims to answer the growing need for virtuously organize informational apparatuses related to Cultural Heritage. We propose a methodology that integrates multidisciplinary processes of interaction with information aimed at survey, documentation, management, knowledge and enhancement of historic artifacts.It is needed to review and update the procedure of instrumental data acquisition, standardization and structuring of the acquired data in a three-dimensional semantic model as well as the subsequent representability and accessibility of the model and the related database. If the use of Building Information Modeling has in recent years seen a consolidation in the procedures and the identification of standard methods in design process, nevertheless in the field of architectural heritage, the challenge to identify operational methodologies for the conservation, management and process enhancement is still open.

  4. Performance assessment of distributed communication architectures in smart grid.

    OpenAIRE

    Jiang, Jing; Sun, Hongjian

    2016-01-01

    The huge amount of smart meters and growing frequent data readings have become a big challenge on data acquisition and processing in smart grid advanced metering infrastructure systems. This requires a distributed communication architecture in which multiple distributed meter data management systems (MDMSs) are deployed and meter data are processed locally. In this paper, we present the network model for supporting this distributed communication architecture and propos...

  5. Traceability of Requirements and Software Architecture for Change Management

    NARCIS (Netherlands)

    Göknil, Arda

    2011-01-01

    At the present day, software systems get more and more complex. The requirements of software systems change continuously and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system

  6. A meta-level architecture for strategic reasoning in naval planning (Extended abstract)

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jonker, C.M.; van Maanen, P.P.; Treur, J.

    2005-01-01

    The management of naval organizations aims at the maximization of mission success by means of monitoring, planning, and strategic reasoning. This paper presents a meta-level architecture for strategic reasoning in naval planning. The architecture is instantiated with decision knowledge acquired from

  7. An Enterprise Information System Data Architecture Guide

    National Research Council Canada - National Science Library

    Lewis, Grace

    2001-01-01

    Data architecture defines how data is stored, managed, and used in a system. It establishes common guidelines for data operations that make it impossible to predict, model, gauge, or control the flow of data in the system...

  8. Context Aware Middleware Architectures: Survey and Challenges

    Directory of Open Access Journals (Sweden)

    Xin Li

    2015-08-01

    Full Text Available Context aware applications, which can adapt their behaviors to changing environments, are attracting more and more attention. To simplify the complexity of developing applications, context aware middleware, which introduces context awareness into the traditional middleware, is highlighted to provide a homogeneous interface involving generic context management solutions. This paper provides a survey of state-of-the-art context aware middleware architectures proposed during the period from 2009 through 2015. First, a preliminary background, such as the principles of context, context awareness, context modelling, and context reasoning, is provided for a comprehensive understanding of context aware middleware. On this basis, an overview of eleven carefully selected middleware architectures is presented and their main features explained. Then, thorough comparisons and analysis of the presented middleware architectures are performed based on technical parameters including architectural style, context abstraction, context reasoning, scalability, fault tolerance, interoperability, service discovery, storage, security & privacy, context awareness level, and cloud-based big data analytics. The analysis shows that there is actually no context aware middleware architecture that complies with all requirements. Finally, challenges are pointed out as open issues for future work.

  9. Smart SOA platforms in cloud computing architectures

    CERN Document Server

    Exposito , Ernesto

    2014-01-01

    This book is intended to introduce the principles of the Event-Driven and Service-Oriented Architecture (SOA 2.0) and its role in the new interconnected world based on the cloud computing architecture paradigm. In this new context, the concept of "service" is widely applied to the hardware and software resources available in the new generation of the Internet. The authors focus on how current and future SOA technologies provide the basis for the smart management of the service model provided by the Platform as a Service (PaaS) layer.

  10. Derivation of the probability distribution function for the local density of states of a disordered quantum wire via the replica trick and supersymmetry

    International Nuclear Information System (INIS)

    Bunder, J.E.J.E.; McKenzie, R.H.Ross H.

    2001-01-01

    We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states. Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled 'spins' which are elements of u(1,1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams

  11. An Architecture for Continuous Data Quality Monitoring in Medical Centers.

    Science.gov (United States)

    Endler, Gregor; Schwab, Peter K; Wahl, Andreas M; Tenschert, Johannes; Lenz, Richard

    2015-01-01

    In the medical domain, data quality is very important. Since requirements and data change frequently, continuous and sustainable monitoring and improvement of data quality is necessary. Working together with managers of medical centers, we developed an architecture for a data quality monitoring system. The architecture enables domain experts to adapt the system during runtime to match their specifications using a built-in rule system. It also allows arbitrarily complex analyses to be integrated into the monitoring cycle. We evaluate our architecture by matching its components to the well-known data quality methodology TDQM.

  12. Design of an application using microservices architecture and its deployment in the cloud

    OpenAIRE

    Fernández Garcés, Lidia

    2016-01-01

    The design of modern large business application systems is moving from monolithic enterprise architectures towards microservices based architectures. These are especially well suited to run in cloud environments, because each microservice can be developed, deployed and managed individually, which allows much more fine-grained control and scalability. Based on a concrete example, this projet should build an application using a microservice architecture, showi...

  13. New force replica exchange method and protein folding pathways probed by force-clamp technique.

    Science.gov (United States)

    Kouza, Maksim; Hu, Chin-Kun; Li, Mai Suan

    2008-01-28

    We have developed a new extended replica exchange method to study thermodynamics of a system in the presence of external force. Our idea is based on the exchange between different force replicas to accelerate the equilibrium process. This new approach was applied to obtain the force-temperature phase diagram and other thermodynamical quantities of the three-domain ubiquitin. Using the C(alpha)-Go model and the Langevin dynamics, we have shown that the refolding pathways of single ubiquitin depend on which terminus is fixed. If the N end is fixed then the folding pathways are different compared to the case when both termini are free, but fixing the C terminal does not change them. Surprisingly, we have found that the anchoring terminal does not affect the pathways of individual secondary structures of three-domain ubiquitin, indicating the important role of the multidomain construction. Therefore, force-clamp experiments, in which one end of a protein is kept fixed, can probe the refolding pathways of a single free-end ubiquitin if one uses either the polyubiquitin or a single domain with the C terminus anchored. However, it is shown that anchoring one end does not affect refolding pathways of the titin domain I27, and the force-clamp spectroscopy is always capable to predict folding sequencing of this protein. We have obtained the reasonable estimate for unfolding barrier of ubiquitin, using the microscopic theory for the dependence of unfolding time on the external force. The linkage between residue Lys48 and the C terminal of ubiquitin is found to have the dramatic effect on the location of the transition state along the end-to-end distance reaction coordinate, but the multidomain construction leaves the transition state almost unchanged. We have found that the maximum force in the force-extension profile from constant velocity force pulling simulations depends on temperature nonlinearly. However, for some narrow temperature interval this dependence becomes

  14. Architectural prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders...

  15. Architectural communication: Intra and extra activity of architecture

    Directory of Open Access Journals (Sweden)

    Stamatović-Vučković Slavica

    2013-01-01

    Full Text Available Apart from a brief overview of architectural communication viewed from the standpoint of theory of information and semiotics, this paper contains two forms of dualistically viewed architectural communication. The duality denotation/connotation (”primary” and ”secondary” architectural communication is one of semiotic postulates taken from Umberto Eco who viewed architectural communication as a semiotic phenomenon. In addition, architectural communication can be viewed as an intra and an extra activity of architecture where the overall activity of the edifice performed through its spatial manifestation may be understood as an act of communication. In that respect, the activity may be perceived as the ”behavior of architecture”, which corresponds to Lefebvre’s production of space.

  16. MWAHCA: a multimedia wireless ad hoc cluster architecture.

    Science.gov (United States)

    Diaz, Juan R; Lloret, Jaime; Jimenez, Jose M; Sendra, Sandra

    2014-01-01

    Wireless Ad hoc networks provide a flexible and adaptable infrastructure to transport data over a great variety of environments. Recently, real-time audio and video data transmission has been increased due to the appearance of many multimedia applications. One of the major challenges is to ensure the quality of multimedia streams when they have passed through a wireless ad hoc network. It requires adapting the network architecture to the multimedia QoS requirements. In this paper we propose a new architecture to organize and manage cluster-based ad hoc networks in order to provide multimedia streams. Proposed architecture adapts the network wireless topology in order to improve the quality of audio and video transmissions. In order to achieve this goal, the architecture uses some information such as each node's capacity and the QoS parameters (bandwidth, delay, jitter, and packet loss). The architecture splits the network into clusters which are specialized in specific multimedia traffic. The real system performance study provided at the end of the paper will demonstrate the feasibility of the proposal.

  17. MWAHCA: A Multimedia Wireless Ad Hoc Cluster Architecture

    Directory of Open Access Journals (Sweden)

    Juan R. Diaz

    2014-01-01

    Full Text Available Wireless Ad hoc networks provide a flexible and adaptable infrastructure to transport data over a great variety of environments. Recently, real-time audio and video data transmission has been increased due to the appearance of many multimedia applications. One of the major challenges is to ensure the quality of multimedia streams when they have passed through a wireless ad hoc network. It requires adapting the network architecture to the multimedia QoS requirements. In this paper we propose a new architecture to organize and manage cluster-based ad hoc networks in order to provide multimedia streams. Proposed architecture adapts the network wireless topology in order to improve the quality of audio and video transmissions. In order to achieve this goal, the architecture uses some information such as each node’s capacity and the QoS parameters (bandwidth, delay, jitter, and packet loss. The architecture splits the network into clusters which are specialized in specific multimedia traffic. The real system performance study provided at the end of the paper will demonstrate the feasibility of the proposal.

  18. Establishing ‘Architectural Thinking’ in Organizations

    OpenAIRE

    Winter , Robert

    2016-01-01

    Part 1: Keynote; International audience; After having harvested ‘low hanging fruits’ in early stages of Enterprise Architecture Management (EAM), it becomes increasingly difficult to keep up with large benefit realizations in later stages. The focus on the traditional EAM players (IT unit, architects, enterprise management) should be widened to ‘that other 90 % of the enterprise’ that are not directly related to the IT function. In order to create impact beyond IT, it appears necessary to com...

  19. Geometry anisotropy and mechanical property isotropy in titanium foam fabricated by replica impregnation method

    International Nuclear Information System (INIS)

    Manonukul, Anchalee; Srikudvien, Pathompoom; Tange, Makiko; Puncreobutr, Chedtha

    2016-01-01

    Polyurethane (PU) foams have both geometry and mechanical property anisotropy. Metal foams, which are manufacturing by investment casting or melt deposition method and using PU foam as a template, also have mechanical property anisotropy. This work studied the mechanical properties in two directions of titanium foam with four different cell sizes fabricated using the replica impregnation method. The two directions are (1) the loading direction parallel to the foaming direction where the cells are elongated (EL direction) and (2) the loading direction perpendicular to the foaming direction where the cell are equiaxed (EQ direction). The results show that the compression responses for both EL and EQ directions are isotropy. Micrographs and X-ray micro-computed tomography show that the degree of geometry anisotropy is not strong enough to results in mechanical property anisotropy.

  20. Geometry anisotropy and mechanical property isotropy in titanium foam fabricated by replica impregnation method

    Energy Technology Data Exchange (ETDEWEB)

    Manonukul, Anchalee, E-mail: anchalm@mtec.or.th [National Metal and Materials Technology Center (MTEC), National Science and Technology Development Agency (NSTDA), 114 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Srikudvien, Pathompoom [National Metal and Materials Technology Center (MTEC), National Science and Technology Development Agency (NSTDA), 114 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Tange, Makiko [Taisei Kogyo Thailand Co., Ltd., Room INC2d-409, Innovation Cluster 2 Building, Tower D, 141 Thailand Science Park, Paholyothin Rd., Klong 1, Klong Luang, Pathumthani 12120 (Thailand); Puncreobutr, Chedtha [Department of Metallurgical Engineering, Faculty of Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330 (Thailand)

    2016-02-08

    Polyurethane (PU) foams have both geometry and mechanical property anisotropy. Metal foams, which are manufacturing by investment casting or melt deposition method and using PU foam as a template, also have mechanical property anisotropy. This work studied the mechanical properties in two directions of titanium foam with four different cell sizes fabricated using the replica impregnation method. The two directions are (1) the loading direction parallel to the foaming direction where the cells are elongated (EL direction) and (2) the loading direction perpendicular to the foaming direction where the cell are equiaxed (EQ direction). The results show that the compression responses for both EL and EQ directions are isotropy. Micrographs and X-ray micro-computed tomography show that the degree of geometry anisotropy is not strong enough to results in mechanical property anisotropy.

  1. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... proportions, to organize the process on site choosing either one room wall components or several rooms wall components – either horizontally or vertically. Combined with the seamless joint the playing with these possibilities the new industrialized architecture can deliver variations in choice of solutions...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  2. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... expression in the specific housing area. It is the aim of this article to expand the different design strategies which architects can use – to give the individual project attitudes and designs with architectural quality. Through the customized component production it is possible to choose different...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  3. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    Architectural freedom and industrialized architecture. Inge Vestergaard, Associate Professor, Cand. Arch. Aarhus School of Architecture, Denmark Noerreport 20, 8000 Aarhus C Telephone +45 89 36 0000 E-mai l inge.vestergaard@aarch.dk Based on the repetitive architecture from the "building boom" 1960...... customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performed expression in direct relation to the given context. Through the last couple of years we have in Denmark been focusing a more sustainable and low energy building technique, which also include...... to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...

  4. A New Electronic Commerce Architecture in the Cloud

    OpenAIRE

    Guigang Zhang; Chao Li; Sixin Xue; Yuenan Liu; Yong Zhang; Chunxiao Xing

    2012-01-01

    In this paper, the authors propose a new electronic commerce architecture in the cloud that satisfies the requirements of the cloud. This architecture includes five technologies, which are the massive EC data storage technology in the cloud, the massive EC data processing technology in the cloud, the EC security management technology in the cloud, OLAP technology for EC in the cloud, and active EC technology in the cloud. Finally, a detailed discussion of future trends for EC in the cloud env...

  5. A Multi-Agent System Architecture for Sensor Networks

    Directory of Open Access Journals (Sweden)

    María Guijarro

    2009-12-01

    Full Text Available The design of the control systems for sensor networks presents important challenges. Besides the traditional problems about how to process the sensor data to obtain the target information, engineers need to consider additional aspects such as the heterogeneity and high number of sensors, and the flexibility of these networks regarding topologies and the sensors in them. Although there are partial approaches for resolving these issues, their integration relies on ad hoc solutions requiring important development efforts. In order to provide an effective approach for this integration, this paper proposes an architecture based on the multi-agent system paradigm with a clear separation of concerns. The architecture considers sensors as devices used by an upper layer of manager agents. These agents are able to communicate and negotiate services to achieve the required functionality. Activities are organized according to roles related with the different aspects to integrate, mainly sensor management, data processing, communication and adaptation to changes in the available devices and their capabilities. This organization largely isolates and decouples the data management from the changing network, while encouraging reuse of solutions. The use of the architecture is facilitated by a specific modelling language developed through metamodelling. A case study concerning a generic distributed system for fire fighting illustrates the approach and the comparison with related work.

  6. A DRM Security Architecture for Home Networks

    NARCIS (Netherlands)

    Popescu, B.C.; Crispo, B.; Kamperman, F.L.A.J.; Tanenbaum, A.S.; Kiayias, A.; Yung, M.

    2004-01-01

    This paper describes a security architecture allowing digital rights management in home networks consisting of consumer electronic devices. The idea is to allow devices to establish dynamic groups, so called "Authorized Domains", where legally acquired copyrighted content can seamlessly move from

  7. Grid tied PV/battery system architecture and power management for fast electric vehicle charging

    Science.gov (United States)

    Badawy, Mohamed O.

    The prospective spread of Electric vehicles (EV) and plug-in hybrid electric vehicles (PHEV) arises the need for fast charging rates. Higher charging rates requirements lead to high power demands, which cant be always supported by the grid. Thus, the use of on-site sources alongside the electrical grid for EVs charging is a rising area of interest. In this dissertation, a photovoltaic (PV) source is used to support the high power EVs charging. However, the PV output power has an intermittent nature that is dependable on the weather conditions. Thus, battery storage are combined with the PV in a grid tied system, providing a steady source for on-site EVs use in a renewable energy based fast charging station. Verily, renewable energy based fast charging stations should be cost effective, efficient, and reliable to increase the penetration of EVs in the automotive market. Thus, this Dissertation proposes a novel power flow management topology that aims on decreasing the running cost along with innovative hardware solutions and control structures for the developed architecture. The developed power flow management topology operates the hybrid system at the minimum operating cost while extending the battery lifetime. An optimization problem is formulated and two stages of optimization, i.e online and offline stages, are adopted to optimize the batteries state of charge (SOC) scheduling and continuously compensate for the forecasting errors. The proposed power flow management topology is validated and tested with two metering systems, i.e unified and dual metering systems. The results suggested that minimal power flow is anticipated from the battery storage to the grid in the dual metering system. Thus, the power electronic interfacing system is designed accordingly. Interconnecting bi-directional DC/DC converters are analyzed, and a cascaded buck boost (CBB) converter is chosen and tested under 80 kW power flow rates. The need to perform power factor correction (PFC) on

  8. Precision of fit between implant impression coping and implant replica pairs for three implant systems.

    Science.gov (United States)

    Nicoll, Roxanna J; Sun, Albert; Haney, Stephan; Turkyilmaz, Ilser

    2013-01-01

    The fabrication of an accurately fitting implant-supported fixed prosthesis requires multiple steps, the first of which is assembling the impression coping on the implant. An imprecise fit of the impression coping on the implant will cause errors that will be magnified in subsequent steps of prosthesis fabrication. The purpose of this study was to characterize the 3-dimensional (3D) precision of fit between impression coping and implant replica pairs for 3 implant systems. The selected implant systems represent the 3 main joint types used in implant dentistry: external hexagonal, internal trilobe, and internal conical. Ten impression copings and 10 implant replicas from each of the 3 systems, B (Brånemark System), R (NobelReplace Select), and A (NobelActive) were paired. A standardized aluminum test body was luted to each impression coping, and the corresponding implant replica was embedded in a stone base. A coordinate measuring machine was used to quantify the maximum range of displacement in a vertical direction as a function of the tightening force applied to the guide pin. Maximum angular displacement in a horizontal plane was measured as a function of manual clockwise or counterclockwise rotation. Vertical and rotational positioning was analyzed by using 1-way analysis of variance (ANOVA). The Fisher protected least significant difference (PLSD) multiple comparisons test of the means was applied when the F-test in the ANOVA was significant (α=.05). The mean and standard deviation for change in the vertical positioning of impression copings was 4.3 ±2.1 μm for implant system B, 2.8 ±4.2 μm for implant system R, and 20.6 ±8.8 μm for implant system A. The mean and standard deviation for rotational positioning was 3.21 ±0.98 degrees for system B, 2.58 ±1.03 degrees for system R, and 5.30 ±0.79 degrees for system A. The P-value for vertical positioning between groups A and B and between groups A and R was <.001. No significant differences were found for

  9. Storage system architectures and their characteristics

    Science.gov (United States)

    Sarandrea, Bryan M.

    1993-01-01

    Not all users storage requirements call for 20 MBS data transfer rates, multi-tier file or data migration schemes, or even automated retrieval of data. The number of available storage solutions reflects the broad range of user requirements. It is foolish to think that any one solution can address the complete range of requirements. For users with simple off-line storage requirements, the cost and complexity of high end solutions would provide no advantage over a more simple solution. The correct answer is to match the requirements of a particular storage need to the various attributes of the available solutions. The goal of this paper is to introduce basic concepts of archiving and storage management in combination with the most common architectures and to provide some insight into how these concepts and architectures address various storage problems. The intent is to provide potential consumers of storage technology with a framework within which to begin the hunt for a solution which meets their particular needs. This paper is not intended to be an exhaustive study or to address all possible solutions or new technologies, but is intended to be a more practical treatment of todays storage system alternatives. Since most commercial storage systems today are built on Open Systems concepts, the majority of these solutions are hosted on the UNIX operating system. For this reason, some of the architectural issues discussed focus around specific UNIX architectural concepts. However, most of the architectures are operating system independent and the conclusions are applicable to such architectures on any operating system.

  10. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  11. Lightweight enterprise architectures

    CERN Document Server

    Theuerkorn, Fenix

    2004-01-01

    STATE OF ARCHITECTUREArchitectural ChaosRelation of Technology and Architecture The Many Faces of Architecture The Scope of Enterprise Architecture The Need for Enterprise ArchitectureThe History of Architecture The Current Environment Standardization Barriers The Need for Lightweight Architecture in the EnterpriseThe Cost of TechnologyThe Benefits of Enterprise Architecture The Domains of Architecture The Gap between Business and ITWhere Does LEA Fit? LEA's FrameworkFrameworks, Methodologies, and Approaches The Framework of LEATypes of Methodologies Types of ApproachesActual System Environmen

  12. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  13. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  14. Weighted Components of i-Government Enterprise Architecture

    Science.gov (United States)

    Budiardjo, E. K.; Firmansyah, G.; Hasibuan, Z. A.

    2017-01-01

    Lack of government performance, among others due to the lack of coordination and communication among government agencies. Whilst, Enterprise Architecture (EA) in the government can be use as a strategic planning tool to improve productivity, efficiency, and effectivity. However, the existence components of Government Enterprise Architecture (GEA) do not show level of importance, that cause difficulty in implementing good e-government for good governance. This study is to explore the weight of GEA components using Principal Component Analysis (PCA) in order to discovered an inherent structure of e-government. The results show that IT governance component of GEA play a major role in the GEA. The rest of components that consist of e-government system, e-government regulation, e-government management, and application key operational, contributed more or less the same. Beside that GEA from other countries analyzes using comparative base on comon enterprise architecture component. These weighted components use to construct i-Government enterprise architecture. and show the relative importance of component in order to established priorities in developing e-government.

  15. An OER Architecture Framework: Need and Design

    Directory of Open Access Journals (Sweden)

    Pankaj Khanna

    2013-03-01

    Full Text Available This paper describes an open educational resources (OER architecture framework that would bring significant improvements in a well-structured and systematic way to the educational practices of distance education institutions of India. The OER architecture framework is articulated with six dimensions: pedagogical, technological, managerial, academic, financial, and ethical. These dimensions are structured with the component areas of relevance: IT infrastructure services, management support systems, open content development and maintenance, online teaching-learning, and learner assessment and evaluation of the OER architecture framework. An OER knowledge and information base, including a web portal, is proposed in the form of a series of knowledge repositories. This system would not only streamline the delivery of distance education but also would enhance the quality of distance learning through the development of high quality e-content, instructional processes, course/programme content development, IT infrastructure, and network systems. Thus the proposed OER architecture framework when implemented in the distance education system (DES of India would improve the quality of distance education and also increase its accessibility in a well-organised and structured way.

  16. Indigenous architecture as a context-oriented architecture, a look at ...

    African Journals Online (AJOL)

    What has become problematic as the achievement of international style and globalization of architecture during the time has been the purely technological look at architecture, and the architecture without belonging to a place. In recent decades, the topic of sustainable architecture and reconsidering indigenous architecture ...

  17. Developing Distributed System With Service Resource Oriented Architecture

    Directory of Open Access Journals (Sweden)

    Hermawan Hermawan

    2012-06-01

    Full Text Available Service Oriented Architecture is a design paradigm in software engineering with which a distributed system is built for an enterprise. This paradigm aims at providing the system as a service through a protocol in web service technology, namely Simple Object Access Protocol (SOAP. However, SOA is service level agreements of webservice. For this reason, this reasearch aims at combining SOA with Resource Oriented Architecture in order to expand scalability of services. This combination creates Sevice Resource Oriented Architecture (SROA with which a distributed system is developed that integrates services within project management software. Following this design, the software is developed according to a framework of Agile Model Driven Development which can reduce complexities of the whole process of software development.

  18. Drone Defense System Architecture for U.S. Navy Strategic Facilities

    Science.gov (United States)

    2017-09-01

    unlimited. DRONE DEFENSE SYSTEM ARCHITECTURE FOR U.S. NAVY STRATEGIC FACILITIES by David Arteche, Kenneth Chivers, Bryce Howard, Terrell Long, Walter...and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction...project report 4. TITLE AND SUBTITLE DRONE DEFENSE SYSTEM ARCHITECTURE FOR U.S. NAVY STRATEGIC FACILITIES 5. FUNDING NUMBERS 6. AUTHOR(S) David Arteche

  19. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Science.gov (United States)

    2010-07-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  20. Architecture in the Islamic Civilization: Muslim Building or Islamic Architecture

    OpenAIRE

    Yassin, Ayat Ali; Utaberta, Dr. Nangkula

    2012-01-01

    The main problem of the theory in the arena of islamic architecture is affected by some of its Westernthoughts, and stereotyping the islamic architecture according to Western thoughts; this leads to the breakdownof the foundations in the islamic architecture. It is a myth that islamic architecture is subjected to theinfluence from foreign architectures. This paper will highlight the dialectical concept of islamic architecture ormuslim buildings and the areas of recognition in islamic architec...

  1. Execution Management in the Virtual Ship Architecture Issue 1.00

    National Research Council Canada - National Science Library

    Cramp, Anthony

    2000-01-01

    The Virtual Ship is an application of the High Level Architecture (HLA) in which simulation models that represent the components of a warship are brought together in a distributed manner to create a virtual representation of a warship...

  2. A MultiAgent Architecture for Collaborative Serious Game applied to Crisis Management Training: Improving Adaptability of Non Player Characters

    Directory of Open Access Journals (Sweden)

    M’hammed Ali Oulhaci

    2014-05-01

    Full Text Available Serious Games (SG are more and more used for training, as in the crisis management domain, where several hundred stakeholders can be involved, causing various organizational difficulties on field exercises. SGs specific benefits include player immersion and detailed players’ actions tracking during a virtual exercise. Moreover, Non Player Characters (NPC can adapt the crisis management exercise perimeter to the available stakeholders or to specific training objectives. In this paper we present a Multi-Agent System architecture supporting behavioural simulation as well as monitoring and assessment of human players. A NPC is enacted by a Game Agent which reproduces the behaviour of a human actor, based on a deliberative model (Belief Desire Intention. To facilitate the scenario design, an Agent editor allows a designer to configure agents’behaviours. The behaviour simulation was implemented within the pre-existing SIMFOR project, a serious game for training in crisis management.

  3. Replica exchange enveloping distribution sampling (RE-EDS): A robust method to estimate multiple free-energy differences from a single simulation.

    Science.gov (United States)

    Sidler, Dominik; Schwaninger, Arthur; Riniker, Sereina

    2016-10-21

    In molecular dynamics (MD) simulations, free-energy differences are often calculated using free energy perturbation or thermodynamic integration (TI) methods. However, both techniques are only suited to calculate free-energy differences between two end states. Enveloping distribution sampling (EDS) presents an attractive alternative that allows to calculate multiple free-energy differences in a single simulation. In EDS, a reference state is simulated which "envelopes" the end states. The challenge of this methodology is the determination of optimal reference-state parameters to ensure equal sampling of all end states. Currently, the automatic determination of the reference-state parameters for multiple end states is an unsolved issue that limits the application of the methodology. To resolve this, we have generalised the replica-exchange EDS (RE-EDS) approach, introduced by Lee et al. [J. Chem. Theory Comput. 10, 2738 (2014)] for constant-pH MD simulations. By exchanging configurations between replicas with different reference-state parameters, the complexity of the parameter-choice problem can be substantially reduced. A new robust scheme to estimate the reference-state parameters from a short initial RE-EDS simulation with default parameters was developed, which allowed the calculation of 36 free-energy differences between nine small-molecule inhibitors of phenylethanolamine N-methyltransferase from a single simulation. The resulting free-energy differences were in excellent agreement with values obtained previously by TI and two-state EDS simulations.

  4. Novel approach to the fabrication of an artificial small bone using a combination of sponge replica and electrospinning methods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yang-Hee; Lee, Byong-Taek, E-mail: lbt@sch.ac.kr [Department of Biomedical Engineering and Materials, School of Medicine, Soonchunhyang University 366-1, Ssangyong-dong, Cheonan, Chungnam 330-090 (Korea, Republic of)

    2011-06-15

    In this study, a novel artificial small bone consisting of ZrO{sub 2}-biphasic calcium phosphate/polymethylmethacrylate-polycaprolactone-hydroxyapatite (ZrO{sub 2}-BCP/PMMA-PCL-HAp) was fabricated using a combination of sponge replica and electrospinning methods. To mimic the cancellous bone, the ZrO{sub 2}/BCP scaffold was composed of three layers, ZrO{sub 2}, ZrO{sub 2}/BCP and BCP, fabricated by the sponge replica method. The PMMA-PCL fibers loaded with HAp powder were wrapped around the ZrO{sub 2}/BCP scaffold using the electrospinning process. To imitate the Haversian canal region of the bone, HAp-loaded PMMA-PCL fibers were wrapped around a steel wire of 0.3 mm diameter. As a result, the bundles of fiber wrapped around the wires imitated the osteon structure of the cortical bone. Finally, the ZrO{sub 2}/BCP scaffold was surrounded by HAp-loaded PMMA-PCL composite bundles. After removal of the steel wires, the ZrO{sub 2}/BCP scaffold and bundles of HAp-loaded PMMA-PCL formed an interconnected structure resembling the human bone. Its diameter, compressive strength and porosity were approximately 12 mm, 5 MPa and 70%, respectively, and the viability of MG-63 osteoblast-like cells was determined to be over 90% by the MTT (3-(4, 5-dimethylthiazol-2-yl)-2, 5-diphenyltetrazolium bromide) assay. This artificial bone shows excellent cytocompatibility and is a promising bone regeneration material.

  5. Architecture of conference control functions

    Science.gov (United States)

    Kausar, Nadia; Crowcroft, Jon

    1999-11-01

    Conference control is an integral part in many-to-many communications that is used to manage and co-ordinate multiple users in conferences. There are different types of conferences which require different types of control. Some of the features of conference control may be user invoked while others are for internal management of a conference. In recent years, ITU (International Telecommunication Union) and IETF (Internet Engineering Task Force) have standardized two main models of conferencing, each system providing a set of conference control functionalities that are not easily provided in the other one. This paper analyzes the main activities appropriate for different types of conferences and presents an architecture for conference control called GCCP (Generic Conference Control Protocol). GCCP interworks different types of conferencing and provides a set of conference control functions that can be invoked by users directly. As an example of interworking, interoperation of IETF's SIP and ITU's H.323 call control functions have been examined here. This paper shows that a careful analysis of a conferencing architecture can provide a set of control functions essential for any group communication model that can be extensible if needed.

  6. Three-dimensional morphological characterization of the skin surface micro-topography using a skin replica and changes with age.

    Science.gov (United States)

    Masuda, Y; Oguri, M; Morinaga, T; Hirao, T

    2014-08-01

    Skin surface micro-topography (SSMT), consisting of pores, ridges and furrows, reflects the skin condition and is an important factor determining the aesthetics of the skin. Most previous studies evaluating SSMT have employed two-dimensional image analysis of magnified pictures captured by a video microscope. To improve the accuracy of SSMT analysis, we established a three-dimensional (3D) analysis method for SSMT and developed various parameters including the skin ridge number, and applied the method to study the age-dependent change in skin. Confocal laser scanning microscopy was used for 3D measurement of the surface morphology of silicon replicas taken from the cheek. We then used these data to calculate the parameters that reflect the nature of SSTM including the skin ridge number using originally developed software. Employing a superscription technique, we investigated the variation in SSMT with age for replicas taken from the cheeks of 103 Japanese females (5-85 years old). The skin surface area and roughness, the area of pores, the area, length, depth and width of skin furrows and the number of skin ridges were examined. The surface roughness, the area of pores and the depth of skin furrows increased with age. The area and length of skin furrows and the number of skin ridges decreased with age. The method proposed to analyse SSMT three dimensionally is an effective tool with which to characterize the condition of the skin. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. The discipline of architecture and freedom of spirit: Architect Ivan Antić

    Directory of Open Access Journals (Sweden)

    Milašinović-Marić Dijana

    2005-01-01

    Full Text Available Each epoch has its distinct characters who, by their talent, creativity knowledge and skills, manage to surmount the limiting context and come up as individuals. One among such creators is the architect – Ivan Antić, Member of Academy, who in his design projects and with his distinct architectural credo managed to express the ideals of time in which he was creating. His buildings – masterpieces of the Serbian modern architecture, e.g. Museum of the modern art located at Ušće in Belgrade; Museum for victims of execution in Šumarice sport arenas, public and business edifices, all of them represent unique architectural achievements. Those building forms are both functional and aesthetical, almost as a sublimate in fulfillment of content and construction requirements.

  8. A resilient and secure software platform and architecture for distributed spacecraft

    Science.gov (United States)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  9. Real-life IT architecture design reports and their relation to IEEE Std 1471 stakeholders and concerns

    NARCIS (Netherlands)

    van Vliet, H.; Koning, H.

    2006-01-01

    Architectural designs are an important means to manage the development and deployment of information technology (IT). Much debate has been going on about a proper definition of architecture in IT and about how to describe it. In 2000, the IEEE Std 1471 proposed a model of an architecture description

  10. Web-Based Course Management and Web Services

    Science.gov (United States)

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  11. An Enterprise Security Program and Architecture to Support Business Drivers

    OpenAIRE

    Brian Ritchot

    2013-01-01

    This article presents a business-focused approach to developing and delivering enterprise security architecture that is focused on enabling business objectives while providing a sensible and balanced approach to risk management. A balanced approach to enterprise security architecture can create the important linkages between the goals and objectives of a business, and it provides appropriate measures to protect the most critical assets within an organization while accepting risk where appropr...

  12. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  13. A proposal for an SDN-based SIEPON architecture

    Science.gov (United States)

    Khalili, Hamzeh; Sallent, Sebastià; Piney, José Ramón; Rincón, David

    2017-11-01

    Passive Optical Network (PON) elements such as Optical Line Terminal (OLT) and Optical Network Units (ONUs) are currently managed by inflexible legacy network management systems. Software-Defined Networking (SDN) is a new networking paradigm that improves the operation and management of networks. In this paper, we propose a novel architecture, based on the SDN concept, for Ethernet Passive Optical Networks (EPON) that includes the Service Interoperability standard (SIEPON). In our proposal, the OLT is partially virtualized and some of its functionalities are allocated to the core network management system, while the OLT itself is replaced by an OpenFlow (OF) switch. A new MultiPoint MAC Control (MPMC) sublayer extension based on the OpenFlow protocol is presented. This would allow the SDN controller to manage and enhance the resource utilization, flow monitoring, bandwidth assignment, quality-of-service (QoS) guarantees, and energy management of the optical network access, to name a few possibilities. The OpenFlow switch is extended with synchronous ports to retain the time-critical nature of the EPON network. OpenFlow messages are also extended with new functionalities to implement the concept of EPON Service Paths (ESPs). Our simulation-based results demonstrate the effectiveness of the new architecture, while retaining a similar (or improved) performance in terms of delay and throughput when compared to legacy PONs.

  14. Staged Event-Driven Architecture As A Micro-Architecture Of Distributed And Pluginable Crawling Platform

    Directory of Open Access Journals (Sweden)

    Leszek Siwik

    2013-01-01

    Full Text Available There are many crawling systems available on the market but they are rather close systems dedicated for performing particular kind and class of tasks with predefined set of scope, strategy etc. In real life however there are meaningful groups of users (e.g. marketing, criminal or governmental analysts requiring not just a yet another crawling system dedicated for performing predefined tasks. They need rather easy-to-use, user friendly all-in-one studio for not only executing and running internet robots and crawlers, but also for (graphical (redefining and (recomposing crawlers according to dynamically changing requirements and use-cases. To realize the above-mentioned idea, Cassiopeia framework has been designed and developed. One has to remember, however, that enormous size and unimaginable structural complexity of WWW network are the reasons that, from a technical and architectural point of view, developing effective internet robots – and the more so developing a framework supporting graphical robots’ composition – becomes a really challenging task. The crucial aspect in the context of crawling efficiency and scalability is concurrency model applied. There are two the most typical concurrency management models i.e. classical concurrency based on the pool of threads and processes and event-driven concurrency. None of them are ideal approaches. That is why, research on alternative models is still conducted to propose efficient and convenient architecture for concurrent and distributed applications. One of promising models is staged event-driven architecture mixing to some extent both of above mentioned classical approaches and providing some additional benefits such as splitting application into separate stages connected by events queues – what is interesting taking requirements about crawler (recomposition into account. The goal of this paper is to present the idea and the PoC  implementation of Cassiopeia framework, with the special

  15. Trust-Management, Intrusion-Tolerance, Accountability, and Reconstitution Architecture (TIARA)

    Science.gov (United States)

    2009-12-01

    Tainting, tagged, metadata, architecture, hardware, processor, microkernel , zero-kernel, co-design 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF... microkernels (e.g., [27]) embraced the idea that it was beneficial to reduce the ker- nel, separating out services as separate processes isolated from...limited adoption. More recently Tanenbaum [72] notes the security virtues of microkernels and suggests the modern importance of security makes it

  16. Behavioural responses of dogs to asymmetrical tail wagging of a robotic dog replica.

    Science.gov (United States)

    Artelle, K A; Dumoulin, L K; Reimchen, T E

    2011-03-01

    Recent evidence suggests that bilateral asymmetry in the amplitude of tail wagging of domestic dogs (Canis familiaris) is associated with approach (right wag) versus withdrawal (left wag) motivation and may be the by-product of hemispheric dominance. We consider whether such asymmetry in motion of the tail, a crucial appendage in intra-specific communication in all canids, provides visual information to a conspecific leading to differential behaviour. To evaluate this, we experimentally investigated the approach behaviour of free-ranging dogs to the asymmetric tail wagging of a life-size robotic dog replica. Our data, involving 452 separate interactions, showed a significantly greater proportion of dogs approaching the model continuously without stopping when the tail wagged to the left, compared with a right wag, which was more likely to yield stops. While the results indicate that laterality of a wagging tail provides behavioural information to conspecifics, the responses are not readily integrated into the predicted behaviour based on hemispheric dominance.

  17. dSDiVN: a distributed Software-Defined Networking architecture for Infrastructure-less Vehicular Networks

    OpenAIRE

    Alioua, Ahmed; Senouci, Sidi-Mohammed; Moussaoui, Samira

    2017-01-01

    In the last few years, the emerging network architecture paradigm of Software-Defined Networking (SDN), has become one of the most important technology to manage large scale networks such as Vehicular Ad-hoc Networks (VANETs). Recently, several works have shown interest in the use of SDN paradigm in VANETs. SDN brings flexibility, scalability and management facility to current VANETs. However, almost all of proposed Software-Defined VANET (SDVN) architectures are infrastructure-based. This pa...

  18. Roofline model toolkit: A practical tool for architectural and program analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Yu Jung [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ligocki, Terry J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cordery, Matthew J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hall, Mary W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-18

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measure sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.

  19. DIRAC Data Management System

    CERN Document Server

    Smith, A C

    2007-01-01

    The LHCb experiment being built to utilize CERN’s flagship Large Hadron Collider will generate data to be analysed by a community of over 600 physicists worldwide. DIRAC, LHCb’s Workload and Data Management System, facilitates the use of underlying EGEE Grid resources to generate, process and analyse this data in the distributed environment. The Data Management System, presented here, provides real-time, data-driven distribution in accordance with LHCb’s Computing Model. The data volumes produced by the LHC experiments are unprecedented, rendering individual institutes and even countries, unable to provide the computing and storage resources required to make full use of the produced data. EGEE Grid resources allow the processing of LHCb data possible in a distributed fashion and LHCb’s Computing Model is based on this approach. Data Management in this environment requires reliable and high-throughput transfer of data, homogeneous access to storage resources and the cataloguing of data replicas, all of...

  20. Implementation of Model View Controller (Mvc) Architecture on Building Web-based Information System

    OpenAIRE

    'Uyun, Shofwatul; Ma'arif, Muhammad Rifqi

    2010-01-01

    The purpose of this paper is to introduce the use of MVC architecture in web-based information systemsdevelopment. MVC (Model-View-Controller) architecture is a way to decompose the application into threeparts: model, view and controller. Originally applied to the graphical user interaction model of input,processing and output. Expected to use the MVC architecture, applications can be built maintenance of moremodular, rusable, and easy and migrate. We have developed a management system of sch...

  1. IMPLEMENTATION OF MODEL VIEW CONTROLLER (MVC) ARCHITECTURE ON BUILDING WEB-BASED INFORMATION SYSTEM

    OpenAIRE

    'Uyun, Shofwatul; Ma'arif, Muhammad Rifqi

    2010-01-01

    The purpose of this paper is to introduce the use of MVC architecture in web-based information systemsdevelopment. MVC (Model-View-Controller) architecture is a way to decompose the application into threeparts: model, view and controller. Originally applied to the graphical user interaction model of input,processing and output. Expected to use the MVC architecture, applications can be built maintenance of moremodular, rusable, and easy and migrate. We have developed a management system of sch...

  2. Theoretical Perspectives of Enterprise Architecture for Technological Transformation

    DEFF Research Database (Denmark)

    Tambo, Torben

    2017-01-01

    The purpose of this article is to investigate the completeness of the theoretical foundations of Enterprise Architecture (EA) by reviewing four selected disciplines from Management of Technology (MOT). Often theory on EA is based on prior EA contributions or more distant contributions such as ser......The purpose of this article is to investigate the completeness of the theoretical foundations of Enterprise Architecture (EA) by reviewing four selected disciplines from Management of Technology (MOT). Often theory on EA is based on prior EA contributions or more distant contributions...... such as service science, semiotics, psycho-social constructs, business process analytics, and systems science. It is here argued that other theories might be more supportive to EA. The current article is based on a review of the MOT literature and a subsequent literature review within each of the four specialized...

  3. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  4. Open architecture design and approach for the Integrated Sensor Architecture (ISA)

    Science.gov (United States)

    Moulton, Christine L.; Krzywicki, Alan T.; Hepp, Jared J.; Harrell, John; Kogut, Michael

    2015-05-01

    Integrated Sensor Architecture (ISA) is designed in response to stovepiped integration approaches. The design, based on the principles of Service Oriented Architectures (SOA) and Open Architectures, addresses the problem of integration, and is not designed for specific sensors or systems. The use of SOA and Open Architecture approaches has led to a flexible, extensible architecture. Using these approaches, and supported with common data formats, open protocol specifications, and Department of Defense Architecture Framework (DoDAF) system architecture documents, an integration-focused architecture has been developed. ISA can help move the Department of Defense (DoD) from costly stovepipe solutions to a more cost-effective plug-and-play design to support interoperability.

  5. Agent-Oriented Privacy-Based Information Brokering Architecture for Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Abdulmutalib Masaud-Wahaishi

    2009-01-01

    Full Text Available Healthcare industry is facing a major reform at all levels—locally, regionally, nationally, and internationally. Healthcare services and systems become very complex and comprise of a vast number of components (software systems, doctors, patients, etc. that are characterized by shared, distributed and heterogeneous information sources with varieties of clinical and other settings. The challenge now faced with decision making, and management of care is to operate effectively in order to meet the information needs of healthcare personnel. Currently, researchers, developers, and systems engineers are working toward achieving better efficiency and quality of service in various sectors of healthcare, such as hospital management, patient care, and treatment. This paper presents a novel information brokering architecture that supports privacy-based information gathering in healthcare. Architecturally, the brokering is viewed as a layer of services where a brokering service is modeled as an agent with a specific architecture and interaction protocol that are appropriate to serve various requests. Within the context of brokering, we model privacy in terms of the entities ability to hide or reveal information related to its identities, requests, and/or capabilities. A prototype of the proposed architecture has been implemented to support information-gathering capabilities in healthcare environments using FIPA-complaint platform JADE.

  6. Agent-oriented privacy-based information brokering architecture for healthcare environments.

    Science.gov (United States)

    Masaud-Wahaishi, Abdulmutalib; Ghenniwa, Hamada

    2009-01-01

    Healthcare industry is facing a major reform at all levels-locally, regionally, nationally, and internationally. Healthcare services and systems become very complex and comprise of a vast number of components (software systems, doctors, patients, etc.) that are characterized by shared, distributed and heterogeneous information sources with varieties of clinical and other settings. The challenge now faced with decision making, and management of care is to operate effectively in order to meet the information needs of healthcare personnel. Currently, researchers, developers, and systems engineers are working toward achieving better efficiency and quality of service in various sectors of healthcare, such as hospital management, patient care, and treatment. This paper presents a novel information brokering architecture that supports privacy-based information gathering in healthcare. Architecturally, the brokering is viewed as a layer of services where a brokering service is modeled as an agent with a specific architecture and interaction protocol that are appropriate to serve various requests. Within the context of brokering, we model privacy in terms of the entities ability to hide or reveal information related to its identities, requests, and/or capabilities. A prototype of the proposed architecture has been implemented to support information-gathering capabilities in healthcare environments using FIPA-complaint platform JADE.

  7. Safety-Critical Partitioned Software Architecture: A Partitioned Software Architecture for Robotic

    Science.gov (United States)

    Horvath, Greg; Chung, Seung H.; Cilloniz-Bicchi, Ferner

    2011-01-01

    The flight software on virtually every mission currently managed by JPL has several major flaws that make it vulnerable to potentially fatal software defects. Many of these problems can be addressed by recently developed partitioned operating systems (OS). JPL has avoided adopting a partitioned operating system on its flight missions, primarily because doing so would require significant changes in flight software design, and the risks associated with changes of that magnitude cannot be accepted by an active flight project. The choice of a partitioned OS can have a dramatic effect on the overall system and software architecture, allowing for realization of benefits far beyond the concerns typically associated with the choice of OS. Specifically, we believe that a partitioned operating system, when coupled with an appropriate architecture, can provide a strong infrastructure for developing systems for which reusability, modifiability, testability, and reliability are essential qualities. By adopting a partitioned OS, projects can gain benefits throughout the entire development lifecycle, from requirements and design, all the way to implementation, testing, and operations.

  8. Impact of airborne particle size, acoustic airflow and breathing pattern on delivery of nebulized antibiotic into the maxillary sinuses using a realistic human nasal replica.

    Science.gov (United States)

    Leclerc, Lara; Pourchez, Jérémie; Aubert, Gérald; Leguellec, Sandrine; Vecellio, Laurent; Cottier, Michèle; Durand, Marc

    2014-09-01

    Improvement of clinical outcome in patients with sinuses disorders involves targeting delivery of nebulized drug into the maxillary sinuses. We investigated the impact of nebulization conditions (with and without 100 Hz acoustic airflow), particle size (9.9 μm, 2.8 μm, 550 nm and 230 nm) and breathing pattern (nasal vs. no nasal breathing) on enhancement of aerosol delivery into the sinuses using a realistic nasal replica developed by our team. After segmentation of the airways by means of high-resolution computed tomography scans, a well-characterized nasal replica was created using a rapid prototyping technology. A total of 168 intrasinus aerosol depositions were performed with changes of aerosol particle size and breathing patterns under different nebulization conditions using gentamicin as a marker. The results demonstrate that the fraction of aerosol deposited in the maxillary sinuses is enhanced by use of submicrometric aerosols, e.g. 8.155 ± 1.476 mg/L of gentamicin in the left maxillary sinus for the 2.8 μm particles vs. 2.056 ± 0.0474 for the 550 nm particles. Utilization of 100-Hz acoustic airflow nebulization also produced a 2- to 3-fold increase in drug deposition in the maxillary sinuses (e.g. 8.155 ± 1.476 vs. 3.990 ± 1.690 for the 2.8 μm particles). Our study clearly shows that optimum deposition was achieved using submicrometric particles and 100-Hz acoustic airflow nebulization with no nasal breathing. It is hoped that our new respiratory nasal replica will greatly facilitate the development of more effective delivery systems in the future.

  9. Information Systems’ Portfolio: Contributions of Enterprise and Process Architecture

    Directory of Open Access Journals (Sweden)

    Silvia Fernandes

    2017-09-01

    Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.

  10. HBIM and Virtual Tools: A New Chance to Preserve Architectural Heritage

    Directory of Open Access Journals (Sweden)

    Anna Osello

    2018-01-01

    Full Text Available Nowadays, architectural heritage is increasingly exposed to dangers due to natural disasters or human invasive actions. However, management and conservation represent crucial phases within the life cycle of historical buildings. Unfortunately, the complexity of conservation practices and the lack of knowledge of historic buildings are the cause of an inefficient recovering process in case of emergencies. To overcome this problem, this research aims to ensure the preservation of relevant information through the use of building information modeling (BIM methodology. By developing historic building information models (HBIMs, it is possible to enhance the architectural heritage. This represents an opportunity to incorporate digital media into the global heritage conservation field. To achieve this goal, a historical castle was selected as a case study; this unique piece of architecture is located in the Piedmont Region, close to city of Turin (Italy. The results show a direct relation between a historical digital model, finalized to the management of architectural and system components, and visualization tools. To conclude, the adoption of this strategy is an effective way to preserve and consult information using advanced visualization techniques based on augmented and virtual reality (AR and VR.

  11. Nuclear architecture and landscape: the power plant creates the site

    International Nuclear Information System (INIS)

    Parent, Claude; Bouvier, Yves

    2005-01-01

    The implementation, from 1974, of the French nuclear programme, was associated with an 'Architecture Plan' requested by Michel Hug, Equipment Manager at power utility EDF. The objective was to create an architecture language specific to nuclear power. Far from trying to hide the nuclear power stations, the nuclear architecture college conversely designed one set of ambitious and powerful shapes. Systematically associated to one landscape and to one colourist, the architect sought to use in the best possible way the potentialities available on one site. The power station should not blend in with the landscape, but on the contrary, participate in the creation of a fresh landscape

  12. Analysis of Architecture Pattern Usage in Legacy System Architecture Documentation

    NARCIS (Netherlands)

    Harrison, Neil B.; Avgeriou, Paris

    2008-01-01

    Architecture patterns are an important tool in architectural design. However, while many architecture patterns have been identified, there is little in-depth understanding of their actual use in software architectures. For instance, there is no overview of how many patterns are used per system or

  13. Measurements of hadron yields from the T2K replica target in the NA61/SHINE experiment for neutrino flux prediction in T2K

    CERN Document Server

    AUTHOR|(CDS)2086777

    T2K is an accelerator-based long-baseline neutrino experiment in Japan. The main goal of the T2K experiment is a search for CP violation in the lepton sector by measuring electron (anti)neutrino appearance in a muon (anti)neutrino beam. Initial (anti)neutrino flux is produced in decays of hadrons which originate from the interactions and the re-interactions of a $30\\:$GeV proton beam with a $90\\:$cm long graphite target. Knowledge of the T2K neutrino flux is limited due to large hadron production uncertainties. A series of hadron production measurements were done to solve this problem, in the NA61/SHINE experiment at CERN. Measurements were performed with a proton beam and two target types: a thin graphite target and a replica of the T2K target. Work presented in this thesis concentrates on the T2K replica target data taken in 2010 and the development of the analysis and calibration software. The aim of these measurements is to fully constrain production of $\\pi^+$, $\\pi^-$, $K^+$, $K^-$ and $p$ coming from t...

  14. On the Architectural Engineering Competences in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    2007-01-01

    In 1997 a new education in Architecture & Design at Department of Architecture and Design, Aalborg University was started with 50 students. During the recent years this number has increased to approximately 100 new students each year, i.e. approximately 500 students are following the 3 years...... bachelor (BSc) and the 2 years master (MSc) programme. The first 5 semesters are common for all students followed by 5 semesters with specialization into Architectural Design, Urban Design, Industrial Design or Digital Design. The present paper gives a short summary of the architectural engineering...

  15. Agent-based Personal Network (PN) service architecture

    DEFF Research Database (Denmark)

    Jiang, Bo; Olesen, Henning

    2004-01-01

    In this paper we proposte a new concept for a centralized agent system as the solution for the PN service architecture, which aims to efficiently control and manage the PN resources and enable the PN based services to run seamlessly over different networks and devices. The working principle...

  16. Measurements of charged pion differential yields from the surface of the T2K replica target for incoming 31 GeV/c protons with the NA61/SHINE spectrometer at the CERN SPS

    CERN Document Server

    Abgrall, N.; Ajaz, M.; Ali, Y.; Andronov, E.; Anticic, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blümer, J.; Bogomilov, M.; Brandin, A.; Bravar, A.; Brzychczyk, J.; Bunyatov, S.A.; Busygina, O.; Christakoglou, P.; Cirkovic, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Deveaux, M.; Diakonos, F.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Feofilov, G.A.; Fodor, Z.; Garibov, A.; Gazdzicki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hervé, A.E.; Hierholzer, M.; Igolkin, S.; Ivashkin, A.; Johnson, S.R.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kisiel, J.; Kobayashi, T.; Kolesnikov, V.I.; Kolev, D.; Kondratiev, V.P.; Korzenev, A.; Kowalik, K.; Kowalski, S.; Koziel, M.; Krasnoperov, A.; Kuich, M.; Kurepin, A.; Larsen, D.; László, A.; Lewicki, M.; Lyubushkin, V.V.; Mackowiak-Pawłowska, M.; Maksiak, B.; Malakhov, A.I.; Manic, D.; Marcinek, A.; Marino, A.D.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G.L.; Messerly, B.; Mills, G.B.; Morozov, S.; Mrówczynski, S.; Nagai, Y.; Nakadaira, T.; Naskret, M.; Nirkko, M.; Nishikawa, K.; Panagiotou, A.D.; Paolone, V.; Pavin, M.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Popov, B.A.; Posiadała-Zezula, M.; Puławski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Was, E.; Robert, A.; Röhrich, D.; Rondio, E.; Roth, M.; Rubbia, A.; Rumberger, B.T.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Sarnecki, R.; Schmidt, K.; Sekiguchi, T.; Selyuzhenkov, I.; Seryakov, A.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Słodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Ströbele, H.; Šuša, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tefelska, A.; Tefelski, D.; Tereshchenko, V.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V.V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszynski, O.; Yarritu, K.; Zambelli, L.; Zimmerman, E.D.; Friend, M.; Galymov, V.; Hartz, M.; Hiraki, T.; Ichikawa, A.; Kubo, H.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Tzanov, M.; Yu, M.

    2016-01-01

    Measurements of particle emission from a replica of the T2K 90 cm-long carbon target were performed in the NA61/SHINE experiment at CERN SPS, using data collected during a high-statistics run in 2009. An efficient use of the long-target measurements for neutrino flux predictions in T2K requires dedicated reconstruction and analysis techniques. Fully-corrected differential yields of charged pions from the surface of the T2K replica target for incoming 31 GeV/c protons are presented. A possible strategy to implement these results into the T2K neutrino beam predictions is discussed and the propagation of the uncertainties of these results to the final neutrino flux is performed

  17. Advanced parallel processing with supercomputer architectures

    International Nuclear Information System (INIS)

    Hwang, K.

    1987-01-01

    This paper investigates advanced parallel processing techniques and innovative hardware/software architectures that can be applied to boost the performance of supercomputers. Critical issues on architectural choices, parallel languages, compiling techniques, resource management, concurrency control, programming environment, parallel algorithms, and performance enhancement methods are examined and the best answers are presented. The authors cover advanced processing techniques suitable for supercomputers, high-end mainframes, minisupers, and array processors. The coverage emphasizes vectorization, multitasking, multiprocessing, and distributed computing. In order to achieve these operation modes, parallel languages, smart compilers, synchronization mechanisms, load balancing methods, mapping parallel algorithms, operating system functions, application library, and multidiscipline interactions are investigated to ensure high performance. At the end, they assess the potentials of optical and neural technologies for developing future supercomputers

  18. Human skulls with turquoise inlays: pre hispanic origin or replicas?

    International Nuclear Information System (INIS)

    Silva V, Y.; Castillo M, M.T.; Bautista M, J.P.; Arenas A, J.

    2006-01-01

    The lack of archaeological context determining if the manufacture of two human skulls adorned with turquoise inlays have pre-Columbian origin or not (replicas), led to perform other studies. Under these conditions, besides orthodox methodology commonly used to assign chronology and cultural aspects as form, style, decoration, iconography, etc., it was necessary to obtain more results based on the use of characterization techniques. The techniques employed were Scanning Electron Microscopy (SEM), X-Ray Energy Dispersive Spectroscopy (EDS), Transmission Electron Microscopy (TEM) and Fourier Transform Infrared Spectroscopy (FTIR), in order to determine the manufacture techniques and chemical composition of the materials used for the cementant. SEM analysis showed the presence of zones composed by Ca, O, C and Al. In some cases Mg, Cl, Fe and Pb were identified. High concentration of Cu was present in all samples, due to residues of turquoise inlays (CuAI 6 (PO 4 ) 4 (OH) 8 (H 2 O) 4 ) with which the skulls were decorated. In the cementant was identified the Ca as base element of the cementant, as well as particles < 100 nm with irregular morphology and other amorphous zones. FTIR spectrums indicated the presence of organic substances that could be used as agglutinating in the cementant. The current work shows a progress identifying involved techniques in the manufacturing of two human skulls with turquoise inlays. (Author)

  19. Replica Exchange Simulations of the Thermodynamics of Aβ Fibril Growth

    Science.gov (United States)

    Takeda, Takako; Klimov, Dmitri K.

    2009-01-01

    Abstract Replica exchange molecular dynamics and an all-atom implicit solvent model are used to probe the thermodynamics of deposition of Alzheimer's Aβ monomers on preformed amyloid fibrils. Consistent with the experiments, two deposition stages have been identified. The docking stage occurs over a wide temperature range, starting with the formation of the first peptide-fibril interactions at 500 K. Docking is completed when a peptide fully adsorbs on the fibril edge at the temperature of 380 K. The docking transition appears to be continuous, and occurs without free energy barriers or intermediates. During docking, incoming Aβ monomer adopts a disordered structure on the fibril edge. The locking stage occurs at the temperature of ≈360 K and is characterized by the rugged free energy landscape. Locking takes place when incoming Aβ peptide forms a parallel β-sheet structure on the fibril edge. Because the β-sheets formed by locked Aβ peptides are typically off-registry, the structure of the locked phase differs from the structure of the fibril interior. The study also reports that binding affinities of two distinct fibril edges with respect to incoming Aβ peptides are different. The peptides bound to the concave edge have significantly lower free energy compared to those bound on the convex edge. Comparison with the available experimental data is discussed. PMID:19167295

  20. Launch Vehicle Control Center Architectures

    Science.gov (United States)

    Watson, Michael D.; Epps, Amy; Woodruff, Van; Vachon, Michael Jacob; Monreal, Julio; Williams, Randall; McLaughlin, Tom

    2014-01-01

    This analysis is a survey of control center architectures of the NASA Space Launch System (SLS), United Launch Alliance (ULA) Atlas V and Delta IV, and the European Space Agency (ESA) Ariane 5. Each of these control center architectures have similarities in basic structure, and differences in functional distribution of responsibilities for the phases of operations: (a) Launch vehicles in the international community vary greatly in configuration and process; (b) Each launch site has a unique processing flow based on the specific configurations; (c) Launch and flight operations are managed through a set of control centers associated with each launch site, however the flight operations may be a different control center than the launch center; and (d) The engineering support centers are primarily located at the design center with a small engineering support team at the launch site.