WorldWideScience

Sample records for information processing architecture

  1. Photonic Architecture for Scalable Quantum Information Processing in Diamond

    Directory of Open Access Journals (Sweden)

    Kae Nemoto

    2014-08-01

    Full Text Available Physics and information are intimately connected, and the ultimate information processing devices will be those that harness the principles of quantum mechanics. Many physical systems have been identified as candidates for quantum information processing, but none of them are immune from errors. The challenge remains to find a path from the experiments of today to a reliable and scalable quantum computer. Here, we develop an architecture based on a simple module comprising an optical cavity containing a single negatively charged nitrogen vacancy center in diamond. Modules are connected by photons propagating in a fiber-optical network and collectively used to generate a topological cluster state, a robust substrate for quantum information processing. In principle, all processes in the architecture can be deterministic, but current limitations lead to processes that are probabilistic but heralded. We find that the architecture enables large-scale quantum information processing with existing technology.

  2. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  3. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  4. Information architecture: Standards adoption and retirement process service action plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this Service Action Plan is to announce, as well as provide, a high-level outline of a new Departmental process for the adoption and retirement of information technology standards. This process supports the implementation of a Department of Energy (DOE) Information Architecture. This plan was prepared with the Department of Energy information technology standards customers and stakeholders in mind. The process described in this plan will be serviced primarily by staff from the Office of the Deputy Assistant Secretary for Information Management with assistance from designated program and site Information Technology Standards Points of Contact. We welcome any comments regarding this new Departmental process and encourage the proposal of information technology standards for adoption or retirement.

  5. Information architecture: Standards adoption and retirement process service action plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this Service Action Plan is to announce, as well as provide, a high-level outline of a new Departmental process for the adoption and retirement of information technology standards. This process supports the implementation of a Department of Energy (DOE) Information Architecture. This plan was prepared with the Department of Energy information technology standards customers and stakeholders in mind. The process described in this plan will be serviced primarily by staff from the Office of the Deputy Assistant Secretary for Information Management with assistance from designated program and site Information Technology Standards Points of Contact. We welcome any comments regarding this new Departmental process and encourage the proposal of information technology standards for adoption or retirement.

  6. Information Systems’ Portfolio: Contributions of Enterprise and Process Architecture

    Directory of Open Access Journals (Sweden)

    Silvia Fernandes

    2017-09-01

    Full Text Available We are witnessing a need for a quick and intelligent reaction from organizations to the level and speed of change in business processes.New information technologies and systems (IT/IS are challenging business models and products. One of the great shakes comes from the online and/or mobile apps and platforms.These are having a tremendous impact in launching innovative and competitive services through the combination of digital and physical features. This leads to actively rethink enterprise information systems’ portfolio, its management and suitability. One relevant way for enterprises to manage their IT/IS in order to cope with those challenges is enterprise and process architecture. A decision-making culture based on processes helps to understand and define the different elements that shape an organization and how those elements inter-relate inside and outside it. IT/IS portfolio management requires an increasing need of modeling data and process flows for better discerning and acting at its selection and alignment with business goals. The new generation of enterprise architecture (NGEA helps to design intelligent processes that answer quickly and creatively to new and challenging trends. This has to be open, agile and context-aware to allow well-designed services that match users’ expectations. This study includes two real cases/problems to solve quickly in companies and solutions are presented in line with this architectural approach.

  7. Integrated optics architecture for trapped-ion quantum information processing

    Science.gov (United States)

    Kielpinski, D.; Volin, C.; Streed, E. W.; Lenzini, F.; Lobino, M.

    2016-12-01

    Standard schemes for trapped-ion quantum information processing (QIP) involve the manipulation of ions in a large array of interconnected trapping potentials. The basic set of QIP operations, including state initialization, universal quantum logic, and state detection, is routinely executed within a single array site by means of optical operations, including various laser excitations as well as the collection of ion fluorescence. Transport of ions between array sites is also routinely carried out in microfabricated trap arrays. However, it is still not possible to perform optical operations in parallel across all array sites. The lack of this capability is one of the major obstacles to scalable trapped-ion QIP and presently limits exploitation of current microfabricated trap technology. Here we present an architecture for scalable integration of optical operations in trapped-ion QIP. We show theoretically that diffractive mirrors, monolithically fabricated on the trap array, can efficiently couple light between trap array sites and optical waveguide arrays. Integrated optical circuits constructed from these waveguides can be used for sequencing of laser excitation and fluorescence collection. Our scalable architecture supports all standard QIP operations, as well as photon-mediated entanglement channels, while offering substantial performance improvements over current techniques.

  8. Future of information architecture

    CERN Document Server

    Baofu, Peter

    2008-01-01

    The Future of Information Architecture examines issues surrounding why information is processed, stored and applied in the way that it has, since time immemorial. Contrary to the conventional wisdom held by many scholars in human history, the recurrent debate on the explanation of the most basic categories of information (eg space, time causation, quality, quantity) has been misconstrued, to the effect that there exists some deeper categories and principles behind these categories of information - with enormous implications for our understanding of reality in general. To understand this, the b

  9. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  10. Reframing information architecture

    CERN Document Server

    Resmini, Andrea

    2014-01-01

    Information architecture has changed dramatically since the mid-1990s and earlier conceptions of the world and the internet being different and separate have given way to a much more complex scenario in the present day. In the post-digital world that we now inhabit the digital and the physical blend easily and our activities and usage of information takes place through multiple contexts and via multiple devices and unstable, emergent choreographies. Information architecture now is steadily growing into a channel- or medium-specific multi-disciplinary framework, with contributions coming from a

  11. An architecture for distributed real-time large-scale information processing for intelligence analysis

    Science.gov (United States)

    Santos, Eugene, Jr.; Santos, Eunice E.; Santos, Eugene S.

    2004-04-01

    Given a massive and dynamic space of information (nuggets) and a query to be answered, how can the correct (answer) nuggets be retrieved in an effective and efficient manner? We present a large-scale distributed real-time architecture based on anytime intelligent foraging, gathering, and matching (I-FGM) on massive and dynamic information spaces. Simply put, we envision that when given a search query, large numbers of computational processes are alerted or activated in parallel to begin identifying and retrieving the appro-priate information nuggets. In particular, our approach aims to provide an anytime capa-bility which functions as follows: Given finite computational resources, I-FGM will pro-ceed to explore the information space and, over time, continuously identify and update promising candidate nugget, thus, good candidates will be available at anytime on re-quest. With the computational costs of evaluating the relevance of a candidate nugget, the anytime nature of I-FGM will provide increasing confidence on nugget selections over time by providing admissible partial evaluations. When a new promising candidate is identified, the current set of selected nuggets is re-evaluated and updated appropriately. Essentially, I-FGM will guide its finite computational resources in locating the target in-formation nuggets quickly and iteratively over time. In addition, the goal of I-FGM is to naturally handle new nuggets as they appear. A central element of our framework is to provide a formal computational model of this massive data-intensive problem.

  12. Advanced information processing system: The Army Fault-Tolerant Architecture detailed design overview

    Science.gov (United States)

    Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven

    1994-01-01

    The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.

  13. Aspects of Information Architecture involved in process mapping in Military Organizations under the semiotic perspective

    Directory of Open Access Journals (Sweden)

    Mac Amaral Cartaxo

    2016-04-01

    Full Text Available Introduction: The description of the processes to represent the activities in an organization has important call semiotic, It is the flowcharts of uses, management reports and the various forms of representation of the strategies used. The subsequent interpretation of the organization's employees involved in learning tasks and the symbols used to translate the meanings of management practices is essential role for the organization. Objective: The objective of this study was to identify evidence of conceptual and empirical, on aspects of information architecture involved in the mapping process carried out in military organizations under the semiotic perspective. Methodology: The research is characterized as qualitative, case study and the data collection technique was the semi-structured interview, applied to management advisors. Results: The main results indicate that management practices described with the use of pictorial symbols and different layouts have greater impact to explain the relevance of management practices and indicators. Conclusion: With regard to the semiotic appeal, it was found that the impact of a management report is significant due to the use of signs and layout that stimulate further reading by simplifying complex concepts in tables, diagrams summarizing lengthy descriptions.

  14. Development of a multitechnology FPGA: a reconfigurable architecture for photonic information processing

    Science.gov (United States)

    Mal, Prosenjit; Toshniwal, Kavita; Hawk, Chris; Bhadri, Prashant R.; Beyette, Fred R., Jr.

    2004-06-01

    Over the years, Field Programmable Gate Arrays (FPGAs) have made a profound impact on the electronics industry with rapidly improving semiconductor-manufacturing technology ranging from sub-micron to deep sub-micron processes and equally innovative CAD tools. Though FPGA has revolutionized programmable/reconfigurable digital logic technology, one limitation of current FPGA"s is that the user is limited to strictly electronic designs. Thus, they are not suitable for applications that are not purely electronic, such as optical communications, photonic information processing systems and other multi-technology applications (ex. analog devices, MEMS devices and microwave components). Over recent years, the growing trend has been towards the incorporation of non-traditional device technologies into traditional CMOS VLSI systems. The integration of these technologies requires a new kind of FPGA that can merge conventional FPGA technology with photonic and other multi-technology devices. The proposed new class of field programmable device will extend the flexibility, rapid prototyping and reusability benefits associated with conventional electronic into photonic and multi-technology domain and give rise to the development of a wider class of programmable and embedded integrated systems. This new technology will create a tremendous opportunity for applying the conventional programmable/reconfigurable hardware concepts in other disciplines like photonic information processing. To substantiate this novel architectural concept, we have fabricated proof-of-the-concept CMOS VLSI Multi-technology FPGA (MT-FPGA) chips that include both digital field programmable logic blocks and threshold programmable photoreceivers which are suitable for sensing optical signals. Results from these chips strongly support the feasibility of this new optoelectronic device concept.

  15. Information architecture. Volume 4: Vision

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The Vision document marks the transition from definition to implementation of the Department of Energy (DOE) Information Architecture Program. A description of the possibilities for the future, supported by actual experience with a process model and tool set, points toward implementation options. The directions for future information technology investments are discussed. Practical examples of how technology answers the business and information needs of the organization through coordinated and meshed data, applications, and technology architectures are related. This document is the fourth and final volume in the planned series for defining and exhibiting the DOE information architecture. The targeted scope of this document includes DOE Program Offices, field sites, contractor-operated facilities, and laboratories. This document paints a picture of how, over the next 7 years, technology may be implemented, dramatically improving the ways business is conducted at DOE. While technology is mentioned throughout this document, the vision is not about technology. The vision concerns the transition afforded by technology and the process steps to be completed to ensure alignment with business needs. This goal can be met if those directing the changing business and mission-support processes understand the capabilities afforded by architectural processes.

  16. Enterprise Information Technology Architectures

    Science.gov (United States)

    2000-05-01

    ARCHITECTURES OPR: HQ AFCA/ ITAI (Mr. Kenneth Fore) Certified by: HQ USAF/SCXX (Lt Col Terry G. Pricer, Sr.) Pages: 15 Distribution: F This...coordinating their drafts with Headquarters Air Force Communications Agency (HQ AFCA/ ITAI ), 203 W. Losey Street, Room 1065, Scott AFB IL 62225-5224. Send...Communications and Information Center (HQ AFCIC/ ITAI ), 1250 Air Force Pentagon, Washington DC 20330-1250. Use AF Form 847, Recommendation for Change

  17. Information architecture. Volume 3: Guidance

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The purpose of this document, as presented in Volume 1, The Foundations, is to assist the Department of Energy (DOE) in developing and promulgating information architecture guidance. This guidance is aimed at increasing the development of information architecture as a Departmentwide management best practice. This document describes departmental information architecture principles and minimum design characteristics for systems and infrastructures within the DOE Information Architecture Conceptual Model, and establishes a Departmentwide standards-based architecture program. The publication of this document fulfills the commitment to address guiding principles, promote standard architectural practices, and provide technical guidance. This document guides the transition from the baseline or defacto Departmental architecture through approved information management program plans and budgets to the future vision architecture. This document also represents another major step toward establishing a well-organized, logical foundation for the DOE information architecture.

  18. Ontology-driven health information systems architectures.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank

    2009-01-01

    Following an architecture vision such as the Generic Component Model (GCM) architecture framework, health information systems for supporting personalized care have to be based on a component-oriented architecture. Representing concepts and their interrelations, the GCM perspectives system architecture, domains, and development process can be described by the domains' ontologies. The paper introduces ontology principles, ontology references to the GCM as well as some practical aspects of ontology-driven approaches to semantically interoperable and sustainable health information systems.

  19. Information architecture. Volume 1, The foundations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    The Information Management Planning and Architecture Coordinating Team was formed to establish an information architecture framework to meet DOE`s current and future information needs. This department- wide activity was initiated in accordance with the DOE Information Management Strategic Plan; it also supports the Departmental Strategic Plan. It recognizes recent changes in emphasis as reflected in OMB Circular A-130 and the Information Resources Management Planning Process Improvement Team recommendations. Sections of this document provides the foundation for establishing DOE`s Information Architecture: Background, Business Case (reduced duplication of effort, increased integration of activities, improved operational capabilities), Baseline (technology baseline currently in place within DOE), Vision (guiding principles for future DOE Information Architecture), Standards Process, Policy and Process Integration (describes relations between information architecture and business processes), and Next Steps. Following each section is a scenario. A glossary of terms is provided.

  20. Information architecture for an economy of information

    Directory of Open Access Journals (Sweden)

    Mac Amaral Cartaxo

    2017-04-01

    Full Text Available Introduction: The information is intended as a basic raw material, which depend on the processes for decision or learning in the organizational context. Thus the information instead of constituting a primary assumption about the economy is actually more than that, to the extent that it is desired as well by any other economic agents. Objective: The objective of this study was to identify evidence of the conceptual nature of the heterogeneity of information as an important phenomenon that produces a significant impact on the market balance and the welfare of economic agents. Methodology: The research is characterized as a review of literature in the fields of information science, specifically with respect to information architecture, contrasting with the Economy to establish a parallel about the so-called information economy. Results: The main results indicate that having an Information Architecture that enables the ostensible information, enables easy access to knowledge sought by those who demand, is due to its clarity, connectivity, contextualized content to the level of knowledge of users, enabling organizations to maximize the effectiveness of decisions. Conclusion: With regard to the use of an information architecture within an information economy, it is seen that consumers increase the level of utility as the translated information of cheap, light and easily knowledge reduces the degree of dispersion and price volatility.

  1. Advanced information processing system: The Army fault tolerant architecture conceptual study. Volume 2: Army fault tolerant architecture design and analysis

    Science.gov (United States)

    Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.

    1992-01-01

    Described here is the Army Fault Tolerant Architecture (AFTA) hardware architecture and components and the operating system. The architectural and operational theory of the AFTA Fault Tolerant Data Bus is discussed. The test and maintenance strategy developed for use in fielded AFTA installations is presented. An approach to be used in reducing the probability of AFTA failure due to common mode faults is described. Analytical models for AFTA performance, reliability, availability, life cycle cost, weight, power, and volume are developed. An approach is presented for using VHSIC Hardware Description Language (VHDL) to describe and design AFTA's developmental hardware. A plan is described for verifying and validating key AFTA concepts during the Dem/Val phase. Analytical models and partial mission requirements are used to generate AFTA configurations for the TF/TA/NOE and Ground Vehicle missions.

  2. ARCHITECTURE INFORMS HISTORY

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Clusters of ancient architecture in central China have recently been entered on the world heritage list A group of ancient architecture in Dengfeng,central China’s Henan Province,was added to the world heritage list at the 34th session of the World Heritage Committee in Brazil on August 1 this year.The architectural collection is China’s 39th property inscribed on the list,and the third world heritage site in the province after the Longmen Grottoes and Yinxu in Anyang,site of the capital of the late Shang Dynasty(1600-1046 B.C.).

  3. ARCHITECTURE INFORMS HISTORY

    Institute of Scientific and Technical Information of China (English)

    ZAN JIFANG

    2010-01-01

    @@ Agroup of ancient architecture in Dengfeng, central China's Henan Province, was added to the world heritage list at the 34th session of the World Heritage Committee in Brazil on August 1 this year. The architectural col-lection is China's 39th property inscribed on the list, and the third world heritage site in the province after the Longmen Grottoes and Yinxu in Anyang, site of the capital of the late Shang Dynasty (1600-1046 B.C.).

  4. The Impact of Building Information Modeling on the Architectural Design Process

    Science.gov (United States)

    Moreira, P. F.; Silva, Neander F.; Lima, Ecilamar M.

    Many benefits of Building Information Modeling, BIM, have been suggested by several authors and by software vendors. In this paper we describe an experiment in which two groups of designers were observed developing an assigned design task. One of the groups used a BIM system, while the other used a standard computer-aided drafting system. The results show that some of the promises of BIM hold true, such as consistency maintenance and error avoidance in the design documentation process. Other promises such as changing the design process itself seemed also promising but they need more research to determine to greater extent the depth of such changes.

  5. On Information System Security Architecture

    Institute of Scientific and Technical Information of China (English)

    ChunfangJiang; ChaoyuanYue; JianguoZuo

    2004-01-01

    The current studies on security architecture and information system security architecture (ISSA) are surveyed in this paper, and some types and their features of ISSA are discussed. Then, the structural elements of ISSA are analyzed, and the constructing steps for ISSA are proposed.

  6. Assured information flow capping architecture

    Science.gov (United States)

    Black, M. D.; Carvin, N. A.

    1985-05-01

    The Tactical Air Control System (TACS) is that set of Tactical Air Force assets used to assess the air and ground situation, and to plan, allocate, commit, and control assigned resources. Previous studies noted that the TACS elements should be more highly distributed to improve survivability in the battlefield of the future. This document reports on the results of the Assured Information Flow Capping Architecture study, which developed governing concepts for communications architectures that can support the information flow requirements of a future, distributed TACS. Architecture comprising existing and planned communications equipment were postulated and compared with a set of goals to identify deficiencies. Architectures using new equipment that resolve many of the deficiencies were then postulated, and areas needing further investigation were identified.

  7. RASSP signal processing architectures

    Science.gov (United States)

    Shirley, Fred; Bassett, Bob; Letellier, J. P.

    1995-06-01

    display. This paper discusses the impact of simulation on choosing signal processing algorithms and architectures, drawing from the experiences of the Demonstration and Benchmark inter-company teams at Lockhhed Sanders, Motorola, Hughes, and ISX.

  8. The Architectural Information Map: Semantic modeling in conceptual architectural design

    NARCIS (Netherlands)

    Tunçer, E.B.

    2009-01-01

    This research focuses on the acquisition, representation, sharing and reuse of design information and knowledge in the conceptual phase of architectural design, and targets the creation of situated digital environments where communities of architectural practice communicate and collaborate using thi

  9. The Architectural Information Map: Semantic modeling in conceptual architectural design

    NARCIS (Netherlands)

    Tunçer, E.B.

    2009-01-01

    This research focuses on the acquisition, representation, sharing and reuse of design information and knowledge in the conceptual phase of architectural design, and targets the creation of situated digital environments where communities of architectural practice communicate and collaborate using

  10. Information Architectures for Information Sharing Management — A Literature Review

    OpenAIRE

    Shuyan Xie; Markus Helfert; Lukasz Ostrowski

    2012-01-01

    The struggle for commercial supremacy through information is being fought on two points: Information management and enabling technologies. Over the last years, there has been an increasing focus on information architecture (IA) to help organisations distinguish and manage information as corporate resource. As the information complexity increases, more IA studies show that IA could provide the structural and process design to facilitate enterprise interoperation under the information sharing e...

  11. Enterprise Information Architecture for Mission Development

    Science.gov (United States)

    Dutra, Jayne

    2007-01-01

    This slide presentation reviews the concept of an information architecture to assist in mission development. The integrate information architecture will create a unified view of the information using metadata and the values (i.e., taxonomy).

  12. Enterprise Information Architecture for Mission Development

    Science.gov (United States)

    Dutra, Jayne

    2007-01-01

    This slide presentation reviews the concept of an information architecture to assist in mission development. The integrate information architecture will create a unified view of the information using metadata and the values (i.e., taxonomy).

  13. The NASA Integrated Information Technology Architecture

    Science.gov (United States)

    Baldridge, Tim

    1997-01-01

    This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context

  14. Extensible packet processing architecture

    Science.gov (United States)

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  15. Extensible packet processing architecture

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  16. Pacific Missile Test Center Information Resources Management Organization (code 0300): The ORACLE client-server and distributed processing architecture

    Energy Technology Data Exchange (ETDEWEB)

    Beckwith, A. L.; Phillips, J. T.

    1990-06-10

    Computing architectures using distributed processing and distributed databases are increasingly becoming considered acceptable solutions for advanced data processing systems. This is occurring even though there is still considerable professional debate as to what truly'' distributed computing actually is and despite the relative lack of advanced relational database management software (RDBMS) capable of meeting database and system integrity requirements for developing reliable integrated systems. This study investigates the functionally of ORACLE data base management software that is performing distributed processing between a MicroVAX/VMS minicomputer and three MS-DOS-based microcomputers. The ORACLE database resides on the MicroVAX and is accessed from the microcomputers with ORACLE SQL*NET, DECnet, and ORACLE PC TOOL PACKS. Data gathered during the study reveals that there is a demonstrable decrease in CPU demand on the MicroVAX, due to distributed processing'', when the ORACLE PC Tools are used to access the database as opposed to database access from dumb'' terminals. Also discovered were several hardware/software constraints that must be considered in implementing various software modules. The results of the study indicate that this distributed data processing architecture is becoming sufficiently mature, reliable, and should be considered for developing applications that reduce processing on central hosts. 33 refs., 2 figs.

  17. Information Systems for Enterprise Architecture

    Directory of Open Access Journals (Sweden)

    Oswaldo Moscoso Zea

    2014-03-01

    Full Text Available (Received: 2014/02/14 - Accepted: 2014/03/25Enterprise Architecture (EA has emerged as one of the most important topics to consider in Information System studies and has grown to become an essential business management activity to visualize and evaluate the future direction of a company. Nowadays in the market there are several software tools that support Enterprise Architects to work with EA. In order to decrease the risk of purchasing software tools that do not fulfill stakeholder´s needs is important to assess the software before making an investment. In this paper a literature review of the state of the art of EA will be done. Furthermore evaluation initiatives and existing information systems are analyzed which can support decision makers in the appropriate software tools for their companies.

  18. The methodic of information transport from the business architecture to the software architecture

    Directory of Open Access Journals (Sweden)

    Ivana Rábová

    2004-01-01

    Full Text Available The business architecture creation is one of fundamental activities in business process reengineering and also in development of software product that supports the optimised business processes. One part of application development life cycle is modeling of software architecture (functions and data of application. There are two different models, model of business process and model of information system. This article deals with the value of using the same notation (such as UML to vizualization of the both models and it guidelines the information transfer from business architecture to software architecture.

  19. 数字图书馆的信息构建流程设计%Information Architecture Process of Digital Library

    Institute of Scientific and Technical Information of China (English)

    刘颖

    2011-01-01

    信息构建作为一种新的信息管理理论,已经越来越多的应用到数字图书馆建设中,也为数字图书馆的建设提供了新的思路。数字图书馆的信息构建流程分为调查分析、模型设计和运行测试三个步骤,并对每个步骤进行了详细分析。%Information architecture as a new theory of information management has been more and more applied in digital library construction and provides new thought for digital library construction. The information architecture process can be divided into three ste

  20. Information architecture: Profile of adopted standards

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    The Department of Energy (DOE), like other Federal agencies, is under increasing pressure to use information technology to improve efficiency in mission accomplishment as well as delivery of services to the public. Because users and systems have become interdependent, DOE has enterprise wide needs for common application architectures, communication networks, databases, security, and management capabilities. Users need open systems that provide interoperability of products and portability of people, data, and applications that are distributed throughout heterogeneous computing environments. The level of interoperability necessary requires the adoption of DOE wide standards, protocols, and best practices. The Department has developed an information architecture and a related standards adoption and retirement process to assist users in developing strategies and plans for acquiring information technology products and services based upon open systems standards that support application software interoperability, portability, and scalability. This set of Departmental Information Architecture standards represents guidance for achieving higher degrees of interoperability within the greater DOE community, business partners, and stakeholders. While these standards are not mandatory, particular and due consideration of their applications in contractual matters and use in technology implementations Department wide are goals of the Chief Information Officer.

  1. Deep Space Network information system architecture study

    Science.gov (United States)

    Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.

    1992-01-01

    The purpose of this article is to describe an architecture for the DSN information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990's. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies--i.e., computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.

  2. The Architecture of Information at Plateau Beaubourg

    Science.gov (United States)

    Branda, Ewan Edward

    2012-01-01

    During the course of the 1960s, computers and information networks made their appearance in the public imagination. To architects on the cusp of architecture's postmodern turn, information technology offered new forms, metaphors, and techniques by which modern architecture's technological and utopian basis could be reasserted. Yet by the…

  3. The Architecture of Information at Plateau Beaubourg

    Science.gov (United States)

    Branda, Ewan Edward

    2012-01-01

    During the course of the 1960s, computers and information networks made their appearance in the public imagination. To architects on the cusp of architecture's postmodern turn, information technology offered new forms, metaphors, and techniques by which modern architecture's technological and utopian basis could be reasserted. Yet by the end of…

  4. The Architecture of Information at Plateau Beaubourg

    Science.gov (United States)

    Branda, Ewan Edward

    2012-01-01

    During the course of the 1960s, computers and information networks made their appearance in the public imagination. To architects on the cusp of architecture's postmodern turn, information technology offered new forms, metaphors, and techniques by which modern architecture's technological and utopian basis could be reasserted. Yet by the end of…

  5. GPGPU Processing in CUDA Architecture

    CERN Document Server

    Ghorpade, Jayshree; Kulkarni, Madhura; Bawaskar, Amit

    2012-01-01

    The future of computation is the Graphical Processing Unit, i.e. the GPU. The promise that the graphics cards have shown in the field of image processing and accelerated rendering of 3D scenes, and the computational capability that these GPUs possess, they are developing into great parallel computing units. It is quite simple to program a graphics processor to perform general parallel tasks. But after understanding the various architectural aspects of the graphics processor, it can be used to perform other taxing tasks as well. In this paper, we will show how CUDA can fully utilize the tremendous power of these GPUs. CUDA is NVIDIA's parallel computing architecture. It enables dramatic increases in computing performance, by harnessing the power of the GPU. This paper talks about CUDA and its architecture. It takes us through a comparison of CUDA C/C++ with other parallel programming languages like OpenCL and DirectCompute. The paper also lists out the common myths about CUDA and how the future seems to be pro...

  6. The sustainable IT architecture resilient information systems

    CERN Document Server

    Bonnet, P

    2009-01-01

    This book focuses on Service Oriented Architecture (SOA), the basis of sustainable and more agile IT systems that are able to adapt themselves to new trends and manage processes involving a third party. The discussion is based on the public Praxeme method and features a number of examples taken from large SOA projects which were used to rewrite the information systems of an insurance company; as such, decision-makers, creators of IT systems, programmers and computer scientists, as well as those who will use these new developments, will find this a useful resource

  7. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  8. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  9. Information System Architectures: Representation, Planning and Evaluation

    Directory of Open Access Journals (Sweden)

    André Vasconcelos

    2003-12-01

    Full Text Available In recent years organizations have been faced with increasingly demanding business environments - pushed by factors like market globalization, need for product and service innovation and product life cycle reduction - and with new information technologies changes and opportunities- such as the Component-off-the-shelf paradigm, the telecommunications improvement or the Enterprise Systems off-the-shelf modules availability - all of which impose a continuous redraw and reorganization of business strategies and processes. Nowadays, Information Technology makes possible high-speed, efficient and low cost access to the enterprise information, providing the means for business processes automation and improvement. In spite of these important technological progresses, information systems that support business, do not usually answer efficiently enough to the continuous demands that organizations are faced with, causing non-alignment between business and information technologies (IT and therefore reducing organization competitive abilities. This article discusses the vital role that the definition of an Information System Architecture (ISA has in the development of Enterprise Information Systems that are capable of staying fully aligned with organization strategy and business needs. In this article the authors propose a restricted collection of founding and basis operations, which will provide the conceptual paradigm and tools for proper ISA handling. These tools are then used in order to represent, plan and evaluate an ISA of a Financial Group.

  10. Towards architectural information in implementation (NIER track)

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2011-01-01

    Agile development methods favor speed and feature producing iterations. Software architecture, on the other hand, is ripe with techniques that are slow and not oriented directly towards implementation of costumers’ needs. Thus, there is a major challenge in retaining architectural information...

  11. 中国健康信息数据集的一种本体处理架构%Ontology Processing Architecture for Chinese Healthcare Informative Data Set

    Institute of Scientific and Technical Information of China (English)

    刘晖; 林欣; 韦俊银

    2009-01-01

    Aiming at healthcare information data set does not support semantic integration, automatic processing and reasoning, it proposes a Knowledge Base(KBS) processing architecture based on ontology database in the mathematical fundamentals of Description Logics. It makes building digitalization of healthcare information bring into track of knowledge engineering, provides an astronomical, normative, dependable, maintainable KBS. It uses the data sets of the pre-marriage medical examination as a case reference, result shows that the processing architecture can implement the automatic disposal and reasoning of knowledge.%针对中国健康信息数据集规范不支持语义集成、自动处理和推理,在DescriptionLogics数学基础上,提出一种基于本体数据库的知识基(KBS)处理架构.该架构将健康信息数字化建设纳入知识工程轨道,提供一个庞大、规范、可靠、安全,可维护的KBS.以婚前体格检查数据集为案例说明,结果证明该架构可实现知识的自动处理和推理.

  12. GPGPU PROCESSING IN CUDA ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Jayshree Ghorpade

    2012-02-01

    Full Text Available The future of computation is the Graphical Processing Unit, i.e. the GPU. The promise that the graphicscards have shown in the field of image processing and accelerated rendering of 3D scenes, and thecomputational capability that these GPUs possess, they are developing into great parallel computingunits. It is quite simple to program a graphics processor to perform general parallel tasks. But afterunderstanding the various architectural aspects of the graphics processor, it can be used to performother taxing tasks as well. In this paper, we will show how CUDA can fully utilize the tremendous powerof these GPUs. CUDA is NVIDIA’s parallel computing architecture. It enables dramatic increases incomputing performance, by harnessing the power of the GPU. This paper talks about CUDA and itsarchitecture. It takes us through a comparison of CUDA C/C++ with other parallel programminglanguages like OpenCL and DirectCompute. The paper also lists out the common myths about CUDAand how the future seems to be promising for CUDA.

  13. ADaPPT: Enterprise Architecture Thinking for Information Systems Development

    Directory of Open Access Journals (Sweden)

    Hanifa Shah

    2011-01-01

    Full Text Available Enterprises have architecture: whether it is visible or invisible is another matter. An enterprises' architecture determines the way in which it works to deliver its business objectives and the way in which it can change to continue to meet its evolving business objectives. Enterprise architectural thinking can facilitate effective strategic planning and information systems development. This paper reviews enterprise architecture (EA and its concepts. It briefly considers EA frameworks. It describes the ADaPPT (Aligning Data, People, Processes and Technology EA approach as a means to managing organisational complexity and change. Future research directions are discussed.

  14. Algorithms, architectures and information systems security

    CERN Document Server

    Sur-Kolay, Susmita; Nandy, Subhas C; Bagchi, Aditya

    2008-01-01

    This volume contains articles written by leading researchers in the fields of algorithms, architectures, and information systems security. The first five chapters address several challenging geometric problems and related algorithms. These topics have major applications in pattern recognition, image analysis, digital geometry, surface reconstruction, computer vision and in robotics. The next five chapters focus on various optimization issues in VLSI design and test architectures, and in wireless networks. The last six chapters comprise scholarly articles on information systems security coverin

  15. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  16. Jupiter Europa Orbiter Architecture Definition Process

    Science.gov (United States)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  17. Information Architecture: The Data Warehouse Foundation.

    Science.gov (United States)

    Thomas, Charles R.

    1997-01-01

    Colleges and universities are initiating data warehouse projects to provide integrated information for planning and reporting purposes. A survey of 40 institutions with active data warehouse projects reveals the kinds of tools, contents, data cycles, and access currently used. Essential elements of an integrated information architecture are…

  18. Information Architecture: Sharing the Shareable Resource.

    Science.gov (United States)

    Vogel, Douglas R.; Wetherbe, James C.

    1991-01-01

    A methodology for developing a strategic, long-range organizational information architecture is described. The development of a plan to identify key categories of information to enhance decision making and operational productivity is detailed. Figures and tables detail the methodology's application at the University of Minnesota. (DB)

  19. Information Architecture the Design of Digital Information Spaces

    CERN Document Server

    Ding, Wei

    2009-01-01

    Information Architecture is about organizing and simplifying information, designing and integrating information spaces/systems, and creating ways for people to find and interact with information content. Its goal is to help people understand and manage information and make right decisions accordingly. In the ever-changing social, organizational and technological contexts, Information Architects not only design individual information spaces (e.g., individual websites, software applications, and mobile devices), but also tackle strategic aggregation and integration of multiple information spaces

  20. The architecture of enterprise hospital information system.

    Science.gov (United States)

    Lu, Xudong; Duan, Huilong; Li, Haomin; Zhao, Chenhui; An, Jiye

    2005-01-01

    Because of the complexity of the hospital environment, there exist a lot of medical information systems from different vendors with incompatible structures. In order to establish an enterprise hospital information system, the integration among these heterogeneous systems must be considered. Complete integration should cover three aspects: data integration, function integration and workflow integration. However most of the previous design of architecture did not accomplish such a complete integration. This article offers an architecture design of the enterprise hospital information system based on the concept of digital neural network system in hospital. It covers all three aspects of integration, and eventually achieves the target of one virtual data center with Enterprise Viewer for users of different roles. The initial implementation of the architecture in the 5-year Digital Hospital Project in Huzhou Central hospital of Zhejiang Province is also described.

  1. 论知识管理和信息构建%On Knowledge Management & Information Architecture

    Institute of Scientific and Technical Information of China (English)

    张新民; 梁战平

    2003-01-01

    The paper outlines the concept and implementation process of knowledge management, introduces a new domain of study in information science, that is, the concept, composition and process of information architecture, and explores the relationship between knowledge aumagement and information architecture.

  2. An Ontology-Based Representation Architecture of Unstructured Information

    Institute of Scientific and Technical Information of China (English)

    GU Jin-guang; CHEN He-ping; CHEN Xin-meng

    2004-01-01

    Integrating with the respective advantages of XML Schema and Ontology, this paper puts forward a semantic information processing architecture-OBSA to solve the problem of heterogeneity of information sources and uncertainty of semantic.It introduces an F-Logic based semantic information presentation mechanism, presents a design of an ontology-based semantic representation language and a mapping algorithm converting Ontology to XML DTD/Schema, and an adapter framework for accessing distributed and heterogeneous information.

  3. Information Architecture and Electronic Market Performance

    NARCIS (Netherlands)

    O.R. Koppius (Otto)

    2002-01-01

    textabstractElectronic markets are one of the most prominent business applications of the Internet, so determining the factors that drive their performance is of great value. This thesis shows that an important driver of electronic market performance is the information architecture of the market, wh

  4. BADD phase II: DDS information management architecture

    Science.gov (United States)

    Stephenson, Thomas P.; DeCleene, Brian T.; Speckert, Glen; Voorhees, Harry L.

    1997-06-01

    The DARPA Battlefield Awareness and Data Dissemination (BADD) Phase II Program will provide the next generation multimedia information management architecture to support the warfighter. One goal of this architecture is proactive dissemination of information to the warfighter through strategies such as multicast and 'smart push and pull' designed to minimize latency and make maximum use of available communications bandwidth. Another goal is to support integration of information from widely distributed legacy repositories. This will enable the next generation of battlefield awareness applications to form a common operational view of the battlefield to aid joint service and/or multi-national peacekeeping forces. This paper discusses the approach we are taking to realize such an architecture for BADD. Our architecture and its implementation, known as the Distributed Dissemination Serivces (DDS) are based on two key concepts: a global database schema and an intelligent, proactive caching scheme. A global schema provides a common logical view of the information space in which the warfighter operates. This schema (or subsets of it) is shared by all warfighters through a distributed object database providing local access to all relevant metadata. This approach provides both scalability to a large number of warfighters, and it supports tethered as well as autonomous operations. By utilizing DDS information integration services that provide transparent access to legacy databases, related information from multiple 'stovepipe' systems are now available to battlefield awareness applications. The second key concept embedded in our architecture is an intelligent, hierarchical caching system supported by proactive dissemination management services which push both lightweight and heavyweight data such as imagery and video to warfighters based on their information profiles. The goal of this approach is to transparently and proactively stage data which is likely to be requested by

  5. The architecture of the management system of complex steganographic information

    Science.gov (United States)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  6. Trust-based information system architecture for personal wellness.

    Science.gov (United States)

    Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd

    2014-01-01

    Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.

  7. Information security considerations in open systems architectures

    Energy Technology Data Exchange (ETDEWEB)

    Klein, S.A. (Atlantic Research Corp., Rockville, MD (United States)); Menendez, J.N. (Atlantic Research Corp., Hanover, MD (United States))

    1993-02-01

    This paper is part of a series of papers invited by the IEEE POWER CONTROL CENTER WORKING GROUP concerning the changing designs of modern control centers. Papers invited by the Working Group discuss the following issues: Benefits of Openness, Criteria for Evaluating Open EMS Systems, Hardware Design, Configuration Management, Security, Project Management, Data Bases, SCADA, Inter and Intra-System Communications, and Man Machine Interfaces.'' This paper discusses information security and issues related to its achievement in open systems architectures. Beginning with a discussion of the goals of information security and their relation to open systems, the paper provides examples of the threats to electric utility computer systems and the consequences associated with these threats, presents basic countermeasures applicable to all computer systems, and discusses issues specific to open systems architectures.

  8. A new architecture for enterprise information systems.

    Science.gov (United States)

    Covvey, H D; Stumpf, J J

    1999-01-01

    Irresistible economic and technical forces are forcing healthcare institutions to develop regionalized services such as consolidated or virtual laboratories. Technical realities, such as the lack of an enabling enterprise-level information technology (IT) integration infrastructure, the existence of legacy systems, and non-existent or embryonic enterprise-level IT services organizations, are delaying or frustrating the achievement of the desired configuration of shared services. On attempting to address this matter, we discover that the state-of-the-art in integration technology is not wholly adequate, and itself becomes a barrier to the full realization of shared healthcare services. In this paper we report new work from the field of Co-operative Information Systems that proposes a new architecture of systems that are intrinsically cooperation-enabled, and we extend this architecture to both the regional and national scales.

  9. The Information Architecture of Behavior Change Websites

    OpenAIRE

    2005-01-01

    The extraordinary growth in Internet use offers researchers important new opportunities to identify and test new ways to deliver effective behavior change programs. The information architecture (IA)—the structure of website information—is an important but often overlooked factor to consider when adapting behavioral strategies developed in office-based settings for Web delivery. Using examples and relevant perspectives from multiple disciplines, we describe a continuum of website IA designs ra...

  10. PESOI: Process Embedded Service-Oriented Architecture

    Institute of Scientific and Technical Information of China (English)

    Wei-Tek Tsai; Yinong Chen; Chun Fan

    2006-01-01

    Service-Oriented Architecture (SOA) has drawn significant attention recently, and numerous architecture approaches have been proposed to represent SOA-based applications. The architecture of SOA-based applications is different from traditional software architecture, which is mainly static. The architecture of an SOA-based application is dynamic, I.e., the application can be composed at runtime using existing services, and thus the architecture is really determined at runtime, instead of design time. SOA applications have provided a new direction for software architecture study, where the architecture can be dynamically changed at runtime to meet the new application requirements. This paper proposes a Process-Embedded Service-Oriented Infrastructure to build SOA-based applications. This infrastructure embeds the entire software lifecycle management and service-oriented system engineering into the application developed on this infrastructure. Thus, the users can easily re-develop the applications during operation to meet the changing environments and requirements, through the supports provided by the embedded infrastructure.

  11. Architecture for Survivable System Processing (ASSP)

    Science.gov (United States)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  12. SUPPLY CHAIN INFORMATION INTEGRATION THROUGH SERVICE ORIENTED ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Igor Milanovic

    2013-12-01

    Full Text Available In recent years information integration became significant problem for both natural and legal persons in everyday operations. Huge amount of information are available, but insufficiently processed in order to have useful value. Choosing the right combination of tools and technologies for integration is prerequisite for requiring information from multiple heterogeneous sources and their qualitative and simple using after.In this paper, we have focused on information integration within companies which are parts of supply chain or network. This environment typically includes a various mix of sources, structured (such as relational or other databases, and unstructured (such as document repositories, spreadsheets, documents, web pages, emails and others. Effective information integration and sharing significantly enhances supply chain practices. Service oriented architecture (SOA is an architectural style for building software applications that use services available in a network such as the web. The use of SOA to achieve inter-enterprise supply network information integration has many advantages.

  13. A Layered Trust Information Security Architecture

    Directory of Open Access Journals (Sweden)

    Robson de Oliveira Albuquerque

    2014-12-01

    Full Text Available Information can be considered the most important asset of any modern organization. Securing this information involves preserving confidentially, integrity and availability, the well-known CIA triad. In addition, information security is a risk management job; the task is to manage the inherent risks of information disclosure. Current information security platforms do not deal with the different facets of information technology. This paper presents a layered trust information security architecture (TISA and its creation was motivated by the need to consider information and security from different points of view in order to protect it. This paper also extends and discusses security information extensions as a way of helping the CIA triad. Furthermore, this paper suggests information representation and treatment elements, operations and support components that can be integrated to show the various risk sources when dealing with both information and security. An overview of how information is represented and treated nowadays in the technological environment is shown, and the reason why it is so difficult to guarantee security in all aspects of the information pathway is discussed.

  14. A layered trust information security architecture.

    Science.gov (United States)

    de Oliveira Albuquerque, Robson; Villalba, Luis Javier García; Orozco, Ana Lucila Sandoval; Buiati, Fábio; Kim, Tai-Hoon

    2014-12-01

    Information can be considered the most important asset of any modern organization. Securing this information involves preserving confidentially, integrity and availability, the well-known CIA triad. In addition, information security is a risk management job; the task is to manage the inherent risks of information disclosure. Current information security platforms do not deal with the different facets of information technology. This paper presents a layered trust information security architecture (TISA) and its creation was motivated by the need to consider information and security from different points of view in order to protect it. This paper also extends and discusses security information extensions as a way of helping the CIA triad. Furthermore, this paper suggests information representation and treatment elements, operations and support components that can be integrated to show the various risk sources when dealing with both information and security. An overview of how information is represented and treated nowadays in the technological environment is shown, and the reason why it is so difficult to guarantee security in all aspects of the information pathway is discussed.

  15. The architecture of information in organisations

    Directory of Open Access Journals (Sweden)

    Tiko Iyamu

    2011-03-01

    Full Text Available Over the last two decades competition amongst organisations including financial institutions has increased tremendously. The value of information is critical to competition in different organisations. In addition, the management of cost of delivery and cohesiveness of information flow and use in the organisations continue a challenge to information technology (IT. In an attempt to address these challenges, many organisations sought various solutions, including enterprise information architecture (EIA. The EIA is intended to address the needs of the organisation for competitive advantage.This research article focused on the role of principles in the development and implementation of EIA. The article aimed to investigate how EIA could be best leveraged, exploited, or otherwise used to provide business value. The research brings about a fresh perspective and new methodological principles required in architecting the enterprise information.

  16. Information Architecture for Quality Management Support in Hospitals.

    Science.gov (United States)

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  17. An agile enterprise regulation architecture for health information security management.

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital.

  18. A multi-agent system architecture for geographic information gathering.

    Science.gov (United States)

    Gao, Gang-Yi; Wang, Shen-Kang

    2004-11-01

    World Wide Web (WWW) is a vast repository of information, including a great deal of geographic information. But the location and retrieval of geographic information will require a significant amount of time and effort. In addition, different users usually have different views and interests in the same information. To resolve such problems, this paper first proposed a model of geographic information gathering based on multi-Agent (MA) architecture. Then based on this model, we construct a prototype system with GML (Geography Markup Language). This system consists of three tiers-Client, Web Server and Data Resource. Finally, we expatiate on the process of Web Server.

  19. A multi-Agent system architecture for geographic information gathering

    Institute of Scientific and Technical Information of China (English)

    高刚毅; 王申康

    2004-01-01

    World Wide Web (WWW) is a vast repository of information, including a great deal of geographic information. But the location and retrieval of geographic information will require a significant amount of time and effort. In addition, different users usually have different views and interests in the same information. To resolve such problems, this paper first proposed a model of geographic information gathering based on multi-Agent (MA) architecture. Then based on this model, we construct a prototype system with GML (Geography Markup Language). This system consists of three tiers-Client, Web Server and Data Resource. Finally, we expatiate on the process of Web Server.

  20. An Agile Enterprise Regulation Architecture for Health Information Security Management

    Science.gov (United States)

    Chen, Ying-Pei; Hsieh, Sung-Huai; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-01-01

    Abstract Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  1. The architecture of information architecture, interaction design and the patterning of digital information

    CERN Document Server

    Dade-Robertson, Martyn

    2011-01-01

    This book looks at relationships between the organization of physical objects in space and the organization of ideas. Historical, philosophical, psychological and architectural knowledge are united to develop an understanding of the relationship between information and its representation.Despite its potential to break the mould, digital information has relied on metaphors from a pre-digital era. In particular, architectural ideas have pervaded discussions of digital information, from the urbanization of cyberspace in science fiction, through to the adoption of spatial visualiz

  2. Business process architectures: overview, comparison and framework

    Science.gov (United States)

    Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.

    2016-02-01

    With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.

  3. Publishing perishing? Towards tomorrow's information architecture

    Directory of Open Access Journals (Sweden)

    Gerstein Mark B

    2007-01-01

    Full Text Available Abstract Scientific articles are tailored to present information in human-readable aliquots. Although the Internet has revolutionized the way our society thinks about information, the traditional text-based framework of the scientific article remains largely unchanged. This format imposes sharp constraints upon the type and quantity of biological information published today. Academic journals alone cannot capture the findings of modern genome-scale inquiry. Like many other disciplines, molecular biology is a science of facts: information inherently suited to database storage. In the past decade, a proliferation of public and private databases has emerged to house genome sequence, protein structure information, functional genomics data and more; these digital repositories are now a vital component of scientific communication. The next challenge is to integrate this vast and ever-growing body of information with academic journals and other media. To truly integrate scientific information we must modernize academic publishing to exploit the power of the Internet. This means more than online access to articles, hyperlinked references and web-based supplemental data; it means making articles fully computer-readable with intelligent markup and Structured Digital Abstracts. Here, we examine the changing roles of scholarly journals and databases. We present our vision of the optimal information architecture for the biosciences, and close with tangible steps to improve our handling of scientific information today while paving the way for an expansive central index in the future.

  4. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  5. A security architecture for health information networks.

    Science.gov (United States)

    Kailar, Rajashekar; Muralidhar, Vinod

    2007-10-11

    Health information network security needs to balance exacting security controls with practicality, and ease of implementation in today's healthcare enterprise. Recent work on 'nationwide health information network' architectures has sought to share highly confidential data over insecure networks such as the Internet. Using basic patterns of health network data flow and trust models to support secure communication between network nodes, we abstract network security requirements to a core set to enable secure inter-network data sharing. We propose a minimum set of security controls that can be implemented without needing major new technologies, but yet realize network security and privacy goals of confidentiality, integrity and availability. This framework combines a set of technology mechanisms with environmental controls, and is shown to be sufficient to counter commonly encountered network security threats adequately.

  6. Real-time optical information processing

    CERN Document Server

    Javidi, Bahram

    1994-01-01

    Real-Time Optical Information Processing covers the most recent developments in optical information processing, pattern recognition, neural computing, and materials for devices in optical computing. Intended for researchers and graduate students in signal and information processing with some elementary background in optics, the book provides both theoretical and practical information on the latest in information processing in all its aspects. Leading researchers in the field describe the significant signal processing algorithms architectures in optics as well as basic hardware concepts,

  7. The Information Flow Framework: New architecture

    CERN Document Server

    Kent, Robert E

    2011-01-01

    This presentation discusses a new, modular, more mature architecture for the Information Flow Framework (IFF). The IFF uses institution theory as a foundation for the semantic integration of ontologies. It represents metalogic, and as such operates at the structural level of ontologies. The content, form and experience of the IFF could contribute to the development of a standard ontology for category theory. The foundational aspect of the IFF helps to explain the relationship between the fundamental concepts of set theory and category theory. The development of the IFF follows two design principles: conceptual warrant and categorical design. Both are limitations of the logical expression. Conceptual warrant limits the content of logical expression, by requiring us to justify the introduction of new terminology (and attendant axiomatizations). Categorical design limits the form of logical expression (of all mathematical concepts and constraints) to atomic expressions: declarations, equations or relational expr...

  8. An Enterprise Information Architecture: A Case Study for Decentralized Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Watson, R.W.

    1999-06-15

    As enterprises become increasingly information based, making improvements in their information activities is a top priority to assure their continuing competitiveness. A key to achieving these improvements is developing an Enterprise Information Architecture (EIA). An EIA can be viewed as a structured set of multidimensional interrelated elements that support all information processes. The current ad hoc EIAs in place within many enterprises can not meet their future needs because of a lack of a coherent framework, incompatibilities, missing elements, few and poorly understood standards, uneven quality and unnecessary duplications. This paper discusses the EIA developed at Lawrence Livermore National Laboratory as a case study, for other information based enterprises, particularly those with decentralized and autonomous organization structures and cultures. While the architecture is important, the process by which it is developed and sustained over time is equally important. This paper outlines the motivation for an EIA and discusses each of the interacting elements identified. It also presents an organizational structure and processes for building a sustainable EIA activity.

  9. An Enterprise Information Architecture: A Case Study for Decentralized Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Watson, R.W.

    1999-09-28

    As enterprises become increasingly information based, making improvements in their information activities is a top priority to assure their continuing competitiveness. A key to achieving these improvements is developing an Enterprise Information Architecture (EIA). An EIA can be viewed as a structured set of multidimensional interrelated elements that support all information processes. The current ad hoc EIAs in place within many enterprises can not meet their future needs because of a lack of a coherent framework, incompatibilities, missing elements, few and poorly understood standards, uneven quality and unnecessary duplications. This paper discusses the EIA developed at Lawrence Livermore National Laboratory as a case study, for other information based enterprises, particularly those with decentralized and autonomous organization structures and cultures. While the architecture is important, the process by which it is developed and sustained over time is equally important. This paper outlines the motivation for an EIA and discusses each of the interacting elements identified. It also presents an organizational structure and processes for building a sustainable EIA activity.

  10. A Model for Information Integration Using Service Oriented Architectur

    Directory of Open Access Journals (Sweden)

    C. Punitha Devi

    2014-06-01

    Full Text Available Business agility remains to be the keyword that drives the business into different directions and enabling a 360 degree shift in the business process. To achieve agility the organization should work on real time information and data. The need to have instant access to information appears to be ever shine requirement of all organizations or enterprise. Access to information does not come directly with a single query but a complex process termed Information integration. Information integration has been in existence for the past two decades and has been progressive up to now. The challenges and issues keep on persisting as information integration problem evolves by itself. This paper addresses the issues in the approaches, techniques and models pertaining to information integration and identifies the problem for a need for a complete model. As SOA is the architectural style that is changing the business patterns today, this paper proposes a service oriented model for information integration. The model mainly focuses on giving a complete structure for information integration that is adaptable to any environment and open in nature. Here information is converted into service and then the information services are integrated through service oriented integration to provide the integrated information also as service.

  11. A Service-Oriented Architecture for Proactive Geospatial Information Services

    Directory of Open Access Journals (Sweden)

    Haifeng Li

    2011-12-01

    Full Text Available The advances in sensor network, linked data, and service-oriented computing has indicated a trend of information technology, i.e., toward an open, flexible, and distributed architecture. However, the existing information technologies show a lack of effective sharing, aggregation, and cooperation services to handle the sensors, data, and processing resources to fulfill user’s complicated tasks in near real-time. This paper presents a service-orientated architecture for proactive geospatial information services (PGIS, which integrates the sensors, data, processing, and human services. PGIS is designed to organize, aggregate, and co-operate services by composing small scale services into service chains to meet the complicated user requirements. It is a platform to provide real-time or near real-time data collection, storage, and processing capabilities. It is a flexible, reusable, and scalable system to share and interoperate geospatial data, information, and services. The developed PGIS framework has been implemented and preliminary experiments have been performed to verify its performance. The results show that the basic functions such as task analysis, managing sensors for data acquisition, service composition, service chain construction and execution are validated, and the important properties of PGIS, including interoperability, flexibility, and reusability, are achieved.

  12. A multi-Agent system architecture for geographic information gathering

    Institute of Scientific and Technical Information of China (English)

    高刚毅; 王申康

    2004-01-01

    World Wide Web(WWW)is a vast repository of information,including a great deal of geographic information.But the location and retrieval of geographic information will require a significant amount of time and effort. In addition,different users usually have different views and interests in the same information. To resolve such problems,this paper first proposed a model of geographic information gathering based on multi-Agent(MA)architecture. Then based on this model,we construct a prototype system with GML(Geography Markup Language). This system consists of three tiers-Client,Web Server and Data Resource. Finally,we expatiate on the process of Web Server.

  13. Hybridization of Architectural Styles for Integrated Enterprise Information Systems

    Science.gov (United States)

    Bagusyte, Lina; Lupeikiene, Audrone

    Current enterprise systems engineering theory does not provide adequate support for the development of information systems on demand. To say more precisely, it is forming. This chapter proposes the main architectural decisions that underlie the design of integrated enterprise information systems. This chapter argues for the extending service-oriented architecture - for merging it with component-based paradigm at the design stage and using connectors of different architectural styles. The suitability of general-purpose language SysML for the modeling of integrated enterprise information systems architectures is described and arguments pros are presented.

  14. CDC WONDER: a cooperative processing architecture for public health.

    Science.gov (United States)

    Friede, A; Rosen, D H; Reid, J A

    1994-01-01

    CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange.

  15. Formal Modeling for Information Appliance Using Abstract MVC Architecture

    OpenAIRE

    Arichika, Yuji; Araki, Keijiro

    2004-01-01

    In information appliance development, it is important to divide core functions and display functions because information appliance have various user interface and display functions changed frequently. Using MVC architecture is one way to divide display functions and core functions. But MVC architecture is implementation architecture and there are some gaps to get abstract model. On the other hand it is known that formal methods are useful for constructing abstract model. Therefore we intend t...

  16. Platform for Assessing Strategic Alignment Using Enterprise Architecture: Application to E-Government Process Assessment

    OpenAIRE

    Kaoutar Elhari; Bouchaib Bounabat

    2011-01-01

    This paper presents an overview of S2AEA (v2) (Strategic Alignment Assessment based on Enterprise Architecture (version2)), a platform for modelling enterprise architecture and for assessing strategic alignment based on internal enterprise architecture metrics. The idea of the platform is based on the fact that enterprise architecture provides a structure for business processes and information systems that supports them. This structure can be used to measure the degree of consistency between ...

  17. Minimizing the Risk of Architectural Decay by using Architecture-Centric Evolution Process

    CERN Document Server

    Farid, Humaira; Iqbal, M Aqeel

    2011-01-01

    Software systems endure many noteworthy changes throughout their life-cycle in order to follow the evolution of the problem domains. Generally, the software system architecture cannot follow the rapid evolution of a problem domain which results in the discrepancies between the implemented and designed architecture. Software architecture illustrates a system's structure and global properties and consequently determines not only how the system should be constructed but also leads its evolution. Architecture plays an important role to ensure that a system satisfies its business and mission goals during implementation and evolution. However, the capabilities of the designed architecture may possibly be lost when the implementation does not conform to the designed architecture. Such a loss of consistency causes the risk of architectural decay. The architectural decay can be avoided if architectural changes are made as early as possible. The paper presents the Process Model for Architecture-Centric Evolution which ...

  18. The information architecture of behavior change websites.

    Science.gov (United States)

    Danaher, Brian G; McKay, H Garth; Seeley, John R

    2005-05-18

    The extraordinary growth in Internet use offers researchers important new opportunities to identify and test new ways to deliver effective behavior change programs. The information architecture (IA)-the structure of website information--is an important but often overlooked factor to consider when adapting behavioral strategies developed in office-based settings for Web delivery. Using examples and relevant perspectives from multiple disciplines, we describe a continuum of website IA designs ranging from a matrix design to the tunnel design. The free-form matrix IA design allows users free rein to use multiple hyperlinks to explore available content according to their idiosyncratic interests. The more directive tunnel IA design (commonly used in e-learning courses) guides users step-by-step through a series of Web pages that are arranged in a particular order to improve the chances of achieving a goal that is measurable and consistent. Other IA designs are also discussed, including hierarchical IA and hybrid IA designs. In the hierarchical IA design, program content is arranged in a top-down manner, which helps the user find content of interest. The more complex hybrid IA design incorporates some combination of components that use matrix, tunnel, and/or hierarchical IA designs. Each of these IA designs is discussed in terms of usability, participant engagement, and program tailoring, as well as how they might best be matched with different behavior change goals (using Web-based smoking cessation interventions as examples). Our presentation underscores the role of considering and clearly reporting the use of IA designs when creating effective Web-based interventions. We also encourage the adoption of a multidisciplinary perspective as we move towards a more mature view of Internet intervention research.

  19. Organizational Architecture and Success in the Information Technology Industry

    OpenAIRE

    Haim Mendelson

    2000-01-01

    This paper studies an organizational architecture that I call information-age architecture. I define a measure of organizational IQ and test whether it is related to financial and market success using data from the fast-moving information technology industry. Higher organizational IQ is associated with higher profitability and growth. This relationship is stronger in business environments that are characterized by faster clockspeeds.

  20. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  1. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  2. A Distributed DB Architecture for Processing cPIR Queries

    Directory of Open Access Journals (Sweden)

    Sultan.M

    2013-06-01

    Full Text Available Information Retrieval is the Process of obtaining materials, usually documents from unstructured huge volume of data. Several Protocols are available to retrieve bit information available in the distributed databases. A Cloud framework provides a platform for private information retrieval. In this article, we combine the artifacts of the distributed system with Cloud framework for extracting information from unstructured databases. The process involves distributing the database to a number of co-operative peers which will reduce the response of the query by influencing computational resources in the peer. A single query is subdivided into multiple queries and processed in parallel across the distributed sites. Our Simulation results using Cloud Sim shows that this distributed database architecture reduces the cost of computational Private Information Retrieval with reduced response time and processor overload in peer sites.

  3. Hierarchical Architecture for Enterprise Information System under Dynamic Environment

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In a dynamic environment, it is vital for enterpris e to have flexible information system architecture to integrate ERP, Supply Chain Management (SCM) and E-Commerce (EC). The traditional systems are established o n the ERP-centered flat architecture. This architecture has some disadvantages in supporting the dynamics of enterprises. Firstly, ERP is already a very expens ive and complex system; the extension based on it can only increase the complexi ty and make the implementation more expensive and risk...

  4. Data Centric Integration and Analysis of Information Technology Architectures

    Science.gov (United States)

    2007-09-01

    Systems Engineering Analysis Process and DoDAF Architecture Development Process............... 133 5. Federate Architecture Databases and Tools...constructed as a product of this thesis, this step is not applicable. 6. Source Data Collection The NPS Library was used to query the EBSCOhost , BOSUN...

  5. Kuhlthau's Information Search Process.

    Science.gov (United States)

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  6. A collaboration process for enterprise architecture creation

    NARCIS (Netherlands)

    Nakakawa, Agnes

    2012-01-01

    Designing an enterprise architecture involves architect-specific tasks (those that are executed by enterprise architects) and collaboration dependent tasks (those whose proper execution requires enterprise architects to collaborate with organizational stakeholders). Enterprise architecture framework

  7. A collaboration process for enterprise architecture creation

    NARCIS (Netherlands)

    Nakakawa, Agnes

    2012-01-01

    Designing an enterprise architecture involves architect-specific tasks (those that are executed by enterprise architects) and collaboration dependent tasks (those whose proper execution requires enterprise architects to collaborate with organizational stakeholders). Enterprise architecture

  8. Mapping Signal Processing Kernels to Tiled Architectures

    Science.gov (United States)

    2007-11-02

    attractive alternatives to monolithic computer architecture designs because they allow a larger design to be built from smaller modules and limit the...Computer Architectures. ACM Transactions on Computer Systems, 2(4):289–308, November 1984. [12] Steven Swanson, Ken Michelson , Andrew Schwerin, and...Program MIT Lincoln LaboratoryHPEC 2004-3 JML 28 Sep 2004 Tiled Architectures • Monolithic single-chip architectures are becoming rare in the industry

  9. Classical Process diagrams and Service oriented Architecture

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available SOA (Service Oriented Architecture has played in the last two decades a very useful role in the design philosophy of the target software. The basic units of software for which the mentioned philosophy is valid are called services. Generally it is counted that the advance implementation of services is given by using so–called Web services that are on the platform of the Internet 2.0. Naturally, there has been counted also with the fact that the services will be used in software applications designed by professional programmers. Later, the concept of software services was supported by the enterprise concept of the SOE type (Service oriented Enterprise and by the creation of the SOA paradigm.Many computer scientists, including Thomas Erl – doyen of SOA, do not understand SOA either as an integrated technology or as a development methodology. Proofs of this statement are in the following definitions.SOA is a form of technology architecture that adheres to the principles of service – orientation. When realized through the Web services technology platform, SOA establishes the potential to support and promote these principles throughout the business processes and automation domains of an enterprise (Erl, 2006. Thomas Erl (Erl, 2007 has expressed the idea of SOA implementation using the following definition.SOA establishes an architectural model that aides to enhance the efficiency, agility, and productivity of an enterprise by positioning services as the primary means through which solution logic is represented in support of the realization of strategic goals associated with service-oriented computing. Nevertheless the key principles, on which SOA is constructed (Erl, 2006, are not significantly reflected in any of the previous definitions. Some of the mentioned principles are still included at least in the more free definitions of SOA, for example (Barry, 2003.A service-oriented architecture is essentially a collection of services. These

  10. SOFTWARE ARCHITECTURE FOR FIJI NATIONAL UNIVERSITY CAMPUS INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Bimal Aklesh Kumar

    2011-04-01

    Full Text Available Software Architecture defines the overview of the system which consists of various components and their relationships among the software. Architectural design is very important in the development of large scale software solution and plays a very active role in achieving business goals, quality and reusable solution. It is often difficult to choose the best software architecture for your system from the several candidate types available. In this paper we look at the several architectural types and compare them based on the key requirements of our system, and select the most appropriate architecture for the implementation of campus information systems at Fiji National University. Finally we provide details of proposed architecture and outline future plans for implementation of our system.

  11. Software Architecture for Fiji National University Campus Information Systems

    CERN Document Server

    Kumar, Bimal Aklesh

    2011-01-01

    Software Architecture defines the overview of the system which consists of various components and their relationships among the software. Architectural design is very important in the development of large scale software solution and plays a very active role in achieving business goals, quality and reusable solution. It is often difficult to choose the best software architecture for your system from the several candidate types available. In this paper we look at the several architectural types and compare them based on the key requirements of our system, and select the most appropriate architecture for the implementation of campus information systems at Fiji National University. Finally we provide details of proposed architecture and outline future plans for implementation of our system.

  12. A secure and efficiently searchable health information architecture.

    Science.gov (United States)

    Yasnoff, William A

    2016-06-01

    Patient-centric repositories of health records are an important component of health information infrastructure. However, patient information in a single repository is potentially vulnerable to loss of the entire dataset from a single unauthorized intrusion. A new health record storage architecture, the personal grid, eliminates this risk by separately storing and encrypting each person's record. The tradeoff for this improved security is that a personal grid repository must be sequentially searched since each record must be individually accessed and decrypted. To allow reasonable search times for large numbers of records, parallel processing with hundreds (or even thousands) of on-demand virtual servers (now available in cloud computing environments) is used. Estimated search times for a 10 million record personal grid using 500 servers vary from 7 to 33min depending on the complexity of the query. Since extremely rapid searching is not a critical requirement of health information infrastructure, the personal grid may provide a practical and useful alternative architecture that eliminates the large-scale security vulnerabilities of traditional databases by sacrificing unnecessary searching speed.

  13. Final report: An enabling architecture for information driven manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Griesmeyer, J.M.

    1997-08-01

    This document is the final report for the LDRD: An Enabling Architecture for Information Driven Manufacturing. The project was motivated by the need to bring quality products to market quickly and to remain efficient and profitable with small lot sizes, intermittent production and short product life cycles. The emphasis is on integration of the product realization process and the information required to drive it. Enterprise level information was not addressed except in so far as the enterprise must provide appropriate information to the production equipment to specify what to produce, and the equipment must return enough information to record what was produced. A production script approach was developed in which the production script specifies all of the information required to produce a quality product. A task sequencer that decomposes the script into process steps which are dispatched to capable Standard Manufacturing Modules. The plug and play interface to these modules allows rapid introduction of new modules into the production system and speeds up the product realization cycle. The results of applying this approach to the Agile Manufacturing Prototyping System are described.

  14. Financial information processing

    Institute of Scientific and Technical Information of China (English)

    Shuo BAI; Shouyang WANG; Lean YU; Aoying ZHOU

    2009-01-01

    @@ The rapid growth in financial data volume has made financial information processing more and more difficult due to the increase in complexity, which has forced businesses and academics alike to turn to sophisticated information processing technologies for better solutions. A typical feature is that high-performance computers and advanced computational techniques play ever-increasingly important roles for business and industries to have competitive advantages. Accordingly, financial information processing has emerged as a new cross-disciplinary field integrating computer science, mathematics, financial economics, intelligent techniques, and computer simulations to make different decisions based on processed financial information.

  15. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  16. Integrated quality control architecture for multistage machining processes

    Science.gov (United States)

    Yang, Jie; Liu, Guixiong

    2010-12-01

    To solve problems concerning the process quality prediction control for the multistage machining processes, a integrated quality control architecture is proposed in this paper. First, a hierarchical multiple criteria decision model is established for the key process and the weight matrix method stratified is discussed. Predictive control of the manufacturing quality is not just for on-site monitoring and control layer, control layer in the enterprise, remote monitoring level of quality exists a variety of target predictive control demand, therefore, based on XML to achieve a unified description of manufacturing quality information, and in different source of quality information between agencies to achieve the transfer and sharing. This will predict complex global quality control, analysis and diagnosis data to lay a good foundation to achieve a more practical, open and standardized manufacturing quality with higher levels of information integration system.

  17. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  18. ENTERPRISE SERVICES ARCHITECTURE IN THE WORLD OF INFORMATION TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Stefan IOVAN

    2012-05-01

    Full Text Available Enterprise Services Architecture (ESA is blueprint for now enterprise software should be constructed to provide maximum business value. The challenge facing most companies is not whether to adopt Service-Oriented Architecture (SOA, but when and how to do so. There is always a lag between technological vision and business feasibility. It also takes time to fully realize the potential of existing technologies, a process that does not stop the moment the new thing arrives. But when the value of a new approach such as ESA starts to make a difference and produces a competitive advantage, the motivation to change skyrockets. The time to change becomes now and the hunger for learning grows. The goal of this paper is to satisfy the hunger for information for those who suspect that ESA may be a gateway to transforming Information Technology (IT into a strategic weapon. This paper will explain – in more detail that ever before – what ESA is bringing the concept to life in all of its products as a platform supported by an ecosystem.

  19. Developing the architecture for the Climate Information Portal for Copernicus

    Science.gov (United States)

    Som de Cerff, Wim; Thijsse, Peter; Plieger, Maarten; Pascoe, Stephen; Jukes, Martin; Leadbetter, Adam; Goosen, Hasse; de Vreede, Ernst

    2015-04-01

    environment and society, but will develop an end to end processing chain (indicator toolkit), from comprehensive information on the climate state through to highly aggregated decision relevant products. This processing chain will be demonstrated within three thematic areas: water, rural and urban. Indicators of climate change and climate change impact will be provided, and a toolkit to update and post process the collection of indicators will be integrated into the portal. For the indicators three levels (Tiers) have been loosely defined: Tier 1: field summarising properties of the climate system; e.g. temperature change; Tier 2: expressed in terms of environmental properties outside the climate system; e.g. flooding change; Tier 3: expressed in social and economic impact. For the architecture, CLIPC has two interlocked themes: 1. Harmonised access to climate datasets derived from models, observations and re-analyses 2. A climate impact toolkit to evaluate, rank and aggregate indicators For development of the CLIPC architecture an Agile 'storyline' approach is taken. The storyline is a real world use case and consists of producing a Tier 3 indicator (Urban Heat Vulnerability) and making it available through the CLIPC infrastructure for a user group. In this way architecture concepts can be directly tested and improved. Also, the produced indicator can be shown to users to refine requirements. Main components of the CLIPC architecture are 1) Data discovery and access, 2) Data processing, 3) Data visualization, 4) Knowledge base and 5) User Management. The Data discovery and access component main challenge is to provide harmonized access to various sources of climate data (ngEO, EMODNET/SeaDataNet, ESGF, MyOcean). The discovery service concept will be provided using a CLIPC data and data product catalogue and via a structured data search on selected infrastructures, using NERC vocabulary services and mappings. Data processing will be provided using OGC WPS services, linking

  20. Information Architecture and the Comic Arts: Knowledge Structure and Access

    Science.gov (United States)

    Farmer, Lesley S. J.

    2015-01-01

    This article explains information architecture, focusing on comic arts' features for representing and structuring knowledge. Then it details information design theory and information behaviors relative to this format, also noting visual literacy. Next , applications of comic arts in education are listed. With this background, several research…

  1. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  2. Processes for an Architecture of Volume

    DEFF Research Database (Denmark)

    Mcgee, Wes; Feringa, Jelle; Søndergaard, Asbjørn

    2013-01-01

    This paper addresses both the architectural, conceptual motivations and the tools and techniques necessary for the digital production of an architecture of volume. The robotic manufacturing techniques of shaping volumetric materials by hot wire and abrasive wire cutting are discussed through...

  3. Design of Information Processing System Architecture Based on Cloud Computing%基于云计算信息处理系统体系结构设计

    Institute of Scientific and Technical Information of China (English)

    刘燕

    2012-01-01

    研究面向云计算的智能管理系统的体系架构与集约化数据管理的理论和方法.建立面向云计算的智能管理系统原型,数据空间的模式规范化理论和需求归约化方法,实现云计算个性化管理;采用分离式管理架构,实现按需双向动态一致性维护,多源、非结构数据的抽取、融合与分析;建立面向数据安全与隐私保护的安全体系结构,保证系统中各类数据的安全和可信.%This study for the intelligent management system for cloud computing architecture with intensive data management theories and methods. Establish a prototype of the intelligent management system for cloud computing, the theory of schema normalization of the data space and demand reduction method to achieve cloud computing personalized management; a separate management structure, to achieve on-demand two-way dynamic consistency maintenance, multi-source, non-structured data extraction, integration and analysis; the establishment of security architecture for data security and privacy protection, to ensure the safety and credibility of the system in various types of data.

  4. 2016 37th International Conference Information Systems Architecture and Technology

    CERN Document Server

    Grzech, Adam; Świątek, Jerzy; Wilimowska, Zofia

    2017-01-01

    This four volume set of books constitutes the proceedings of the 2016 37th International Conference Information Systems Architecture and Technology (ISAT), or ISAT 2016 for short, held on September 18–20, 2016 in Karpacz, Poland. The conference was organized by the Department of Management Systems and the Department of Computer Science, Wrocław University of Science and Technology, Poland. The papers included in the proceedings have been subject to a thorough review process by highly qualified peer reviewers. The accepted papers have been grouped into four parts: Part I—addressing topics including, but not limited to, systems analysis and modeling, methods for managing complex planning environment and insights from Big Data research projects. Part II—discoursing about topics including, but not limited to, Web systems, computer networks, distributed computing, and mulit-agent systems and Internet of Things. Part III—discussing topics including, but not limited to, mobile and Service Oriented Architect...

  5. 36th International Conference on Information Systems Architecture and Technology

    CERN Document Server

    Grzech, Adam; Świątek, Jerzy; Wilimowska, Zofia

    2016-01-01

    This four volume set of books constitutes the proceedings of the 36th International Conference Information Systems Architecture and Technology 2015, or ISAT 2015 for short, held on September 20–22, 2015 in Karpacz, Poland. The conference was organized by the Computer Science and Management Systems Departments, Faculty of Computer Science and Management, Wroclaw University of Technology, Poland. The papers included in the proceedings have been subject to a thorough review process by highly qualified peer reviewers. The accepted papers have been grouped into four parts: Part I—addressing topics including, but not limited to, systems analysis and modeling, methods for managing complex planning environment and insights from Big Data research projects. Part II—discoursing about topics including, but not limited to, Web systems, computer networks, distributed computing, and multi-agent systems and Internet of Things. Part III—discussing topics including, but not limited to, mobile and Service Oriented Archi...

  6. Platform for Assessing Strategic Alignment Using Enterprise Architecture: Application to E-Government Process Assessment

    Directory of Open Access Journals (Sweden)

    Kaoutar Elhari

    2011-01-01

    Full Text Available This paper presents an overview of S2AEA (v2 (Strategic Alignment Assessment based on Enterprise Architecture (version2, a platform for modelling enterprise architecture and for assessing strategic alignment based on internal enterprise architecture metrics. The idea of the platform is based on the fact that enterprise architecture provides a structure for business processes and information systems that supports them. This structure can be used to measure the degree of consistency between business strategies and information systems. In that sense, this paper presents a platform illustrating the role of enterprise architecture in the strategic alignment assessment. This assessment can be used in auditing information systems. The platform is applied to assess an e-government process.

  7. Platform for Assessing Strategic Alignment Using Enterprise Architecture: Application to E-Government Process Assessment

    CERN Document Server

    Elhari, Kaoutar

    2011-01-01

    This paper presents an overview of S2AEA (v2) (Strategic Alignment Assessment based on Enterprise Architecture (version2)), a platform for modelling enterprise architecture and for assessing strategic alignment based on internal enterprise architecture metrics. The idea of the platform is based on the fact that enterprise architecture provides a structure for business processes and information systems that supports them. This structure can be used to measure the degree of consistency between business strategies and information systems. In that sense, this paper presents a platform illustrating the role of enterprise architecture in the strategic alignment assessment. This assessment can be used in auditing information systems. The platform is applied to assess an e-government process.

  8. Sparsity and Information Processing

    OpenAIRE

    Ikeda, Shiro

    2015-01-01

    Recently, many information processing methods utilizing the sparsity of the information source is studied. We have reported some results on this line of research. Here we pick up two results from our own works. One is an image reconstruction method for radio interferometory and the other is a motor command computation method for a two-joint arm.

  9. Using an Architectural Metaphor for Information Design in Hypertext.

    Science.gov (United States)

    Deboard, Donn R.; Lee, Doris

    2001-01-01

    Uses Frank Lloyd Wright's (1867-1959) organic architecture as a metaphor to define the relationship between a part and a whole, whether the focus is on a building and its surroundings or information delivered via hypertext. Reviews effective strategies for designing text information via hypertext and incorporates three levels of information…

  10. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    Science.gov (United States)

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  11. A Frame Based Architecture for Information Integration in CIMS

    Institute of Scientific and Technical Information of China (English)

    吴信东

    1992-01-01

    This paper foumulates and architecture for information integration in computer integrated manufacturing systems(CIMS).The architecture takes the frame structure as single link among applications and between applications and physical storage.All the advantages in form features based intgrated systems can be found in the frame-based architecture as the frame structrue here takes from features as its primitives.But other advantage,e.g.,default knowledge and dynamic domain knowledge can be attached to frames and the frame structure is easy to be changed and extended,which cannot be found ing form reatures based systems,can also be showed in frame based architectures as the frame structure is a typical knowledge representation scheme in artificial intelligence and many researches and interests have put on it.

  12. An Architecture for Information Commerce Systems

    NARCIS (Netherlands)

    Hauswirth, Manfred; Jazayeri, Mehdi; Miklós, Zoltan; Podnar, Ivana; Di Nitto, Elisabetta; Wombacher, Andreas

    The increasing use of the Internet in business and commerce has created a number of new business opportunities and the need for supporting models and platforms. One of these opportunities is information commerce (i-commerce), a special case of ecommerce focused on the purchase and sale of

  13. An Architecture for Information Commerce Systems

    NARCIS (Netherlands)

    Hauswirth, Manfred; Jazayeri, Mehdi; Miklós, Zoltan; Podnar, Ivana; Di Nitto, Elisabetta; Wombacher, Andreas

    2001-01-01

    The increasing use of the Internet in business and commerce has created a number of new business opportunities and the need for supporting models and platforms. One of these opportunities is information commerce (i-commerce), a special case of ecommerce focused on the purchase and sale of informatio

  14. The Process Architecture of EU Territorial Cohesion Policy

    Directory of Open Access Journals (Sweden)

    Andreas Faludi

    2010-08-01

    Full Text Available When preparing the European Spatial Development Perspective (ESDP, Member States were supported by the European Commission but denied the EU a competence in the matter. Currently, the Treaty of Lisbon identifies territorial cohesion as a competence shared between the Union and the Member States. This paper is about the process architecture of territorial cohesion policy. In the past, this architecture resembled the Open Method of Coordination (OMC which the White Paper on European Governance praised, but only in areas where there was no EU competence. This reflected zero-sum thinking which may continue even under the Lisbon Treaty. After all, for as long as territorial cohesion was not a competence, voluntary cooperation as practiced in the ESDP process was pursued in this way. However, the practice of EU policies, even in areas where there is an EU competence, often exhibits features of the OMC. Surprisingly effective innovations hold the promise of rendering institutions of decision making comprehensible and democratically accountable. In the EU as a functioning polity decision making is thus at least part deliberative so that actors’ preferences are transformed by the force of the better argument. This brings into focus the socialisation of the deliberators into epistemic communities. Largely an informal process, this is reminiscent of European spatial planning having been characterised as a learning process.

  15. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  16. Laying the groundwork for enterprise-wide medical language processing services: architecture and process.

    Science.gov (United States)

    Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S

    2009-11-14

    A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.

  17. Process Principle of Information

    Institute of Scientific and Technical Information of China (English)

    张高锋; 任君

    2006-01-01

    Ⅰ.IntroductionInformation structure is the organization modelof given and New information in the course ofinformation transmission.A discourse contains avariety of information and not all the informationlisted in the discourse is necessary and useful to us.When we decode a discourse,usually,we do not needto read every word in the discourse or text but skimor scan the discourse or text to search what we thinkis important or useful to us in the discourse as quicklyas possible.Ⅱ.Process Principles of Informati...

  18. The straight-line information security architecture

    Energy Technology Data Exchange (ETDEWEB)

    Nilsen, C.

    1995-08-01

    Comprehensive monitoring can provide a wealth of sensor data useful in enhancing the safety, security, and international accountability of stored nuclear material. However, care must be taken to distribute this type of data on a need to know basis to the various types of users. The following paper describes an exploratory effort on behalf of Sandia National Labs to integrate commercially available systems to securely disseminate (on a need to know basis) both classified and unclassified sensor information to a variety of users on the interact.

  19. Information services and information processing

    Science.gov (United States)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  20. Shifts in the architecture of the Nationwide Health Information Network.

    Science.gov (United States)

    Lenert, Leslie; Sundwall, David; Lenert, Michael Edward

    2012-01-01

    In the midst of a US $30 billion USD investment in the Nationwide Health Information Network (NwHIN) and electronic health records systems, a significant change in the architecture of the NwHIN is taking place. Prior to 2010, the focus of information exchange in the NwHIN was the Regional Health Information Organization (RHIO). Since 2010, the Office of the National Coordinator (ONC) has been sponsoring policies that promote an internet-like architecture that encourages point to-point information exchange and private health information exchange networks. The net effect of these activities is to undercut the limited business model for RHIOs, decreasing the likelihood of their success, while making the NwHIN dependent on nascent technologies for community level functions such as record locator services. These changes may impact the health of patients and communities. Independent, scientifically focused debate is needed on the wisdom of ONC's proposed changes in its strategy for the NwHIN.

  1. Versatile architectures for onboard payload signal processing

    NARCIS (Netherlands)

    Walters, K.H.G.

    2013-01-01

    This thesis describes a system-on-chip (SoC) architecture for future space mis- sions. The SoC market for deep-space missions develops slowly and is limited in features compared to the market of consumer electronics. Where consumers of- ten cannot keep up with the features which are offered to them

  2. The holistic architectural approach to integrating the healthcare record in the overall information system.

    Science.gov (United States)

    Ferrara, F M; Sottile, P A; Grimson, W

    1999-01-01

    The integration and evolution of existing systems represents one of the most urgent problems facing those responsible for healthcare information systems so that the needs of the whole organisation are addressed. The management of the healthcare record represents one of the major requirements in the overall process, however it is also necessary to ensure that the healthcare record and other healthcare information is integrated within the context of an overall healthcare information system. The CEN ENV 12967-1 'Healthcare Information Systems Architecture' standard defines a holistic architectural approach where the various, organisational, clinical, administrative and managerial requirements co-exist and cooperate, relying on a common heritage of information and services. This paper reviews the middleware-based approach adopted by CEN ENV 12967-1 and the specialisation necessary for the healthcare record based on CEN ENV 12265 'Electronic Healthcare Record Architecture'.

  3. An Architectural Style for Closed-loop Process-Control

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    This report describes an architectural style for distributed closed-loop process control systems with high performance and hard real-time constraints. The style strikes a good balance between the architectural qualities of performance and modifiability/maintainability that traditionally are often...

  4. An Architectural Style for Closed-loop Process-Control

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Eriksen, Ole

    2003-01-01

    This report describes an architectural style for distributed closed-loop process control systems with high performance and hard real-time constraints. The style strikes a good balance between the architectural qualities of performance and modifiability/maintainability that traditionally are often...

  5. Information Processing - Administrative Data Processing

    Science.gov (United States)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  6. An Information Architecture To Support the Visualization of Personal Histories.

    Science.gov (United States)

    Plaisant, Catherine; Shneiderman, Ben; Mushlin, Rich

    1998-01-01

    Proposes an information architecture for personal-history data and describes how the data model can be extended to a runtime model for a compact visualization using graphical timelines. The model groups personal-history events into aggregates that are contained in facets, crosslinks are made, and data attributes are mapped. (Author/LRW)

  7. Introduction to information processing

    CERN Document Server

    Dietel, Harvey M

    2014-01-01

    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  8. 物联网架构和智能信息处理理论与关键技术%Theory and Key Technologies of Architecture and Intelligent Information Processing for Internet of Things

    Institute of Scientific and Technical Information of China (English)

    赵志军; 沈强; 唐晖; 方旭明

    2011-01-01

    Internet of Things(IoT) is a virtual network which is dedicated to combine the Internet and a huge number of things together and those things are connected to the Internet via various technologies, e. G. , wired or wireless technologies. By connecting to the Internet, things, e. G. ,RFID,sensors or actuators,could communicate with each other. Firstly,this paper introduced the characteristics,network architecture and the difference among IoT and other networks, e. G. , wireless sensor networks and RFID. To achieve the target of large-scale application, we proposed a general network architecture for IoT. Meanwhile, the concept of Regional Server was presented firstly in the proposed architecture to address the "Information Isolated Island" problem. Then, this paper claimed that semantic interoperability, knowledge presentation and context-ware technology are key technologies for IoT. Accordingly, this paper deeply investigated the key problems of intelligent information processing, e. G. Definition of information spaces, quality of information and information processing technologies,in an extensive way.%物联网(Internet of Things,IoT)是一种通过各种接入技术将海量电子设备与互联网进行互联的大规模虚拟网络,包括RFID、传感器以及执行器的电子设备通过互联网互联互通,将异构信息汇聚后共同完成某项特定的任务.为了解决物联网规模化运用中的关键技术,首先探讨了物联网与这些网络的区别,给出了物联网的定义、特征以及参考架构.同时,创新性地提出了区域服务器的概念,并通过本体论知识表达等方法的运用解决了物联网中的“信息孤岛”难题.智能信息处理是物联网的核心内容之一,而物联网的知识表达与情景感知等相关技术是物联网智能信息处理的核心内容.据此,着重分析研究了智能信息处理的信息空间定义、信息量化方法以及信息处理各阶段的主要问题及相应的解决办法.

  9. A Semantics-Rich Information Technology Architecture for Smart Buildings

    Directory of Open Access Journals (Sweden)

    Dario Bonino

    2014-11-01

    Full Text Available The design of smart homes, buildings and environments currently suffers from a low maturity of available methodologies and tools. Technologies, devices and protocols strongly bias the design process towards vertical integration, and more flexible solutions based on separation of design concerns are seldom applied. As a result, the current landscape of smart environments is mostly populated by defectively designed solutions where application requirements (e.g., end-user functionality are too often mixed and intertwined with technical requirements (e.g., managing the network of devices. A mature and effective design process must, instead, rely on a clear separation between the application layer and the underlying enabling technologies, to enable effective design reuse. The role of smart gateways is to enable this separation of concerns and to provide an abstracted view of available automation technology to higher software layers. This paper presents a blueprint for the information technology (IT architecture of smart buildings that builds on top of established software engineering practices, such as model-driven development and semantic representation, and that avoids many pitfalls inherent in legacy approaches. The paper will also present a representative use case where the approach has been applied and the corresponding modeling and software tools.

  10. Information security architecture an integrated approach to security in the organization

    CERN Document Server

    Killmeyer, Jan

    2000-01-01

    An information security architecture is made up of several components. Each component in the architecture focuses on establishing acceptable levels of control. These controls are then applied to the operating environment of an organization. Functionally, information security architecture combines technical, practical, and cost-effective solutions to provide an adequate and appropriate level of security.Information Security Architecture: An Integrated Approach to Security in the Organization details the five key components of an information security architecture. It provides C-level executives

  11. Generalized Information Architecture for Managing Requirements in IBM?s Rational DOORS(r) Application.

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner; Shannon, Sharon A.

    2014-12-01

    When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rational DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201

  12. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture

    Science.gov (United States)

    Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  13. Formation process of Malaysian modern architecture under influence of nationalism

    OpenAIRE

    宇高, 雄志; 山崎, 大智

    2001-01-01

    This paper examines the Formation Process of Malaysian Modern Architecture under Influence of Nationalism,through the process of independence of Malaysia. The national style as "Malaysian national architecture" which hasengaged on background of political environment under the post colonial situation. Malaysian urban design is alsodetermined under the balance of both of ethnic culture and the national culture. In Malaysia, they decided to choosethe Malay ethnic culture as the national culture....

  14. Modeling cognitive and emotional processes: a novel neural network architecture.

    Science.gov (United States)

    Khashman, Adnan

    2010-12-01

    In our continuous attempts to model natural intelligence and emotions in machine learning, many research works emerge with different methods that are often driven by engineering concerns and have the common goal of modeling human perception in machines. This paper aims to go further in that direction by investigating the integration of emotion at the structural level of cognitive systems using the novel emotional DuoNeural Network (DuoNN). This network has hidden layer DuoNeurons, where each has two embedded neurons: a dorsal neuron and a ventral neuron for cognitive and emotional data processing, respectively. When input visual stimuli are presented to the DuoNN, the dorsal cognitive neurons process local features while the ventral emotional neurons process the entire pattern. We present the computational model and the learning algorithm of the DuoNN, the input information-cognitive and emotional-parallel streaming method, and a comparison between the DuoNN and a recently developed emotional neural network. Experimental results show that the DuoNN architecture, configuration, and the additional emotional information processing, yield higher recognition rates and faster learning and decision making.

  15. ELISA, a demonstrator environment for information systems architecture design

    Science.gov (United States)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  16. A Coprocessor for Accelerating Visual Information Processing

    CERN Document Server

    Stechele, W; Herrmann, S; Simon, J Lidon

    2011-01-01

    Visual information processing will play an increasingly important role in future electronics systems. In many applications, e.g. video surveillance cameras, data throughput of microprocessors is not sufficient and power consumption is too high. Instruction profiling on a typical test algorithm has shown that pixel address calculations are the dominant operations to be optimized. Therefore AddressLib, a structured scheme for pixel addressing was developed, that can be accelerated by AddressEngine, a coprocessor for visual information processing. In this paper, the architectural design of AddressEngine is described, which in the first step supports a subset of the AddressLib. Dataflow and memory organization are optimized during architectural design. AddressEngine was implemented in a FPGA and was tested with MPEG-7 Global Motion Estimation algorithm. Results on processing speed and circuit complexity are given and compared to a pure software implementation. The next step will be the support for the full Addres...

  17. An architecture for biological information extraction and representation.

    Science.gov (United States)

    Vailaya, Aditya; Bluvas, Peter; Kincaid, Robert; Kuchinsky, Allan; Creech, Michael; Adler, Annette

    2005-02-15

    Technological advances in biomedical research are generating a plethora of heterogeneous data at a high rate. There is a critical need for extraction, integration and management tools for information discovery and synthesis from these heterogeneous data. In this paper, we present a general architecture, called ALFA, for information extraction and representation from diverse biological data. The ALFA architecture consists of: (i) a networked, hierarchical, hyper-graph object model for representing information from heterogeneous data sources in a standardized, structured format; and (ii) a suite of integrated, interactive software tools for information extraction and representation from diverse biological data sources. As part of our research efforts to explore this space, we have currently prototyped the ALFA object model and a set of interactive software tools for searching, filtering, and extracting information from scientific text. In particular, we describe BioFerret, a meta-search tool for searching and filtering relevant information from the web, and ALFA Text Viewer, an interactive tool for user-guided extraction, disambiguation, and representation of information from scientific text. We further demonstrate the potential of our tools in integrating the extracted information with experimental data and diagrammatic biological models via the common underlying ALFA representation. aditya_vailaya@agilent.com.

  18. A Proposed Precision Network Measurements Architecture for the Philippine Research, Education, and Government Information Network (PREGINET)

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper proposes a network measurements architecture for the Philippine research, education, and government information network (PREGINET). The proposed architecture is an NTP-based hybrid network measurements system, which offers precise measurements, easily managed, and less bandwidth-consuming system. An NTP via GPS setup is included in the architecture to provide precise time synchronization all over the network. This setup provides a precise time reference for both the source of the measurements data and the collecting or processing machines.The current network measurements implementations in PREGINET, as presented in the latter part of the paper, will provide a hint on what tools have to be developed in order to implement the proposed architecture.

  19. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  20. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  1. A learnable parallel processing architecture towards unity of memory and computing.

    Science.gov (United States)

    Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J

    2015-08-14

    Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.

  2. Information theoretic derivation of network architecture and learning algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R.D.; Barnes, C.W.; Lee, Y.C.; Mead, W.C.

    1991-01-01

    Using variational techniques, we derive a feedforward network architecture that minimizes a least squares cost function with the soft constraint that the mutual information between input and output be maximized. This permits optimum generalization for a given accuracy. A set of learning algorithms are also obtained. The network and learning algorithms are tested on a set of test problems which emphasize time series prediction. 6 refs., 1 fig.

  3. The Process Architecture of EU Territorial Cohesion Policy

    OpenAIRE

    Andreas Faludi

    2010-01-01

    When preparing the European Spatial Development Perspective (ESDP), Member States were supported by the European Commission but denied the EU a competence in the matter. Currently, the Treaty of Lisbon identifies territorial cohesion as a competence shared between the Union and the Member States. This paper is about the process architecture of territorial cohesion policy. In the past, this architecture resembled the Open Method of Coordination (OMC) which the White Paper on European Governanc...

  4. Weather Information Processing

    Science.gov (United States)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  5. Complex processes from dynamical architectures with time-scale hierarchy.

    Directory of Open Access Journals (Sweden)

    Dionysios Perdikis

    Full Text Available The idea that complex motor, perceptual, and cognitive behaviors are composed of smaller units, which are somehow brought into a meaningful relation, permeates the biological and life sciences. However, no principled framework defining the constituent elementary processes has been developed to this date. Consequently, functional configurations (or architectures relating elementary processes and external influences are mostly piecemeal formulations suitable to particular instances only. Here, we develop a general dynamical framework for distinct functional architectures characterized by the time-scale separation of their constituents and evaluate their efficiency. Thereto, we build on the (phase flow of a system, which prescribes the temporal evolution of its state variables. The phase flow topology allows for the unambiguous classification of qualitatively distinct processes, which we consider to represent the functional units or modes within the dynamical architecture. Using the example of a composite movement we illustrate how different architectures can be characterized by their degree of time scale separation between the internal elements of the architecture (i.e. the functional modes and external interventions. We reveal a tradeoff of the interactions between internal and external influences, which offers a theoretical justification for the efficient composition of complex processes out of non-trivial elementary processes or functional modes.

  6. Complex processes from dynamical architectures with time-scale hierarchy.

    Science.gov (United States)

    Perdikis, Dionysios; Huys, Raoul; Jirsa, Viktor

    2011-02-10

    The idea that complex motor, perceptual, and cognitive behaviors are composed of smaller units, which are somehow brought into a meaningful relation, permeates the biological and life sciences. However, no principled framework defining the constituent elementary processes has been developed to this date. Consequently, functional configurations (or architectures) relating elementary processes and external influences are mostly piecemeal formulations suitable to particular instances only. Here, we develop a general dynamical framework for distinct functional architectures characterized by the time-scale separation of their constituents and evaluate their efficiency. Thereto, we build on the (phase) flow of a system, which prescribes the temporal evolution of its state variables. The phase flow topology allows for the unambiguous classification of qualitatively distinct processes, which we consider to represent the functional units or modes within the dynamical architecture. Using the example of a composite movement we illustrate how different architectures can be characterized by their degree of time scale separation between the internal elements of the architecture (i.e. the functional modes) and external interventions. We reveal a tradeoff of the interactions between internal and external influences, which offers a theoretical justification for the efficient composition of complex processes out of non-trivial elementary processes or functional modes.

  7. An architecture model for multiple disease management information systems.

    Science.gov (United States)

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  8. Heterogeneous architecture to process swarm optimization algorithms

    Directory of Open Access Journals (Sweden)

    Maria A. Dávila-Guzmán

    2014-01-01

    Full Text Available Since few years ago, the parallel processing has been embedded in personal computers by including co-processing units as the graphics processing units resulting in a heterogeneous platform. This paper presents the implementation of swarm algorithms on this platform to solve several functions from optimization problems, where they highlight their inherent parallel processing and distributed control features. In the swarm algorithms, each individual and dimension problem are parallelized by the granularity of the processing system which also offer low communication latency between individuals through the embedded processing. To evaluate the potential of swarm algorithms on graphics processing units we have implemented two of them: the particle swarm optimization algorithm and the bacterial foraging optimization algorithm. The algorithms’ performance is measured using the acceleration where they are contrasted between a typical sequential processing platform and the NVIDIA GeForce GTX480 heterogeneous platform; the results show that the particle swarm algorithm obtained up to 36.82x and the bacterial foraging swarm algorithm obtained up to 9.26x. Finally, the effect to increase the size of the population is evaluated where we show both the dispersion and the quality of the solutions are decreased despite of high acceleration performance since the initial distribution of the individuals can converge to local optimal solution.

  9. Dynamic information architecture system (DIAS) : multiple model simulation management.

    Energy Technology Data Exchange (ETDEWEB)

    Simunich, K. L.; Sydelko, P.; Dolph, J.; Christiansen, J.

    2002-05-13

    Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application contexts. The modeling domain of a specific DIAS-based simulation is determined by (1) software Entity (domain-specific) objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. In DIAS, models communicate only with Entity objects, never with each other. Each Entity object has a number of Parameter and Aspect (of behavior) objects associated with it. The Parameter objects contain the state properties of the Entity object. The Aspect objects represent the behaviors of the Entity object and how it interacts with other objects. DIAS extends the ''Object'' paradigm by abstraction of the object's dynamic behaviors, separating the ''WHAT'' from the ''HOW.'' DIAS object class definitions contain an abstract description of the various aspects of the object's behavior (the WHAT), but no implementation details (the HOW). Separate DIAS models/applications carry the implementation of object behaviors (the HOW). Any model deemed appropriate, including existing legacy-type models written in other languages, can drive entity object behavior. The DIAS design promotes plug-and-play of alternative models, with minimal recoding of existing applications. The DIAS Context Builder object builds a constructs or scenario for the simulation, based on developer specification and user inputs. Because DIAS is a discrete event simulation system, there is a Simulation Manager object with which all events are processed. Any class that registers to receive events must implement an event handler (method) to process the event during execution. Event handlers

  10. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  11. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    Science.gov (United States)

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  12. Versatile architectures for onboard payload signal processing

    NARCIS (Netherlands)

    Walters, Karel Hubertus Gerardus

    2013-01-01

    This thesis presents a hardware fused-multiply-add floating point unit, called Sabrewing, which has properties that satisfy the needs for payload processing units. Sabrewing is BSD licensed and made in Europe such that it bypasses the export regulations of the USA. It combines floating-point and fix

  13. Architecture for Cross-Organizational Business Processes

    NARCIS (Netherlands)

    Hoffner, Yigal; Ludwig, Heiko; Gülcü, Ceki; Grefen, P.W.P.J.

    Efficient means of electronic interaction are an essential requirement for the integration of different companies' business processes along the value chain. Until recently, this interaction relied on expensive, complex and inflexible solutions, mostly based on EDI or some proprietary means. The high

  14. Advanced Computing Architectures for Cognitive Processing

    Science.gov (United States)

    2009-07-01

    customized datapath elements, encryption circuits optimized for specific keys, string matching circuits for publish/subscribe computations or...parallel datapaths , RC implementations can concurrently search various paths for determining likely meanings or predictions for text strings. This...signal processing applications, with the ability to relatively easily build a pipelined datapath optimized for the specific application needs. For this

  15. Maximum density of quantum information in a scalable CMOS implementation of the hybrid qubit architecture

    Science.gov (United States)

    Rotta, Davide; De Michielis, Marco; Ferraro, Elena; Fanciulli, Marco; Prati, Enrico

    2016-06-01

    Scalability from single-qubit operations to multi-qubit circuits for quantum information processing requires architecture-specific implementations. Semiconductor hybrid qubit architecture is a suitable candidate to realize large-scale quantum information processing, as it combines a universal set of logic gates with fast and all-electrical manipulation of qubits. We propose an implementation of hybrid qubits, based on Si metal-oxide-semiconductor (MOS) quantum dots, compatible with the CMOS industrial technological standards. We discuss the realization of multi-qubit circuits capable of fault-tolerant computation and quantum error correction, by evaluating the time and space resources needed for their implementation. As a result, the maximum density of quantum information is extracted from a circuit including eight logical qubits encoded by the [[7, 1, 3

  16. An interoperability architecture for the health information exchange in Rwanda

    CSIR Research Space (South Africa)

    Crichton, R

    2012-08-01

    Full Text Available in Rwanda and enable other systems cur- rently implemented in the country to connect and inter-operate more easily. The HIE plans to promote data re-use between the connected systems and to facili- tate information sharing. It also aims to provide... inputs were received from the Rwanda Health Enterprise Architecture (RHEA) project team, including Mead Walker, Beatriz de Faria Leao, Paul Biondich, Wayne Naidoo and Ed- uardo Jezierski. Additional support was obtained from eZ-Vida in Brazil and...

  17. Software Architecture for Modeling and Simulation of Underwater Acoustic Information Systems

    Institute of Scientific and Technical Information of China (English)

    WANG Xi-min; CAI Zhi-ming

    2009-01-01

    The simulation of underwater acoustic information flow is an important way to research sonar performance and its engagement effectiveness in the ocean environment. This paper analyzes the significance of modeling an open and sophisticated simulation software architecture by object-oriented method, and introduces the modeling processes and expression method of simulation architecture. According to the requirements of simulation system and the underwater acoustic information flow, the logical architecture of simulation software system is modeled by the object-oriented method. A use-case view captured the system requirements. The logical view shows the logical architecture of software system. The simulation software is decomposed into the loose coupling constituent parts by layering and partitioning the packages for maintainability. The design patterns enabled the simulation software to have good expansibility and reusability. The simulation system involving multi-targets and multi-sonar is developed based on the architecture model. Practices show that the model meets the needs for simulating an open and sophisticated system.

  18. Lockheed Martin Idaho Technologies Company information management technology architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, M.J.; Lau, P.K.S.

    1996-05-01

    The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies are inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other information technology infrastructure components necessary for a more integrated information technology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.

  19. Combinatorial structures and processing in neural blackboard architectures

    NARCIS (Netherlands)

    van der Velde, Frank; van der Velde, Frank; de Kamps, Marc; Besold, Tarek R.; d'Avila Garcez, Artur; Marcus, Gary F.; Miikkulainen, Risto

    2015-01-01

    We discuss and illustrate Neural Blackboard Architectures (NBAs) as the basis for variable binding and combinatorial processing the brain. We focus on the NBA for sentence structure. NBAs are based on the notion that conceptual representations are in situ, hence cannot be copied or transported.

  20. Architecture and key technologies of grid geographic information system

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The highly efficient and stable collaborative computation platform for geospatial information can be constructed on the basis of Grid computing technology,com-bined with Peer-to-Peer (P2P) computing technology and geospatial database technology. This paper proposed the architecture and key technologies of the Grid GIS (Geographic Information System) incorporated with P2P structure,and corre-spondingly a Grid GIS prototype named Nebula was designed and then imple-mented. Nebula is a suite of middleware for geospatial Grid computing,which could be deployed onto various service nodes in network. Based on Grid protocols and infrastructure,Nebula provides invocation interfaces to users in form of Grid services. By using P2P message based communication mechanism,complex geospatial computation tasks could be accomplished by Nebula in a collaborative way. This paper introduced Nebula’s architecture and key modules,and according to experimental data,we discussed the Grid GIS’s advantages,application scenar-ios and future directions.

  1. The Complex Information Process

    Directory of Open Access Journals (Sweden)

    Edwina Taborsky

    2000-07-01

    Full Text Available Abstract: This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  2. The Complex Information Process

    Science.gov (United States)

    Taborsky, Edwina

    2000-09-01

    This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  3. Processes models, environmental analyses, and cognitive architectures: quo vadis quantum probability theory?

    Science.gov (United States)

    Marewski, Julian N; Hoffrage, Ulrich

    2013-06-01

    A lot of research in cognition and decision making suffers from a lack of formalism. The quantum probability program could help to improve this situation, but we wonder whether it would provide even more added value if its presumed focus on outcome models were complemented by process models that are, ideally, informed by ecological analyses and integrated into cognitive architectures.

  4. An Executable Architecture Tool for the Modeling and Simulation of Operational Process Models

    Science.gov (United States)

    2015-03-16

    national coordination and iterative development. This paper includes a literature review, background information on process models and architecture...future work involves coordination with Subject Matter Experts ( SMEs ), and extracting data from experiments to assign more appropriate values. 3) Sub...development. This paper provided a brief description of other available tools, Fig. 10. Snapshot of Simulation Output Results for Example 3 background

  5. Architecture of Maritime Awareness System Supplied with External Information

    Directory of Open Access Journals (Sweden)

    Stróżyna Milena

    2016-12-01

    Full Text Available In this paper, we discuss a software architecture, which has been developed for the needs of the System for Intelligent Maritime Monitoring (SIMMO. The system bases on the state-of-the-art information fusion and intelligence analysis techniques, which generates an enhanced Recognized Maritime Picture and thus supports situation analysis and decision- making. The SIMMO system aims to automatically fuse an up-to-date maritime data from Automatic Identification System (AIS and open Internet sources. Based on collected data, data analysis is performed to detect suspicious vessels. Functionality of the system is realized in a number of different modules (web crawlers, data fusion, anomaly detection, visualization modules that share the AIS and external data stored in the system’s database. The aim of this article is to demonstrate how external information can be leveraged in maritime awareness system and what software solutions are necessary. A working system is presented as a proof of concept.

  6. Handbook on neural information processing

    CERN Document Server

    Maggini, Marco; Jain, Lakhmi

    2013-01-01

    This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include:                         Deep architectures                         Recurrent, recursive, and graph neural networks                         Cellular neural networks                         Bayesian networks                         Approximation capabilities of neural networks                         Semi-supervised learning                         Statistical relational learning                         Kernel methods for structured data                         Multiple classifier systems                         Self organisation and modal learning                         Applications to ...

  7. The architecture of a modern military health information system.

    Science.gov (United States)

    Mukherji, Raj J; Egyhazy, Csaba J

    2004-06-01

    This article describes a melding of a government-sponsored architecture for complex systems with open systems engineering architecture developed by the Institute for Electrical and Electronics Engineers (IEEE). Our experience in using these two architectures in building a complex healthcare system is described in this paper. The work described shows that it is possible to combine these two architectural frameworks in describing the systems, operational, and technical views of a complex automation system. The advantage in combining the two architectural frameworks lies in the simplicity of implementation and ease of understanding of automation system architectural elements by medical professionals.

  8. Architectural design of heterogeneous metallic nanocrystals--principles and processes.

    Science.gov (United States)

    Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang

    2014-12-16

    CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design

  9. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

  10. Optoelectronic Information Processing

    Science.gov (United States)

    2012-03-07

    printing for III- V/Si heterogeneous integration • Single layer Si NM photonic crystal Fano membrane reflector (MR) replaces conventional DBR ... Monolithic integration on focal plane arrays using standard processes • Wavelength & polarization tunable on pixel by pixel basis • Collection...photonic crystals, nano-antennas, nano-emitters & modulators . [Agency Reviews, National Academies input] Integrated Photonics, Optical Components

  11. Bioinspired decision architectures containing host and microbiome processing units.

    Science.gov (United States)

    Heyde, K C; Gallagher, P W; Ruder, W C

    2016-09-27

    Biomimetic robots have been used to explore and explain natural phenomena ranging from the coordination of ants to the locomotion of lizards. Here, we developed a series of decision architectures inspired by the information exchange between a host organism and its microbiome. We first modeled the biochemical exchanges of a population of synthetically engineered E. coli. We then built a physical, differential drive robot that contained an integrated, onboard computer vision system. A relay was established between the simulated population of cells and the robot's microcontroller. By placing the robot within a target-containing a two-dimensional arena, we explored how different aspects of the simulated cells and the robot's microcontroller could be integrated to form hybrid decision architectures. We found that distinct decision architectures allow for us to develop models of computation with specific strengths such as runtime efficiency or minimal memory allocation. Taken together, our hybrid decision architectures provide a new strategy for developing bioinspired control systems that integrate both living and nonliving components.

  12. Security Risk Assessment Process for UAS in the NAS CNPC Architecture

    Science.gov (United States)

    Iannicca, Dennis Christopher; Young, Daniel Paul; Suresh, Thadhani; Winter, Gilbert A.

    2013-01-01

    This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper

  13. E-health and healthcare enterprise information system leveraging service-oriented architecture.

    Science.gov (United States)

    Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Cheng, Po-Hsun; Lai, Feipei

    2012-04-01

    To present the successful experiences of an integrated, collaborative, distributed, large-scale enterprise healthcare information system over a wired and wireless infrastructure in National Taiwan University Hospital (NTUH). In order to smoothly and sequentially transfer from the complex relations among the old (legacy) systems to the new-generation enterprise healthcare information system, we adopted the multitier framework based on service-oriented architecture to integrate the heterogeneous systems as well as to interoperate among many other components and multiple databases. We also present mechanisms of a logical layer reusability approach and data (message) exchange flow via Health Level 7 (HL7) middleware, DICOM standard, and the Integrating the Healthcare Enterprise workflow. The architecture and protocols of the NTUH enterprise healthcare information system, especially in the Inpatient Information System (IIS), are discussed in detail. The NTUH Inpatient Healthcare Information System is designed and deployed on service-oriented architecture middleware frameworks. The mechanisms of integration as well as interoperability among the components and the multiple databases apply the HL7 standards for data exchanges, which are embedded in XML formats, and Microsoft .NET Web services to integrate heterogeneous platforms. The preliminary performance of the current operation IIS is evaluated and analyzed to verify the efficiency and effectiveness of the designed architecture; it shows reliability and robustness in the highly demanding traffic environment of NTUH. The newly developed NTUH IIS provides an open and flexible environment not only to share medical information easily among other branch hospitals, but also to reduce the cost of maintenance. The HL7 message standard is widely adopted to cover all data exchanges in the system. All services are independent modules that enable the system to be deployed and configured to the highest degree of flexibility

  14. Industrial Information Processing

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build......-to-order” challenge. An outcome of this capability is that the potential market for customized products will expand, resulting in a reduction in administrative and manufacturing costs. This potential for cost reduction, simultaneous with market expansion, is a source of competitive advantage; hence manufacturers have...

  15. Spatialization of social process vs singular object of architecture

    Directory of Open Access Journals (Sweden)

    Lujak Mihailo

    2010-01-01

    Full Text Available The fundamental subject of this research is spatialization of social process in the period of modernism manifested through transformation and/or change in meaning of space under a variety of social processes without changing the physical structure of space. These changes in meaning represent the specificity of development in space under the influence of the said social processes, which in this case is Yugoslav modernism, resulting in the creation of a singular object of architecture specific of a certain environment. These processes have been researched in the residential complex of Block 19a in New Belgrade, designed by architects Milan Lojanica, Predrag Cagić, and Borivoje Jovanović, and constructed between 1975 and 1982. The basic objective of this paper is to establish crucial causes for this complex to be considered the landmark in the designing practice of the time in Yugoslavia through research and critical analysis of the residential complex of Block 19a, and to try and determine the importance and potential influence in further architectural development in the period following its construction. In other words, the basic objective of this paper is to establish whether residential complex Block 19a represents a singular object of architecture in Yugoslavia/Serbia.

  16. Innovation in teaching and learning methods: Integrating sustainability subjects in the architectural design process

    Directory of Open Access Journals (Sweden)

    Carolina Sepúlveda M.

    2016-01-01

    Full Text Available Architectural training has always been linked to two opposed and complementary processes: creative thinking and linear/technical thinking. Nowadays, the training process of an architect is usually based on an experimental design studio, which is complemented by the formal teaching of theoretical and technical subjects. This system is based on the idea that it will produce a comprehensive professional who is capable of achieving creative, appropriate and viable solutions. However, this teaching method can carry hidden difficulties that may hinder the development of architecture students to their full potential. This article will inform on the methodology and results of applying an innovative method of teaching and learning architecture. This method aims at maximising the capacity of students to integrate their creative and technical competencies in order to increase the quality of work of future architects.

  17. Free-form architectural envelopes: Digital processes opportunities of industrial production at a reasonable price

    Directory of Open Access Journals (Sweden)

    E. Castaneda

    2015-06-01

    Full Text Available Free-form architecture is one of the major challenges for architects, engineers, and the building industry. This is due to the inherent difficulty of manufacturing double curvature facades at reasonable prices and quality. This paper discusses the possibilities of manufacturing free-form facade panels for architectural envelopes supported by recent advances in CAD/CAM systems and digital processes. These methods allow for no-mould processes, thus reducing the final price. Examples of actual constructions will be presented to prove the viability of computer numerically controlled (CNC fabrication technologies. Scientific literature will be reviewed. Promising fabrication methods (additive, subtractive, forming to accomplish this proposal will be discussed. This research will provide valuable information regarding the feasibility of manufacturing free-form panels for architectural envelopes at lower prices.  

  18. 77 FR 58141 - Public Buildings Service; Information Collection; Art-in-Architecture Program National Artist...

    Science.gov (United States)

    2012-09-19

    ...-Architecture & Fine Arts Division (PCAC), 1800 F Street NW., Room 3305, Washington, DC 20405, at telephone(202... ADMINISTRATION Public Buildings Service; Information Collection; Art-in- Architecture Program National Artist... requirement regarding Art-in Architecture Program National Artist Registry (GSA Form 7437). The Art-in...

  19. Program information architecture/document hierarchy. [Information Management Systems, it's components and rationale

    Energy Technology Data Exchange (ETDEWEB)

    Woods, T.W.

    1991-09-01

    The Nuclear Waste Management System (NWMS) Management Systems Improvement Strategy (MSIS) (DOE 1990) requires that the information within the computer program and information management system be ordered into a precedence hierarchy for consistency. Therefore, the US Department of Energy (DOE). Office of Civilian Radioactive Waste Management (OCRWM) requested Westinghouse Hanford Company to develop a plan for NWMS program information which the MSIS calls a document hierarchy. This report provides the results of that effort and describes the management system as a program information architecture.'' 3 refs., 3 figs.

  20. An Architecture of Deterministic Quantum Central Processing Unit

    OpenAIRE

    Xue, Fei; Chen, Zeng-Bing; Shi, Mingjun; Zhou, Xianyi; Du, Jiangfeng; Han, Rongdian

    2002-01-01

    We present an architecture of QCPU(Quantum Central Processing Unit), based on the discrete quantum gate set, that can be programmed to approximate any n-qubit computation in a deterministic fashion. It can be built efficiently to implement computations with any required accuracy. QCPU makes it possible to implement universal quantum computation with a fixed, general purpose hardware. Thus the complexity of the quantum computation can be put into the software rather than the hardware.

  1. Using Enterprise Architecture for the Alignment of Information Systems in Supply Chain Management

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    Using information systems in supply chain management (SCM) has become commonplace, and therefore architectural issue are part of the agenda for this domain. This article uses three perspectives on enterprise architecture (EA) in the supply chain: The "correlation view," the "remote view......" and the "institutional view."  It is shown that the EA in the domain of supply chain has to meet quite a complicated set of demands. Drawing strongly Doucet et al. (2009) attention is given to the consideration on practical alignment and assurance strategies between EA and SCM. A case of an apparel company with a global...... supply chain using a bespoke ERP system for the supply chain support is presented and discussed. The case outlines potentials for an enhanced alignment and coherence between management, business processes and underlying information system; innovation is led by tighter integration with business partners...

  2. A novel digital pulse processing architecture for nuclear instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole [CEA, LIST - Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Paindavoine, Michel [CNRS, Universite de Bourgogne - Laboratoire d' Etude de l' Apprentissage et du Developpement, 21000 DIJON, (France)

    2015-07-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet

  3. SKOPE A connectionist/symbolic architecture of spoken Korean processing

    CERN Document Server

    Lee, G; Lee, Geunbae; Lee, Jong-Hyeok

    1995-01-01

    Spoken language processing requires speech and natural language integration. Moreover, spoken Korean calls for unique processing methodology due to its linguistic characteristics. This paper presents SKOPE, a connectionist/symbolic spoken Korean processing engine, which emphasizes that: 1) connectionist and symbolic techniques must be selectively applied according to their relative strength and weakness, and 2) the linguistic characteristics of Korean must be fully considered for phoneme recognition, speech and language integration, and morphological/syntactic processing. The design and implementation of SKOPE demonstrates how connectionist/symbolic hybrid architectures can be constructed for spoken agglutinative language processing. Also SKOPE presents many novel ideas for speech and language processing. The phoneme recognition, morphological analysis, and syntactic analysis experiments show that SKOPE is a viable approach for the spoken Korean processing.

  4. A Networked Perspective on the Engineering Design Process: At the Intersection of Process and Organisation Architectures

    DEFF Research Database (Denmark)

    Parraguez, Pedro

    projects often fail to be on time, on budget, and meeting specifications. Despite the wealth of process models available, previous approaches have been insufficient to provide a networked perspective that allows the challenging combination of organisational and process complexity to unfold. The lack...... the process. This combination of process structure—how people and activities are connected—and composition—the functional diversity of the groups participating in the process—is referred to as the actual design process architecture. This thesis reports on research undertaken to develop, apply and test......-driven reflection of the relationship between process architecture and performance, and to provide the means to compare process plans against the actual process. The framework is based on a multi-domain network approach to process architecture and draws on previous research using matrix-based and graph...

  5. VLSI architecture of NEO spike detection with noise shaping filter and feature extraction using informative samples.

    Science.gov (United States)

    Hoang, Linh; Yang, Zhi; Liu, Wentai

    2009-01-01

    An emerging class of multi-channel neural recording systems aims to simultaneously monitor the activity of many neurons by miniaturizing and increasing the number of recording channels. Vast volume of data from the recording systems, however, presents a challenge for processing and transmitting wirelessly. An on-chip neural signal processor is needed for filtering uninterested recording samples and performing spike sorting. This paper presents a VLSI architecture of a neural signal processor that can reliably detect spike via a nonlinear energy operator, enhance spike signal over noise ratio by a noise shaping filter, and select meaningful recording samples for clustering by using informative samples. The architecture is implemented in 90-nm CMOS process, occupies 0.2 mm(2), and consumes 0.5 mW of power.

  6. BRICS and Quantum Information Processing

    DEFF Research Database (Denmark)

    Schmidt, Erik Meineche

    1998-01-01

    BRICS is a research centre and international PhD school in theoretical computer science, based at the University of Aarhus, Denmark. The centre has recently become engaged in quantum information processing in cooperation with the Department of Physics, also University of Aarhus. This extended...... abstract surveys activities at BRICS with special emphasis on the activities in quantum information processing....

  7. An Information Architecture Framework for the USAF: Managing Information from an Enterprise Perspective

    Science.gov (United States)

    2010-03-01

    Architecture Framework (E2AF) • Computer Integrated Manufacturing Open Systems Architecture ( CIMOSA ) • The Open Group Architecture Framework (TOGAF...API Application Programming Interface CDM conceptual data model CIMOSA Computer Integrated Manufacturing Open Systems Architecture DoDAF

  8. All-IP-Ethernet architecture for real-time sensor-fusion processing

    Science.gov (United States)

    Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya

    2016-03-01

    Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.

  9. An Approach to Share Architectural Drawing Information and Document Information for Automated Code Checking System

    Institute of Scientific and Technical Information of China (English)

    Jungsik Choi; Inhan Kim

    2008-01-01

    The purpose of this study is to suggest a way of optimized managing and sharing information be-tween standard architectural drawings and construction documents in Korea architectural industry for auto-mated code checking system by linked STEP and XML. To archive this purpose, the authors have analyzed current research and technical development for STEP and XML link and developed a prototype system for sharing information between model based drawings and XML based construction documents. Finally, the authors have suggested practical use scenario of sharing information through linked STEP and XML using test case of automatic code checking. In the paper, the possibility of constructing integrated architectural computing environment through exchange and sharing of drawing information and external data for the whole building life-cycle, from the conceptual design stage to the construction and maintenance stage has been examined. Automated code checking through linked STEP and XML could be enhanced through col-laboration business, more completed code, improved building performance, and reduced construction costs.

  10. The start up as a phase of architectural design process.

    Science.gov (United States)

    Castro, Iara Sousa; Lima, Francisco de Paula Antunes; Duarte, Francisco José de Castro Moura

    2012-01-01

    Alterations made in the architectural design can be considered as a continuous process, from its conception to the moment a built environment is already in use. This article focuses on the "moving phase", which is the initial moment of the environment occupation and the start-up of services. It aims to show that the continuity of ergonomics interventions during the "moving phase" or start up may reveal the built environment inadequacies; clearly showing needs not met by the design and allowing making instant decisions to solve non-foreseen problems. The results have revealed some lessons experienced by users during a critical stage not usually included in the design process.

  11. COMPLEX THINKING IN THE PROCESS OF LEARNING ARCHITECTURAL COMPOSITION

    Directory of Open Access Journals (Sweden)

    Špela Hudnik

    2013-12-01

    Full Text Available In the learning process which aim is developing original creativity, has its central role complex thinking. This is important for the sensibilisation and intensification of the individual creative abilities. Multidisciplinary approach, various mind strategies and techniques of creating and resolving problems encourage by the individual and the group creativity, innovation, teamwork and critical thinking. The article represents four examples of the process in which new creative ideas, translated into complex graphical compositions representing the combination of architectural and fine arts contents, experience, ethical and esthetical sensitivity, existential self-awareness and the holistic personal development, are born.

  12. Critical zone architecture and processes: a geophysical perspective

    Science.gov (United States)

    Holbrook, W. S.

    2016-12-01

    The "critical zone (CZ)," Earth's near-surface layer that reaches from treetop to bedrock, sustains terrestrial life by storing water and producing nutrients. Despite is central importance, however, the CZ remains poorly understood, due in part to the complexity of interacting biogeochemical and physical processes that take place there, and in part due to the difficulty of measuring CZ properties and processes at depth. Major outstanding questions include: What is the architecture of the CZ? How does that architecture vary across scales and across gradients in climate, lithology, topography, biology and regional states of stress? What processes control the architecture of the CZ? At what depth does weathering initiate, and what controls the rates at which it proceeds? Based on recent geophysical campaigns at seven Critical Zone Observatory (CZO) sites and several other locations, a geophysical perspective on CZ architecture and processes is emerging. CZ architecture can be usefully divided into four layers, each of which has distinct geophysical properties: soil, saprolite, weathered bedrock and protolith. The distribution of those layers across landscapes varies depending on protolith composition and internal structure, topography, climate (P/T) and the regional state of stress. Combined observations from deep CZ drilling, geophysics and geochemistry demonstrate that chemical weathering initiates deep in the CZ, in concert with mechanical weathering (fracturing), as chemical weathering appears concentrated along fractures in borehole walls. At the Calhoun CZO, the plagioclase weathering front occurs at nearly 40 m depth, at the base of a 25-m-thick layer of weathered bedrock. The principal boundary in porosity, however, occurs at the saprolite/weathered bedrock boundary: porosity decreases over an order of magnitude, from 50% to 5% over an 8-m-thick zone at the base of saprolite. Porosity in weathered bedrock is between 2-5%. Future progress will depend on (1

  13. AGENT AND RADIO FREQUENCY IDENTIFICATION BASED ARCHITECTURE FOR SUPERMARKET INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Hasan Al-Sakran

    2013-01-01

    Full Text Available In recent years the acceptance of Radio Frequency Identification (RFID technology in business environments has been increasing rapidly due to its competitive business value. Adopting a suitable RFID-based information system has become increasingly important for supermarkets. However, most supermarkets still use conventional barcode-based systems to manage their information processes, which are consistently reported as one of the most unenthusiastic aspects of supermarket shopping for both customers and management. We propose an RFID agent-based architecture that adopts intelligent agent technology with an RFID based applications. RFID provides capability to uniquely identify an object within a supermarket area, while agents are able to establish a channel of communication which can be used to facilitate communications between a RFID device and supermarket back-end system. The proposed framework includes a design of intelligent mobile shopping cart equipped with both RFID and agent technologies. As a result of using the proposed RFID agent based architecture, the customer shopping experience will be improved due to ease of retrieving of the detailed information on items and quick checkout by scanning all items at once, thus eliminating queues. From supermarket management point of view the proposed architecture will reduce the cost of operation e.g., decreasing cost of goods sold which comes in the form of labor efficiency in areas of checkout operation, inventory management and alerting the supermarket management when a certain product is running out of stock and needs to be restocked.

  14. Information processing, computation, and cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  15. Semantic Annotation Framework For Intelligent Information Retrieval Using KIM Architecture

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Malik

    2010-11-01

    Full Text Available Due to the explosion of information/knowledge on the web and wide use of search engines for desiredinformation,the role of knowledge management(KM is becoming more significant in an organization.Knowledge Management in an Organization is used to create ,capture, store, share, retrieve and manageinformation efficiently. The semantic web, an intelligent and meaningful web, tend to provide a promisingplatform for knowledge management systems and vice versa, since they have the potential to give eachother the real substance for machine-understandable web resources which in turn will lead to anintelligent, meaningful and efficient information retrieval on web. Today,the challenge for web communityis to integrate the distributed heterogeneous resources on web with an objective of an intelligent webenvironment focusing on data semantics and user requirements. Semantic Annotation(SA is being widelyused which is about assigning to the entities in the text and links to their semantic descriptions. Varioustools like KIM, Amaya etc may be used for semantic Annotation.In this paper, we introduce semantic annotation as one of the key technology in an intelligent webenvironment , then revisit and review, discuss and explore about Knowledge Management and SemanticAnnotation. A Knowledge Management Framework and a Framework for Semantic Annotation andSemantic Search with Knowledge Base(GATE and Ontology have been presented. Then KIM Annotationplatform architecture including KIM Ontology(KIMO, KIM Knowledge Base and KIM front ends havebeen highlighted. Finally, intelligent pattern search and concerned GATE framework with a KIMAnnotation Example have been illiustrated towards an intelligent information retrieval

  16. An interoperability architecture for the health information exchange in Rwanda

    CSIR Research Space (South Africa)

    Crichton, R

    2012-08-01

    Full Text Available district in the initial phase, to national level without requiring a fundamental change in technology or design paradigm. This paper describes the key requirements and the design of the current architecture using ISO/IEC/IEEE 42010 standard architecture...

  17. Image processing methods and architectures in diagnostic pathology.

    Directory of Open Access Journals (Sweden)

    Oscar DĂŠniz

    2010-05-01

    Full Text Available Grid technology has enabled the clustering and the efficient and secure access to and interaction among a wide variety of geographically distributed resources such as: supercomputers, storage systems, data sources, instruments and special devices and services. Their main applications include large-scale computational and data intensive problems in science and engineering. General grid structures and methodologies for both software and hardware in image analysis for virtual tissue-based diagnosis has been considered in this paper. This methods are focus on the user level middleware. The article describes the distributed programming system developed by the authors for virtual slide analysis in diagnostic pathology. The system supports different image analysis operations commonly done in anatomical pathology and it takes into account secured aspects and specialized infrastructures with high level services designed to meet application requirements. Grids are likely to have a deep impact on health related applications, and therefore they seem to be suitable for tissue-based diagnosis too. The implemented system is a joint application that mixes both Web and Grid Service Architecture around a distributed architecture for image processing. It has shown to be a successful solution to analyze a big and heterogeneous group of histological images under architecture of massively parallel processors using message passing and non-shared memory.

  18. RECORDING INFORMATION ON ARCHITECTURAL HERITAGE SHOULD MEET THE REQUIREMENTS FOR CONSERVATION Digital Recording Practices at the Summer Palace

    OpenAIRE

    L. Zhang; Y. Cong; Bai, C; Wu, C

    2017-01-01

    The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural ...

  19. APPLICATION OF IMPROVED PRODUCTION ACTIVITY CONTROL ARCHITECTURE FOR SHOP FLOOR INFORMATION SYSTEM IN DIGITAL MANUFACTURING

    Institute of Scientific and Technical Information of China (English)

    SHAHID Ikramullah Butt; SUN Houfang; GAO Zhengqing

    2006-01-01

    Shop floor control (SFC) is responsible for the coordination and control of the manufacturing physical and information flow within the shop floor in the manufacturing system. Weaknesses of the production activity control (PAC) architecture of the shop floor are addressed by the Maglica's new system architecture. This architecture gives rise to unlimited number of movers and producers thus evolving more complex but decentralized architecture. Beijing Institute of Technology - production activity control (BIT-PAC) architecture introduces an idea of sub-producers and sub-movers thus reducing the complexity of the architecture. All the equipments including sub-producers and sub-movers are considered to be passive in the proposed shop floor information system. The dissemination of information from sub-producers and sub-movers is done manually through a PC. Proposed BIT-PAC SFC architecture facilitates the information flow from shop floor to the other area of the organization. Effective use of internet information services (IIS) and SQL2000 is done along with the ASP. NET technology to implement the application logic. Applicability of the software based on BIT-PAC architecture is checked by running application software on a network PC that supports the dynamic flow of information from sub-producers and sub-movers to the other parts of the organization. Use of software is also shown at the end for BIT training workshop thus supporting the use of SFC architecture for similar kind of environments.

  20. Cognition: Human Information Processing. Introduction.

    Science.gov (United States)

    Griffith, Belver C.

    1981-01-01

    Summarizes the key research issues and developments in cognitive science, especially with respect to the similarities, differences, and interrelationships between human and machine information processing. Nine references are listed. (JL)

  1. THE ARCHITECTURAL INFORMATION SYSTEM SIARCH3D-UNIVAQ FOR ANALYSIS AND PRESERVATION OF ARCHITECTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    M. Centofanti

    2012-09-01

    Full Text Available The research group of L'Aquila University defined a procedure to create an architectonical Information System called SIArch- Univaq. This information system can be integrated with "Risk Map" Italian database. The SIArch-Univaq is based on importation of architectonical three-dimensional photorealistic models in GIS environment. 3D models are realised according to building constructive elements, derived by a critical architectonic surveying; the importation of models into GIS allows the interrogation of the constructive elements (i.e. beam, window, door, etc.: this favour the knowledge of the architectonical heritage, indispensable requirement to plan processes of restoration, maintenance and management.

  2. A novel architecture for information retrieval system based on semantic web

    Science.gov (United States)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  3. Algorithm-Architecture Matching for Signal and Image Processing

    CERN Document Server

    Gogniat, Guy; Morawiec, Adam; Erdogan, Ahmet

    2011-01-01

    Advances in signal and image processing together with increasing computing power are bringing mobile technology closer to applications in a variety of domains like automotive, health, telecommunication, multimedia, entertainment and many others. The development of these leading applications, involving a large diversity of algorithms (e.g. signal, image, video, 3D, communication, cryptography) is classically divided into three consecutive steps: a theoretical study of the algorithms, a study of the target architecture, and finally the implementation. Such a linear design flow is reaching its li

  4. A Supply Chain Architecture Based on Multi-agent Systems to Support Decentralized Collaborative Processes

    Science.gov (United States)

    Hernández, Jorge E.; Poler, Raúl; Mula, Josefa

    In a supply chain management context, the enterprise architecture concept to efficiently support the collaborative processes among the supply chain members involved has been evolving. Each supply chain has an organizational structure that describes the hierarchical relationships among its members, ranging from centralized to decentralized organizations. From a decentralized perspective, each supply chain member is able to identify collaborative and non collaborative partners and the kind of information to be exchanged to support negotiation processes. The same concepts of organizational structure and negotiation rules can be applied to a multi-agent system. This paper proposes a novel supply chain architecture to support decentralized collaborative processes in supply chains by considering a multi-agent-based system modeling approach.

  5. Formal and Informal Modeling of Fault Tolerant Noc Architectures

    Directory of Open Access Journals (Sweden)

    Mostefa BELARBI

    2015-11-01

    Full Text Available The suggested new approach based on B-Event formal technics consists of suggesting aspects and constraints related to the reliability of NoC (Network-On-chip and the over-cost related to the solutions of tolerances on the faults: a design of NoC tolerating on the faults for SoC (System-on-Chip containing configurable technology FPGA (Field Programmable Gates Array, by extracting the properties of the NoC architecture. We illustrate our methodology by developing several refinements which produce QNoC (Quality of Service of Network on chip switch architecture from specification to test. We will show how B-event formalism can follow life cycle of NoC design and test: for example the code VHDL (VHSIC Hardware Description Language simulation established of certain kind of architecture can help us to optimize the architecture and produce new architecture; we can inject the new properties related to the new QNoC architecture into formal B-event specification. B-event is associated to Rodin tool environment. As case study, the last stage of refinement used a wireless network in order to generate complete test environment of the studied application.

  6. Combining cognitive engineering and information fusion architectures to build effective joint systems

    Science.gov (United States)

    Sliva, Amy L.; Gorman, Joe; Voshell, Martin; Tittle, James; Bowman, Christopher

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture concept was previously described as a novel approach toward integrating analytic and decision-making processes in joint human/automation systems in highly complex sociotechnical settings. In this paper, we extend the DNDW construct with a description of components in this framework, combining structures of the Dual Node Network (DNN) for Information Fusion and Resource Management with extensions on Rasmussen's Decision Ladder (DL) to provide guidance on constructing information systems that better serve decision-making support requirements. The DNN takes a component-centered approach to system design, decomposing each asset in terms of data inputs and outputs according to their roles and interactions in a fusion network. However, to ensure relevancy to and organizational fitment within command and control (C2) processes, principles from cognitive systems engineering emphasize that system design must take a human-centered systems view, integrating information needs and decision making requirements to drive the architecture design and capabilities of network assets. In the current work, we present an approach for structuring and assessing DNDW systems that uses a unique hybrid DNN top-down system design with a human-centered process design, combining DNN node decomposition with artifacts from cognitive analysis (i.e., system abstraction decomposition models, decision ladders) to provide work domain and task-level insights at different levels in an example intelligence, surveillance, and reconnaissance (ISR) system setting. This DNDW structure will ensure not only that the information fusion technologies and processes are structured effectively, but that the resulting information products will align with the requirements of human decision makers and be adaptable to different work settings .

  7. SoC-based architecture for biomedical signal processing.

    Science.gov (United States)

    Gutiérrez-Rivas, R; Hernández, A; García, J J; Marnane, W

    2015-01-01

    Over the last decades, many algorithms have been proposed for processing biomedical signals. Most of these algorithms have been focused on the elimination of noise and artifacts existing in these signals, so they can be used for automatic monitoring and/or diagnosis applications. With regard to remote monitoring, the use of portable devices often requires a reduced number of resources and power consumption, being necessary to reach a trade-off between the accuracy of algorithms and their computational complexity. This paper presents a SoC (System-on-Chip) architecture, based on a FPGA (Field-Programmable Gate Array) device, suitable for the implementation of biomedical signal processing. The proposal has been successfully validated by implementing an efficient QRS complex detector. The results show that, using a reduced amount of resources, values of sensitivity and positive predictive value above 99.49% are achieved, which make the proposed approach suitable for telemedicine applications.

  8. Structural optimization for materially informed design to robotic production processes

    NARCIS (Netherlands)

    Bier, H.H.; Mostafavi, S.

    2015-01-01

    Hyperbody’s materially informed Design-to-Robotic-Production (D2RP) processes for additive and subtractive manufacturing aim to achieve performative porosity in architecture at various scales. An extended series of D2RP experiments aiming to produce prototypes at 1:1 scale wherein design materiality

  9. Design of a Parallel Sampling Encoder for Analog to Information (A2I Converters: Theory, Architecture and CMOS Implementation

    Directory of Open Access Journals (Sweden)

    Andreas G. Andreou

    2013-03-01

    Full Text Available We discuss the architecture and design of parallel sampling front ends for analog to information (A2I converters. As a way of example, we detail the design of a custom 0.5 µm CMOS implementation of a mixed signal parallel sampling encoder architecture. The system consists of configurable parallel analog processing channels, whose output is sampled by traditional analog-to-digital converters (ADCs. The analog front-end modulates the signal of interest with a high-speed digital chipping sequence and integrates the result prior to sampling at a low rate. An FPGA is employed to generate the chipping sequences and process the digitized samples.

  10. SANDS: a service-oriented architecture for clinical decision support in a National Health Information Network.

    Science.gov (United States)

    Wright, Adam; Sittig, Dean F

    2008-12-01

    In this paper, we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. The SANDS architecture for decision support has several significant advantages over other architectures for clinical decision support. The most salient of these are:

  11. Chaotic signal processing: information aspects

    CERN Document Server

    Andreyev, Y V; Efremova, E V; Anagnostopoulos, A N

    2003-01-01

    One of the features of chaotic signals that make them different of other types of signals is their special information properties. In this paper, we investigate the effect of these properties on the procedures of chaotic signal processing. On examples of cleaning chaotic signals off noise, chaotic synchronization and separation of chaotic signals we demonstrate the existence of basic limits imposed by information theory on chaotic signal processing, independent of concrete algorithms. Relations of these limits with the Second law, Shannon theorems and Landauer principle are discussed.

  12. Distributed, Modular, Network Enabled Architecture For Process Control Military Applications

    Directory of Open Access Journals (Sweden)

    Abhijit Kamble*,

    2014-02-01

    Full Text Available In process control world, use of distributed modular embedded controller architecture drastically reduces the number and complexity of cabling; at the same time increases the system computing performance and response for real time application as compared to centralized control system. We propose a design based on ARM Cortex M4 hardware architecture and Cortex Microcontroller Software Interface Standard (CMSIS based software development. The ARM Cortex-M series ensures a compatible target processor and provides common core peripherals whereas CMSIS abstraction layer reduces development time, helps design software reusability and provides seamless application software interface for controllers. Being a custom design, we can built features like Built-In Test Equipment (BITE, single point fault tolerance, redundancy, 2/3 logic, etc. which are more desirable for a military applications. This paper describes the design of a generic embedded hardware module that can be configured as local I/O controller or application controller or Man Machine Interface (MMI. This paper also proposes a philosophy for step by step hardware and software development.

  13. The Internet information infrastructure: Terrorist tool or architecture for information defense?

    Energy Technology Data Exchange (ETDEWEB)

    Kadner, S.; Turpen, E. [Aquila Technologies Group, Albuquerque, NM (United States); Rees, B. [Los Alamos National Lab., NM (United States)

    1998-12-01

    The Internet is a culmination of information age technologies and an agent of change. As with any infrastructure, dependency upon the so-called global information infrastructure creates vulnerabilities. Moreover, unlike physical infrastructures, the Internet is a multi-use technology. While information technologies, such as the Internet, can be utilized as a tool of terror, these same technologies can facilitate the implementation of solutions to mitigate the threat. In this vein, this paper analyzes the multifaceted nature of the Internet information infrastructure and argues that policymakers should concentrate on the solutions it provides rather than the vulnerabilities it creates. Minimizing risks and realizing possibilities in the information age will require institutional activities that translate, exploit and convert information technologies into positive solutions. What follows is a discussion of the Internet information infrastructure as it relates to increasing vulnerabilities and positive potential. The following four applications of the Internet will be addressed: as the infrastructure for information competence; as a terrorist tool; as the terrorist`s target; and as an architecture for rapid response.

  14. Quality Criteria for Architectural 3D Data in Usage and Preservation Processes

    DEFF Research Database (Denmark)

    Lindlar, Michelle; Tamke, Martin; Myrup Jensen, Morten

    2014-01-01

    responsible for the archival of information about publically funded buildings. Architectural practise of today commonly includes 3D object processing. The output of these processes is slowly reaching the aforementioned cultural heritage institutions which are now facing the task of quality assessment...... of the material. The paper will present a first analysis of potential quality factors and compare architectural and cultural heritage domain expectations in 3D data quality. It will look at two forms of 3D data: modelled 3D objects and scanned 3D objects. The work presented in this paper is based on work......Quality assessment of digital material has been just one of the new tasks the digital revolution brought into the library domain. With the first big print material digitization efforts in the digital heritage domain dating back to the 1980ies, plenty of experience has been gathered...

  15. Quality Criteria for Architectural 3D Data in Usage and Preservation Processes

    DEFF Research Database (Denmark)

    Lindlar, Michelle; Tamke, Martin; Myrup Jensen, Morten

    2014-01-01

    responsible for the archival of information about publically funded buildings. Architectural practise of today commonly includes 3D object processing. The output of these processes is slowly reaching the aforementioned cultural heritage institutions which are now facing the task of quality assessment...... of the material. The paper will present a first analysis of potential quality factors and compare architectural and cultural heritage domain expectations in 3D data quality. It will look at two forms of 3D data: modelled 3D objects and scanned 3D objects. The work presented in this paper is based on work......Quality assessment of digital material has been just one of the new tasks the digital revolution brought into the library domain. With the first big print material digitization efforts in the digital heritage domain dating back to the 1980ies, plenty of experience has been gathered...

  16. Executable Behavioral Modeling of System and Software Architecture Specifications to Inform Resourcing Decisions

    Science.gov (United States)

    2016-09-01

    BEHAVIORAL MODELING OF SYSTEM- AND SOFTWARE-ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS by Monica F. Farah-Stapleton...REPORT DATE September 2016 3. REPORT TYPE AND DATES COVERED Doctoral Dissertation 4. TITLE AND SUBTITLE EXECUTABLE BEHAVIORAL MODELING OF SYSTEM...intellectual, programmatic, and organizational resources. Precise behavioral modeling offers a way to assess architectural design decisions prior to

  17. Intelligent Information Retrieval and Web Mining Architecture Using SOA

    Science.gov (United States)

    El-Bathy, Naser Ibrahim

    2010-01-01

    The study of this dissertation provides a solution to a very specific problem instance in the area of data mining, data warehousing, and service-oriented architecture in publishing and newspaper industries. The research question focuses on the integration of data mining and data warehousing. The research problem focuses on the development of…

  18. Building information modeling in the architectural design phases

    DEFF Research Database (Denmark)

    Hermund, Anders

    2009-01-01

    with an architectural quality? In Denmark the implementation of the digital working methods related to BIM has been introduced by government law in 2007. Will the important role of the architect as designer change in accordance with these new methods, and does the idea of one big integrated model represent a paradox...... in relation to designing? The BIM mindset requires changes on many levels....

  19. The Swedish strategy and method for development of a national healthcare information architecture.

    Science.gov (United States)

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision

  20. Wind farm protection using an IEC 61850 process bus architecture

    Energy Technology Data Exchange (ETDEWEB)

    McGinn, D. [GE Digital Energy, Toronto, ON (Canada)

    2009-07-01

    This presentation discussed wind farm protection using an IEC 61850 process bus architecture. Data is transmitted via IEC 61850 messages over a fiber optic communications network to a central relaying unit (CRU) that executes protection and control (P and C) functions for a whole wind farm. The presentation also discussed the need to further simplify P and C design. Challenges of wind farm protection include physical distance; relay location; and selectivity/isolation. P and C hardware evolution was illustrated and its advantages were presented. These included one relay to set, configure, test, and maintain; eliminate the need for transfer trips; free monitoring; free disturbance recording; and easy integration of data. The development of commercially available systems began around 1990 with the IPACS system developed by Hydro One, and the SIPSUR system developed in Spain. 7 figs.

  1. Fidelity in Archaeal Information Processing

    Directory of Open Access Journals (Sweden)

    Bart de Koning

    2010-01-01

    Full Text Available A key element during the flow of genetic information in living systems is fidelity. The accuracy of DNA replication influences the genome size as well as the rate of genome evolution. The large amount of energy invested in gene expression implies that fidelity plays a major role in fitness. On the other hand, an increase in fidelity generally coincides with a decrease in velocity. Hence, an important determinant of the evolution of life has been the establishment of a delicate balance between fidelity and variability. This paper reviews the current knowledge on quality control in archaeal information processing. While the majority of these processes are homologous in Archaea, Bacteria, and Eukaryotes, examples are provided of nonorthologous factors and processes operating in the archaeal domain. In some instances, evidence for the existence of certain fidelity mechanisms has been provided, but the factors involved still remain to be identified.

  2. Holistic Layer of the Enterprise Architecture on the Basis of Process-Driven Organization

    Directory of Open Access Journals (Sweden)

    Stepan Alexa

    2017-07-01

    Full Text Available Growing complexity of the enterprise ecosystem along with the existence of legacy approaches in the organization can result in a number of challenges when maintaining the solid baseline of its information assets. The digital industry has, over past two decades, passed through rapid evolution triggered both by availability of new technologies, and business as well as operating and funding models. These enablers have direct impact on the way that organizations design and execute their business processes in order to maintain the alignment between their capabilities and targets. This trend implies that enterprises and organizations need to remain flexible by maintaining the alignment of their business and their infrastructure in a dynamically changing and integrated ecosystem. It has been widely recognized that the enterprise architecture as well as the process driven approaches provide tools used by organizations to explain how business, resources and other elements within the organization are related to each other. This article discusses the role, and associated value, that the enterprise architecture and process driven approach have when describing what constitutes the enterprise. At the same time it elaborates on principles and constructs of the model of the holistic layer of the enterprise architecture on the basis of process driven approach. The proposed model aims to combine a unified view on infrastructure and behavior of the enterprise with lean principles in order to identify and focus on key elements of the enterprise.

  3. Efficiency of cellular information processing

    CERN Document Server

    Barato, Andre C; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the E. coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium i...

  4. Information processing. [in human performance

    Science.gov (United States)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  5. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  6. A Single Information Space for aerospace enterprises based on a Service Oriented Architecture

    Directory of Open Access Journals (Sweden)

    Vasilyeva Ekaterina

    2016-01-01

    Full Text Available Nowadays aerospace enterprises have a variety of Information Systems (IS and Software Complexes (SC. Their strategic aim is to integrate all IS and SC, thus to create the Single Information Space (SIS. An approach to create the SIS based on a Service Oriented Architecture (SOA concept at aerospace enterprises is suggested. It was shown that different tools to implement this approach are available. The SOA model of the SIS of the typical enterprise, which has a custom set of services, IS and SC, was developed. Two modifications of this SOA model are proposed. The first modification is suitable for creation of the SIS on enterprises which have Business Intelligence (BI systems based on Data Warehouse (DW. Another modification of the SOA-based SIS is important for those enterprises which require to process at least the part of technological and manufacturing data in a real-time mode.

  7. SArEM: A SPEM extension for software architecture extraction process

    Directory of Open Access Journals (Sweden)

    Mira Abboud

    2016-04-01

    Full Text Available In order to maintain a system, it’s critical to understand its architecture. However even though every system has an architecture, not every system has a reliable representation of its architecture. To deal with this problem many researchers have engaged in software architecture extraction where the system’s architecture is recovered from its source code. While there is a plethora of approaches aiming at extracting software architectures, there is no mean or tool measurement for these approaches; which makes the comparison between the different approaches a hard task. To tackle this lack, we developed a meta-model, based on SPEM meta-model, that specifies the software architecture extraction process. Such meta-model serves as a tool to compare, analyze and evaluate research field approaches. In this paper we detail our meta-model called SArEM (Software Architecture Extraction Meta-model and clarify its concepts.

  8. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  9. A Concept Transformation Learning Model for Architectural Design Learning Process

    Science.gov (United States)

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  10. A Concept Transformation Learning Model for Architectural Design Learning Process

    Science.gov (United States)

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  11. Quantum communication and information processing

    Science.gov (United States)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  12. Historic Building Information Modelling - Adding intelligence to laser and image based surveys of European classical architecture

    Science.gov (United States)

    Murphy, Maurice; McGovern, Eugene; Pavia, Sara

    2013-02-01

    Historic Building Information Modelling (HBIM) is a novel prototype library of parametric objects, based on historic architectural data and a system of cross platform programmes for mapping parametric objects onto point cloud and image survey data. The HBIM process begins with remote collection of survey data using a terrestrial laser scanner combined with digital photo modelling. The next stage involves the design and construction of a parametric library of objects, which are based on the manuscripts ranging from Vitruvius to 18th century architectural pattern books. In building parametric objects, the problem of file format and exchange of data has been overcome within the BIM ArchiCAD software platform by using geometric descriptive language (GDL). The plotting of parametric objects onto the laser scan surveys as building components to create or form the entire building is the final stage in the reverse engineering process. The final HBIM product is the creation of full 3D models including detail behind the object's surface concerning its methods of construction and material make-up. The resultant HBIM can automatically create cut sections, details and schedules in addition to the orthographic projections and 3D models (wire frame or textured) for both the analysis and conservation of historic objects, structures and environments.

  13. Information Architecture and Information Quality%信息构建与信息质量

    Institute of Scientific and Technical Information of China (English)

    刘永

    2005-01-01

    美国的建筑学家Richard Saul Wurman在1975年提出信息构建(IA,Information Architecture)的概念,从建筑学的角度研究有关城市环境信息的收集、组织和利用问题,旨在为建筑师、城市规划者、公用设施和交通上程师以及市民提供信息支持。20世纪90年代,信息技术迅速发展和应用,带来了网络化、数字化和信息化,人们开始研究信息构建在网络信息组织中的应用问题。

  14. Thinking in networks: artistic–architectural responses to ubiquitous information

    Directory of Open Access Journals (Sweden)

    Yvonne Spielmann

    2011-12-01

    Full Text Available The article discusses creative practices that in aesthetical-technical ways intervene into the computer networked communication systems.I am interested in artist practices that use networks in different ways to make us aware about the possibilities to rethink media-cultural environments. I use the example of the Japanese art-architectural group Double Negative Architecture to give an example of creatively thinking in networks.Yvonne Spielmann (Ph.D., Dr. habil. is presently Research Professor and Chair of New Media at The University of the West of Scotland. Her work focuses on inter-relationships between media and culture, technology, art, science and communication, and in particular on Western/European and non-Western/South-East Asian interaction. Milestones of publish research output are four authored monographs and about 90 single authored articles. Her book, “Video, the Reflexive Medium” (published by MIT Press 2008, Japanese edition by Sangen-sha Press 2011 was rewarded the 2009 Lewis Mumford Award for Outstanding Scholarship in the Ecology of Technics. Her most recent book “Hybrid Cultures” was published in German by Suhrkamp Press in 2010, English edition from MIT Press in 2012. Spielmann's work has been published in German and English and has been translated into French, Polish, Croatian, Swedish, Japanese, and Korean. She holds the 2011 Swedish Prize for Swedish–German scientific co-operation.

  15. UNREALIZED COMPETITIONS OF THE 1920-1930ies IN THE CONTEXT OF THE ARCHITECTURE DEVELOPMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Dudka Elena Nikolaevna

    2012-10-01

    Full Text Available Analysis of organizational and methodical issues associated with architectural competitions of the 1920-1930ss and evaluation of their results are provided in the article. The role of the architectural development and historical, cultural, social and ideological backgrounds of the architectural competitions are also covered by the author. The review of architectural competitions makes it possible to identify their impact produced on the development of the theory and practice of architecture. The period between 1920 and 1930 was marked by a quantitative and qualitative boom of architectural design competitions. Analysis of the practice of architectural competitions makes it possible to identify periods of revolutionary transformations. The most prominent buildings that date back to these periods include theatre buildings in Kharkov, Rostov-Don, Sverdlovsk; a competition for the architectural design of the Gosprom Building in Kharkov; a competition for the architectural design of the Palace of Soviets in Moscow, etc. These competitions have shaped up new approaches to volume and space-related solutions as well as architectural forms. Research of the integral process of development of approaches and/or methods of organization of architectural design competitions make it possible to identify their role as effective catalysts of architectural theory and practice.

  16. Architectures for enabling flexible business processes : A research agenda

    NARCIS (Netherlands)

    Overbeek, S.J.; Gong, Y.

    2010-01-01

    For decades, information systems have been designed for controlling and managing business processes. In the past, these systems were often monolithic in nature and not made for interacting and communicating with other systems. Today, departments and organizations must collaborate, which requires dis

  17. Architectures for enabling flexible business processes : A research agenda

    NARCIS (Netherlands)

    Overbeek, S.J.; Gong, Y.

    2010-01-01

    For decades, information systems have been designed for controlling and managing business processes. In the past, these systems were often monolithic in nature and not made for interacting and communicating with other systems. Today, departments and organizations must collaborate, which requires

  18. Media architecture using information and media as construction material

    CERN Document Server

    Wiethoff, Alexander

    2017-01-01

    The buzzwords "Information Society" and "Age of Access" suggest that information is now universally accessible without any form of hindrance. Indeed, the German constitution calls for all citizens to have open access to information. Yet in reality, there are multifarious hurdles to information access - whether physical, economic, intellectual, linguistic, political, or technical. Thus, while new methods and practices for making information accessible arise on a daily basis, we are nevertheless confronted by limitations to information access in various domains. This new book series assembles ac

  19. Algorithms and Software Architecture for the Production of Information Products From LIDAR Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Diamond Data Systems (DDS) proposes the development of a new advanced architecture, algorithms and software to support the end-to-end processing of LIDAR data to...

  20. Optical information storage and processing

    Science.gov (United States)

    Liu, Zhiwen

    Optical information storage and optical information processing are the two themes of this thesis. Chapter two and three discuss the issue of storage while the final two chapters investigate the topic of optical computing. In the second chapter, we demonstrate a holographic system which is able to record phenomena in nanosecond speed. Laser induced shock wave propagation is recorded by angularly multiplexing pulsed holograms. Five frames can be recorded with frame interval of 12ns and time resolution of 5.9ns. We also demonstrate a system which can record fast events holographically on a CCD camera. Carrier multiplexing is used to store 3 frames in a single CCD frame with frame interval of 12ns. This technique can be extended to record femtosecond events. Information storage in subwavelength structures is discussed in the third chapter. A 2D simulation tool using the FDTD algorithm is developed and applied to calculate the far field scattering from subwavelength trenches. The simulation agrees with the experimental data very well. Width, depth and angle multiplexing is investigated to encode information in subwavelength features. An eigenfunction approach is adopted to analyze how much information can be stored given the length of the feature. Finally we study the effect of non-linear buffer layer. We switch gear to holographic correlators in the fourth chapter. We study various properties of the defocused correlator which can control the shift invariance conveniently. An approximate expression of the shift selectivity is derived. We demonstrate a real time correlator with 480 templates. The cross talk of the correlators is also analyzed. Finally, in the fifth chapter we apply the optical correlator to fingerprint identification and study the performance of the correlation based algorithms. The windowed correlation can improve the rotation and distortion tolerance.

  1. Digital signal processor and its application. ; Animation processing DSP architectures. Digital signal processor to sono oyo. ; Dogazo shoriyo DSP architecture

    Energy Technology Data Exchange (ETDEWEB)

    Murakami, T.; Ohira, H. (Mitsubishi Electric Corp., Tokyo (Japan))

    1991-12-20

    A description is given on the internationally standardized animation coding system, and existing and next generation type image processing digital signal processor (DSP) architectures. The internationally standardized animation coding system stratifies images into segments of picture element, frame, and block, with each stratum given exclusive processing. A TV conference and a TV telephone conversation require a huge amount of animation image data. To process these data on a real-time basis, the current video image processing system takes a multi DSP configuration. Methods to split the loads and allocate each load fixedly to each DSP are classified into splitting the loads in the units of coding processing function, the object images, and the loads according to amounts of loads to be calculated. These splitting methods are applied to each stratum processing. The process and splitting corresponding to each stratum processing improved the efficiency. Since the method is a software-based processing, it can be applied not only to the irternationally standardized system, but also to the vector quantization system. Although the present LSI technology is not sufficiently capable to mount the architectures meeting the stratified configuration on one chip, an architecture that specializes the functions in each stratum has a possibility to serve as one chip DSP. 14 refs., 5 figs., 4 tabs.

  2. Enterprise Information System Architecture Based on Web 2.0

    Institute of Scientific and Technical Information of China (English)

    YI Xiushuang; WANG Yu; LIU Jinghong; WEN Zhankao

    2006-01-01

    Enterprise information systems with a great use of Web 2.0 technologies will be more open, free, and more efficient.With the contrast between classic Web technologies and Web 2.0 technologies, we represent a sample of enterprise information system based on Web 2.0, and show how the use of Web 2.0 technologies changes the system data exchange model of the enterprise information systems and how it improves the efficiency and effectiveness of information systems.

  3. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  4. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  5. Has the architectural process of public buildings changed since the romans?

    DEFF Research Database (Denmark)

    Brink Rasmussen, Mai; Jensen, Rasmus Lund; Fisker, Anna Marie

    2014-01-01

    Have the architectural process of public buildings changed as much as the society since the Romans? This paper compares the Roman approach of public buildings with the Danish approach today.......Have the architectural process of public buildings changed as much as the society since the Romans? This paper compares the Roman approach of public buildings with the Danish approach today....

  6. Architectural drawing in the process of visual research: The new school concept of the representation of space

    Directory of Open Access Journals (Sweden)

    Kovač Vladimir

    2016-01-01

    Full Text Available The viewpoint of architect Đorđe Petrović on drawing as a research process, driven by his work at the Faculty of Architecture, University of Belgrade within the field of architectural drawing, is to be taken as a starting point for the analysis of the process of visual representation of architectural space in this paper. The analysis is primarily focused on the relevant period from the beginning of the seventies, when the concept of the New School was formed, and Petrović introduced the concepts of visual research and visual communications to the curriculum, in his reassessment of the role of architectural drawings as a purely technical and information resource. The basic methodological question concerns the interpretation of the concept of visual research, conducted within the reformed curriculum, as well as its position in the then socio-cultural context and in relation to the actual practice of the time and the period that preceded it. Looking at the drawing as a powerful means of representations of space, the paper discusses architectural discourse determined by architectural drawing as the product of social and theoretical practice, similar to the hypothesis of Henri Lefebvre, presented in his work The Production of Space.

  7. The Process of Learning from Information.

    Science.gov (United States)

    Kuhlthau, Carol Collier

    1995-01-01

    Presents the process of learning from information as the key concept for the library media center in the information age school. The Information Search Process Approach is described as a model for developing information skills fundamental to information literacy, and process learning is discussed. (Author/LRW)

  8. Mobilizing disability experience to inform architectural education. Lessons learned from a field experiment

    OpenAIRE

    Heylighen, Ann; Vermeersch, Peter-Willem

    2015-01-01

    Through their bodily interaction with the designed environment, disabled people are able to appreciate qualities designers may not be attuned to. In architectural practice, however disability experience is hardly acknowledged as a valuable resource for design. Since attitudes developed in the educational settings are carried into people’s professional careers, this paper examines the added value of mobilizing disability experience to inform architectural education. To this end it analyses the...

  9. Developing Historic Building Information Modelling Guidelines and Procedures for Architectural Heritage in Ireland

    Science.gov (United States)

    Murphy, M.; Corns, A.; Cahill, J.; Eliashvili, K.; Chenau, A.; Pybus, C.; Shaw, R.; Devlin, G.; Deevy, A.; Truong-Hong, L.

    2017-08-01

    Cultural heritage researchers have recently begun applying Building Information Modelling (BIM) to historic buildings. The model is comprised of intelligent objects with semantic attributes which represent the elements of a building structure and are organised within a 3D virtual environment. Case studies in Ireland are used to test and develop the suitable systems for (a) data capture/digital surveying/processing (b) developing library of architectural components and (c) mapping these architectural components onto the laser scan or digital survey to relate the intelligent virtual representation of a historic structure (HBIM). While BIM platforms have the potential to create a virtual and intelligent representation of a building, its full exploitation and use is restricted to narrow set of expert users with access to costly hardware, software and skills. The testing of open BIM approaches in particular IFCs and the use of game engine platforms is a fundamental component for developing much wider dissemination. The semantically enriched model can be transferred into a WEB based game engine platform.

  10. Information management architecture for an integrated computing environment for the Environmental Restoration Program. Environmental Restoration Program, Volume 3, Interim technical architecture

    Energy Technology Data Exchange (ETDEWEB)

    1994-09-01

    This third volume of the Information Management Architecture for an Integrated Computing Environment for the Environmental Restoration Program--the Interim Technical Architecture (TA) (referred to throughout the remainder of this document as the ER TA)--represents a key milestone in establishing a coordinated information management environment in which information initiatives can be pursued with the confidence that redundancy and inconsistencies will be held to a minimum. This architecture is intended to be used as a reference by anyone whose responsibilities include the acquisition or development of information technology for use by the ER Program. The interim ER TA provides technical guidance at three levels. At the highest level, the technical architecture provides an overall computing philosophy or direction. At this level, the guidance does not address specific technologies or products but addresses more general concepts, such as the use of open systems, modular architectures, graphical user interfaces, and architecture-based development. At the next level, the technical architecture provides specific information technology recommendations regarding a wide variety of specific technologies. These technologies include computing hardware, operating systems, communications software, database management software, application development software, and personal productivity software, among others. These recommendations range from the adoption of specific industry or Martin Marietta Energy Systems, Inc. (Energy Systems) standards to the specification of individual products. At the third level, the architecture provides guidance regarding implementation strategies for the recommended technologies that can be applied to individual projects and to the ER Program as a whole.

  11. Botnet Detection Architecture Based on Heterogeneous Multi-sensor Information Fusion

    Directory of Open Access Journals (Sweden)

    HaiLong Wang

    2011-12-01

    Full Text Available As technology has been developed rapidly, botnet threats to the global cyber community are also increasing. And the botnet detection has recently become a major research topic in the field of network security. Most of the current detection approaches work only on the evidence from single information source, which can not hold all the traces of botnet and hardly achieve high accuracy. In this paper, a novel botnet detection architecture based on heterogeneous multi-sensor information fusion is proposed. The architecture is designed to carry out information integration in the three fusion levels of data, feature, and decision. As the core component, a feature extraction module is also elaborately designed. And an extended algorithm of the Dempster-Shafer (D-S theory is proved and adopted in decision fusion. Furthermore, a representative case is provided to illustrate that the detection architecture can effectively fuse the complicated information from various sensors, thus to achieve better detection effect.

  12. Proprioceptive information processing in schizophrenia

    DEFF Research Database (Denmark)

    Arnfred, Sidse M H

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor...... Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time......-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus...

  13. Clinical engineering and risk management in healthcare technological process using architecture framework.

    Science.gov (United States)

    Signori, Marcos R; Garcia, Renato

    2010-01-01

    This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.

  14. Semantic Web-Driven LMS Architecture towards a Holistic Learning Process Model Focused on Personalization

    Science.gov (United States)

    Kerkiri, Tania

    2010-01-01

    A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…

  15. Physics Colloquium: The optical route to quantum information processing

    CERN Multimedia

    Université de Genève

    2011-01-01

    Geneva University Physics Department 24, Quai Ernest Ansermet CH-1211 Geneva 4 Monday 11 April 2011 17h00 - Ecole de Physique, Auditoire Stückelberg The optical route to quantum information processing Prof. Terry Rudolph/Imperial College, London Photons are attractive as carriers of quantum information both because they travel, and can thus transmit information, but also because of their good coherence properties and ease in undergoing single-qubit manipulations. The main obstacle to their use in information processing is inducing an effective interaction between them in order to produce entanglement. The most promising approach in photon-based information processing architectures is so-called measurement-based quantum computing. This relies on creating upfront a multi-qubit highly entangled state (the cluster state) which has the remarkable property that, once prepared, it can be used to perform quantum computation by making only single qubit measurements. In this talk I will discuss generically the...

  16. Stigmergic construction and topochemical information shape ant nest architecture.

    Science.gov (United States)

    Khuong, Anaïs; Gautrais, Jacques; Perna, Andrea; Sbaï, Chaker; Combe, Maud; Kuntz, Pascale; Jost, Christian; Theraulaz, Guy

    2016-02-01

    The nests of social insects are not only impressive because of their sheer complexity but also because they are built from individuals whose work is not centrally coordinated. A key question is how groups of insects coordinate their building actions. Here, we use a combination of experimental and modeling approaches to investigate nest construction in the ant Lasius niger. We quantify the construction dynamics and the 3D structures built by ants. Then, we characterize individual behaviors and the interactions of ants with the structures they build. We show that two main interactions are involved in the coordination of building actions: (i) a stigmergic-based interaction that controls the amplification of depositions at some locations and is attributable to a pheromone added by ants to the building material; and (ii) a template-based interaction in which ants use their body size as a cue to control the height at which they start to build a roof from existing pillars. We then develop a 3D stochastic model based on these individual behaviors to analyze the effect of pheromone presence and strength on construction dynamics. We show that the model can quantitatively reproduce key features of construction dynamics, including a large-scale pattern of regularly spaced pillars, the formation and merging of caps over the pillars, and the remodeling of built structures. Finally, our model suggests that the lifetime of the pheromone is a highly influential parameter that controls the growth and form of nest architecture.

  17. Information security architecture an integrated approach to security in the organization

    CERN Document Server

    Killmeyer, Jan

    2006-01-01

    Information Security Architecture, Second Edition incorporates the knowledge developed during the past decade that has pushed the information security life cycle from infancy to a more mature, understandable, and manageable state. It simplifies security by providing clear and organized methods and by guiding you to the most effective resources available.

  18. Design of a Business-to-Government Information Sharing Architecture Using Business Rules

    NARCIS (Netherlands)

    Van Engelenburg, S.H.; Janssen, M.F.W.H.A.; Klievink, A.J.

    2016-01-01

    Information sharing between businesses and government agencies is of vital importance, yet business are often reluctant to share information, e.g. as it might be misused. Taking this into account is however often overlooked in the design of software architectures. In this research we apply a design

  19. Design of a Business-to-Government Information Sharing Architecture Using Business Rules

    NARCIS (Netherlands)

    Van Engelenburg, S.H.; Janssen, M.F.W.H.A.; Klievink, A.J.

    2016-01-01

    Information sharing between businesses and government agencies is of vital importance, yet business are often reluctant to share information, e.g. as it might be misused. Taking this into account is however often overlooked in the design of software architectures. In this research we apply a design

  20. Architectural Design for the Global Legal Information Network

    Science.gov (United States)

    Kalpakis, Konstantinos

    1999-01-01

    In this report, we provide a summary of our activities regarding the goals, requirements analysis, design, and prototype implementation for the Global Legal Information Network, a joint effort between the Law Library of Congress and NASA.

  1. A Novel Software Architecture for the Provision of Context-Aware Semantic Transport Information

    Directory of Open Access Journals (Sweden)

    Asier Moreno

    2015-05-01

    Full Text Available The effectiveness of Intelligent Transportation Systems depends largely on the ability to integrate information from diverse sources and the suitability of this information for the specific user. This paper describes a new approach for the management and exchange of this information, related to multimodal transportation. A novel software architecture is presented, with particular emphasis on the design of the data model and the enablement of services for information retrieval, thereby obtaining a semantic model for the representation of transport information. The publication of transport data as semantic information is established through the development of a Multimodal Transport Ontology (MTO and the design of a distributed architecture allowing dynamic integration of transport data. The advantages afforded by the proposed system due to the use of Linked Open Data and a distributed architecture are stated, comparing it with other existing solutions. The adequacy of the information generated in regard to the specific user’s context is also addressed. Finally, a working solution of a semantic trip planner using actual transport data and running on the proposed architecture is presented, as a demonstration and validation of the system.

  2. Proprioceptive information processing in schizophrenia.

    Science.gov (United States)

    Arnfred, Sidse M H

    2012-03-01

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus was constructed that stimulated the proprioceptive dimension of recognition of applied force. This load stimulus was tested both in simple and several types of more complex stimulus paradigms, with and without tasks, in total in 66 healthy volunteers. The evoked potential (EP) resulting from the load stimulus was named the proprioceptive EP. The later components of the proprioceptive EP (> 150 ms) were modulated similarly to previously reported electrical somatosensory EPs by repetition and cognitive task. The earlier activity was further investigated through decomposition of the time-frequency transformed data by a new non-negative matrix analysis, and previous research and visual inspection validated these results. Several time-frequency components emerged in the proprioceptive EP. The contra-lateral parietal gamma component (60-70 ms; 30-41 Hz) had not previously been described in the somatosensory modality without electrical stimulation. The parietal beta component (87-103 ms; 19-22 Hz) was increased when the proprioceptive stimulus appeared in a predictable sequence in

  3. Service oriented architecture for the integration of clinical and physiological data for real-time event stream processing.

    Science.gov (United States)

    Kamaleswaran, Rishikesan; McGregor, Carolyn; Percival, Jennifer

    2009-01-01

    This paper proposes a framework for the integration of physiological and clinical health data within a Service-Oriented architecture framework. This integration will subsequently be used in real-time event stream processing in intelligent patient monitoring devices. Service-oriented architecture offers a unique method of integrating health data as information is collected from multiple medical devices that lack any substantial means of standardization. Employing various services to facilitate the transmission and integration of these data will result in significant improvement in both efficacy and analytical velocity of intelligent patient monitoring systems. We demonstrate this approach within the Neonatal Intensive Care setting.

  4. Creativity, Complexity, and Precision: Information Visualization for (Landscape) Architecture

    DEFF Research Database (Denmark)

    Buscher, Monika; Christensen, Michael; Mogensen, Preben Holst

    2000-01-01

    and links are created and maintained as an integral part of ongoing work with `live' documents and objects. The result is an extension of the physical information space of the architects' studio that utilizes the potential of electronic data storage, visualization and network technologies to support work......Drawing on ethnographic studies of (landscape) architects at work, this paper presents a human-centered approach to information visualization. A 3D collaborative electronic workspace allows people to configure, save and browse arrangements of heterogeneous work materials. Spatial arrangements...

  5. Proposal for logistics information management system using distributed architecture; Bunsangata butsuryu joho system no teian to kensho

    Energy Technology Data Exchange (ETDEWEB)

    Kataoka, N.; Koizumi, H.; Shimizu, H. [Mitsubishi Electric Power Corp., Tokyo (Japan)

    1998-03-01

    Conventional host-based central-processing type logistics information systems collect all information about stocked products (sales results, inventory, out-of-stock items) on a single host computer, and based on this information perform ordering, shipping, receiving, and other processing. In a client/server architecture, the system is not simply downsized: in order to ensure more effective use of logistics information and closer coordination with manufacturing information systems, the logistics information system must be configured as a distributed system specific to a given factory and its various products. Such distributed systems each function acts independently, but at the same time the overall system of which they is part must operate in harmony to perform cost optimization, adjust allocation of resources among different factories and business locations, and present a single monolithic interface to retailers and sales agents. In this paper, we propose a logistics information system with a distributed architecture as well as agents whose role is to coordinate operation of the overall system, as one means of realizing this combination of component autonomy and overall system harmony. The methodology proposed here was applied to a proving system, and its effectiveness was verified. 9 refs., 12 figs.

  6. An Architecture to Support Information Availability in the Tactical Domain

    Science.gov (United States)

    2012-10-01

    Research Laboratory/Information Directorate Rome Research Site/ RISA 525 Brooks Road Rome NY 13441-4505 8. PERFORMING ORGANIZATION REPORT NUMBER... RISA 525 Brooks Road Rome NY 13441-4505 10. SPONSOR/MONITOR’S ACRONYM(S) N/A 11. SPONSORING/MONITORING AGENCY REPORT NUMBER

  7. Mobile Technology and CAD Technology Integration in Teaching Architectural Design Process for Producing Creative Product

    Science.gov (United States)

    Bin Hassan, Isham Shah; Ismail, Mohd Arif; Mustafa, Ramlee

    2011-01-01

    The purpose of this research is to examine the effect of integrating the mobile and CAD technology on teaching architectural design process for Malaysian polytechnic architectural students in producing a creative product. The website is set up based on Caroll's minimal theory, while mobile and CAD technology integration is based on Brown and…

  8. Digital Crust: Information architecture for heterogeneous data integration

    Science.gov (United States)

    Richard, S. M.; Zaslavsky, I.; Fan, Y.; Bristol, S.; Peters, S. E.

    2015-12-01

    The Digital Crust EarthCube Building block is addressing the issue of multiple, heterogeneous but related datasets characteristic of field and sample based research using a 'loose-schema' approach, with linked entity and attribute definitions in an information model (ontology) registry (IMR). Various data entities (RDA 'data types') are defined by mapping entity and attribute definitions to definitions in the IMR. Inclusion (loading) of new data at the simplest level can bring in entities that are not registered, but these will not be 'integratable' with other data until someone does the schema matching into the IMR. New datasets can be designed using registered entity and attributes that will from the beginning be integrated into the system (similar to the approach used by the National Information Exchange Model). The fundamental abstract components in this system are 1) a data repository that allows storage of key-value structured data objects; and 2) a registry that documents information models-- the base data types, attributes and entities -- and mappings from the registered types in the datastore to the registered items. This constitutes the data repository subsystem. Data access is enabled by caching views of aggregated data from the datastore (aggregated based on the semantics of the registered items in the IMR) and creating indexes based on the registered items in the IMR. Contributing data to this system will be greatly facilitated by using existing, documented information models. It can accept datasets that are not 'standardized' as well, but the consequence is that those data will not be integratable with other existing data until the work is done to document the entities and attributes in the data and to map those into existing registered types.

  9. The Latent Curriculum: Breaking Conceptual Barriers to Information Architecture

    Directory of Open Access Journals (Sweden)

    Catherine Boden

    2012-05-01

    Full Text Available In online instruction there is a physical and temporal distance between students and instructors that is not present in face-to-face instruction, which has implications for developing online curricula. This paper examines information literacy components of Introduction to Systematic Reviews, an online graduate-level course offered at the University of Saskatchewan. Course evaluation suggested that, although the screencast tutorials were well accepted by the students as a method of learning, there was need to enhance their content. Through grading of assignments, consultations with the students, and evaluation of the final search strategies, the authors identified common aspects of search strategy development with which the students struggled throughout the course. There was a need to unpack the curriculum to more clearly identify specific areas that needed to be expanded or improved. Bloom’s Revised Taxonomy was utilized as the construct to identify information literacy learning objectives at a relatively granular level. Comparison of learning objectives and the content of the screencast tutorials revealed disparities between desired outcomes and the curriculum (particularly for high-level thinking – the latent curriculum. Analyzing curricula using a tool like Bloom’s Revised Taxonomy will help information literacy librarians recognize hidden or latent learning objectives.

  10. A UML-based ontology for describing hospital information system architectures.

    Science.gov (United States)

    Winter, A; Brigl, B; Wendt, T

    2001-01-01

    To control the heterogeneity inherent to hospital information systems the information management needs appropriate hospital information systems modeling methods or techniques. This paper shows that, for several reasons, available modeling approaches are not able to answer relevant questions of information management. To overcome this major deficiency we offer an UML-based ontology for describing hospital information systems architectures. This ontology views at three layers: the domain layer, the logical tool layer, and the physical tool layer, and defines the relevant components. The relations between these components, especially between components of different layers make the answering of our information management questions possible.

  11. Methodological process for chromatic reading of traditional architectural elements

    Directory of Open Access Journals (Sweden)

    Andrea Costa Romão Silva

    2016-01-01

    Full Text Available This article proposes to research the complex phenomenon of color in an architectural environment from the presentation of a specific methodology, based on historical-document research and technical-architectural developed for the particular case of six religious monuments of the Historical Center of São Cristóvão, in Sergipe – Brazil. The implementation of this type of research becomes relevant due to the fact the color is inserted in the space physiognomy as a cultural factor, related to historical issues of symbolic associations, although, due to its heterogeneity, is subject to constant changes in their appearance. Therefore, the documentation becomes an important tool for organization and registration of renascent color memory, safeguarding it at the same time can help restoration work that may occur.

  12. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...... that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... the autonomy of architecture, not as an esoteric concept but as a valid source of information in a pragmatic design practice, may help us overcome the often-proclaimed dichotomy between formal autonomy and a societally committed architecture. It follows that in architectural education there can be a close...

  13. Open Computer Forensic Architecture a Way to Process Terabytes of Forensic Disk Images

    Science.gov (United States)

    Vermaas, Oscar; Simons, Joep; Meijer, Rob

    This chapter describes the Open Computer Forensics Architecture (OCFA), an automated system that dissects complex file types, extracts metadata from files and ultimately creates indexes on forensic images of seized computers. It consists of a set of collaborating processes, called modules. Each module is specialized in processing a certain file type. When it receives a so called 'evidence', the information that has been extracted so far about the file together with the actual data, it either adds new information about the file or uses the file to derive a new 'evidence'. All evidence, original and derived, is sent to a router after being processed by a particular module. The router decides which module should process the evidence next, based upon the metadata associated with the evidence. Thus the OCFA system can recursively process images until from every compound file the embedded files, if any, are extracted, all information that the system can derive, has been derived and all extracted text is indexed. Compound files include, but are not limited to, archive- and zip-files, disk images, text documents of various formats and, for example, mailboxes. The output of an OCFA run is a repository full of derived files, a database containing all extracted information about the files and an index which can be used when searching. This is presented in a web interface. Moreover, processed data is easily fed to third party software for further analysis or to be used in data mining or text mining-tools. The main advantages of the OCFA system are Scalability, it is able to process large amounts of data.

  14. Infochemistry Information Processing at the Nanoscale

    CERN Document Server

    Szacilowski, Konrad

    2012-01-01

    Infochemistry: Information Processing at the Nanoscale, defines a new field of science, and describes the processes, systems and devices at the interface between chemistry and information sciences. The book is devoted to the application of molecular species and nanostructures to advanced information processing. It includes the design and synthesis of suitable materials and nanostructures, their characterization, and finally applications of molecular species and nanostructures for information storage and processing purposes. Divided into twelve chapters; the first three chapters serve as an int

  15. Social Information Processing in Deaf Adolescents

    Science.gov (United States)

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  16. The Use of Supporting Documentation for Information Architecture by Australian Libraries

    Science.gov (United States)

    Hider, Philip; Burford, Sally; Ferguson, Stuart

    2009-01-01

    This article reports the results of an online survey that examined the development of information architecture of Australian library Web sites with reference to documented methods and guidelines. A broad sample of library Web managers responded from across the academic, public, and special sectors. A majority of libraries used either in-house or…

  17. A Secure and Efficient Communications Architecture for Global Information Grid Users Via Cooperating Space Assets

    Science.gov (United States)

    2008-06-19

    selected from the literature . Each of these proposed architectures improves system scalability and security in different ways. 2.4.3.1 GOTHIC The... Literature Review............................................................................................................8 2.1 Chapter Overview...5 Figure 2 - Global Information Grid Layers [SMC06a]..................................................... 12 Figure 3 - The Gothic

  18. Research on Information-Based Teaching in Reform and Practice of Architectural Design

    Science.gov (United States)

    Hao, Li-Jun; Xiao, Zhe-Tao

    2017-01-01

    In China, with the development of the era, the Architectural Design (AD) education has been given the requirement that students should master creative thinking mode and design method. The teaching target of integrating the Information-Based Teaching (IBT) into Creative Thinking (CT) mode is analyzed, and the Teaching Mode (TM) of integrating the…

  19. Processing of remote sensing information in cooperative intelligent grid environment

    Science.gov (United States)

    Sun, Jie; Ma, Hongchao; Zhong, Liang

    2008-12-01

    In order to raise the intelligent level and improve cooperative ability of grid. This paper proposes an agent oriented middleware, which is applied to the traditional OGSA architecture to compose a new architecture named CIG (Cooperative Intelligent Grid) and expounds the types of cooperative processing of remote sensing, the architecture of CIG and how to implement the cooperation in the CIG environment.

  20. Facilitating Software Architecting by Ranking Requirements based on their Impact on the Architecture Process

    NARCIS (Netherlands)

    Galster, Matthias; Eberlein, Armin; Sprinkle, J; Sterritt, R; Breitman, K

    2011-01-01

    Ranking software requirements helps decide what requirements to implement during a software development project, and when. Currently, requirements ranking techniques focus on resource constraints or stakeholder priorities and neglect the effect of requirements on the software architecture process. H

  1. Facilitating Software Architecting by Ranking Requirements based on their Impact on the Architecture Process

    NARCIS (Netherlands)

    Galster, Matthias; Eberlein, Armin; Sprinkle, J; Sterritt, R; Breitman, K

    2011-01-01

    Ranking software requirements helps decide what requirements to implement during a software development project, and when. Currently, requirements ranking techniques focus on resource constraints or stakeholder priorities and neglect the effect of requirements on the software architecture process.

  2. Wireless receiver architectures and design antennas, RF, synthesizers, mixed signal, and digital signal processing

    CERN Document Server

    Rouphael, Tony J

    2014-01-01

    Wireless Receiver Architectures and Design presents the various designs and architectures of wireless receivers in the context of modern multi-mode and multi-standard devices. This one-stop reference and guide to designing low-cost low-power multi-mode, multi-standard receivers treats analog and digital signal processing simultaneously, with equal detail given to the chosen architecture and modulating waveform. It provides a complete understanding of the receiver's analog front end and the digital backend, and how each affects the other. The book explains the design process in great detail, s

  3. The processing architectures of whole-object features: A logical-rules approach.

    Science.gov (United States)

    Moneer, Sarah; Wang, Tony; Little, Daniel R

    2016-09-01

    In this article, we examine whether dimensions comprising the entirety of an object (e.g., size and saturation) are processed independently or pooled into a single whole-object representation. These whole-object features, while notionally separable, sometimes show empirical effects consistent with integrality. A recently proposed theoretical distinction between integral and separable dimensions that emphasizes the time course of information processing, can be used to differentiate whether whole-object features are processed independently, either in serial or in parallel, or pooled into a single coactive process (see, e.g., Little, Nosofsky, Donkin, & Denton, 2013). The current research examines this theoretical distinction in the processing of 3 sets of whole-object-featured stimuli that vary on any pair of the dimensions of saturation, size, and orientation. We found that a mixture of serial and parallel architectures underlies the processing of whole-object features. These results indicate that whole-object features are processed independently. (PsycINFO Database Record

  4. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    Science.gov (United States)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  5. Processing information system for highly specialized information in corporate networks

    Science.gov (United States)

    Petrosyan, M. O.; Kovalev, I. V.; Zelenkov, P. V.; Brezitskaya, VV; Prohorovich, G. A.

    2016-11-01

    The new structure for formation system and management system for highly specialized information in corporate systems is offered. The main distinguishing feature of this structure is that it involves the processing of multilingual information in a single user request.

  6. Wireless Information and Power Transfer: Architecture Design and Rate-Energy Tradeoff

    CERN Document Server

    Zhou, Xun; Ho, Chin Keong

    2012-01-01

    Simultaneous information and power transfer over the wireless channels potentially offers great convenience to mobile users. Yet practical receiver designs impose technical constraints on its hardware realization, as practical circuits for harvesting energy from radio signals are not yet able to decode the carried information directly. To make theoretical progress, we propose a general receiver operation, namely, dynamic power splitting (DPS), which splits the received signal with adjustable power for energy harvesting and for information decoding. Moreover, we propose two types of practical receiver architectures, namely, separated versus integrated information and energy receivers. The integrated receiver integrates the front-end components of the separated receiver, thus achieving a smaller form factor. The rateenergy tradeoff for these two architectures are characterized by a so-called rate-energy (R-E) region. Numerical results show that the R-E region of the integrated receiver is superior to that of th...

  7. Improvements of sensorimotor processes during action cascading associated with changes in sensory processing architecture-insights from sensory deprivation.

    Science.gov (United States)

    Gohil, Krutika; Hahne, Anja; Beste, Christian

    2016-06-20

    In most everyday situations sensorimotor processes are quite complex because situations often require to carry out several actions in a specific temporal order; i.e. one has to cascade different actions. While it is known that changes to stimuli affect action cascading mechanisms, it is unknown whether action cascading changes when sensory stimuli are not manipulated, but the neural architecture to process these stimuli is altered. In the current study we test this hypothesis using prelingually deaf subjects as a model to answer this question. We use a system neurophysiological approach using event-related potentials (ERPs) and source localization techniques. We show that prelingually deaf subjects show improvements in action cascading. However, this improvement is most likely not due to changes at the perceptual (P1-ERP) and attentional processing level (N1-ERP), but due to changes at the response selection level (P3-ERP). It seems that the temporo-parietal junction (TPJ) is important for these effects to occur, because the TPJ comprises overlapping networks important for the processing of sensory information and the selection of responses. Sensory deprivation thus affects cognitive processes downstream of sensory processing and only these seem to be important for behavioral improvements in situations requiring complex sensorimotor processes and action cascading.

  8. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  9. Theory of Neural Information Processing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Galla, Tobias [Abdus Salam International Centre for Theoretical Physics and INFM/CNR SISSA-Unit, Strada Costiera 11, I-34014 Trieste (Italy)

    2006-04-07

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10{sup 11} neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard

  10. Integrating acoustic analysis in the architectural design process using parametric modelling

    DEFF Research Database (Denmark)

    Peters, Brady

    2011-01-01

    This paper discusses how parametric modeling techniques can be used to provide architectural designers with a better understanding of the acoustic performance of their designs and provide acoustic engineers with models that can be analyzed using computational acoustic analysis software. Architects...... provide a method by which architects and engineers can work together more efficiently and communicate better. This research is illustrated through the design of an architectural project, a new school in Copenhagen, Denmark by JJW Architects, where parametric modeling techniques have been used in different......, acoustic performance can inform the geometry and material logic of the design. In this way, the architectural design and the acoustic analysis model become linked....

  11. Heterogeneous reconfigurable processors for real-time baseband processing from algorithm to architecture

    CERN Document Server

    Zhang, Chenxin; Öwall, Viktor

    2016-01-01

    This book focuses on domain-specific heterogeneous reconfigurable architectures, demonstrating for readers a computing platform which is flexible enough to support multiple standards, multiple modes, and multiple algorithms. The content is multi-disciplinary, covering areas of wireless communication, computing architecture, and circuit design. The platform described provides real-time processing capability with reasonable implementation cost, achieving balanced trade-offs among flexibility, performance, and hardware costs. The authors discuss efficient design methods for wireless communication processing platforms, from both an algorithm and architecture design perspective. Coverage also includes computing platforms for different wireless technologies and standards, including MIMO, OFDM, Massive MIMO, DVB, WLAN, LTE/LTE-A, and 5G. •Discusses reconfigurable architectures, including hardware building blocks such as processing elements, memory sub-systems, Network-on-Chip (NoC), and dynamic hardware reconfigur...

  12. Five Computational Actions in Information Processing

    Directory of Open Access Journals (Sweden)

    Stefan Vladutescu

    2014-12-01

    Full Text Available This study is circumscribed to the Information Science. The zetetic aim of research is double: a to define the concept of action of information computational processing and b to design a taxonomy of actions of information computational processing. Our thesis is that any information processing is a computational processing. First, the investigation trays to demonstrate that the computati onal actions of information processing or the informational actions are computationalinvestigative configurations for structuring information: clusters of highlyaggregated operations which are carried out in a unitary manner operate convergent and behave like a unique computational device. From a methodological point of view, they are comprised within the category of analytical instruments for the informational processing of raw material, of data, of vague, confused, unstructured informational elements. As internal articulation, the actions are patterns for the integrated carrying out of operations of informational investigation. Secondly, we propose an inventory and a description of five basic informational computational actions: exploring, grouping, anticipation, schematization, inferential structuring. R. S. Wyer and T. K. Srull (2014 speak about "four information processing". We would like to continue with further and future investigation of the relationship between operations, actions, strategies and mechanisms of informational processing.

  13. Multi-Agent Architecture for Implementation of ITIL Processes: Case of Incident Management Process

    OpenAIRE

    Youssef SEKHARA; Medromi, Hicham; Sayouti, Adil

    2014-01-01

    ITIL (Information Technology Infrastructure Library) is the most widely accepted approach to IT service management in the world. Upon the adoption of ITIL processes, organizations face many challenges that can lead to increased complexity. In this paper we use the advantages of agent technology to make implementation and use of ITIL processes more efficient, starting by the incident management process.

  14. Multi-Agent Architecture for Implementation of ITIL Processes: Case of Incident Management Process

    Directory of Open Access Journals (Sweden)

    Youssef SEKHARA

    2014-08-01

    Full Text Available ITIL (Information Technology Infrastructure Library is the most widely accepted approach to IT service management in the world. Upon the adoption of ITIL processes, organizations face many challenges that can lead to increased complexity. In this paper we use the advantages of agent technology to make implementation and use of ITIL processes more efficient, starting by the incident management process.

  15. NLP-PIER: A Scalable Natural Language Processing, Indexing, and Searching Architecture for Clinical Notes.

    Science.gov (United States)

    McEwan, Reed; Melton, Genevieve B; Knoll, Benjamin C; Wang, Yan; Hultman, Gretchen; Dale, Justin L; Meyer, Tim; Pakhomov, Serguei V

    2016-01-01

    Many design considerations must be addressed in order to provide researchers with full text and semantic search of unstructured healthcare data such as clinical notes and reports. Institutions looking at providing this functionality must also address the big data aspects of their unstructured corpora. Because these systems are complex and demand a non-trivial investment, there is an incentive to make the system capable of servicing future needs as well, further complicating the design. We present architectural best practices as lessons learned in the design and implementation NLP-PIER (Patient Information Extraction for Research), a scalable, extensible, and secure system for processing, indexing, and searching clinical notes at the University of Minnesota.

  16. Research on the Architecture of a Basic Reconfigurable Information Communication Network

    Directory of Open Access Journals (Sweden)

    Ruimin Wang

    2013-01-01

    Full Text Available The current information network cannot fundamentally meet some urgent requirements, such as providing ubiquitous information services and various types of heterogeneous network, supporting diverse and comprehensive network services, possessing high quality communication effects, ensuring the security and credibility of information interaction, and implementing effective supervisory control. This paper provides the theory system for the basic reconfigurable information communication network based on the analysis of present problems on the Internet and summarizes the root of these problems. It also provides an in-depth discussion about the related technologies and the prime components of the architecture.

  17. The Future of Architecture Collaborative Information Sharing: DoDAF Version 2.03 Updates

    Science.gov (United States)

    2012-04-30

    Salamander x Select Solution Factory Select Business Solutions BPMN , UML x SimonTool Simon Labs x SimProcess CACI BPMN x System Architecture Management...for DoDAF Mega UML x Metastorm ProVision Metastorm BPMN x Naval Simulation System - 4 Aces METRON x NetViz CA x OPNET OPNET x Tool Name Vendor Primary

  18. Information Selection in Intelligence Processing

    Science.gov (United States)

    2011-12-01

    problem of overload.” As another example, Whaley (Whaley, 1974) argues that one of the causes for the Pearl Harbor and Barbarossa strategic surprises is...which becomes more and more important as the Internet evolves. The IR problem and the information selection problem share some similar...all the algorithms tend more towards exploration: the temperature parameter in Softmax is higher (0.12 instead of 0.08), the delta for the VDBE

  19. Recording Information on Architectural Heritage Should Meet the Requirements for Conservation Digital Recording Practices at the Summer Palace

    Science.gov (United States)

    Zhang, L.; Cong, Y.; Wu, C.; Bai, C.; Wu, C.

    2017-08-01

    The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural information? What method do we use for information recording? These questions should be addressed in relation to the nature of the particular heritage site and the specific conditions for the conservation work. In recent years, with the rapid development of information acquisition technology such as Close Range Photogrammetry, 3D Laser Scanning as well as high speed and high precision Aerial Photogrammetry, many Chinese universities, research institutes and heritage management bureaux have purchased considerable equipment for information recording. However, the lack of understanding of both the nature of architectural heritage and the purpose for which the information is being collected has led to several problems. For example: some institutions when recording architectural heritage information aim solely at high accuracy. Some consider that advanced measuring methods must automatically replace traditional measuring methods. Information collection becomes the purpose, rather than the means, of architectural heritage conservation. Addressing these issues, this paper briefly reviews the history of architectural heritage information recording at the Summer Palace (Yihe Yuan, first built in 1750), Beijing. Using the recording practices at the Summer Palace during the past ten years as examples, we illustrate our achievements and lessons in recording architectural heritage information with regard to the following aspects: (buildings') ideal status desired, (buildings') current status

  20. RECORDING INFORMATION ON ARCHITECTURAL HERITAGE SHOULD MEET THE REQUIREMENTS FOR CONSERVATION Digital Recording Practices at the Summer Palace

    Directory of Open Access Journals (Sweden)

    L. Zhang

    2017-08-01

    Full Text Available The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural information? What method do we use for information recording? These questions should be addressed in relation to the nature of the particular heritage site and the specific conditions for the conservation work. In recent years, with the rapid development of information acquisition technology such as Close Range Photogrammetry, 3D Laser Scanning as well as high speed and high precision Aerial Photogrammetry, many Chinese universities, research institutes and heritage management bureaux have purchased considerable equipment for information recording. However, the lack of understanding of both the nature of architectural heritage and the purpose for which the information is being collected has led to several problems. For example: some institutions when recording architectural heritage information aim solely at high accuracy. Some consider that advanced measuring methods must automatically replace traditional measuring methods. Information collection becomes the purpose, rather than the means, of architectural heritage conservation. Addressing these issues, this paper briefly reviews the history of architectural heritage information recording at the Summer Palace (Yihe Yuan, first built in 1750, Beijing. Using the recording practices at the Summer Palace during the past ten years as examples, we illustrate our achievements and lessons in recording architectural heritage information with regard to the following aspects: (buildings’ ideal status desired, (buildings

  1. Information Search Process in Science Education.

    Science.gov (United States)

    McNally, Mary Jane; Kuhlthau, Carol C.

    1994-01-01

    Discussion of the development of an information skills curriculum focuses on science education. Topics addressed include information seeking behavior; information skills models; the search process of scientists; science education; a process approach for student activities; and future possibilities. (Contains 15 references.) (LRW)

  2. Scalable architecture for a room temperature solid-state quantum information processor.

    Science.gov (United States)

    Yao, N Y; Jiang, L; Gorshkov, A V; Maurer, P C; Giedke, G; Cirac, J I; Lukin, M D

    2012-04-24

    The realization of a scalable quantum information processor has emerged over the past decade as one of the central challenges at the interface of fundamental science and engineering. Here we propose and analyse an architecture for a scalable, solid-state quantum information processor capable of operating at room temperature. Our approach is based on recent experimental advances involving nitrogen-vacancy colour centres in diamond. In particular, we demonstrate that the multiple challenges associated with operation at ambient temperature, individual addressing at the nanoscale, strong qubit coupling, robustness against disorder and low decoherence rates can be simultaneously achieved under realistic, experimentally relevant conditions. The architecture uses a novel approach to quantum information transfer and includes a hierarchy of control at successive length scales. Moreover, it alleviates the stringent constraints currently limiting the realization of scalable quantum processors and will provide fundamental insights into the physics of non-equilibrium many-body quantum systems.

  3. Information architecture: study and analysis of data Public Medical base (PubMed

    Directory of Open Access Journals (Sweden)

    Odete Máyra Mesquita Sales

    2016-07-01

    Full Text Available Objective. Based on principles proposed by Rosenfeld and Morville (2006, the present study examined the PubMed database interface, since a well-structured information architecture contributes to good usability in any digital environment. Method. The research development occurred through the use of literature techniques and empirical study on the analysis of information architecture based on organization, navigation, recommended labeling and search for Rosenfeld and Morville (2006 for the sake of usability base PubMed. For better understanding and description of these principles, we used the technique of content analysis. Results. The results showed that the database interface meets the criteria established by the elements of Information Architecture, such as organization based on hypertext structure, horizontal menu and local content divided into categories, identifying active links, global navigation , breadcrumb, textual labeling and iconographic and highlight the search engine. Conclusions. This research showed that the PubMed database interface is well structured, friendly and objective, with numerous possibilities of search and information retrieval. However, there is a need to adopt accessibility standards on this website, so that it reaches more efficiently its purpose of facilitating access to information organized and stored in the PubMed database.

  4. 2D Material Device Architectures: Process Optimisation and Characterisation

    DEFF Research Database (Denmark)

    Gammelgaard, Lene

    encapsulated in hexagonal boron nitride have been fabricated and studied electrically. These devices have field-effect mobilities comparable with the highest values reported. Furthermore, state of the art nano-patterns have been fabricated into encapsulated graphene. It was also explore how graphene layers...... perform as tunable contacts to transition metal dichalcogenides layers encapsulated in hexagonal boron nitride. This architecture yields high performance devices, where high mobilities of the air sensitive MoTe2 crystals have measured, and metal-insulator transition have been observed in monolayer MoS2...... devices. Additionally, the long-term stability of transition metal dichalcogenides has been studied, and the order of the layers has been demonstrated detectable by atomic force microscopy. The encapsulated van der Waals heterostructures give high performance and long-term stability of two...

  5. An Architecture for Real-Time Processing of OSIRIS-REx Engineering and Science Data, from Raw Telemetry to PDS

    Science.gov (United States)

    Selznick, S. H.

    2017-06-01

    Herein we describe an architecture developed for processing engineering and science data for the OSIRIS-REx mission. The architecture is soup-to-nuts, starting with raw telemetry and ending with submission to PDS.

  6. A Platform Architecture for Sensor Data Processing and Verification in Buildings

    Science.gov (United States)

    Ortiz, Jorge Jose

    2013-01-01

    This thesis examines the state of the art of building information systems and evaluates their architecture in the context of emerging technologies and applications for deep analysis of the built environment. We observe that modern building information systems are difficult to extend, do not provide general services for application development, do…

  7. A Platform Architecture for Sensor Data Processing and Verification in Buildings

    Science.gov (United States)

    Ortiz, Jorge Jose

    2013-01-01

    This thesis examines the state of the art of building information systems and evaluates their architecture in the context of emerging technologies and applications for deep analysis of the built environment. We observe that modern building information systems are difficult to extend, do not provide general services for application development, do…

  8. Mathematics of Information Processing and the Internet

    Science.gov (United States)

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  9. Scaling the Information Processing Demands of Occupations

    Science.gov (United States)

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  10. SAMS--a systems architecture for developing intelligent health information systems.

    Science.gov (United States)

    Yılmaz, Özgün; Erdur, Rıza Cenk; Türksever, Mustafa

    2013-12-01

    In this paper, SAMS, a novel health information system architecture for developing intelligent health information systems is proposed and also some strategies for developing such systems are discussed. The systems fulfilling this architecture will be able to store electronic health records of the patients using OWL ontologies, share patient records among different hospitals and provide physicians expertise to assist them in making decisions. The system is intelligent because it is rule-based, makes use of rule-based reasoning and has the ability to learn and evolve itself. The learning capability is provided by extracting rules from previously given decisions by the physicians and then adding the extracted rules to the system. The proposed system is novel and original in all of these aspects. As a case study, a system is implemented conforming to SAMS architecture for use by dentists in the dental domain. The use of the developed system is described with a scenario. For evaluation, the developed dental information system will be used and tried by a group of dentists. The development of this system proves the applicability of SAMS architecture. By getting decision support from a system derived from this architecture, the cognitive gap between experienced and inexperienced physicians can be compensated. Thus, patient satisfaction can be achieved, inexperienced physicians are supported in decision making and the personnel can improve their knowledge. A physician can diagnose a case, which he/she has never diagnosed before, using this system. With the help of this system, it will be possible to store general domain knowledge in this system and the personnel's need to medical guideline documents will be reduced.

  11. A heterogeneous multiprocessor architecture for low-power audio signal processing applications

    DEFF Research Database (Denmark)

    Paker, Ozgun; Sparsø, Jens; Haandbæk, Niels

    2001-01-01

    This paper describes a low-power programmable DSP architecture that targets audio signal processing. The architecture can be characterized as a heterogeneous multiprocessor consisting of small and simple instruction set processors called mini-cores that communicate using message passing. The proc......This paper describes a low-power programmable DSP architecture that targets audio signal processing. The architecture can be characterized as a heterogeneous multiprocessor consisting of small and simple instruction set processors called mini-cores that communicate using message passing....... Early results obtained from the design of a prototype chip containing filter processors for a hearing aid application, indicate a power consumption that is an order of magnitude better than current state of the art low-power audio DSPs implemented using full-custom techniques. This is due to: (1...

  12. Visual Information Processing Based on Qualitative Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Hua; LIU Yongchang; LI Chao

    2007-01-01

    Visual information processing is not only an important research direction in fields of psychology,neuroscience and artificial intelligence etc,but also the research base on biological recognition theory and technology realization.Visual information processing in existence,e.g.visual information processing facing to nerve calculation,visual information processing using substance shape distilling and wavelet under high yawp,ANN visual information processing and etc,are very complex in comparison.Using qualitative Mapping,this text describes the specific attributes in the course of visual information processing and the results are more brief and straightforward.So the software program of vision recognition is probably easier to realize.

  13. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    CERN Document Server

    Chang, Chein-I

    2017-01-01

    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  14. A configurable process for design of object-oriented software architectures

    DEFF Research Database (Denmark)

    lønvig, Birgitte

    combinations of problems and solutions in a number of different domains. The workflow of how to configure a process for a domain is although applicable for different domains. The software architecture design process is based on a general conceptual framework consisting of domain characteristics, requirements......When we design large complex software systems, such as systems in the telecommunications world, and we follow one of the standard object-oriented methods or processes, we end up with a system that fulfills the requirements of functionality. However, it is difficult to ensure that other requirements......, such as modifiability and reusability, are fulfilled. Furthermore the architecture is not explicitly described and is therefore difficult to comprehend. This Ph. D. dissertation defines a configurable process for design of object-oriented software architectures. The process can be regarded as an extension to standard...

  15. Scalable Engineering of Quantum Optical Information Processing Architectures (SEQUOIA)

    Science.gov (United States)

    2016-12-13

    interfacing with telecom quantum networks /qubit distribution 4. DV quantum computing using CV cluster Embed circuit model quantum computing into CV...linear-optics mode transformations Realizing scalable, high-fidelity interferometric networks is a central challenge to be addressed on the path...methods for characterizing these large interferometric networks . Figure 1:Photonic integrated circuit. Left: programmable PIC. Right: Transmission at

  16. CURRENT DEVELOPMENTS IN COMPLEX INFORMATION PROCESSING,

    Science.gov (United States)

    DATA PROCESSING, * COMPUTER PROGRAMMING , INFORMATION RETRIEVAL, DATA STORAGE SYSTEMS, MATHEMATICAL LOGIC, ARTIFICIAL INTELLIGENCE, PATTERN RECOGNITION, GAME THEORY, PROGRAMMING LANGUAGES, MATHEMATICAL ANALYSIS.

  17. The working out of architectural concept for a new type public building — multi-information and education center by using information technologies and mathematical models

    Directory of Open Access Journals (Sweden)

    Михаил Владимирович Боровиков

    2012-12-01

    Full Text Available Architectural concept of multifunctional information and educational center and its implementation is given in the author's project. Advanced information technology and mathematical models used in the development of the author project.

  18. The LHCb Data Acquisition and High Level Trigger Processing Architecture

    Science.gov (United States)

    Frank, M.; Gaspar, C.; Jost, B.; Neufeld, N.

    2015-12-01

    The LHCb experiment at the LHC accelerator at CERN collects collisions of particle bunches at 40 MHz. After a first level of hardware trigger with an output rate of 1 MHz, the physically interesting collisions are selected by running dedicated trigger algorithms in the High Level Trigger (HLT) computing farm. This farm consists of up to roughly 25000 CPU cores in roughly 1750 physical nodes each equipped with up to 4 TB local storage space. This work describes the LHCb online system with an emphasis on the developments implemented during the current long shutdown (LS1). We will elaborate the architecture to treble the available CPU power of the HLT farm and the technicalities to determine and verify precise calibration and alignment constants which are fed to the HLT event selection procedure. We will describe how the constants are fed into a two stage HLT event selection facility using extensively the local disk buffering capabilities on the worker nodes. With the installed disk buffers, the CPU resources can be used during periods of up to ten days without beams. These periods in the past accounted to more than 70% of the total time.

  19. Intelligent query processing for semantic mediation of information systems

    Directory of Open Access Journals (Sweden)

    Saber Benharzallah

    2011-11-01

    Full Text Available We propose an intelligent and an efficient query processing approach for semantic mediation of information systems. We propose also a generic multi agent architecture that supports our approach. Our approach focuses on the exploitation of intelligent agents for query reformulation and the use of a new technology for the semantic representation. The algorithm is self-adapted to the changes of the environment, offers a wide aptitude and solves the various data conflicts in a dynamic way; it also reformulates the query using the schema mediation method for the discovered systems and the context mediation for the other systems.

  20. Scalable Architecture for a Room Temperature Solid-State Quantum Information Processor

    CERN Document Server

    Yao, Norman Y; Gorshkov, Alexey V; Maurer, Peter C; Giedke, Geza; Cirac, J Ignacio; Lukin, Mikhail D

    2010-01-01

    The realization of a scalable quantum information processor has emerged over the past decade as one of the central challenges at the interface of fundamental science and engineering. Much progress has been made towards this goal. Indeed, quantum operations have been demonstrated on several trapped ion qubits, and other solid-state systems are approaching similar levels of control. Extending these techniques to achieve fault-tolerant operations in larger systems with more qubits remains an extremely challenging goal, in part, due to the substantial technical complexity of current implementations. Here, we propose and analyze an architecture for a scalable, solid-state quantum information processor capable of operating at or near room temperature. The architecture is applicable to realistic conditions, which include disorder and relevant decoherence mechanisms, and includes a hierarchy of control at successive length scales. Our approach is based upon recent experimental advances involving Nitrogen-Vacancy colo...

  1. An architecture for EEG signal processing and interpretation during sleep (ESPIS).

    Science.gov (United States)

    Toussaint, M; Schaltenbrand, N; Paiva, T; Pollmacher, T; Pflieger, C; Luthringer, R; Macher, J P

    1994-10-01

    The project's aim is to develop a dedicated workstation in order to process multiple channels of electrophysiological signals in real-time during sleep. In ESPIS we are aiming to define both an architecture and an environment for EEG signal interpretation in medicine based on computer science gold standards (Unix, XWindow, Motif). Signal processing and pattern recognition analysis are provided by parallel processing on a specific developed acquisition architecture (DSP) based on transputers. The main result is a high performance prototype demonstrating signal interpretation during sleep which has already been tested in a medical environment. The overall specifications allow this biomedical device to be extended to other types of medical signals.

  2. Integrated and Modular Design of an Optimized Process Architecture

    Directory of Open Access Journals (Sweden)

    Colin Raßfeld

    2013-07-01

    Full Text Available Global economic integration increased the complexity of business activities, so organizations are forced to become more efficient each day. Process organization is a very useful way of aligning organizational systems towards business processes. However, an organization must do more than just focus its attention and efforts on processes. The layout design has also a significant impact on the system performance.. We contribute to this field by developing a tailored process-oriented organizational structure and new layout design for the quality assurance of a leading German automotive manufacturer. The target concept we developed was evaluated by process owners and an IT-based process simulation. Our results provide solid empirical back-up in which the performance and effects are  assessed from a qualitative and quantitative perspective

  3. Information Processing Approaches to Cognitive Development

    Science.gov (United States)

    1989-08-04

    This chapter reviews the history and current status of information- processing approaches to cognitive development . Because the approach is so...a detailed analysis of self-modifying production systems and their potential for formulating theories of cognitive development . Keywords: Information processing; Cognitive development ; Self modification; Production system.

  4. Chinese Information Processing and Its Prospects

    Institute of Scientific and Technical Information of China (English)

    Sheng Li; Tie-Jun Zhao

    2006-01-01

    The paper presents some main progresses and achievements in Chinese information processing. It focuses on six aspects, I.e., Chinese syntactic analysis, Chinese semantic analysis, machine translation, information retrieval, information extraction, and speech recognition and synthesis. The important techniques and possible key problems of the respective branch in the near future are discussed as well.

  5. A Hybrid FPGA/Coarse Parallel Processing Architecture for Multi-modal Visual Feature Descriptors

    DEFF Research Database (Denmark)

    Jensen, Lars Baunegaard With; Kjær-Nielsen, Anders; Alonso, Javier Díaz

    2008-01-01

    This paper describes the hybrid architecture developed for speeding up the processing of so-called multi-modal visual primitives which are sparse image descriptors extracted along contours. In the system, the first stages of visual processing are implemented on FPGAs due to their highly parallel...

  6. Process-based Architecture for Robustness Applying Linux isolation mechanism in MG-R

    NARCIS (Netherlands)

    Matsinger, A.A.J.; Kourzanov, P.; Gopakumar, G.N.

    2006-01-01

    This report contains the results of a feasibility study of applyingLinux facilities for isolating and protecting processes, and for communication and synchronisation between processes, to the MG-R architecture so as to improve the robustness. Moreover some guidelines andtrade-offs are discussed how

  7. Using Multiple FPGA Architectures for Real-time Processing of Low-level Machine Vision Functions

    Science.gov (United States)

    Thomas H. Drayer; William E. King; Philip A. Araman; Joseph G. Tront; Richard W. Conners

    1995-01-01

    In this paper, we investigate the use of multiple Field Programmable Gate Array (FPGA) architectures for real-time machine vision processing. The use of FPGAs for low-level processing represents an excellent tradeoff between software and special purpose hardware implementations. A library of modules that implement common low-level machine vision operations is presented...

  8. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...... the obligation to prepare students to perform in a profession that is largely defined by forces outside that discipline. It will be proposed that the autonomy of architecture can be understood as a unique kind of information: as architecture’s self-reliance or knowledge-about itself. A knowledge...... that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...

  9. Beyond Information Architecture: A Systems Integration Approach to Web-site Design

    Directory of Open Access Journals (Sweden)

    Krisellen Maloney

    2017-09-01

    Full Text Available Users' needs and expectations regarding access to information have fundamentally changed, creating a disconnect between how users expect to use a library Web site and how the site was designed. At the same time, library technical infrastructures include legacy systems that were not designedf or the Web environment. The authors propose a framework that combines elements of information architecture with approaches to incremental system design and implementation. The framework allows for the development of a Web site that is responsive to changing user needs, while recognizing the need for libraries to adopt a cost-effective approach to implementation and maintenance.

  10. Information Architecture in the Smart TV Enviroment. For LG Smart TV platform

    Directory of Open Access Journals (Sweden)

    María Victoria Nuño Moral

    2015-07-01

    Full Text Available This paper discusses the feasibility of applying the elements of Information Architecture (systems of organization, of navigation, of labeling and of search to the world of Smart TV and how different systems and services studied in these platforms are interpreted. Specifically, the study focuses on the LG Smart TV platform. One of the questions raised is whether the advances that are emerging in some disciplines are also perceived to smart TVs. In this particular area, have a long way to go because you have to develop the application that allows the user to directly manage information.

  11. Architectural Prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders......' concerns with respect to a system under development. An architectural prototype is primarily a learning and communication vehicle used to explore and experiment with alternative architectural styles, features, and patterns in order to balance different architectural qualities. The use of architectural...... prototypes in the development process is discussed, and we argue that such prototypes can play a role throughout the entire process. The use of architectural prototypes is illustrated by three distinct cases of creating software systems. We argue that architectural prototyping can provide key insights...

  12. Architectural prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders......' concerns with respect to a system under development. An architectural prototype is primarily a learning and communication vehicle used to explore and experiment with alternative architectural styles, features, and patterns in order to balance different architectural qualities. The use of architectural...... prototypes in the development process is discussed, and we argue that such prototypes can play a role throughout the entire process. The use of architectural prototypes is illustrated by three distinct cases of creating software systems. We argue that architectural prototyping can provide key insights...

  13. DataTourism : Designing an Architecture to Process Tourism Data

    OpenAIRE

    Soualah-Alila, Fayrouz; Coustaty, Mickaël; Rempulski, Nicolas; Doucet, Antoine

    2016-01-01

    International audience; With the rapid diffusion of new technologies in tourism , professionals face new challenges to use efficiently the vast amount of data created by tourists. Nowadays , these information come from multiple and varied sources , as cellular or social networks , touristic location attendance or dematerialized satisfaction surveys , and in huge amount. They are an important resource for the tourism industry , but their heterogeneity makes it difficult to aggregate and analys...

  14. Streamlining the Process of Acquiring Secure Open Architecture Software Systems

    Science.gov (United States)

    2013-04-01

    Office of Small Business Programs, Department of the Navy  Director, Office of Acquisition Resources and Analysis ( ARA )  Deputy Assistant...innovative ways and means to acquire/develop component-based OA software systems that are subject to diverse, heterogeneous IP licenses (Alspaugh... heterogeneously licensed systems. Journal of the Association for Information Systems, 11(11), 730–755. Anderson, S. (2012, July-September). Software

  15. Finite volume method room acoustic simulations integrated into the architectural design process

    DEFF Research Database (Denmark)

    Pind Jörgensson, Finnur Kári; Jeong, Cheol-Ho; Engsig-Karup, Allan Peter

    2017-01-01

    with the architectural design from the earliest design stage, as a part of a holistic design process. A new procedure to integrate room acoustics into architectural design is being developed in a Ph.D. project, with the aim of promoting this early stage holistic design process. This project aims to develop a new hybrid......In many cases, room acoustics are neglected during the early stage of building design. This can result in serious acoustical problems that could have been easily avoided and can be difficult or expensive to remedy at later stages. Ideally, the room acoustic design should interact...

  16. Aligning application architecture to the business context

    NARCIS (Netherlands)

    Wieringa, R.J.; Blanken, H.M.; Fokkinga, M.M.; Grefen, P.W.P.J.; Eder, J.; Missikoff, M.

    2003-01-01

    Alignment of application architecture to business architecture is a central problem in the design, acquisition and implementation of information systems in current large-scale information-processing organizations. Current research in architecture alignment is either too strategic or too software imp

  17. A Biologically-Inspired Neural Network Architecture for Image Processing

    Science.gov (United States)

    1990-12-01

    findings, in accord with other research cited here, were obtained from cortical measurements or, 15 adult cats and 12 kittens , all anesthetized (9...software models on a Cray computer. Furthermore, care should be taken to avoid exceeding machine memory capacity when running intensive processes

  18. Techniques and software architectures for medical visualisation and image processing

    NARCIS (Netherlands)

    Botha, C.P.

    2005-01-01

    This thesis presents a flexible software platform for medical visualisation and image processing, a technique for the segmentation of the shoulder skeleton from CT data and three techniques that make contributions to the field of direct volume rendering. Our primary goal was to investigate the use

  19. A Reasoning Architecture for Expert Troubleshooting of Complex Processes

    Science.gov (United States)

    2012-09-01

    Vachtsevanos5 1, 2, 5 Department of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30332,USA naveed @gatech.edu...appropriate models and measurements (data) to perform accurately and expeditiously expert troubleshooting for complex military and industrial processes...military and industrial systems (machines, aircraft, etc.) experience fault/failure modes that must be diagnosed accurately and rapidly in order to

  20. Strategic investment of embodied energy during the architectural planning process

    NARCIS (Netherlands)

    Hildebrand, L.

    2014-01-01

    It is an interesting time in the building industry; for more than one decade sustainability is a planning parameter that essentially impacts construction related processes. Reduction of operational energy was initiated after the oil crisis and changed the type of construction by including heat trans

  1. All-solution-processed organic solar cells with conventional architecture

    NARCIS (Netherlands)

    Franeker, J.J. van; Voorthuijzen, W.P.; Gorter, H.; Hendriks, K.H.; Janssen, R.A.J.; Hadipour, A.; Andriessen, H.A.J.M.; Galagan, Y.O.

    2013-01-01

    Abstract All-solution processed organic solar cells with a conventional device structure were demonstrated. The evaporated low work function LiF/Al electrode was replaced by a printed high work function silver electrode combined with an additional electron transport layer (ETL). Two electron transpo

  2. Mapping individual logical processes in information searching

    Science.gov (United States)

    Smetana, F. O.

    1974-01-01

    An interactive dialog with a computerized information collection was recorded and plotted in the form of a flow chart. The process permits one to identify the logical processes employed in considerable detail and is therefore suggested as a tool for measuring individual thought processes in a variety of situations. A sample of an actual test case is given.

  3. Aptitude from an Information Processing Perspective.

    Science.gov (United States)

    McLaughlin, Barry

    An information-processing approach to language learning is examined; language aptitude is factored into the approach, and the role of working memory is discussed. The process of learning includes two processes that make heavy use of working memory is: automatization and restructuring. At first, learners must make a conscious effort to remember and…

  4. Mapping one year's design processes at an architecture firm specialized in sustainable architecture- How do sustainability certification systems affect design processes?

    DEFF Research Database (Denmark)

    Landgren, M.; Jensen, Lotte Bjerregaard; Heller, Alfred;

    2016-01-01

    The current study mapped how a Danish architecture firm integrated sustainability in their projects over a year. All the projects concerned were aimed at being sustainable within the framework of the DGNB certification system. The focus of DGNB is equally divided between environmental, economic...... and social aspects. During the mapping process, a picture was drawn of the state of the art for integrating DGNB in design processes and of the challenges involved. Case studies formed the basis of the study and helped substantiate the complexity of integrating DGNB’s criteria as design parameters...

  5. Ion trapping for quantum information processing

    Institute of Scientific and Technical Information of China (English)

    WAN Jin-yin; WANG Yu-zhu; LIU Liang

    2007-01-01

    In this paper we have reviewed the recent pro-gresses on the ion trapping for quantum information process-ing and quantum computation. We have first discussed the basic principle of quantum information theory and then fo-cused on ion trapping for quantum information processing.Many variations, especially the techniques of ion chips, have been investigated since the original ion trap quantum compu-tation scheme was proposed. Full two-dimensional control of multiple ions on an ion chip is promising for the realization of scalable ion trap quantum computation and the implemen-tation of quantum networks.

  6. System Maturity and Architecture Assessment Methods, Processes, and Tools

    Science.gov (United States)

    2012-03-02

    1 For a detailed description of the SRL methodology see Sauser, B., J.E. Ramirez- Marquez , D. Nowicki, A...and Ramirez- Marquez 2009; Magnaye, Sauser et al. 2010). Although there are guidelines and tools to support the assessment process (Nolte, Kennedy...employ these metrics (Tan, Sauser et al. 2011). Graettinger, et al. (Graettinger, Garcia et al. 2002) reports that approaches for readiness level

  7. Clinical Decision Support for Whole Genome Sequence Information Leveraging a Service-Oriented Architecture: a Prototype

    Science.gov (United States)

    Welch, Brandon M.; Rodriguez-Loya, Salvador; Eilbeck, Karen; Kawamoto, Kensaku

    2014-01-01

    Whole genome sequence (WGS) information could soon be routinely available to clinicians to support the personalized care of their patients. At such time, clinical decision support (CDS) integrated into the clinical workflow will likely be necessary to support genome-guided clinical care. Nevertheless, developing CDS capabilities for WGS information presents many unique challenges that need to be overcome for such approaches to be effective. In this manuscript, we describe the development of a prototype CDS system that is capable of providing genome-guided CDS at the point of care and within the clinical workflow. To demonstrate the functionality of this prototype, we implemented a clinical scenario of a hypothetical patient at high risk for Lynch Syndrome based on his genomic information. We demonstrate that this system can effectively use service-oriented architecture principles and standards-based components to deliver point of care CDS for WGS information in real-time. PMID:25954430

  8. Revealed Quantum Information in Weak Interaction Processes

    CERN Document Server

    Hiesmayr, B C

    2014-01-01

    We analyze the achievable limits of the quantum information processing of the weak interaction revealed by hyperons with spin. We find that the weak decay process corresponds to an interferometric device with a fixed visibility and fixed phase difference for each hyperon. Nature chooses rather low visibilities expressing a preference to parity conserving or violating processes (except for the decay $\\Sigma^+\\longrightarrow p \\pi^0$). The decay process can be considered as an open quantum channel that carries the information of the hyperon spin to the angular distribution of the momentum of the daughter particles. We find a simple geometrical information theoretic interpretation of this process: two quantization axes are chosen spontaneously with probabilities $\\frac{1\\pm\\alpha}{2}$ where $\\alpha$ is proportional to the visibility times the real part of the phase shift. Differently stated the weak interaction process corresponds to spin measurements with an imperfect Stern-Gerlach apparatus. Equipped with this...

  9. A scalable healthcare information system based on a service-oriented architecture.

    Science.gov (United States)

    Yang, Tzu-Hsiang; Sun, Yeali S; Lai, Feipei

    2011-06-01

    Many existing healthcare information systems are composed of a number of heterogeneous systems and face the important issue of system scalability. This paper first describes the comprehensive healthcare information systems used in National Taiwan University Hospital (NTUH) and then presents a service-oriented architecture (SOA)-based healthcare information system (HIS) based on the service standard HL7. The proposed architecture focuses on system scalability, in terms of both hardware and software. Moreover, we describe how scalability is implemented in rightsizing, service groups, databases, and hardware scalability. Although SOA-based systems sometimes display poor performance, through a performance evaluation of our HIS based on SOA, the average response time for outpatient, inpatient, and emergency HL7Central systems are 0.035, 0.04, and 0.036 s, respectively. The outpatient, inpatient, and emergency WebUI average response times are 0.79, 1.25, and 0.82 s. The scalability of the rightsizing project and our evaluation results show that the SOA HIS we propose provides evidence that SOA can provide system scalability and sustainability in a highly demanding healthcare information system.

  10. From Blueprint to Digital Model: The Information Age, Archives and the Future of Architectural History*

    Directory of Open Access Journals (Sweden)

    Michelangelo Sabatino

    2012-10-01

    Full Text Available The digital revolution has not only transformed the process of thinking and making architecture,but has also led to shifts for researchers in the field and the institutions that safeguard and interpretevidence of the architect’s design process. As the rise of PowerPoint made it less cumbersometo view multiple images simultaneously, pioneering art historian Heinrich Wöfflin’s morelimited binary lantern slide presentation was effectively rendered obsolete. However, digital imagingand projection in the field brought risks as great as the new freedoms it afforded. The shiftfrom a work environment dominated until recently by drawings on paper and architectural models(even as CAD was being implemented over the last 20 years to one dominated by digitaldesign and 3D modeling has irrevocably affected the ways contemporary architects produceand save their drawings as well as how they are stored and accessed in archives, how they aredisplayed, and how they are published. As technology has brought new horizons to the profession,the image of the architect has gone from the solitary scholar of Medieval architecture depictedby A. W. N. Pugin in 1841 to that of savvy manager overseeing large firms like Foster +Partners; the historian too has shed the image of recluse toiling in the bowels of a dusty archiveor library.1

  11. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... the obligation to prepare students to perform in a profession that is largely defined by forces outside that discipline. It will be proposed that the autonomy of architecture can be understood as a unique kind of information: as architecture’s self-reliance or knowledge-about itself. A knowledge...... be transformed and reapplied endlessly through its confrontation with shifting information from outside the realms of architecture. A selection of architects’ statements on their own work will be used to demonstrate how in quite diverse contemporary practices the re-use of existing architectures is applied...

  12. Information processing in the CNS: a supramolecular chemistry?

    Science.gov (United States)

    Tozzi, Arturo

    2015-10-01

    How does central nervous system process information? Current theories are based on two tenets: (a) information is transmitted by action potentials, the language by which neurons communicate with each other-and (b) homogeneous neuronal assemblies of cortical circuits operate on these neuronal messages where the operations are characterized by the intrinsic connectivity among neuronal populations. In this view, the size and time course of any spike is stereotypic and the information is restricted to the temporal sequence of the spikes; namely, the "neural code". However, an increasing amount of novel data point towards an alternative hypothesis: (a) the role of neural code in information processing is overemphasized. Instead of simply passing messages, action potentials play a role in dynamic coordination at multiple spatial and temporal scales, establishing network interactions across several levels of a hierarchical modular architecture, modulating and regulating the propagation of neuronal messages. (b) Information is processed at all levels of neuronal infrastructure from macromolecules to population dynamics. For example, intra-neuronal (changes in protein conformation, concentration and synthesis) and extra-neuronal factors (extracellular proteolysis, substrate patterning, myelin plasticity, microbes, metabolic status) can have a profound effect on neuronal computations. This means molecular message passing may have cognitive connotations. This essay introduces the concept of "supramolecular chemistry", involving the storage of information at the molecular level and its retrieval, transfer and processing at the supramolecular level, through transitory non-covalent molecular processes that are self-organized, self-assembled and dynamic. Finally, we note that the cortex comprises extremely heterogeneous cells, with distinct regional variations, macromolecular assembly, receptor repertoire and intrinsic microcircuitry. This suggests that every neuron (or group of

  13. Occurrence reporting and processing of operations information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  14. Study on integration of geographical information system and real-time control system based on Agent architecture

    Institute of Scientific and Technical Information of China (English)

    王远飞; 叶雷; 何洪林; 张超

    2004-01-01

    The real-time control system correlatively dealing with spatial information will become an important part of the artificial control system in the future. Geographical information system, as an analyzing and processing spatial data platform and powerful tool, will play a more and more role in the real-time control field. Agent-based architecture, as a concept of artificial intelligence, has been introduced in this paper. A new intelligent soft Agent, spatial-info Agent was developed, compared with the central nerve system, integrated by GIS and the traditional real-time control system. The realization model structure of spatial-info Agent was given too. And the techniques and integration methods were discussed by integrating mapinfo and fiber integration measurement system.

  15. The Historical Foundations. Historical Architectural Treaty How Information Source of the Architectonic Heritage

    Directory of Open Access Journals (Sweden)

    Fernando Da Casa

    2011-12-01

    Full Text Available In order to address architectural heritage conservation, we must be familiar with the medium with which we will be working, its function and response to incidents or external actions (natural or anthropogenic and how the buildings were conceived and constructed in order to understand how they will be affected by the intervention process to which they will be subjected and adopt the adequate measures so these processes will not harm the buildings. An important element is the foundation. This is a fundamental, yet often forgotten, element. It is important to know the history of the foundations, how and why they were constructed and for this, it is essential to study architectural treatises as the origin of their design. It is surprising to read classical architecture treatises and observe that they do not refer to calculations of dimensions, but to constructive solutions that today may seem clever because they are obvious, but in reality, they do not address the thoughts of the designer or builder. The historic architectural treatises on construction that significantly influenced Spanish construction, which we studied and will present in this article, include Vitruvius and Palladio as well as the developments in the eighteenth and nineteenth centuries, and even into the first half of the twentieth century: (Vitruvius (1st century BC, Palladio (1524, Alberti (1582, Cristóbal de Rojas (1598, Fray Laurencio San Nicolás (1639, Brizguz y Bru (1738, Rieger (1763, Fornes y Gurrea (1841, Espinosa (1859, Marcos y Bausá (1879, Ger y Lobez (1898 and Barberot (1927.

  16. On the Influences of Information Architecture on Information Science%论信息构建对情报学的影响

    Institute of Scientific and Technical Information of China (English)

    周晓英

    2003-01-01

    According to its activities, principles and goals, Information Architecture (IA) is a new embranehment of Information Science. The author discusses the influences of IA on Information Science, and probes into some new requests that have been brought up to Information Science along with the emergence of IA.

  17. Communications System Architecture Development for Air Traffic Management and Aviation Weather Information Dissemination

    Science.gov (United States)

    Gallagher, Seana; Olson, Matt; Blythe, Doug; Heletz, Jacob; Hamilton, Griff; Kolb, Bill; Homans, Al; Zemrowski, Ken; Decker, Steve; Tegge, Cindy

    2000-01-01

    This document is the NASA AATT Task Order 24 Final Report. NASA Research Task Order 24 calls for the development of eleven distinct task reports. Each task was a necessary exercise in the development of comprehensive communications systems architecture (CSA) for air traffic management and aviation weather information dissemination for 2015, the definition of the interim architecture for 2007, and the transition plan to achieve the desired End State. The eleven tasks are summarized along with the associated Task Order reference. The output of each task was an individual task report. The task reports that make up the main body of this document include Task 5, Task 6, Task 7, Task 8, Task 10, and Task 11. The other tasks provide the supporting detail used in the development of the architecture. These reports are included in the appendices. The detailed user needs, functional communications requirements and engineering requirements associated with Tasks 1, 2, and 3 have been put into a relational database and are provided electronically.

  18. Design and implementation of information acquisition system architecture for multi-sensor robots

    Institute of Scientific and Technical Information of China (English)

    Chen Guoliang; Huang Xinhan; Wang Min

    2007-01-01

    A multi-layer controller architecture based on digital signal processor(DSP)and on-chip MCU was proposed for multi-sensor information acquisition system;it consisted of a data acquisition unit and a data fusion unit,and used a host controller to connect the two units into all integrated system.Compared with architectures of traditional acquisition system,this architecture had good openness and goad adaptability of algorithms in hardware.To validate its feasibility,a small-scale prototype was cleverly designed,which adopted ADuC812.tMS320F206 and 89C51 as controllers,and had 16-channel ADC and 12-channel DAC with high accuracy of 12-bit.The Interfaces between different controllers were introduced in detail.Some basic parameters of the prototype were presented by board-level tests and by comparison with other two systems.The prototype Was employed to provide on-line state measurement,parameter estimation and decision-making for trajectory tracking of wheeled mobile robot.Experimental results show that the prototype achieves the goals of data acquisition,fusion and control perfectly.

  19. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...

  20. Information processing among high-performance managers

    Directory of Open Access Journals (Sweden)

    S.C. Garcia-Santos

    2010-01-01

    Full Text Available The purpose of this study was to evaluate the information processing of 43 business managers with a professional superior performance. The theoretical framework considers three models: the Theory of Managerial Roles of Henry Mintzberg, the Theory of Information Processing, and Process Model Response to Rorschach by John Exner. The participants have been evaluated by Rorschach method. The results show that these managers are able to collect data, evaluate them and establish rankings properly. At same time, they are capable of being objective and accurate in the problems assessment. This information processing style permits an interpretation of the world around on basis of a very personal and characteristic processing way or cognitive style.

  1. The ToolBus: A Service-oriented Architecture for Language processing Tools

    NARCIS (Netherlands)

    Klint, P.

    2007-01-01

    The paradigm of service orientation creates new opportunities for language-processing tools and Interactive Development Environments. At CWI we have developed the ToolBus, a service-oriented architecture with application areas like software renovation and implementation of domain- specific languages

  2. A Design of Pipelined Architecture for on-the-Fly Processing of Big Data Streams

    Directory of Open Access Journals (Sweden)

    Usamah Algemili

    2015-01-01

    Full Text Available Conventional processing infrastructures have been challenged by huge demand of stream-based applications. The industry responded by introducing traditional stream processing engines along-with emerged technologies. The ongoing paradigm embraces parallel computing as the most-suitable proposition. Pipelining and Parallelism have been intensively studied in recent years, yet parallel programming on multiprocessor architectures stands as one of the biggest challenges to the software industry. Parallel computing relies on parallel programs that may encounter internal memory constrains. In addition, parallel computing needs special skillset of programming as well as software conversions. This paper presents reconfigurable pipelined architecture. The design is especially aimed at Big Data clustering, and it adopts Symmetric multiprocessing (SMP along with crossbar switch and forced interrupt. The main goal of this promising architecture is to efficiently process big data streams on-the-fly, while it can process sequential programs on parallel-pipelined model. The system overpasses internal memory constrains of multicore architectures by applying forced interrupts and crossbar switching. It reduces complexity, data dependency, high-latency, and cost overhead of parallel computing.

  3. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  4. A process framework for information security management

    Directory of Open Access Journals (Sweden)

    Knut Haufe

    2016-01-01

    Full Text Available Securing sensitive organizational data has become increasingly vital to organizations. An Information Security Management System (ISMS is a systematic approach for establishing, implementing, operating, monitoring, reviewing, maintaining and improving an organization's information security. Key elements of the operation of an ISMS are ISMS processes. However, and in spite of its importance, an ISMS process framework with a description of ISMS processes and their interaction as well as the interaction with other management processes is not available in the literature. Cost benefit analysis of information security investments regarding single measures protecting information and ISMS processes are not in the focus of current research, mostly focused on economics. This article aims to fill this research gap by proposing such an ISMS process framework as the main contribution. Based on a set of agreed upon ISMS processes in existing standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS, instead of focusing on measures and controls. By this, as a main finding, the systemic character of the ISMS consisting of processes and the perception of relevant roles of the ISMS is strengthened.

  5. Critical Awareness in the Era of Globalisation: Lessons for Landscape Architecture from an informal Community in Tijuana, Mexico

    Directory of Open Access Journals (Sweden)

    Kyle Brown

    2007-11-01

    Full Text Available The profession of landscape architecture has undoubtedly benefited from global economic investment, which has spurred development projects involving landscape architects in developing countries around catalysts such as industrial growth and tourism. However, these globalisation trends have also been blamed for various environmental and social ills, and pose substantial risk and uncertainty for the profession. This paper examines the consequences of globalisation, including the impact on informal communities that may not directly benefit from such activities. These consequences are illustrated through literature review as well as description of a community we have been engaged with in Tijuana, Mexico; a case that is typical of many global trends. We also examine the role of landscape architecture practice and education in this globalisation process, arguing that greater consideration is warranted of the professional's role in maintaining or transforming existing social structures that are conducive to inequities and injustices. We argue that critical awareness of a given situation is essential for landscape architects to facilitate social transformation, and we outline a strategy used in Tijuana to gain critical awareness and to effectively dialogue with informal communities.

  6. Multilevel and Hybrid Architecture for Device Abstraction and Context Information Management in Smart Home Environments

    Science.gov (United States)

    Peláez, Víctor; González, Roberto; San Martín, Luis Ángel; Campos, Antonio; Lobato, Vanesa

    Hardware device management, and context information acquisition and abstraction are key factors to develop the ambient intelligent paradigm in smart homes. This work presents an architecture that addresses these two problems and provides a usable framework to develop applications easily. In contrast to other proposals, this work addresses performance issues specifically. Results show that the execution performance of the developed prototype is suitable for deployment in a real environment. In addition, the modular design of the system allows the user to develop applications using different techniques and different levels of abstraction.

  7. Design of Enterprise and Web Application Architecture for Secure Information System

    Directory of Open Access Journals (Sweden)

    Dr. Banta Singh Jangra

    2012-04-01

    Full Text Available In this paper we design and provide the tempering of systems can cause huge damage and hence it becomes extremely critical to understand all the aspects around avenues of security threats as well as understand the possible solutions to safeguard and secure the information flow. A Security Architecture Blueprint is must to bring focus to the key areas of concern for the enterprise, highlighting decision criteria and context for each domain. Security services provide confidentiality, integrity, and availability services for the platform. Security services are implemented as protection services, such as authentication and authorization, detection services, such as monitoring and auditing, and response services, such as incident response and forensics.

  8. ENERGETIC CHARGE OF AN INFORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    Popova T.M.

    2009-12-01

    Full Text Available Main laws of technical thermodynamics are universal and could be applied to processes other than thermodynamic ones. The results of the comparison of peculiarities of irreversible informational and thermodynamic processes are presented in the article and a new term “Infopy” is used. A more precise definition of “infopy” as an energetic charge is given in the article.

  9. Information Processing Approaches to Studying Spelling Deficiencies.

    Science.gov (United States)

    Gerber, Michael M.; Hall, Robert J.

    1987-01-01

    The article explores information processing models of spelling performance and argues that an adequate theory of spelling processes must include: (1) qualitative changes in performance as a function of maturation that underlie development of automaticity; (2) transactional development of spelling-related knowledge structures and efficient…

  10. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  11. An Internet of Things Generic Reference Architecture

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.; Riaz, Tahir; Madsen, Ole Brun

    2013-01-01

    , and keeping track of all these things for monitoring and controlling some information. IoT architecture is studied from software architecture, overall system architecture and network architecture point of view. Paper puts forward the requirements of software architecture along with, its component...... and deployment diagram, process and interface diagram at abstract level. Paper proposes the abstract generic IoT reference and concrete abstract generic IoT reference architectures. Network architecture is also put up as a state of the art. Paper shortly gives overviews of protocols used for IoT. Some...

  12. Cellular Automata as a learning process in Architecture and Urban design

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Foged, Isak Worre

    2014-01-01

    design approach on a master level urban design studio this paper will discuss the strategies for dealing with complexity at an urban scale as well as the pedagogical considerations behind applying computational tools and methods to a urban design education.......This paper explores the application of cellular automata as method for investigating the dynamic parameters and interrelationships that constitute the urban space. With increasing aspects needed for integration during the architectural and urban design process with the relations between....... An architectural methodological response to this situation is presented through the development of a conceptual computational design system that allows these dynamics to unfold and to be observed for architectural design decision taking. Reflecting on the development and implementation of a cellular automata based...

  13. Systematic information processing style and perseverative worry.

    Science.gov (United States)

    Dash, Suzanne R; Meeten, Frances; Davey, Graham C L

    2013-12-01

    This review examines the theoretical rationale for conceiving of systematic information processing as a proximal mechanism for perseverative worry. Systematic processing is characterised by detailed, analytical thought about issue-relevant information, and in this way, is similar to the persistent, detailed processing of information that typifies perseverative worry. We review the key features and determinants of systematic processing, and examine the application of systematic processing to perseverative worry. We argue that systematic processing is a mechanism involved in perseverative worry because (1) systematic processing is more likely to be deployed when individuals feel that they have not reached a satisfactory level of confidence in their judgement and this is similar to the worrier's striving to feel adequately prepared, to have considered every possible negative outcome/detect all potential danger, and to be sure that they will successfully cope with perceived future problems; (2) systematic processing and worry are influenced by similar psychological cognitive states and appraisals; and (3) the functional neuroanatomy underlying systematic processing is located in the same brain regions that are activated during worrying. This proposed mechanism is derived from core psychological processes and offers a number of clinical implications, including the identification of psychological states and appraisals that may benefit from therapeutic interventions for worry-based problems.

  14. A software architecture for multi-cellular system simulations on graphics processing units.

    Science.gov (United States)

    Jeannin-Girardon, Anne; Ballet, Pascal; Rodin, Vincent

    2013-09-01

    The first aim of simulation in virtual environment is to help biologists to have a better understanding of the simulated system. The cost of such simulation is significantly reduced compared to that of in vivo simulation. However, the inherent complexity of biological system makes it hard to simulate these systems on non-parallel architectures: models might be made of sub-models and take several scales into account; the number of simulated entities may be quite large. Today, graphics cards are used for general purpose computing which has been made easier thanks to frameworks like CUDA or OpenCL. Parallelization of models may however not be easy: parallel computer programing skills are often required; several hardware architectures may be used to execute models. In this paper, we present the software architecture we built in order to implement various models able to simulate multi-cellular system. This architecture is modular and it implements data structures adapted for graphics processing units architectures. It allows efficient simulation of biological mechanisms.

  15. Science-driven system architecture: A new process for leadership class computing

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-10-19

    Over the past several years, computational scientists have observed a frustrating trend of stagnating application performance despite dramatic increases in peak performance of high performance computers. In 2002, researchers at Lawrence Berkeley National Laboratory, Argonne National Laboratory, and IBM proposed a new process to reverse this situation [1]. This strategy is based on new types of development partnerships with computer vendors based on the concept of science-driven computer system design. This strategy will engage applications scientists well before an architecture is available for commercialization. The process is already producing results, and has further potential for dramatically improving system efficiency. This paper documents the progress to date and the potential for future benefits. An example of this process is discussed, using IBM Power architecture with a computer architecture design that can lead to a sustained performance of 50 to 100 Tflo p/s on a broad spectrum of applications in 2006 for a reasonable cost. This partnership will establish a collaborative approach to modifying computer architecture to enable heretofore unrealized achievements in computer capability-limited fields such as nanoscience, combustion modeling, fusion, climate modeling, and astrophysics.

  16. [Building Process and Architectural Planning Characteristics of Daehan Hospital Main Building].

    Science.gov (United States)

    Lee, Geauchul

    2016-04-01

    This paper explores the introduction process of Daehan Hospital from Japan as the modern medical facility in Korea, and the architectural planning characteristics as a medical facility through the detailed building process of Daehan Hospital main building. The most noticeable characteristic of Daehan Hospital is that it was designed and constructed not by Korean engineers but by Japanese engineers. Therefore, Daehan Hospital was influenced by Japanese early modern medical facility, and Japanese engineers modeled Daehan Hospital main building on Tokyo Medical School main building which was constructed in 1876 as the first national medical school and hospital. The architectural type of Tokyo Medical School main building was a typical school architecture in early Japanese modern period which had a middle corridor and a pseudo Western-style tower, but Tokyo Medical School main building became the model of a medical facility as the symbol of the medical department in Tokyo Imperial University. This was the introduction and transplantation process of Japanese modern 'model' like as other modern systems and technologies during the Korean modern transition period. However, unlike Tokyo Medical School main building, Daehan Hospital main building was constructed not as a wooden building but as a masonry building. Comparing with the function of Daehan Hospital main building, its architectural form and construction costs was excessive scale, which was because Japanese Resident-General of Korea had the intention of ostentation that Japanese modernity was superior to Korean Empire.

  17. Decreasing Data Analytics Time: Hybrid Architecture MapReduce-Massive Parallel Processing for a Smart Grid

    Directory of Open Access Journals (Sweden)

    Abdeslam Mehenni

    2017-03-01

    Full Text Available As our populations grow in a world of limited resources enterprise seek ways to lighten our load on the planet. The idea of modifying consumer behavior appears as a foundation for smart grids. Enterprise demonstrates the value available from deep analysis of electricity consummation histories, consumers’ messages, and outage alerts, etc. Enterprise mines massive structured and unstructured data. In a nutshell, smart grids result in a flood of data that needs to be analyzed, for better adjust to demand and give customers more ability to delve into their power consumption. Simply put, smart grids will increasingly have a flexible data warehouse attached to them. The key driver for the adoption of data management strategies is clearly the need to handle and analyze the large amounts of information utilities are now faced with. New approaches to data integration are nauseating moment; Hadoop is in fact now being used by the utility to help manage the huge growth in data whilst maintaining coherence of the Data Warehouse. In this paper we define a new Meter Data Management System Architecture repository that differ with three leaders MDMS, where we use MapReduce programming model for ETL and Parallel DBMS in Query statements(Massive Parallel Processing MPP.

  18. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  19. A third generation object-oriented process model:roles and architectures in focus

    OpenAIRE

    Kivistö, K. (Kari)

    2000-01-01

    Abstract This thesis examines and evaluates the Object-Oriented Client/Server (OOCS) model, a process model that can be used when IT organizations develop object-oriented client/server applications. In particular, it defines the roles in the development team and combines them into the process model. Furthermore, the model focuses on the client/server architecture, considering it explicitly. The model has been under construction for several years and it has been test...

  20. PHYSICAL RESOURCES OF INFORMATION PROCESSES AND TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Mikhail O. Kolbanev

    2014-11-01

    Full Text Available Subject of study. The paper describes basic information technologies for automating of information processes of data storage, distribution and processing in terms of required physical resources. It is shown that the study of these processes with such traditional objectives of modern computer science, as the ability to transfer knowledge, degree of automation, information security, coding, reliability, and others, is not enough. The reasons are: on the one hand, the increase in the volume and intensity of information exchange in the subject of human activity and, on the other hand, drawing near to the limit of information systems efficiency based on semiconductor technologies. Creation of such technologies, which not only provide support for information interaction, but also consume a rational amount of physical resources, has become an actual problem of modern engineering development. Thus, basic information technologies for storage, distribution and processing of information to support the interaction between people are the object of study, and physical temporal, spatial and energy resources required for implementation of these technologies are the subject of study. Approaches. An attempt is made to enlarge the possibilities of traditional cybernetics methodology, which replaces the consideration of material information component by states search for information objects. It is done by taking explicitly into account the amount of physical resources required for changes in the states of information media. Purpose of study. The paper deals with working out of a common approach to the comparison and subsequent selection of basic information technologies for storage, distribution and processing of data, taking into account not only the requirements for the quality of information exchange in particular subject area and the degree of technology application, but also the amounts of consumed physical resources. Main findings. Classification of resources

  1. Continuous-variable quantum information processing

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Leuchs, G.; Silberhorn, C.

    2010-01-01

    Observables of quantum systems can possess either a discrete or a continuous spectrum. For example, upon measurements of the photon number of a light state, discrete outcomes will result whereas measurements of the light's quadrature amplitudes result in continuous outcomes. If one uses the conti......Observables of quantum systems can possess either a discrete or a continuous spectrum. For example, upon measurements of the photon number of a light state, discrete outcomes will result whereas measurements of the light's quadrature amplitudes result in continuous outcomes. If one uses...... the continuous degree of freedom of a quantum system for encoding, processing or detecting information, one enters the field of continuous-variable (CV) quantum information processing. In this paper we review the basic principles of CV quantum information processing with main focus on recent developments...... in the field. We will be addressing the three main stages of a quantum information system; the preparation stage where quantum information is encoded into CVs of coherent states and single-photon states, the processing stage where CV information is manipulated to carry out a specified protocol and a detection...

  2. Architecture and Behavior Design for Information Agents%信息Agent的体系结构与行为设计

    Institute of Scientific and Technical Information of China (English)

    郑淑丽; 杨敬安; 骆祥峰

    2002-01-01

    With the rapid development of Internet,information Agent aroused a great interest with its potential power.Various information agents were made for different targets and were implemented in different means.To facilitate open system interoperability of autonmous agents and reduce program load of programmers,we need to specify reusable architecture to support some reusable behavior for information agent.First,the function overview and the basic architecture of information agent are stated,and then the most important modules are discussed.They are the schedule system and the local DBMS.Finally,five reusable behaviors are presented.

  3. Machine Process Capability Information Through Six Sigma

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.F.

    1998-03-13

    A project investigating details concerning machine process capability information and its accessibility has been conducted. The thesis of the project proposed designing a part (denoted as a machine capability workpiece) based on the major machining features of a given machine. Parts are machined and measured to gather representative production, short-term variation. The information is utilized to predict the expected defect rate, expressed in terms of a composite sigma level process capability index, for a production part. Presently, decisions concerning process planning, particularly what machine will statistically produce the minimum amount of defects based on machined features and associated tolerances, are rarely made. Six sigma tools and methodology were employed to conduct this investigation at AlliedSignal FM and T. Tools such as the thought process map, factor relationship diagrams, and components of variance were used. This study is progressing toward completion. This research study was an example of how machine process capability information may be gathered for milling planar faces (horizontal) and slot features. The planning method used to determine where and how to gather variation for the part to be designed is known as factor relationship diagramming. Components-of-variation is then applied to the gathered data to arrive at the contributing level of variation illustrated within the factor relationship diagram. The idea of using this capability information beyond process planning to the other business enterprise operations is proposed.

  4. A Multi-Objective Compounded Local Mobile Cloud Architecture Using Priority Queues to Process Multiple Jobs

    Science.gov (United States)

    Wei, Xiaohui; Sun, Bingyi; Cui, Jiaxu; Xu, Gaochao

    2016-01-01

    As a result of the greatly increased use of mobile devices, the disadvantages of portable devices have gradually begun to emerge. To solve these problems, the use of mobile cloud computing assisted by cloud data centers has been proposed. However, cloud data centers are always very far from the mobile requesters. In this paper, we propose an improved multi-objective local mobile cloud model: Compounded Local Mobile Cloud Architecture with Dynamic Priority Queues (LMCpri). This new architecture could briefly store jobs that arrive simultaneously at the cloudlet in different priority positions according to the result of auction processing, and then execute partitioning tasks on capable helpers. In the Scheduling Module, NSGA-II is employed as the scheduling algorithm to shorten processing time and decrease requester cost relative to PSO and sequential scheduling. The simulation results show that the number of iteration times that is defined to 30 is the best choice of the system. In addition, comparing with LMCque, LMCpri is able to effectively accommodate a requester who would like his job to be executed in advance and shorten execution time. Finally, we make a comparing experiment between LMCpri and cloud assisting architecture, and the results reveal that LMCpri presents a better performance advantage than cloud assisting architecture. PMID:27419854

  5. A Multi-Objective Compounded Local Mobile Cloud Architecture Using Priority Queues to Process Multiple Jobs.

    Directory of Open Access Journals (Sweden)

    Xiaohui Wei

    Full Text Available As a result of the greatly increased use of mobile devices, the disadvantages of portable devices have gradually begun to emerge. To solve these problems, the use of mobile cloud computing assisted by cloud data centers has been proposed. However, cloud data centers are always very far from the mobile requesters. In this paper, we propose an improved multi-objective local mobile cloud model: Compounded Local Mobile Cloud Architecture with Dynamic Priority Queues (LMCpri. This new architecture could briefly store jobs that arrive simultaneously at the cloudlet in different priority positions according to the result of auction processing, and then execute partitioning tasks on capable helpers. In the Scheduling Module, NSGA-II is employed as the scheduling algorithm to shorten processing time and decrease requester cost relative to PSO and sequential scheduling. The simulation results show that the number of iteration times that is defined to 30 is the best choice of the system. In addition, comparing with LMCque, LMCpri is able to effectively accommodate a requester who would like his job to be executed in advance and shorten execution time. Finally, we make a comparing experiment between LMCpri and cloud assisting architecture, and the results reveal that LMCpri presents a better performance advantage than cloud assisting architecture.

  6. A Multi-Objective Compounded Local Mobile Cloud Architecture Using Priority Queues to Process Multiple Jobs.

    Science.gov (United States)

    Wei, Xiaohui; Sun, Bingyi; Cui, Jiaxu; Xu, Gaochao

    2016-01-01

    As a result of the greatly increased use of mobile devices, the disadvantages of portable devices have gradually begun to emerge. To solve these problems, the use of mobile cloud computing assisted by cloud data centers has been proposed. However, cloud data centers are always very far from the mobile requesters. In this paper, we propose an improved multi-objective local mobile cloud model: Compounded Local Mobile Cloud Architecture with Dynamic Priority Queues (LMCpri). This new architecture could briefly store jobs that arrive simultaneously at the cloudlet in different priority positions according to the result of auction processing, and then execute partitioning tasks on capable helpers. In the Scheduling Module, NSGA-II is employed as the scheduling algorithm to shorten processing time and decrease requester cost relative to PSO and sequential scheduling. The simulation results show that the number of iteration times that is defined to 30 is the best choice of the system. In addition, comparing with LMCque, LMCpri is able to effectively accommodate a requester who would like his job to be executed in advance and shorten execution time. Finally, we make a comparing experiment between LMCpri and cloud assisting architecture, and the results reveal that LMCpri presents a better performance advantage than cloud assisting architecture.

  7. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  8. Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics.

    Science.gov (United States)

    Sadler, Brian M; Hoyos, Sebastian

    2014-01-01

    The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control.

  9. Interaction between Task Oriented and Affective Information Processing in Cognitive Robotics

    Science.gov (United States)

    Haazebroek, Pascal; van Dantzig, Saskia; Hommel, Bernhard

    There is an increasing interest in endowing robots with emotions. Robot control however is still often very task oriented. We present a cognitive architecture that allows the combination of and interaction between task representations and affective information processing. Our model is validated by comparing simulation results with empirical data from experimental psychology.

  10. Information Processing Structure of Quantum Gravity

    Science.gov (United States)

    Gyongyosi, Laszlo; Imre, Sandor

    2014-05-01

    The theory of quantum gravity is aimed to fuse general relativity with quantum theory into a more fundamental framework. Quantum gravity provides both the non-fixed causality of general relativity and the quantum uncertainty of quantum mechanics. In a quantum gravity scenario, the causal structure is indefinite and the processes are causally non-separable. We provide a model for the information processing structure of quantum gravity. We show that the quantum gravity environment is an information resource-pool from which valuable information can be extracted. We analyze the structure of the quantum gravity space and the entanglement of the space-time geometry. We study the information transfer capabilities of quantum gravity space and define the quantum gravity channel. We characterize the information transfer of the gravity space and the correlation measure functions of the gravity channel. We investigate the process of stimulated storage for quantum gravity memories, a phenomenon that exploits the information resource-pool property of quantum gravity. The results confirm that the benefits of the quantum gravity space can be exploited in quantum computations, particularly in the development of quantum computers. The results are supported by the grant COST Action MP1006.

  11. A flexible software architecture for scalable real-time image and video processing applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  12. Introduction to quantum physics and information processing

    CERN Document Server

    Vathsan, Radhika

    2016-01-01

    An Elementary Guide to the State of the Art in the Quantum Information FieldIntroduction to Quantum Physics and Information Processing guides beginners in understanding the current state of research in the novel, interdisciplinary area of quantum information. Suitable for undergraduate and beginning graduate students in physics, mathematics, or engineering, the book goes deep into issues of quantum theory without raising the technical level too much.The text begins with the basics of quantum mechanics required to understand how two-level systems are used as qubits. It goes on to show how quant

  13. Combining Genome-Wide Information with a Functional Structural Plant Model to Simulate 1-Year-Old Apple Tree Architecture

    Science.gov (United States)

    Migault, Vincent; Pallas, Benoît; Costes, Evelyne

    2017-01-01

    In crops, optimizing target traits in breeding programs can be fostered by selecting appropriate combinations of architectural traits which determine light interception and carbon acquisition. In apple tree, architectural traits were observed to be under genetic control. However, architectural traits also result from many organogenetic and morphological processes interacting with the environment. The present study aimed at combining a FSPM built for apple tree, MAppleT, with genetic determinisms of architectural traits, previously described in a bi-parental population. We focused on parameters related to organogenesis (phyllochron and immediate branching) and morphogenesis processes (internode length and leaf area) during the first year of tree growth. Two independent datasets collected in 2004 and 2007 on 116 genotypes, issued from a ‘Starkrimson’ × ‘Granny Smith’ cross, were used. The phyllochron was estimated as a function of thermal time and sylleptic branching was modeled subsequently depending on phyllochron. From a genetic map built with SNPs, marker effects were estimated on four MAppleT parameters with rrBLUP, using 2007 data. These effects were then considered in MAppleT to simulate tree development in the two climatic conditions. The genome wide prediction model gave consistent estimations of parameter values with correlation coefficients between observed values and estimated values from SNP markers ranging from 0.79 to 0.96. However, the accuracy of the prediction model following cross validation schemas was lower. Three integrative traits (the number of leaves, trunk length, and number of sylleptic laterals) were considered for validating MAppleT simulations. In 2007 climatic conditions, simulated values were close to observations, highlighting the correct simulation of genetic variability. However, in 2004 conditions which were not used for model calibration, the simulations differed from observations. This study demonstrates the possibility

  14. Department of Defense Technical Architecture Framework for Information Management. Volume 1-8: Overview. Version 3.0.

    Science.gov (United States)

    1996-04-30

    Portability Guide (XPG), an open environment based on standards. X/Open also brands products. Volume 4 DoD Standards-Based Architecture Version 3.0...Open Portability Guide (XPG), an open environment based on standards. X/Open also brands products. Volume 4 DoD Standards-Based Architecture Version...Interface Style Guide 30 April 1996 Nes, F. 1986. "Space, Color, and Typography on Visual Display Terminals." Behavior and Information Technology 5(2

  15. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  16. Symposium on Information Processing in Organizations.

    Science.gov (United States)

    1982-04-01

    International, July, 1981. * * 100 The Maximization Process under Uncertainty Richard M. Cyert Morris H. DeGroot The theory of the firm has been...Cyert and Morris DeGvot, C-MU: "Th~ I IJ:li. process under uncertai nt Peter Keen, MIT: "Information systems in organizationsu Patrick Larkey and...uncertainty have been particularly fruitful. Some of these areas are oligopoly (Friedman, 1970; Shubik, 1959); statistical decision theory ( DeGroot , 1970

  17. Terminal chaos for information processing in neurodynamics.

    Science.gov (United States)

    Zak, M

    1991-01-01

    New nonlinear phenomenon-terminal chaos caused by failure of the Lipschitz condition at equilibrium points of dynamical systems is introduced. It is shown that terminal chaos has a well organized probabilistic structure which can be predicted and controlled. This gives an opportunity to exploit this phenomenon for information processing. It appears that chaotic states of neurons activity are associated with higher level of cognitive processes such as generalization and abstraction.

  18. A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes

    Science.gov (United States)

    Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw

    2004-01-01

    There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.

  19. High-level specification of a proposed information architecture for support of a bioterrorism early-warning system.

    Science.gov (United States)

    Berkowitz, Murray R

    2013-01-01

    Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.

  20. Harnessing the Risk-Related Data Supply Chain: An Information Architecture Approach to Enriching Human System Research and Operations Knowledge

    Science.gov (United States)

    Buquo, Lynn E.; Johnson-Throop, Kathy A.

    2011-01-01

    An Information Architecture facilitates the understanding and, hence, harnessing of the human system risk-related data supply chain which enhances the ability to securely collect, integrate, and share data assets that improve human system research and operations. By mapping the risk-related data flow from raw data to useable information and knowledge (think of it as a data supply chain), the Human Research Program (HRP) and Space Life Science Directorate (SLSD) are building an information architecture plan to leverage their existing, and often shared, IT infrastructure.

  1. Planar ion chip design for scalable quantum information processing

    Institute of Scientific and Technical Information of China (English)

    Wan Jin-Yin; Wang Yu-Zhu; Liu Liang

    2008-01-01

    We investigate a planar ion chip design with a two-dimensional array of linear ion traps for scalable quantum information processing.Qubits are formed from the internal electronic states of trapped 40Ca+ ions.The segmented electrodes reside in a single plane on a substrate and a grounded metal plate separately,a combination of appropriaterf and DC potentials is applied to them for stable ion confinement.Every two adjacent electrodes can generate a linear ion trap in and between the electrodes above the chip at a distance dependent on the geometrical scale and other considerations.The potential distributions are calculated by using a static electric field qualitatively.This architecture provides a conceptually simple avenue to achieving the microfabrication and large-scale quantum computation based on the axrays of trapped ions.

  2. An Ontology Driven Information Architecture for Big Data and Diverse Domains

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Hardman, Sean; Joyner, Ron; Ramirez, Paul

    2013-04-01

    The Planetary Data System's has just released the PDS4 system for first use. Its architecture is comprised of three principle parts, an ontology that captures knowledge from the planetary science domain, a federated registry/repository system for product identification, versioning, tracking, and storage, and a REST-based service layer for search, retrieval, and distribution. An ontology modeling tool is used to prescriptively capture product definitions that adhere to object-oriented principles and that are compliant with specific registry, archive, and data dictionary reference models. The resulting information model is product centric, allowing all information to be packaged into products and tracked in the registry. The flexibility required in a diverse domain is provided through the use of object-oriented extensions and a hierarchical governance scheme with common, discipline, and mission levels. Finally all PDS4 data standards are generated or derived from the information model. The federated registry provides identification, versioning, and tracking functionality across federated repositories and is configured for deployment using configuration files generated from the ontology. Finally a REST-based service layer provides for metadata harvest, product transformation, packaging, and search, and portal hosting. A model driven architecture allows the data and software engineering teams to develop in parallel with minimal team interaction. The resulting software remains relatively stable as the domain evolves. Finally the development of a single shared ontology promotes interoperability and data correlation and helps meet the expectations of modern scientists for science data discovery, access and use. This presentation will provide an overview of PDS4 focusing on the data standards, how they were developed, how they are now being used, and will present some of the lessons learned while developing in a diverse scientific community. Copyright 2013 California

  3. A Unified Computational Architecture for Preprocessing Visual Information in Space and Time.

    Science.gov (United States)

    Skrzypek, Josef

    1986-06-01

    The success of autonomous mobile robots depends on the ability to understand continuously changing scenery. Present techniques for analysis of images are not always suitable because in sequential paradigm, computation of visual functions based on absolute values of stimuli is inefficient. Important aspects of visual information are encoded in discontinuities of intensity, hence a representation in terms of relative values seems advantageous. We present the computing architecture of a massively parallel vision module which optimizes the detection of relative intensity changes in space and time. Visual information must remain constant despite variation in ambient light level or velocity of target and robot. Constancy can be achieved by normalizing motion and lightness scales. In both cases, basic computation involves a comparison of the center pixels with the context of surrounding values. Therefore, a similar computing architecture, composed of three functionally-different and hierarchically-arranged layers of overlapping operators, can be used for two integrated parts of the module. The first part maintains high sensitivity to spatial changes by reducing noise and normalizing the lightness scale. The result is used by the second part to maintain high sensitivity to temporal discontinuities and to compute relative motion information. Simulation results show that response of the module is proportional to contrast of the stimulus and remains constant over the whole domain of intensity. It is also proportional to velocity of motion limited to any small portion of the visual field. Uniform motion throughout the visual field results in constant response, independent of velocity. Spatial and temporal intensity changes are enhanced because computationally, the module resembles the behavior of a DOG function.

  4. Information Processing in Auto-regulated Systems

    Directory of Open Access Journals (Sweden)

    Karl Javorszky

    2003-06-01

    Full Text Available Abstract: We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields.

  5. A process Approach to Information Services: Information Search Process (ISP Model

    Directory of Open Access Journals (Sweden)

    Hamid Keshavarz

    2010-12-01

    Full Text Available Information seeking is a behavior emerging out of the interaction between information seeker and information system and should be regarded as an episodic process so as to meet information needs of users and to take different roles in different stages of it. The present article introduces a process approach to information services in libraries using Carol Collier Kuhlthau Model. In this model, information seeking is regarded as a process consisting of six stages in each of which users have different thoughts, feelings and actions and librarians also take different roles at any stage correspondingly. These six stages are derived from instructive learning theory based on uncertainty principle. Regardless of some acceptable shortcomings, this model may be regarded as a new solution for rendering modern information services in libraries especially in relation to new information environments and media.

  6. Processing Of Visual Information In Primate Brains

    Science.gov (United States)

    Anderson, Charles H.; Van Essen, David C.

    1991-01-01

    Report reviews and analyzes information-processing strategies and pathways in primate retina and visual cortex. Of interest both in biological fields and in such related computational fields as artificial neural networks. Focuses on data from macaque, which has superb visual system similar to that of humans. Authors stress concept of "good engineering" in understanding visual system.

  7. Introduction: Natural Language Processing and Information Retrieval.

    Science.gov (United States)

    Smeaton, Alan F.

    1990-01-01

    Discussion of research into information and text retrieval problems highlights the work with automatic natural language processing (NLP) that is reported in this issue. Topics discussed include the occurrences of nominal compounds; anaphoric references; discontinuous language constructs; automatic back-of-the-book indexing; and full-text analysis.…

  8. Springfield Processing Plant (SPP) Facility Information

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  9. Spatial information processing in humans and monkeys

    NARCIS (Netherlands)

    Oleksiak, A.

    2010-01-01

    In this thesis a series of experiments are described on human volunteers and rhesus monkeys (Macaca mulatta) in the context of spatial information processing. In the first single-unit recording experiments in monkeys a spatial summation algorithm was investigated. The responses of single neurons to

  10. Quantum process discrimination with information from environment

    Science.gov (United States)

    Wang, Yuan-Mei; Li, Jun-Gang; Zou, Jian; Xu, Bao-Ming

    2016-12-01

    In quantum metrology we usually extract information from the reduced probe system but ignore the information lost inevitably into the environment. However, K. Mølmer [Phys. Rev. Lett. 114, 040401 (2015)] showed that the information lost into the environment has an important effect on improving the successful probability of quantum process discrimination. Here we reconsider the model of a driven atom coupled to an environment and distinguish which of two candidate Hamiltonians governs the dynamics of the whole system. We mainly discuss two measurement methods, one of which obtains only the information from the reduced atom state and the other obtains the information from both the atom and its environment. Interestingly, for the two methods the optimal initial states of the atom, used to improve the successful probability of the process discrimination, are different. By comparing the two methods we find that the partial information from the environment is very useful for the discriminations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11274043, 11375025, and 11005008).

  11. A Mixed-Mode Signal Processing Architecture for Radix-2 DHT

    Directory of Open Access Journals (Sweden)

    Gautam A. Shah,

    2011-06-01

    Full Text Available This paper proposes a mixed-mode signal processing architecture for radix-2 DHT. In the known algorithms, the stage structures perform all the additions and multiplications. The proposed algorithm introduces multiplying structures which perform all the multiplications with the cosine coefficients and their related additions. This leads to i simplification of the stage structures which now perform only the additions, and ii a reduction in the number of multiplications without affecting the number of additions. A mixed-mode signal processing architecture to implement the algorithm utilizing an N-bit ring counter, sample-and-hold array and analog block structure is proposed. The validity of this design has been tested by simulating it with the help of Orcad PSpice.

  12. Enterprise architecture evaluation using architecture framework and UML stereotypes

    Directory of Open Access Journals (Sweden)

    Narges Shahi

    2014-08-01

    Full Text Available There is an increasing need for enterprise architecture in numerous organizations with complicated systems with various processes. Support for information technology, organizational units whose elements maintain complex relationships increases. Enterprise architecture is so effective that its non-use in organizations is regarded as their institutional inability in efficient information technology management. The enterprise architecture process generally consists of three phases including strategic programing of information technology, enterprise architecture programing and enterprise architecture implementation. Each phase must be implemented sequentially and one single flaw in each phase may result in a flaw in the whole architecture and, consequently, in extra costs and time. If a model is mapped for the issue and then it is evaluated before enterprise architecture implementation in the second phase, the possible flaws in implementation process are prevented. In this study, the processes of enterprise architecture are illustrated through UML diagrams, and the architecture is evaluated in programming phase through transforming the UML diagrams to Petri nets. The results indicate that the high costs of the implementation phase will be reduced.

  13. Logic of historical development of the formation process of architectural and construction solutions

    OpenAIRE

    Baranov Valeriy Aleksandrovich

    2014-01-01

    In research field of development processes of architectural and construction decisions (ACD) there is already very considerable scientific, normative and technical material. And still in the end of the research boom in this area (the 90th) many authors stated their weak return in practical design. High methodological potential gives understanding of the act of ACD formation as a special kind of activity, and its structure - as historical phenomenon that allows to put two main methodological p...

  14. New multi-DSP parallel computing architecture for real-time image processing

    Institute of Scientific and Technical Information of China (English)

    Hu Junhong; Zhang Tianxu; Jiang Haoyang

    2006-01-01

    The flexibility of traditional image processing system is limited because those system are designed for specific applications. In this paper, a new TMS320C64x-based multi-DSP parallel computing architecture is presented. It has many promising characteristics such as powerful computing capability, broad I/O bandwidth, topology flexibility, and expansibility. The parallel system performance is evaluated by practical experiment.

  15. Architectural and performance considerations for a 10(7)-instruction/sec optoelectronic central processing unit.

    Science.gov (United States)

    Arrathoon, R; Kozaitis, S

    1987-11-01

    Architectural considerations for a multiple-instruction, single-data-based optoelectronic central processing unit operating at 10(7) instructions per second are detailed. Central to the operation of this device is a giant fiber-optic content-addressable memory in a programmable logic array configuration. The design includes four instructions and emphasizes the fan-in and fan-out capabilities of optical systems. Interconnection limitations and scaling issues are examined.

  16. Performance analysis of massively parallel embedded hardware architectures for retinal image processing

    OpenAIRE

    Osorio Roberto; Nieto Alejandro; Brea Victor; Vilariño David

    2011-01-01

    Abstract This paper examines the implementation of a retinal vessel tree extraction technique on different hardware platforms and architectures. Retinal vessel tree extraction is a representative application of those found in the domain of medical image processing. The low signal-to-noise ratio of the images leads to a large amount of low-level tasks in order to meet the accuracy requirements. In some applications, this might compromise computing speed. This paper is focused on the assessment...

  17. Quantum information processing and nuclear magnetic resonance

    CERN Document Server

    Cummins, H K

    2001-01-01

    as spectrometer pulse sequence programs. Quantum computers are information processing devices which operate by and exploit the laws of quantum mechanics, potentially allowing them to solve problems which are intractable using classical computers. This dissertation considers the practical issues involved in one of the more successful implementations to date, nuclear magnetic resonance (NMR). Techniques for dealing with systematic errors are presented, and a quantum protocol is implemented. Chapter 1 is a brief introduction to quantum computation. The physical basis of its efficiency and issues involved in its implementation are discussed. NMR quantum information processing is reviewed in more detail in Chapter 2. Chapter 3 considers some of the errors that may be introduced in the process of implementing an algorithm, and high-level ways of reducing the impact of these errors by using composite rotations. Novel general expressions for stabilising composite rotations are presented in Chapter 4 and a new class o...

  18. A Real-time Image Processing with a Compact FPGA-based Architecture

    Directory of Open Access Journals (Sweden)

    Ridha Djemal

    2005-01-01

    Full Text Available This study have presented a filed programmable gate array implementation of a real time video smoothing algorithm. In comparison with smoothing video techniques like deblocking filters in H.264 or smoothing in JPEG2000, the proposed method is implemented in hardware and its computational cost and complexity are reduced where all pixel processing related to uncompressed video is done on the fly. Our proposed architecture tries to optimize the design of a modified version of the Nagao filter in order to make video smoothing with respect to real time constraints. This filter have to smooth video before applying an edge extraction approach for manufacturing process control. The proposed architecture based on the RC1000P-P Virtex prototyping Board is analyzed to gain an understanding of the relationships between algorithmic features and implementation cost. Experimental results indicate that using this prototyping board with optimized hardware architecture; we can deliver real-time performances and an improvement in the video quality. This filter is capable to process a real time video with a high resolution and deliver 30 images per second at 10 MHz clock cycle.

  19. A New Information Architecture, Web Site and Services for the CMS Experiment

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services and more than 100,000 documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe the information architecture; the system design, implementation and monitoring; the document and content database; security aspects; and our deployment strategy which ensured continual smooth operation of all systems at all times.

  20. A new Information Architecture, Website and Services for the CMS Experiment

    Science.gov (United States)

    Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas

    2012-12-01

    The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe the information architecture; the system design, implementation and monitoring; the document and content database; security aspects; and our deployment strategy, which ensured continual smooth operation of all systems at all times.

  1. Information Processing Approaches to Cognitive Development

    Science.gov (United States)

    1988-07-01

    psychology: Progress in cognitive development research. New York: Springer-Verlag. Atkinson . R.C., & Shiffrin , R.M. (1968). Human memory : A proposed...821760s and early 70s. (cf. Atkinson & Shiffrin . 1968: Craik & Lockhart. 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several...Production systems as cognitive architectures 23 2.3.1 Working memory issues 24 2.3.2 Production memory issues 24 2.3.3 Conflict resolution issues 25 2.4 Some

  2. Learning from bacteria about natural information processing.

    Science.gov (United States)

    Ben-Jacob, Eshel

    2009-10-01

    Under natural growth conditions, bacteria live in complex hierarchical communities. To conduct complex cooperative behaviors, bacteria utilize sophisticated communication to the extent that their chemical language includes semantic and even pragmatic aspects. I describe how complex colony forms (patterns) emerge through the communication-based interplay between individual bacteria and the colony. Individual cells assume newly co-generated traits and abilities that are not prestored in the genetic information of the cells, that is, not all the information required for efficient responses to all environmental conditions is stored. To solve newly encountered problems, they assess the problem via collective sensing, recall stored information of past experience, and then execute distributed information processing of the 10(9)-10(12) bacteria in the colony--transforming the colony into a "super-brain." I show illuminating examples of swarming intelligence of live bacteria in which they solve optimization problems that are beyond what human beings can solve. This will lead to a discussion about the special nature of bacterial computational principles compared to Turing algorithm computational principles, in particular about the role of distributed information processing.

  3. Disjunctive Information Flow for Communicating Processes

    DEFF Research Database (Denmark)

    Li, Ximeng; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    The security validation of practical computer systems calls for the ability to specify and verify information flow policies that are dependent on data content. Such policies play an important role in concurrent, communicating systems: consider a scenario where messages are sent to different...... processes according to their tagging. We devise a security type system that enforces content-dependent information flow policies in the presence of communication and concurrency. The type system soundly guarantees a compositional noninterference property. All theoretical results have been formally proved...

  4. Information processing of earth resources data

    Science.gov (United States)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  5. Architecture in the network society

    DEFF Research Database (Denmark)

    2004-01-01

    Under the theme Architecture in the Network Society, participants were invited to focus on the dialog and sharing of knowledge between architects and other disciplines and to reflect on, and propose, new methods in the design process, to enhance and improve the impact of information technology...... on architecture. This conference and the past history of eCAADe is an example on establishing a social network for the sharing of knowledge regarding the use of computers in architectural education and research....

  6. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  7. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  8. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  9. Recognizing, Thinking and Learning as Information Processes

    Science.gov (United States)

    1987-08-30

    the stimuli that impinge upon the sense organs ( McCulloch , Lettvin, Maturama and Pitts , 1959: Hubel and Wiesel. 1959). By some process (as we shall...Criticism. Moscow: 20 Information Processes 30 August 1987 Foreign Langudges Publishing House. 85. Lettvin, J. Y., H. R. Maturana, W. S. McCulloch . and W...H. Pitts . 1959. What the frog’s eye tells the frog’s brain. Proc. IRE, 47: 1940-1951. Lomov, B. F. 1984. tMethodological and Theoretical Problems in

  10. Intelligent Information Processing in Imaging Fuzes

    Institute of Scientific and Technical Information of China (English)

    王克勇; 郑链; 宋承天

    2003-01-01

    In order to study the problem of intelligent information processing in new types of imaging fuze, the method of extracting the invariance features of target images is adopted, and radial basis function neural network is used to recognize targets. Owing to its ability of parallel processing, its robustness and generalization, the method can realize the recognition of the conditions of missile-target encounters, and meet the requirements of real-time recognition in the imaging fuze. It is shown that based on artificial neural network target recognition and burst point control are feasible.

  11. Digital image processing for information extraction.

    Science.gov (United States)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  12. Lithological architecture, geological processes and energy-field environments are major factors for the formation of hydrocarbon reservoirs

    Institute of Scientific and Technical Information of China (English)

    ZHAO Wenzhi; WANG Zecheng; LI Xiaoqing; WANG Hongjun; WANG Zhaoyun

    2005-01-01

    The formation of hydrocarbon reservoirs is controlled by three major factors: lithological architecture, geological processes and energy-field environments. Among the three major factors, lithological architecture provides the storing medium for hydrocarbon; geological processes include hydrocarbon generation, migration, accumulation, preservation and modification; and energy-field environments refer to the various geothermal and geodynamic forces that affect the lithological architecture and drive the geological processes.In this study, we take Kela-2 and Sulige gas reservoirs as two examples to study relationships among the three major factors, and explain how these factors influence the scale and quality of hydrocarbon reservoirs.

  13. Vision and visual information processing in cubozoans

    DEFF Research Database (Denmark)

    Bielecki, Jan

    relationship between acuity and light sensitivity. Animals have evolved a wide variety of solutions to this problem such as folded membranes, to have a larger receptive surfaces, and lenses, to focus light onto the receptive membranes. On the neural capacity side, complex eyes demand huge processing network...... to analyse the received information, illustrated by the fact that one third of the human brain is devoted to visual information processing. The cost of maintaining such neural network deter most organisms from investing in the camera type option, if possible, and settle for a model that will more precisely...... fit their need. Visual neuroethology integrates optics, sensory equipment, neural network and motor output to explain how animals can perform behaviour in response to a specific visual stimulus. In this doctoral thesis, I will elucidate the individual steps in a visual neuroethological pathway...

  14. Fractional Transforms in Optical Information Processing

    Directory of Open Access Journals (Sweden)

    Maria Luisa Calvo

    2005-06-01

    Full Text Available We review the progress achieved in optical information processing during the last decade by applying fractional linear integral transforms. The fractional Fourier transform and its applications for phase retrieval, beam characterization, space-variant pattern recognition, adaptive filter design, encryption, watermarking, and so forth is discussed in detail. A general algorithm for the fractionalization of linear cyclic integral transforms is introduced and it is shown that they can be fractionalized in an infinite number of ways. Basic properties of fractional cyclic transforms are considered. The implementation of some fractional transforms in optics, such as fractional Hankel, sine, cosine, Hartley, and Hilbert transforms, is discussed. New horizons of the application of fractional transforms for optical information processing are underlined.

  15. Field Programmable DSP Arrays - A Novel Reconfigurable Architecture for Efficient Reliazation of Digital Signal Processing Functions

    Directory of Open Access Journals (Sweden)

    Amitabha Sinha

    2013-04-01

    Full Text Available Digital Signal Processing functions are widely used in real time high speed applications. Those functions are generally implemented either on ASICs with inflexibility, or on FPGAs with bottlenecks of relatively smaller utilization factor or lower speed compared to ASIC. The proposed reconfigurable DSP processor is redolent to FPGA, but with basic fixed Common Modules (CMs (like adders, subtractors, multipliers, scaling units, shifters instead of CLBs. This pape r introduces the development of a reconfigurable DSP processor that integrates different filter and transform functions. The switching between DSP functions is occurred by reconfiguring the interconnection between CMs. Validation of the proposed reconfigurable architecture has been achieved on Virtex5 FPGA. The architecture provides sufficient amount of flexibility, parallelism and scalability.

  16. Information Processing Structure of Quantum Gravity

    CERN Document Server

    Gyongyosi, Laszlo

    2014-01-01

    The theory of quantum gravity is aimed to fuse general relativity with quantum theory into a more fundamental framework. The space of quantum gravity provides both the non-fixed causality of general relativity and the quantum uncertainty of quantum mechanics. In a quantum gravity scenario, the causal structure is indefinite and the processes are causally non-separable. In this work, we provide a model for the information processing structure of quantum gravity. We show that the quantum gravity environment is an information resource-pool from which valuable information can be extracted. We analyze the structure of the quantum gravity space and the entanglement of the space-time geometry. We study the information transfer capabilities of quantum gravity space and define the quantum gravity channel. We reveal that the quantum gravity space acts as a background noise on the local environment states. We characterize the properties of the noise of the quantum gravity space and show that it allows the separate local...

  17. Art, Science and Architecture: Architecture as a Dynamic Process of Structuring Matter-Energy in the Spatio-Temporal World.

    Science.gov (United States)

    Minai, Asghar Talaye

    Developed were methods of coordinating art and science in relation to the creation of the physical form of the environment. Such an approach has been directed towards a theory of form based on point theory or field theory in architecture and deals with the problem of potentiality or dispositional properties. Part I, Towards a Sociology of…

  18. An Action Selection Architecture for an Emotional Agent

    NARCIS (Netherlands)

    Burghouts, G.J.; op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Poel, Mannes; Nijholt, Antinus; Russell, I.; Haller, S.

    2003-01-01

    An architecture for action selection is presented linking emotion, cognition and behavior. It defines the information and emotion processes of an agent. The architecture has been implemented and used in a prototype environment.

  19. SOA - An Architecture Which Creates a Flexible Link between Business Processes and IT

    Directory of Open Access Journals (Sweden)

    Radu Stefan MOLEAVIN

    2012-05-01

    Full Text Available To be viable, a company must be adapted continously to the market’s requirements. The adaption of a company to the market’s requirements means also changes in the business processes of the firm. Till several years ago, in any enterprise there was a strong link between business processes and IT applications. Any changes in the business processes lead to a change in the IT applications. This means that any change in the business side involves time, human resources and material resources consumption.This fact can be translated in a slow adapting to the market changes and a cost increasing for changing IT applications side necessary for their adaptation to new business market. Today the market changes quickly and it has a great development; this means that the enterprise adaptation to the market’s requirements must be done quickly and with low consumption of either human or material resources. Also, the market changes which cause changes in the business processes should not cause major changes in the IT area. This goal can be achieved using an architecture services-based (SOA for the enterprise. This architecture allows us, like in a puzzle game, to use the same pieces of material ("services" to build different models ("business processes".

  20. Logic of historical development of the formation process of architectural and construction solutions

    Directory of Open Access Journals (Sweden)

    Baranov Valeriy Aleksandrovich

    2014-05-01

    Full Text Available In research field of development processes of architectural and construction decisions (ACD there is already very considerable scientific, normative and technical material. And still in the end of the research boom in this area (the 90th many authors stated their weak return in practical design. High methodological potential gives understanding of the act of ACD formation as a special kind of activity, and its structure - as historical phenomenon that allows to put two main methodological principles in the basis of the research: the principle of activity and the principle of level organization of historically formed objects. As a result it was succeeded to reveal 6 main stages of historical development of ACD formation process and 6 levels of its modern organization. Origin stage - is the transition from animal to human construction and emergence of the first stage of ACD formation - a measurement. The reproductive stage is characterized by centuries-old reproduction of steady volume forms of constructions. At a composite stage professional presentation detaches external form of a construction from architectural and construction actions and techniques. There occurs transformation into a subject of architectural and construction activity and creation of a method of composition on this basis. At a constructive stage engineers are involved in construction process, there is a change of style of thinking, emergence of construction designs and emergence of a new function of ACD means of configuration, and, at last, final allocation and separation of architectural and construction design from construction. Technological stage is a product of industrialization of construction on which systematization is carried out not only on the level of subject, but also in respect to the operational contents that leads to emergence of ACD program formation. At the sixth, methodological stage of development of ASP, a leading way change of the ACD formation becomes a

  1. Architecture of digital signal’s processing units with the rebuildable structure

    Directory of Open Access Journals (Sweden)

    Sheik-Seikin A. N.

    2011-08-01

    Full Text Available The synthesis technique is developed for the computing system architecture with tunable structure with graph of information connections (GIC affine transformations which assumes additional equipment of initial computing system (CS with the appropriate switching system implementing, depending on character of GIC change, one of the methods of block reorganization providing effective implementation of CS in wide range of GIC parameters change. The received estimations of hardware expenses for various methods of reorganization allow to define efficiency of CS implementation for each specific case.

  2. Intelligent tools for building a scientific information platform advanced architectures and solutions

    CERN Document Server

    Skonieczny, Lukasz; Rybinski, Henryk; Kryszkiewicz, Marzena; Niezgodka, Marek

    2013-01-01

    This book is a selection of results obtained within two years of research per- formed under SYNAT - a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The selection refers to the research in artificial intelligence, knowledge discovery and data mining, information retrieval and natural language processing, addressing the problems of implementing intelligent tools for building a scientific information platform.This book is a continuation and extension of the ideas presented in “Intelligent Tools for Building a Scientific Information Platform” published as volume 390 in the same series in 2012. It is based on the SYNAT 2012 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering.  

  3. Arquitetura da informação para biblioteca digital personalizável Information architecture for customization digital library

    Directory of Open Access Journals (Sweden)

    Silvana Ap. Borseti Gregório Vidotti

    2006-01-01

    Full Text Available Alguns recursos que podem minimizar dificuldades em relação à recuperação e disseminação de informações são: bibliotecas digitais, que possuem acesso simultâneo e remoto às informações de forma eficiente; e serviços de personalização, que permite ao usuário uma interação personalizada baseada no seu perfil. O problema de prover esses recursos se encontra na onerosidade e na dificuldade do processo de desenvolvimento, devido à grande quantidade de processos e elementos envolvidos em sua construção. Nesse contexto, é proposta uma arquitetura da informação para bibliotecas digitais personalizáveis, que visa a tratar dos seguintes problemas: escassez de literatura especializada; falta de elementos tecnológicos e informacionais, e pouca utilização de serviços de personalização de conteúdo e de interface para diversos tipos de usuários.Some resources which can minimize difficulties related to information recover and dissemination are: digital libraries, which have efficient simultaneous and remote access to information; and customization services, which allow the user a customized interaction based on his profile. The problem in provide such resources is the delay and difficulty of the development process, because of the great quantity process and elements involved in its construction. In this context, it is proposed a information architecture for customizable digital libraries, which aim to deal with the following problems: lack of specialized literature; lack of technological and informational elements and few using of content and interface customization services for several user types.

  4. Learning from Health Information Exchange Technical Architecture and Implementation in Seven Beacon Communities

    Science.gov (United States)

    McCarthy, Douglas B.; Propp, Karen; Cohen, Alexander; Sabharwal, Raj; Schachter, Abigail A.; Rein, Alison L.

    2014-01-01

    As health care providers adopt and make “meaningful use” of health information technology (health IT), communities and delivery systems must set up the infrastructure to facilitate health information exchange (HIE) between providers and numerous other stakeholders who have a role in supporting health and care. By facilitating better communication and coordination between providers, HIE has the potential to improve clinical decision-making and continuity of care, while reducing unnecessary use of services. When implemented as part of a broader strategy for health care delivery system and payment reform, HIE capability also can enable the use of analytic tools needed for population health management, patient engagement in care, and continuous learning and improvement. The diverse experiences of seven communities that participated in the three-year federal Beacon Community Program offer practical insight into factors influencing the technical architecture of exchange infrastructure and its role in supporting improved care, reduced cost, and a healthier population. The case studies also document challenges faced by the communities, such as significant time and resources required to harmonize variations in the interpretation of data standards. Findings indicate that their progress developing community-based HIE strategies, while driven by local needs and objectives, is also influenced by broader legal, policy, and market conditions. PMID:25848591

  5. Learning from health information exchange technical architecture and implementation in seven beacon communities.

    Science.gov (United States)

    McCarthy, Douglas B; Propp, Karen; Cohen, Alexander; Sabharwal, Raj; Schachter, Abigail A; Rein, Alison L

    2014-01-01

    As health care providers adopt and make "meaningful use" of health information technology (health IT), communities and delivery systems must set up the infrastructure to facilitate health information exchange (HIE) between providers and numerous other stakeholders who have a role in supporting health and care. By facilitating better communication and coordination between providers, HIE has the potential to improve clinical decision-making and continuity of care, while reducing unnecessary use of services. When implemented as part of a broader strategy for health care delivery system and payment reform, HIE capability also can enable the use of analytic tools needed for population health management, patient engagement in care, and continuous learning and improvement. The diverse experiences of seven communities that participated in the three-year federal Beacon Community Program offer practical insight into factors influencing the technical architecture of exchange infrastructure and its role in supporting improved care, reduced cost, and a healthier population. The case studies also document challenges faced by the communities, such as significant time and resources required to harmonize variations in the interpretation of data standards. Findings indicate that their progress developing community-based HIE strategies, while driven by local needs and objectives, is also influenced by broader legal, policy, and market conditions.

  6. Influence of macromolecular architecture on necking in polymer extrusion film casting process

    Energy Technology Data Exchange (ETDEWEB)

    Pol, Harshawardhan; Banik, Sourya; Azad, Lal Busher; Doshi, Pankaj; Lele, Ashish [CSIR-National Chemical Laboratory, Pune, Maharashtra (India); Thete, Sumeet [Purdue University, West Lafayette, Indiana (United States)

    2015-05-22

    Extrusion film casting (EFC) is an important polymer processing technique that is used to produce several thousand tons of polymer films/coatings on an industrial scale. In this research, we are interested in understanding quantitatively how macromolecular chain architecture (for example long chain branching (LCB) or molecular weight distribution (MWD or PDI)) influences the necking and thickness distribution of extrusion cast films. We have used different polymer resins of linear and branched molecular architecture to produce extrusion cast films under controlled experimental conditions. The necking profiles of the films were imaged and the velocity profiles during EFC were monitored using particle tracking velocimetry (PTV) technique. Additionally, the temperature profiles were captured using an IR thermography and thickness profiles were calculated. The experimental results are compared with predictions of one-dimensional flow model of Silagy et al{sup 1} wherein the polymer resin rheology is modeled using molecular constitutive equations such as the Rolie-Poly (RP) and extended Pom Pom (XPP). We demonstrate that the 1-D flow model containing the molecular constitutive equations provides new insights into the role of macromolecular chain architecture on film necking.{sup 1}D. Silagy, Y. Demay, and J-F. Agassant, Polym. Eng. Sci., 36, 2614 (1996)

  7. Evaluating Defense Architecture Frameworks for C4I System Using Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Abdullah S. Alghamdi

    2009-01-01

    Full Text Available Problem statement: The Command, Control, Communications, Computers and Intelligence (C4I Systems provided situational awareness about operational environment and supported in decision making and directed to operative environment. These systems had been used by various agencies like defense, police, investigation, road, rail, airports, oil and gas related department. However, the increase use of C4I system had made it more important and attractive. Consequently interest in design and development of C4I system had increased among the researchers. Many defense industry frameworks were available but the problem was a suitable selection of a framework in design and development of C4I system. Approach: This study described the concepts, tool and methodology being used for evaluation analysis of different frameworks by Analytic Hierarchy Process (AHP. Results: We had compared different defense industry frameworks like Department of Defense Architecture Framework (DODAF, Ministry of Defense Architecture Framework (MODAF and NATO Architecture Framework (NAF and found that AHP is fairly good tool in terms of analysis. Conclusion: Different defense industry frameworks such as DODAF, MODAF and NAF had been evaluated and compared using AHP.

  8. Information processing in convex operational theories

    Energy Technology Data Exchange (ETDEWEB)

    Barnum, Howard Nelch [Los Alamos National Laboratory; Wilce, Alexander G [SUSQUEHANNA UNIV

    2008-01-01

    In order to understand the source and extent of the greater-than-classical information processing power of quantum systems, one wants to characterize both classical and quantum mechanics as points in a broader space of possible theories. One approach to doing this, pioneered by Abramsky and Coecke, is to abstract the essential categorical features of classical and quantum mechanics that support various information-theoretic constraints and possibilities, e.g., the impossibility of cloning in the latter, and the possibility of teleportation in both. Another approach, pursued by the authors and various collaborators, is to begin with a very conservative, and in a sense very concrete, generalization of classical probability theory--which is still sufficient to encompass quantum theory--and to ask which 'quantum' informational phenomena can be reproduced in this much looser setting. In this paper, we review the progress to date in this second programme, and offer some suggestions as to how to link it with the categorical semantics for quantum processes developed by Abramsky and Coecke.

  9. Position paper: researching and developing open architectures for national health information systems in developing African countries

    CSIR Research Space (South Africa)

    Moodley, D

    2011-08-01

    Full Text Available and include an open and participatory approach that encourages reuse and sharing of artifacts and experiences [10]. The framework should o er a generalized methodology and suite of tools that can be used by many countries following customization... implementation. The development of an health enterprise architectural framework (HEAF) and an architectural artifact repository (HEART) are two of HEAL?s current activities towards creating an open architectural framework. HIS designers and implementers...

  10. The Country-specific Organizational and Information Architecture of ERP Systems at Globalised Enterprises

    Directory of Open Access Journals (Sweden)

    Bálint Molnár

    2011-01-01

    Full Text Available The competition on the market forces companies to adapt to the changing environment. Most recently, the economic and financial crisis has been accelerating the alteration of both business and IT models of enterprises. The forces of globalization and internationalization motivate the restructuring of business processes and consequently IT processes. To depict the changes in a unified framework, we need the concept of Enterprise Architecture as a theoretical approach that deals with various tiers, aspects and views of business processes and different layers of application, software and hardware systems. The paper outlines a wide-range theoretical background for analyzing the re-engineering and re-organization of ERP systems at international or transnational companies in the middle-sized EU member states. The research carried out up to now has unravelled the typical structural changes, the models for internal business networks and their modification that reflect the centralization, decentralization and hybrid approaches. Based on the results obtained recently, a future research program has been drawn up to deepen our understanding of the trends within the world of ERP systems.

  11. Information processing speed in ecstasy (MDMA) users.

    Science.gov (United States)

    Wareing, Michelle; Fisk, John E; Montgomery, Catharine; Murphy, Philip N; Chandler, Martin D

    2007-03-01

    Previous research draws parallels between ecstasy-related and age-related deficits in cognitive functioning. Age-related impairments in working memory have been attributed to a slow down in information processing speed. The present study compared 29 current ecstasy users, 10 previous users and 46 non-users on two tests measuring information processing speed and a computation span task measuring working memory. Results showed that ecstasy users performed worse than non-ecstasy users in the letter comparison task although the overall difference was not significant (p=0.089). Results from the pattern recognition task showed that current ecstasy users produced significantly more errors than the other two groups (pecstasy users produced significantly more errors than non-ecstasy users (pecstasy using groups performing significantly worse than non-users on the computation span measure (pmechanism responsible for impairments in the computation span measure is not the same as that in elderly adults where processing speed generally removes most of the age-related variance. Also of relevance is the fact that the ecstasy users reported here had used a range of other drugs making it difficult to unambiguously attribute the results obtained to ecstasy use.

  12. New FPSoC-based architecture for efficient FSBM motion estimation processing in video standards

    Science.gov (United States)

    Canals, J. A.; Martínez, M. A.; Ballester, F. J.; Mora, A.

    2007-05-01

    Due to the timing constraints in real time video encoding, hardware accelerator cores are used for video compression. System on Chip (SoC) designing tools offer a complex microprocessor system designing methodologies with an easy Intellectual Property (IP) core integration. This paper presents a PowerPC-based SoC with a motion-estimation accelerator core attached to the system bus. Motion-estimation (ME) algorithms are the most critical part in video compression due to the huge amount of data transfers and processing time. The main goal of our proposed architecture is to minimize the amount of memory accesses, thus exploiting the bandwidth of a direct memory connection. This architecture has been developed using Xilinx XPS, a SoC platforms design tool. The results show that our system is able to process the integer pixel full search block matching (FSBM) motion-estimation process and interframe mode decision of a QCIF frame (176*144 pixels), using a 48*48 pixel searching window, with an embedded PPC in a Xilinx Virtex-4 FPGA running at 100 MHz, in 1.5 ms, 4.5 % of the total processing time at 30 fps.

  13. Vision and visual information processing in cubozoans

    DEFF Research Database (Denmark)

    Bielecki, Jan

    to analyse the received information, illustrated by the fact that one third of the human brain is devoted to visual information processing. The cost of maintaining such neural network deter most organisms from investing in the camera type option, if possible, and settle for a model that will more precisely......Eyes have been considered support for the divine design hypothesis over evolution because, surely, eyes cannot function with anything less than all the components that comprise a vertebrate camera type eye. Yet, devoted Darwinists have estimated that complex visual systems can evolve from a single...... light sensitive cell within 400 000 generations and all intermediate stages can be found throughout the Metazoa. Eyes have evolved to accommodate increasingly more complex visual behaviours, from light sensitive tissues involved in circadian entrainment to the complex camera type eyes that can guide...

  14. Quantum information processing through nuclear magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Bulnes, J.D.; Sarthour, R.S.; Oliveira, I.S. [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Bonk, F.A.; Azevedo, E.R. de; Bonagamba, T.J. [Sao Paulo Univ., Sao Carlos, SP (Brazil). Inst. de Fisica; Freitas, J.C.C. [Espirito Santo Univ., Vitoria, ES (Brazil). Dept. de Fisica

    2005-09-15

    We discuss the applications of Nuclear Magnetic Resonance (NMR) to quantum information processing, focusing on the use of quadrupole nuclei for quantum computing. Various examples of experimental implementation of logic gates are given and compared to calculated NMR spectra and their respective density matrices. The technique of Quantum State Tomography for quadrupole nuclei is briefly described, and examples of measured density matrices in a two-qubit I = 3/2 spin system are shown. Experimental results of density matrices representing pseudo-Bell states are given, and an analysis of the entropy of theses states is made. Considering an NMR experiment as a depolarization quantum channel we calculate the entanglement fidelity and discuss the criteria for entanglement in liquid state NMR quantum information. A brief discussion on the perspectives for NMR quantum computing is presented at the end. (author)

  15. Computational architecture for image processing on a small unmanned ground vehicle

    Science.gov (United States)

    Ho, Sean; Nguyen, Hung

    2010-08-01

    Man-portable Unmanned Ground Vehicles (UGVs) have been fielded on the battlefield with limited computing power. This limitation constrains their use primarily to teleoperation control mode for clearing areas and bomb defusing. In order to extend their capability to include the reconnaissance and surveillance missions of dismounted soldiers, a separate processing payload is desired. This paper presents a processing architecture and the design details on the payload module that enables the PackBot to perform sophisticated, real-time image processing algorithms using data collected from its onboard imaging sensors including LADAR, IMU, visible, IR, stereo, and the Ladybug spherical cameras. The entire payload is constructed from currently available Commercial off-the-shelf (COTS) components including an Intel multi-core CPU and a Nvidia GPU. The result of this work enables a small UGV to perform computationally expensive image processing tasks that once were only feasible on a large workstation.

  16. Diffusive capture processes for information search

    CERN Document Server

    Lee, S; Kim, Y; Lee, Sungmin; Yook, Soon-Hyung; Kim, Yup

    2007-01-01

    We show how effectively the diffusive capture processes (DCP) on complex networks can be applied to information search in the networks. Numerical simulations show that our method generates only 2% of traffic compared with the most popular flooding-based query-packet-forwarding (FB) algorithm. We find that the average searching time, $$, of the our model is more scalable than another well known $n$-random walker model and comparable to the FB algorithm both on real Gnutella network and scale-free networks with $\\gamma =2.4$. We also discuss the possible relationship between $$ and $$, the second moment of the degree distribution of the networks.

  17. Quantum-Information Processing with Semiconductor Macroatoms

    CERN Document Server

    Biolatti, E; Zanardi, P; Rossi, F; Biolatti, Eliana; Iotti, Rita C.; Zanardi, Paolo; Rossi, Fausto

    2000-01-01

    An all optical implementation of quantum information processing with semiconductor macroatoms is proposed. Our quantum hardware consists of an array of semiconductor quantum dots and the computational degrees of freedom are energy-selected interband optical transitions. The proposed quantum-computing strategy exploits exciton-exciton interactions driven by ultrafast sequences of multi-color laser pulses. Contrary to existing proposals based on charge excitations, the present all-optical implementation does not require the application of time-dependent electric fields, thus allowing for a sub-picosecond, i.e. decoherence-free, operation time-scale in realistic state-of-the-art semiconductor nanostructures.

  18. Architecture Knowledge Management: Challenges, Approaches, and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    Capturing the technical knowledge, contextual information, and rationale surrounding the design decisions underpinning system architectures can greatly improve the software development process. If not managed, this critical knowledge is implicitly embedded in the architecture, becoming tacit knowledge which erodes as personnel on the project change. Moreover, the unavailability of architecture knowledge precludes organizations from growing their architectural capabilities. In this tutorial, we highlight the benefits and challenges in managing software architecture knowledge. We discuss various approaches to characterize architecture knowledge based on the requirements of a particular domain. We describe various concepts and approaches to manage the architecture knowledge from both management and technical perspectives. We also demonstrate the utility of captured knowledge to support software architecture activities with a case study covering the use of architecture knowledge management techniques and tools in an industrial project.

  19. Quantum information processing with noisy cluster states

    CERN Document Server

    Tame, M S; Kim, M S; Vedral, V

    2005-01-01

    We provide an analysis of basic quantum information processing protocols under the effect of intrinsic non-idealities in cluster states. These non-idealities are based on the introduction of randomness in the entangling steps that create the cluster state and are motivated by the unavoidable imperfections faced in creating entanglement using condensed-matter systems. Aided by the use of an alternative and very efficient method to construct cluster state configurations, which relies on the concatenation of fundamental cluster structures, we address quantum state transfer and various fundamental gate simulations through noisy cluster states. We find that a winning strategy to limit the effects of noise, is the management of small clusters processed via just a few measurements. Our study also reinforces recent ideas related to the optical implementation of a one-way quantum computer.

  20. Perception and information processing

    DEFF Research Database (Denmark)

    Scholderer, Joachim

    2010-01-01

    : as consumers, we can only respond to a stimulus if our senses are actually stimulated by it. Psychologically speaking, a stimulus only exists for us once we have formed an internal representation of it. The objective of this chapter is to introduce the systems that are involved in this processing of perceptual......Consumer researchers are interested in the responses of people to commercial stimuli. Usually, these stimuli are products and services, including all attributes, issues, persons, communications, situations, and behaviours related to them. Perception is the first bottleneck in this process...... information and to characterise the operations they perform. To avoid confusion, it should be stressed that the term "perception" is often used in a colloquial sense in consumer research. In concepts like perceived quality, perceived value, or perceived risk, the modifier "perceived" simply highlights...

  1. Quantum information density scaling and qubit operation time constraints of CMOS silicon-based quantum computer architectures

    Science.gov (United States)

    Rotta, Davide; Sebastiano, Fabio; Charbon, Edoardo; Prati, Enrico

    2017-06-01

    Even the quantum simulation of an apparently simple molecule such as Fe2S2 requires a considerable number of qubits of the order of 106, while more complex molecules such as alanine (C3H7NO2) require about a hundred times more. In order to assess such a multimillion scale of identical qubits and control lines, the silicon platform seems to be one of the most indicated routes as it naturally provides, together with qubit functionalities, the capability of nanometric, serial, and industrial-quality fabrication. The scaling trend of microelectronic devices predicting that computing power would double every 2 years, known as Moore's law, according to the new slope set after the 32-nm node of 2009, suggests that the technology roadmap will achieve the 3-nm manufacturability limit proposed by Kelly around 2020. Today, circuital quantum information processing architectures are predicted to take advantage from the scalability ensured by silicon technology. However, the maximum amount of quantum information per unit surface that can be stored in silicon-based qubits and the consequent space constraints on qubit operations have never been addressed so far. This represents one of the key parameters toward the implementation of quantum error correction for fault-tolerant quantum information processing and its dependence on the features of the technology node. The maximum quantum information per unit surface virtually storable and controllable in the compact exchange-only silicon double quantum dot qubit architecture is expressed as a function of the complementary metal-oxide-semiconductor technology node, so the size scale optimizing both physical qubit operation time and quantum error correction requirements is assessed by reviewing the physical and technological constraints. According to the requirements imposed by the quantum error correction method and the constraints given by the typical strength of the exchange coupling, we determine the workable operation frequency

  2. An operational information systems architecture for assessing sustainable transportation planning: principles and design.

    Science.gov (United States)

    Borzacchiello, Maria Teresa; Torrieri, Vincenzo; Nijkamp, Peter

    2009-11-01

    This paper offers the description of an integrated information system framework for the assessment of transportation planning and management. After an introductory exposition, in the first part of the paper, a broad overview of international experiences regarding information systems on transportation is given, focusing in particular on the relationship between transportation system's performance monitoring and the decision-making process, and on the importance of this connection in the evaluation and planning process, in Italian and European cases. Next, the methodological design of an information system to support efficient and sustainable transportation planning and management aiming to integrate inputs from several different data sources is presented. The resulting framework deploys modular and integrated databases which include data stemming from different national or regional data banks and which integrate information belonging to different transportation fields. For this reason, it allows public administrations to account for many strategic elements that influence their decisions regarding transportation, both from a systemic and infrastructural point of view.

  3. Real-time hypothesis driven feature extraction on parallel processing architectures

    DEFF Research Database (Denmark)

    Granmo, O.-C.; Jensen, Finn Verner

    2002-01-01

    Feature extraction in content-based indexing of media streams is often computational intensive. Typically, a parallel processing architecture is necessary for real-time performance when extracting features brute force. On the other hand, Bayesian network based systems for hypothesis driven feature......, rather than one-by-one. Thereby, the advantages of parallel feature extraction can be combined with the advantages of hypothesis driven feature extraction. The technique is based on a sequential backward feature set search and a correlation based feature set evaluation function. In order to reduce...

  4. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2014-07-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisation focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means.In this context, this sixth Footprint issue examines the influence of digital means as pragmatic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  5. Proposed Information Sharing Security Approach for Security Personnels, Vertical Integration, Semantic Interoperability Architecture and Framework for Digital Government

    CERN Document Server

    Headayetullah, Md; Biswas, Sanjay; Puthal, B

    2011-01-01

    This paper mainly depicts the conceptual overview of vertical integration, semantic interoperability architecture such as Educational Sector Architectural Framework (ESAF) for New Zealand government and different interoperability framework solution for digital government. In this paper, we try to develop a secure information sharing approach for digital government to improve home land security. This approach is a role and cooperation based approach for security personnel of different government departments. In order to run any successful digital government of any country in the world, it is necessary to interact with their citizen and to share secure information via different network among the citizen or other government. Consequently, in order to smooth the progress of users to cooperate with and share information without darkness and flawlessly transversely different networks and databases universally, a safe and trusted information-sharing environment has been renowned as a very important requirement and t...

  6. An image based information system - Architecture for correlating satellite and topological data bases

    Science.gov (United States)

    Bryant, N. A.; Zobrist, A. L.

    1978-01-01

    The paper describes the development of an image based information system and its use to process a Landsat thematic map showing land use or land cover in conjunction with a census tract polygon file to produce a tabulation of land use acreages per census tract. The system permits the efficient cross-tabulation of two or more geo-coded data sets, thereby setting the stage for the practical implementation of models of diffusion processes or cellular transformation. Characteristics of geographic information systems are considered, and functional requirements, such as data management, geocoding, image data management, and data analysis are discussed. The system is described, and the potentialities of its use are examined.

  7. Model-based system-of-systems engineering for space-based command, control, communication, and information architecture design

    Science.gov (United States)

    Sindiy, Oleg V.

    This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and

  8. An information theoretic approach for combining neural network process models.

    Science.gov (United States)

    Sridhar, D V.; Bartlett, E B.; Seagrave, R C.

    1999-07-01

    Typically neural network modelers in chemical engineering focus on identifying and using a single, hopefully optimal, neural network model. Using a single optimal model implicitly assumes that one neural network model can extract all the information available in a given data set and that the other candidate models are redundant. In general, there is no assurance that any individual model has extracted all relevant information from the data set. Recently, Wolpert (Neural Networks, 5(2), 241 (1992)) proposed the idea of stacked generalization to combine multiple models. Sridhar, Seagrave and Barlett (AIChE J., 42, 2529 (1996)) implemented the stacked generalization for neural network models by integrating multiple neural networks into an architecture known as stacked neural networks (SNNs). SNNs consist of a combination of the candidate neural networks and were shown to provide improved modeling of chemical processes. However, in Sridhar's work SNNs were limited to using a linear combination of artificial neural networks. While a linear combination is simple and easy to use, it can utilize only those model outputs that have a high linear correlation to the output. Models that are useful in a nonlinear sense are wasted if a linear combination is used. In this work we propose an information theoretic stacking (ITS) algorithm for combining neural network models. The ITS algorithm identifies and combines useful models regardless of the nature of their relationship to the actual output. The power of the ITS algorithm is demonstrated through three examples including application to a dynamic process modeling problem. The results obtained demonstrate that the SNNs developed using the ITS algorithm can achieve highly improved performance as compared to selecting and using a single hopefully optimal network or using SNNs based on a linear combination of neural networks.

  9. An agent-based service-oriented integration architecture for chemical process automation

    Institute of Scientific and Technical Information of China (English)

    Na Luo; Weimin Zhong; Feng Wan; Zhencheng Ye; Feng Qian

    2015-01-01

    In reality, traditional process control system built upon centralized and hierarchical structures presents a weak response to change and is easy to shut down by single failure. Aiming at these problems, a new agent-based service-oriented integration architecture was proposed for chemical process automation system. Web services were dynamical y orchestrated on the internet and agent behaviors were built in them. Data analysis, model, op-timization, control, fault diagnosis and so on were capsuled into different web services. Agents were used for ser-vice compositions by negotiation. A prototype system of poly(ethylene terephthalate) process automation was used as the case study to demonstrate the validation of the integration.

  10. A Three Tier Architecture Applied to LiDAR Processing and Monitoring

    Directory of Open Access Journals (Sweden)

    Efrat Jaeger-Frank

    2006-01-01

    Full Text Available Emerging Grid technologies enable solving scientific problems that involve large datasets and complex analyses, which in the past were often considered difficult to solve. Coordinating distributed Grid resources and computational processes requires adaptable interfaces and tools that provide modularized and configurable environments for accessing Grid clusters and executing high performance computational tasks. Computationally intensive processes are also subject to a high risk of component failures and thus require close monitoring. In this paper we describe a scientific workflow approach to coordinate various resources via data analysis pipelines. We present a three tier architecture for LiDAR interpolation and analysis, a high performance processing of point intensive datasets, utilizing a portal, a scientific workflow engine and Grid technologies. Our proposed solution is available to the community in a unified framework through a shared cyberinfrastructure, the GEON portal, enabling scientists to focus on their scientific work and not be concerned with the implementation of the underlying infrastructure.

  11. Serving database information using a flexible server in a three tier architecture

    Energy Technology Data Exchange (ETDEWEB)

    Lee Lueking et al.

    2003-08-11

    The D0 experiment at Fermilab relies on a central Oracle database for storing all detector calibration information. Access to this data is needed by hundreds of physics applications distributed worldwide. In order to meet the demands of these applications from scarce resources, we have created a distributed system that isolates the user applications from the database facilities. This system, known as the Database Application Network (DAN) operates as the middle tier in a three tier architecture. A DAN server employs a hierarchical caching scheme and database connection management facility that limits access to the database resource. The modular design allows for caching strategies and database access components to be determined by runtime configuration. To solve scalability problems, a proxy database component allows for DAN servers to be arranged in a hierarchy. Also included is an event based monitoring system that is currently being used to collect statistics for performance analysis and problem diagnosis. DAN servers are currently implemented as a Python multithreaded program using CORBA for network communications and interface specification. The requirement details, design, and implementation of DAN are discussed along with operational experience and future plans.

  12. AN EFFICIENT 3-DIMENSIONAL DISCRETE WAVELET TRANSFORM ARCHITECTURE FOR VIDEO PROCESSING APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Ganapathi Hegde; Pukhraj Vaya

    2012-01-01

    This paper presents an optimized 3-D Discrete Wavelet Transform (3-DDWT) architecture.1-DDWT employed for the design of 3-DDWT architecture uses reduced lifting scheme approach.Further the architecture is optimized by applying block enabling technique,scaling,and rounding of the filter coefficients.The proposed architecture uses biorthogonal (9/7) wavelet filter.The architecture is modeled using Verilog HDL,simulated using ModelSim,synthesized using Xilinx ISE and finally implemented on Virtex-5 FPGA.The proposed 3-DDWT architecture has slice register utilization of 5%,operating frequency of 396 MHz and a power consumption of 0.45 W.

  13. Parallel processing architecture for H.264 deblocking filter on multi-core platforms

    Science.gov (United States)

    Prasad, Durga P.; Sonachalam, Sekar; Kunchamwar, Mangesh K.; Gunupudi, Nageswara Rao

    2012-03-01

    filter for multi core platforms such as HyperX technology. Parallel techniques such as parallel processing of independent macroblocks, sub blocks, and pixel row level are examined in this work. The deblocking architecture consists of a basic cell called deblocking filter unit (DFU) and dependent data buffer manager (DFM). The DFU can be used in several instances, catering to different performance needs the DFM serves the data required for the different number of DFUs, and also manages all the neighboring data required for future data processing of DFUs. This approach achieves the scalability, flexibility, and performance excellence required in deblocking filters.

  14. Quantum Information Processing using Nonlinear Optical Effects

    DEFF Research Database (Denmark)

    Andersen, Lasse Mejling

    of the converted idler depends on the other pump. This allows for temporal-mode-multiplexing. When the effects of nonlinear phase modulation (NPM) are included, the phases of the natural input and output modes are changed, reducing the separability. These effects are to some degree mediated by pre......This PhD thesis treats applications of nonlinear optical effects for quantum information processing. The two main applications are four-wave mixing in the form of Bragg scattering (BS) for quantum-state-preserving frequency conversion, and sum-frequency generation (SFG) in second-order nonlinear...... to obtain a 100 % conversion efficiency is to use multiple stages of frequency conversion, but this setup suffers from the combined effects of NPM. This problem is circumvented by using asymmetrically pumped BS, where one pump is continuous wave. For this setup, NPM is found to only lead to linear phase...

  15. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions

    Science.gov (United States)

    Danaher, Brian G.; Brendryen, Håvar; Seeley, John R.; Tyler, Milagra S.; Woolley, Tim

    2015-01-01

    mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture. PMID:25750862

  16. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions

    Directory of Open Access Journals (Sweden)

    Brian G. Danaher

    2015-03-01

    Full Text Available mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1 mobile device functionality, which represents the technological toolbox available to intervention developers; and (2 the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be able to better achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  17. Preattentive Processing of Numerical Visual Information.

    Science.gov (United States)

    Hesse, Philipp N; Schmitt, Constanze; Klingenhoefer, Steffen; Bremmer, Frank

    2017-01-01

    Humans can perceive and estimate approximate numerical information, even when accurate counting is impossible e.g., due to short presentation time. If the number of objects to be estimated is small, typically around 1-4 items, observers are able to give very fast and precise judgments with high confidence-an effect that is called subitizing. Due to its speed and effortless nature subitizing has usually been assumed to be preattentive, putting it into the same category as other low level visual features like color or orientation. More recently, however, a number of studies have suggested that subitizing might be dependent on attentional resources. In our current study we investigated the potentially preattentive nature of visual numerical perception in the subitizing range by means of EEG. We presented peripheral, task irrelevant sequences of stimuli consisting of a certain number of circular patches while participants were engaged in a demanding, non-numerical detection task at the fixation point drawing attention away from the number stimuli. Within a sequence of stimuli of a given number of patches (called "standards") we interspersed some stimuli of different numerosity ("oddballs"). We compared the evoked responses to visually identical stimuli that had been presented in two different conditions, serving as standard in one condition and as oddball in the other. We found significant visual mismatch negativity (vMMN) responses over parieto-occipital electrodes. In addition to the event-related potential (ERP) analysis, we performed a time-frequency analysis (TFA) to investigate whether the vMMN was accompanied by additional oscillatory processes. We found a concurrent increase in evoked theta power of similar strength over both hemispheres. Our results provide clear evidence for a preattentive processing of numerical visual information in the subitizing range.

  18. Quantum information processing with optical vortices

    Energy Technology Data Exchange (ETDEWEB)

    Khoury, Antonio Z. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)

    2012-07-01

    Full text: In this work we discuss several proposals for quantum information processing using the transverse structure of paraxial beams. Different techniques for production and manipulation of optical vortices have been employed and combined with polarization transformations in order to investigate fundamental properties of quantum entanglement as well as to propose new tools for quantum information processing. As an example, we have recently proposed and demonstrated a controlled NOT (CNOT) gate based on a Michelson interferometer in which the photon polarization is the control bit and the first order transverse mode is the target. The device is based on a single lens design for an astigmatic mode converter that transforms the transverse mode of paraxial optical beams. In analogy with Bell's inequality for two-qubit quantum states, we propose an inequality criterion for the non-separability of the spin-orbit degrees of freedom of a laser beam. A definition of separable and non-separable spin-orbit modes is used in consonance with the one presented in Phys. Rev. Lett. 99, 2007. As the usual Bell's inequality can be violated for entangled two-qubit quantum states, we show both theoretically and experimentally that the proposed spin-orbit inequality criterion can be violated for non-separable modes. The inequality is discussed both in the classical and quantum domains. We propose a polarization to orbital angular momentum teleportation scheme using entangled photon pairs generated by spontaneous parametric down conversion. By making a joint detection of the polarization and angular momentum parity of a single photon, we are able to detect all the Bell-states and perform, in principle, perfect teleportation from a discrete to a continuous system using minimal resources. The proposed protocol implementation demands experimental resources that are currently available in quantum optics laboratories. (author)

  19. Natural language processing and advanced information management

    Science.gov (United States)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  20. Information Support of Processes in Warehouse Logistics

    Directory of Open Access Journals (Sweden)

    Gordei Kirill

    2013-11-01

    Full Text Available In the conditions of globalization and the world economic communications, the role of information support of business processes increases in various branches and fields of activity. There is not an exception for the warehouse activity. Such information support is realized in warehouse logistic systems. In relation to territorial administratively education, the warehouse logistic system gets a format of difficult social and economic structure which controls the economic streams covering the intermediary, trade and transport organizations and the enterprises of other branches and spheres. Spatial movement of inventory items makes new demands to participants of merchandising. Warehousing (in the meaning – storage – is one of the operations entering into logistic activity, on the organization of a material stream, as a requirement. Therefore, warehousing as "management of spatial movement of stocks" – is justified. Warehousing, in such understanding, tries to get rid of the perception as to containing stocks – a business expensive. This aspiration finds reflection in the logistic systems working by the principle: "just in time", "economical production" and others. Therefore, the role of warehouses as places of storage is transformed to understanding of warehousing as an innovative logistic system.