WorldWideScience

Sample records for biosurveillance event-driven architecture

  1. The column architecture -- A novel architecture for event driven 2D pixel imagers

    International Nuclear Information System (INIS)

    Millaud, J.; Nygren, D.

    1996-01-01

    The authors describe an electronic architecture for two-dimensional pixel arrays that permits very large increases in rate capability for event- or data-driven applications relative to conventional x-y architectures. The column architecture also permits more efficient use of silicon area in applications requiring local buffering, frameless data acquisition, and it avoids entirely the problem of ambiguities that may arise in conventional approaches. Two examples of active implementation are described: high energy physics and protein crystallography

  2. Staged Event-Driven Architecture As A Micro-Architecture Of Distributed And Pluginable Crawling Platform

    Directory of Open Access Journals (Sweden)

    Leszek Siwik

    2013-01-01

    Full Text Available There are many crawling systems available on the market but they are rather close systems dedicated for performing particular kind and class of tasks with predefined set of scope, strategy etc. In real life however there are meaningful groups of users (e.g. marketing, criminal or governmental analysts requiring not just a yet another crawling system dedicated for performing predefined tasks. They need rather easy-to-use, user friendly all-in-one studio for not only executing and running internet robots and crawlers, but also for (graphical (redefining and (recomposing crawlers according to dynamically changing requirements and use-cases. To realize the above-mentioned idea, Cassiopeia framework has been designed and developed. One has to remember, however, that enormous size and unimaginable structural complexity of WWW network are the reasons that, from a technical and architectural point of view, developing effective internet robots – and the more so developing a framework supporting graphical robots’ composition – becomes a really challenging task. The crucial aspect in the context of crawling efficiency and scalability is concurrency model applied. There are two the most typical concurrency management models i.e. classical concurrency based on the pool of threads and processes and event-driven concurrency. None of them are ideal approaches. That is why, research on alternative models is still conducted to propose efficient and convenient architecture for concurrent and distributed applications. One of promising models is staged event-driven architecture mixing to some extent both of above mentioned classical approaches and providing some additional benefits such as splitting application into separate stages connected by events queues – what is interesting taking requirements about crawler (recomposition into account. The goal of this paper is to present the idea and the PoC  implementation of Cassiopeia framework, with the special

  3. System on chip module configured for event-driven architecture

    Science.gov (United States)

    Robbins, Kevin; Brady, Charles E.; Ashlock, Tad A.

    2017-10-17

    A system on chip (SoC) module is described herein, wherein the SoC modules comprise a processor subsystem and a hardware logic subsystem. The processor subsystem and hardware logic subsystem are in communication with one another, and transmit event messages between one another. The processor subsystem executes software actors, while the hardware logic subsystem includes hardware actors, the software actors and hardware actors conform to an event-driven architecture, such that the software actors receive and generate event messages and the hardware actors receive and generate event messages.

  4. An Overview of Internet biosurveillance

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, David M.; Nelson, Noele P.; Arthur, Ray; Barboza, P.; Collier, Nigel; Lightfoot, Nigel; Linge, J. P.; van der Goot, E.; Mawudeku, A.; Madoff, Lawrence; Vaillant, L.; Walters, Ronald A.; Yangarber, Roman; Mantero, Jas; Corley, Courtney D.; Brownstein, John S.

    2013-06-21

    Internet biosurveillance utilizes unstructured data from diverse Web-based sources to provide early warning and situational awareness of public health threats. The scope of source coverage ranges from local based media in the vernacular to international media in widely read languages. Internet biosurveillance is a timely modality available to government and public health officials, health care workers, and the public and private sector, serving as a real-time complementary approach to traditional indicator-based public health disease surveillance methods. Internet biosurveillance also supports the broader activity of epidemic intelligence. This review covers the current state of the field of Internet biosurveillance and provides a perspective on the future of the field.

  5. Toward Integrated DoD Biosurveillance: Assessment and Opportunities.

    Science.gov (United States)

    Moore, Melinda; Fisher, Gail; Stevens, Clare

    2014-01-01

    In the context of the 2012 National Strategy for Biosurveillance, the Office of Management and Budget (OMB) asked the Department of Defense (DoD) to review its biosurveillance programs, prioritize missions and desired outcomes, evaluate how DoD programs contribute to these, and assess the appropriateness and stability of the department's funding system for biosurveillance. DoD sought external analytic support through the RAND Arroyo Center. In response to the questions posed by OMB request, this study finds the following: (1) Current DoD biosurveillance supports three strategic missions. Based mostly on existing statute, the highest-priority mission is force health protection, followed by biological weapons defense and global health security. (2) Guidance issued by the White House on June 27, 2013, specified priorities for planning fiscal year 2015 budgets; it includes an explicit global health security priority, which strengthens the case for this as a key DoD biosurveillance strategic mission. (3) DoD biosurveillance also supports four desired outcomes: early warning and early detection, situational awareness, better decision making at all levels, and forecast of impacts. (4) Programs and measures that address priority missions-force health protection in particular-and desired outcomes should be prioritized over those that do not do so. (5) More near-real-time analysis and better internal and external integration could enhance the performance and value of the biosurveillance enterprise. (6) Improvements are needed in key enablers, including explicit doctrine/policy, efficient organization and governance, and increased staffing and improved facilities for the Armed Forces Health Surveillance Center (AFHSC). (7) AFHSC has requested additional funding to fully implement its current responsibilities under the 2012 Memorandum of Understanding between the Assistant Secretaries of Defense for Health Affairs and for Nuclear, Chemical, and Biological Defense Programs

  6. Economics-driven software architecture

    CERN Document Server

    Mistrik, Ivan; Kazman, Rick; Zhang, Yuanyuan

    2014-01-01

    Economics-driven Software Architecture presents a guide for engineers and architects who need to understand the economic impact of architecture design decisions: the long term and strategic viability, cost-effectiveness, and sustainability of applications and systems. Economics-driven software development can increase quality, productivity, and profitability, but comprehensive knowledge is needed to understand the architectural challenges involved in dealing with the development of large, architecturally challenging systems in an economic way. This book covers how to apply economic consider

  7. Biosurveillance in Central Asia: Successes and Challenges of Tick-Borne Disease Research in Kazakhstan and Kyrgyzstan.

    Science.gov (United States)

    Hay, John; Yeh, Kenneth B; Dasgupta, Debanjana; Shapieva, Zhanna; Omasheva, Gulnara; Deryabin, Pavel; Nurmakhanov, Talgat; Ayazbayev, Timur; Andryushchenko, Alexei; Zhunushov, Asankadyr; Hewson, Roger; Farris, Christina M; Richards, Allen L

    2016-01-01

    Central Asia is a vast geographic region that includes five former Soviet Union republics: Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan, and Uzbekistan. The region has a unique infectious disease burden, and a history that includes Silk Road trade routes and networks that were part of the anti-plague and biowarfare programs in the former Soviet Union. Post-Soviet Union biosurveillance research in this unique area of the world has met with several challenges, including lack of funding and resources to independently conduct hypothesis driven, peer-review quality research. Strides have been made, however, to increase scientific engagement and capability. Kazakhstan and Kyrgyzstan are examples of countries where biosurveillance research has been successfully conducted, particularly with respect to especially dangerous pathogens. In this review, we describe in detail the successes, challenges, and opportunities of conducting biosurveillance in Central Asia as exemplified by our recent research activities on ticks and tick-borne diseases in Kazakhstan and Kyrgyzstan.

  8. Event management for large scale event-driven digital hardware spiking neural networks.

    Science.gov (United States)

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. NEBULAS A High Performance Data-Driven Event-Building Architecture based on an Asynchronous Self-Routing Packet-Switching Network

    CERN Multimedia

    Costa, M; Letheren, M; Djidi, K; Gustafsson, L; Lazraq, T; Minerskjold, M; Tenhunen, H; Manabe, A; Nomachi, M; Watase, Y

    2002-01-01

    RD31 : The project is evaluating a new approach to event building for level-two and level-three processor farms at high rate experiments. It is based on the use of commercial switching fabrics to replace the traditional bus-based architectures used in most previous data acquisition sytems. Switching fabrics permit the construction of parallel, expandable, hardware-driven event builders that can deliver higher aggregate throughput than the bus-based architectures. A standard industrial switching fabric technology is being evaluated. It is based on Asynchronous Transfer Mode (ATM) packet-switching network technology. Commercial, expandable ATM switching fabrics and processor interfaces, now being developed for the future Broadband ISDN infrastructure, could form the basis of an implementation. The goals of the project are to demonstrate the viability of this approach, to evaluate the trade-offs involved in make versus buy options, to study the interfacing of the physics frontend data buffers to such a fabric, a...

  10. Data-driven architectural production and operation

    NARCIS (Netherlands)

    Bier, H.H.; Mostafavi, S.

    2014-01-01

    Data-driven architectural production and operation as explored within Hyperbody rely heavily on system thinking implying that all parts of a system are to be understood in relation to each other. These relations are increasingly established bi-directionally so that data-driven architecture is not

  11. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2014-07-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisation focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means.In this context, this sixth Footprint issue examines the influence of digital means as pragmatic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  12. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2010-06-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisa­tion focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means. In this context, this sixth Footprint issue examines the influence of digital means as prag­matic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  13. Data-driven architectural design to production and operation

    NARCIS (Netherlands)

    Bier, H.H.; Mostafavi, S.

    2015-01-01

    Data-driven architectural production and operation explored within Hyperbody rely heavily on system thinking implying that all parts of a system are to be understood in relation to each other. These relations are established bi-directionally so that data-driven architecture is not only produced

  14. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Directory of Open Access Journals (Sweden)

    Nicholas Generous

    Full Text Available The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  15. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Science.gov (United States)

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  16. Notification Event Architecture for Traveler Screening: Predictive Traveler Screening Using Event Driven Business Process Management

    Science.gov (United States)

    Lynch, John Kenneth

    2013-01-01

    Using an exploratory model of the 9/11 terrorists, this research investigates the linkages between Event Driven Business Process Management (edBPM) and decision making. Although the literature on the role of technology in efficient and effective decision making is extensive, research has yet to quantify the benefit of using edBPM to aid the…

  17. Assessment of Wearable Sensor Technologies for Biosurveillance

    Science.gov (United States)

    2014-11-01

    include: textile-based wearable sensors, epidermal tattoos, DNA and protein sensors, forensic detection of explosives, remote environmental sensing...Assessment of Wearable Sensor Technologies for Biosurveillance P a g e 4 3 David L. Hirschberg, PhD Assistant Professor, Clinical Pathology

  18. ORBiT: Oak Ridge biosurveillance toolkit for public health dynamics.

    Science.gov (United States)

    Ramanathan, Arvind; Pullum, Laura L; Hobson, Tanner C; Steed, Chad A; Quinn, Shannon P; Chennubhotla, Chakra S; Valkova, Silvia

    2015-01-01

    The digitization of health-related information through electronic health records (EHR) and electronic healthcare reimbursement claims and the continued growth of self-reported health information through social media provides both tremendous opportunities and challenges in developing effective biosurveillance tools. With novel emerging infectious diseases being reported across different parts of the world, there is a need to build systems that can track, monitor and report such events in a timely manner. Further, it is also important to identify susceptible geographic regions and populations where emerging diseases may have a significant impact. In this paper, we present an overview of Oak Ridge Biosurveillance Toolkit (ORBiT), which we have developed specifically to address data analytic challenges in the realm of public health surveillance. In particular, ORBiT provides an extensible environment to pull together diverse, large-scale datasets and analyze them to identify spatial and temporal patterns for various biosurveillance-related tasks. We demonstrate the utility of ORBiT in automatically extracting a small number of spatial and temporal patterns during the 2009-2010 pandemic H1N1 flu season using claims data. These patterns provide quantitative insights into the dynamics of how the pandemic flu spread across different parts of the country. We discovered that the claims data exhibits multi-scale patterns from which we could identify a small number of states in the United States (US) that act as "bridge regions" contributing to one or more specific influenza spread patterns. Similar to previous studies, the patterns show that the south-eastern regions of the US were widely affected by the H1N1 flu pandemic. Several of these south-eastern states act as bridge regions, which connect the north-east and central US in terms of flu occurrences. These quantitative insights show how the claims data combined with novel analytical techniques can provide important

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. Authentic execution of distributed event-driven applications with a small TCB

    OpenAIRE

    Noorman, Job; Mühlberg, Tobias; Piessens, Frank

    2017-01-01

    This paper presents an approach to provide strong assurance of the secure execution of distributed event-driven applications on shared infrastructures, while relying on a small Trusted Computing Base. We build upon and extend security primitives provided by a Protected Module Architecture (PMA) to guarantee authenticity and integrity properties of applications, and to secure control of input and output devices used by these applications. More specifically, we want to guarantee that if an outp...

  1. Automated Testing of Event-Driven Applications

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning

    may be tested by selecting an interesting input (i.e. a sequence of events), and deciding if a failure occurs when the selected input is applied to the event-driven application under test. Automated testing promises to reduce the workload for developers by automatically selecting interesting inputs...... and detect failures. However, it is non-trivial to conduct automated testing of event-driven applications because of, for example, infinite input spaces and the absence of specifications of correct application behavior. In this PhD dissertation, we identify a number of specific challenges when conducting...... automated testing of event-driven applications, and we present novel techniques for solving these challenges. First, we present an algorithm for stateless model-checking of event-driven applications with partial-order reduction, and we show how this algorithm may be used to systematically test web...

  2. Global biosurveillance: enabling science and technology. Workshop background and motivation: international scientific engagement for global security

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Helen H [Los Alamos National Laboratory

    2011-01-18

    Through discussion the conference aims to: (1) Identify core components of a comprehensive global biosurveillance capability; (2) Determine the scientific and technical bases to support such a program; (3) Explore the improvement in biosurveillance to enhance regional and global disease outbreak prediction; (4) Recommend an engagement approach to establishing an effective international community and regional or global network; (5) Propose implementation strategies and the measures of effectiveness; and (6) Identify the challenges that must be overcome in the next 3-5 years in order to establish an initial global biosurveillance capability that will have significant positive impact on BioNP as well as public health and/or agriculture. There is also a look back at the First Biothreat Nonproliferation Conference from December 2007. Whereas the first conference was an opportunity for problem solving to enhance and identify new paradigms for biothreat nonproliferation, this conference is moving towards integrated comprehensive global biosurveillance. Main reasons for global biosurveillance are: (1) Rapid assessment of unusual disease outbreak; (2) Early warning of emerging, re-emerging and engineered biothreat enabling reduced morbidity and mortality; (3) Enhanced crop and livestock management; (4) Increase understanding of host-pathogen interactions and epidemiology; (5) Enhanced international transparency for infectious disease research supporting BWC goals; and (6) Greater sharing of technology and knowledge to improve global health.

  3. Biosurveillance in a Highly Mobile Population - Year 3

    Science.gov (United States)

    2012-07-01

    provides an opportunity- rich tested for the impact upon infectious disease modeling, biosurveillance, and public health. Kulldorf et al (2005) assessed...Secular Circles and Millenial Trends. URSS, Moscow 2006 Kulldorff M, Heffernan R, Hartman J, Assunção RM, Mostashari F. (2005). “Space-Time Permutation

  4. Model Driven Architecture: Foundations and Applications

    NARCIS (Netherlands)

    Rensink, Arend

    The OMG's Model Driven Architecture (MDA) initiative has been the focus of much attention in both academia and industry, due to its promise of more rapid and consistent software development through the increased use of models. In order for MDA to reach its full potential, the ability to manipulate

  5. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  6. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  7. Building a transnational biosurveillance network using semantic web technologies: requirements, design, and preliminary evaluation.

    Science.gov (United States)

    Teodoro, Douglas; Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-05-29

    Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1 × 10

  8. ORBiT: Oak Ridge Bio-surveillance Toolkit for Public Health Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Pullum, Laura L [ORNL; Hobson, Tanner C [ORNL; Steed, Chad A [ORNL; Chennubhotla, Chakra [University of Pittsburgh School of Medicine; Quinn, Shannon [University of Pittsburgh School of Medicine

    2015-01-01

    With novel emerging infectious diseases being reported across different parts of the world, there is a need to build effective bio-surveillance systems that can track, monitor and report such events in a timely manner. Apart from monitoring for emerging disease outbreaks, it is also important to identify susceptible geographic regions and populations where these diseases may have a significant impact. The digitization of health related information through electronic health records (EHR) and electronic healthcare claim reimbursements (eHCR) and the continued growth of self-reported health information through social media provides both tremendous opportunities and challenges in developing novel public health surveillance tools. In this paper, we present an overview of Oak Ridge Bio-surveillance Toolkit (ORBiT), which we have developed specifically to address data analytic challenges in the realm of public health surveillance. In particular, ORBiT provides an extensible environment to pull together diverse, large-scale datasets and analyze them to identify spatial and temporal patterns for various bio-surveillance related tasks. We demonstrate the utility of ORBiT in automatically extracting a small number of spatial and temporal patterns during the 2009-2010 pandemic H1N1 flu season using eHCR data. These patterns provide quantitative insights into the dynamics of how the pandemic flu spread across different parts of the country. We discovered that the eHCR data exhibits multi-scale patterns from which we could identify a small number of states in the United States (US) that act as bridge regions contributing to one or more specific influenza spread patterns. Similar to previous studies, the patterns show that the south-eastern regions of the US were widely affected by the H1N1 flu pandemic. Several of these south-eastern states act as bridge regions, which connect the north-east and central US in terms of flu occurrences. These quantitative insights show how the e

  9. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    Science.gov (United States)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  10. Model-Driven Development of Safety Architectures

    Science.gov (United States)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2017-01-01

    We describe the use of model-driven development for safety assurance of a pioneering NASA flight operation involving a fleet of small unmanned aircraft systems (sUAS) flying beyond visual line of sight. The central idea is to develop a safety architecture that provides the basis for risk assessment and visualization within a safety case, the formal justification of acceptable safety required by the aviation regulatory authority. A safety architecture is composed from a collection of bow tie diagrams (BTDs), a practical approach to manage safety risk by linking the identified hazards to the appropriate mitigation measures. The safety justification for a given unmanned aircraft system (UAS) operation can have many related BTDs. In practice, however, each BTD is independently developed, which poses challenges with respect to incremental development, maintaining consistency across different safety artifacts when changes occur, and in extracting and presenting stakeholder specific information relevant for decision making. We show how a safety architecture reconciles the various BTDs of a system, and, collectively, provide an overarching picture of system safety, by considering them as views of a unified model. We also show how it enables model-driven development of BTDs, replete with validations, transformations, and a range of views. Our approach, which we have implemented in our toolset, AdvoCATE, is illustrated with a running example drawn from a real UAS safety case. The models and some of the innovations described here were instrumental in successfully obtaining regulatory flight approval.

  11. Second-Order Multiagent Systems with Event-Driven Consensus Control

    Directory of Open Access Journals (Sweden)

    Jiangping Hu

    2013-01-01

    Full Text Available Event-driven control scheduling strategies for multiagent systems play a key role in future use of embedded microprocessors of limited resources that gather information and actuate the agent control updates. In this paper, a distributed event-driven consensus problem is considered for a multi-agent system with second-order dynamics. Firstly, two kinds of event-driven control laws are, respectively, designed for both leaderless and leader-follower systems. Then, the input-to-state stability of the closed-loop multi-agent system with the proposed event-driven consensus control is analyzed and the bound of the inter-event times is ensured. Finally, some numerical examples are presented to validate the proposed event-driven consensus control.

  12. Interfacing a biosurveillance portal and an international network of institutional analysts to detect biological threats.

    Science.gov (United States)

    Riccardo, Flavia; Shigematsu, Mika; Chow, Catherine; McKnight, C Jason; Linge, Jens; Doherty, Brian; Dente, Maria Grazia; Declich, Silvia; Barker, Mike; Barboza, Philippe; Vaillant, Laetitia; Donachie, Alastair; Mawudeku, Abla; Blench, Michael; Arthur, Ray

    2014-01-01

    The Early Alerting and Reporting (EAR) project, launched in 2008, is aimed at improving global early alerting and risk assessment and evaluating the feasibility and opportunity of integrating the analysis of biological, chemical, radionuclear (CBRN), and pandemic influenza threats. At a time when no international collaborations existed in the field of event-based surveillance, EAR's innovative approach involved both epidemic intelligence experts and internet-based biosurveillance system providers in the framework of an international collaboration called the Global Health Security Initiative, which involved the ministries of health of the G7 countries and Mexico, the World Health Organization, and the European Commission. The EAR project pooled data from 7 major internet-based biosurveillance systems onto a common portal that was progressively optimized for biological threat detection under the guidance of epidemic intelligence experts from public health institutions in Canada, the European Centre for Disease Prevention and Control, France, Germany, Italy, Japan, the United Kingdom, and the United States. The group became the first end users of the EAR portal, constituting a network of analysts working with a common standard operating procedure and risk assessment tools on a rotation basis to constantly screen and assess public information on the web for events that could suggest an intentional release of biological agents. Following the first 2-year pilot phase, the EAR project was tested in its capacity to monitor biological threats, proving that its working model was feasible and demonstrating the high commitment of the countries and international institutions involved. During the testing period, analysts using the EAR platform did not miss intentional events of a biological nature and did not issue false alarms. Through the findings of this initial assessment, this article provides insights into how the field of epidemic intelligence can advance through an

  13. Toward Integrated DoD Biosurveillance: Assessment and Opportunities

    Science.gov (United States)

    2013-01-01

    enables linkage of biological specimens with biosurveillance data (high-quality material for antibody testing, but less reliable for preservation of...Organization •Robert Koch Institute •Institut Pasteur Network 54 Table 3.2. Potential Additional Information to Be Monitored by AFHSC Source...World Organization for Animal Health • Food and Agricultural Organization • Robert Koch Institute • Institut Pasteur network 55 CDC also has

  14. Using a data-centric event-driven architecture approach in the integration of real-time systems at DTP2

    International Nuclear Information System (INIS)

    Tuominen, Janne; Viinikainen, Mikko; Alho, Pekka; Mattila, Jouni

    2014-01-01

    Integration of heterogeneous and distributed systems is a challenging task, because they might be running on different platforms and written with different implementation languages by multiple organizations. Data-centricity and event-driven architecture (EDA) are concepts that help to implement versatile and well-scaling distributed systems. This paper focuses on the implementation of inter-subsystem communication in a prototype distributed remote handling control system developed at Divertor Test Platform 2 (DTP2). The control system consists of a variety of heterogeneous subsystems, including a client–server web application and hard real-time controllers. A standardized middleware solution (Data Distribution Services (DDS)) that supports a data-centric EDA approach is used to integrate the system. One of the greatest challenges in integrating a system with a data-centric EDA approach is in defining the global data space model. The selected middleware is currently only used for non-deterministic communication. For future application, we evaluated the performance of point-to-point communication with and without the presence of additional network load to ensure applicability to real-time systems. We found that, under certain limitations, the middleware can be used for soft real-time communication. Hard real-time use will require more validation with a more suitable environment

  15. Using a data-centric event-driven architecture approach in the integration of real-time systems at DTP2

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi; Viinikainen, Mikko; Alho, Pekka; Mattila, Jouni

    2014-10-15

    Integration of heterogeneous and distributed systems is a challenging task, because they might be running on different platforms and written with different implementation languages by multiple organizations. Data-centricity and event-driven architecture (EDA) are concepts that help to implement versatile and well-scaling distributed systems. This paper focuses on the implementation of inter-subsystem communication in a prototype distributed remote handling control system developed at Divertor Test Platform 2 (DTP2). The control system consists of a variety of heterogeneous subsystems, including a client–server web application and hard real-time controllers. A standardized middleware solution (Data Distribution Services (DDS)) that supports a data-centric EDA approach is used to integrate the system. One of the greatest challenges in integrating a system with a data-centric EDA approach is in defining the global data space model. The selected middleware is currently only used for non-deterministic communication. For future application, we evaluated the performance of point-to-point communication with and without the presence of additional network load to ensure applicability to real-time systems. We found that, under certain limitations, the middleware can be used for soft real-time communication. Hard real-time use will require more validation with a more suitable environment.

  16. NOvA Event Building, Buffering and Data-Driven Triggering From Within the DAQ System

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M. [Fermilab; Green, C. [Fermilab; Kowalkowski, J. [Fermilab; Norman, A. [Fermilab; Paterno, M. [Fermilab; Rechenmacher, R. [Fermilab

    2012-06-22

    To make its core measurements, the NOvA experiment needs to make real-time data-driven decisions involving beam-spill time correlation and other triggering issues. NOvA-DDT is a prototype Data-Driven Triggering system, built using the Fermilab artdaq generic DAQ/Event-building tools set. This provides the advantages of sharing online software infrastructure with other Intensity Frontier experiments, and of being able to use any offline analysis module--unchanged--as a component of the online triggering decisions. The NOvA-artdaq architecture chosen has significant advantages, including graceful degradation if the triggering decision software fails or cannot be done quickly enough for some fraction of the time-slice ``events.'' We have tested and measured the performance and overhead of NOvA-DDT using an actual Hough transform based trigger decision module taken from the NOvA offline software. The results of these tests--98 ms mean time per event on only 1/16 of th e available processing power of a node, and overheads of about 2 ms per event--provide a proof of concept: NOvA-DDT is a viable strategy for data acquisition, event building, and trigger processing at the NOvA far detector.

  17. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  18. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  19. DOE's Institute for Advanced Architecture and Algorithms: An application-driven approach

    International Nuclear Information System (INIS)

    Murphy, Richard C

    2009-01-01

    This paper describes an application driven methodology for understanding the impact of future architecture decisions on the end of the MPP era. Fundamental transistor device limitations combined with application performance characteristics have created the switch to multicore/multithreaded architectures. Designing large-scale supercomputers to match application demands is particularly challenging since performance characteristics are highly counter-intuitive. In fact, data movement more than FLOPS dominates. This work discusses some basic performance analysis for a set of DOE applications, the limits of CMOS technology, and the impact of both on future architectures.

  20. Critical Path Driven Cosynthesis for Heterogeneous Target Architectures

    DEFF Research Database (Denmark)

    Bjørn-Jørgensen, Peter; Madsen, Jan

    1997-01-01

    This paper presents a critical path driven algorithm to produce a static schedule of a single-rate system onto a heterogeneous target architecture. Our algorithm is a list based scheduling algorithm which concurrently assigns tasks to processors and allocates nets to interprocessor communication........ Experimental results show that our algorithm is able to find good results, as compared to other methods, in small amount of CPU time....

  1. The Event-Driven Software Library for YARP—With Algorithms and iCub Applications

    Directory of Open Access Journals (Sweden)

    Arren Glover

    2018-01-01

    Full Text Available Event-driven (ED cameras are an emerging technology that sample the visual signal based on changes in the signal magnitude, rather than at a fixed-rate over time. The change in paradigm results in a camera with a lower latency, that uses less power, has reduced bandwidth, and higher dynamic range. Such cameras offer many potential advantages for on-line, autonomous, robots; however, the sensor data do not directly integrate with current “image-based” frameworks and software libraries. The iCub robot uses Yet Another Robot Platform (YARP as middleware to provide modular processing and connectivity to sensors and actuators. This paper introduces a library that incorporates an event-based framework into the YARP architecture, allowing event cameras to be used with the iCub (and other YARP-based robots. We describe the philosophy and methods for structuring events to facilitate processing, while maintaining low-latency and real-time operation. We also describe several processing modules made available open-source, and three example demonstrations that can be run on the neuromorphic iCub.

  2. Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

    Directory of Open Access Journals (Sweden)

    Emre eNeftci

    2014-01-01

    Full Text Available Restricted Boltzmann Machines (RBMs and Deep Belief Networks have been demonstrated to perform efficiently in variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The reverberating activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP carries out the weight updates in an online, asynchronous fashion.We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

  3. Event-driven contrastive divergence for spiking neuromorphic systems.

    Science.gov (United States)

    Neftci, Emre; Das, Srinjoy; Pedroni, Bruno; Kreutz-Delgado, Kenneth; Cauwenberghs, Gert

    2013-01-01

    Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However, the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

  4. HoneyComb: An Application-Driven Online Adaptive Reconfigurable Hardware Architecture

    Directory of Open Access Journals (Sweden)

    Alexander Thomas

    2012-01-01

    Full Text Available Since the introduction of the first reconfigurable devices in 1985 the field of reconfigurable computing developed a broad variety of architectures from fine-grained to coarse-grained types. However, the main disadvantages of the reconfigurable approaches, the costs in area, and power consumption, are still present. This contribution presents a solution for application-driven adaptation of our reconfigurable architecture at register transfer level (RTL to reduce the resource requirements and power consumption while keeping the flexibility and performance for a predefined set of applications. Furthermore, implemented runtime adaptive features like online routing and configuration sequencing will be presented and discussed. A presentation of the prototype chip of this architecture designed in 90 nm standard cell technology manufactured by TSMC will conclude this contribution.

  5. Fork-join and data-driven execution models on multi-core architectures: Case study of the FMM

    KAUST Repository

    Amer, Abdelhalim

    2013-01-01

    Extracting maximum performance of multi-core architectures is a difficult task primarily due to bandwidth limitations of the memory subsystem and its complex hierarchy. In this work, we study the implications of fork-join and data-driven execution models on this type of architecture at the level of task parallelism. For this purpose, we use a highly optimized fork-join based implementation of the FMM and extend it to a data-driven implementation using a distributed task scheduling approach. This study exposes some limitations of the conventional fork-join implementation in terms of synchronization overheads. We find that these are not negligible and their elimination by the data-driven method, with a careful data locality strategy, was beneficial. Experimental evaluation of both methods on state-of-the-art multi-socket multi-core architectures showed up to 22% speed-ups of the data-driven approach compared to the original method. We demonstrate that a data-driven execution of FMM not only improves performance by avoiding global synchronization overheads but also reduces the memory-bandwidth pressure caused by memory-intensive computations. © 2013 Springer-Verlag.

  6. Using secure web services to visualize poison center data for nationwide biosurveillance: a case study.

    Science.gov (United States)

    Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M Barry; Lee, Brian; Stinn, John; Worthen, Katherine

    2010-01-01

    Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance.

  7. An Application-Driven Modular IoT Architecture

    Directory of Open Access Journals (Sweden)

    Kumar Yelamarthi

    2017-01-01

    Full Text Available Building upon the advancements in the recent years, a new paradigm in technology has emerged in Internet of Things (IoT. IoT has allowed for communication with the surrounding environment through a multitude of sensors and actuators, yet operating on limited energy. Several researchers have presented IoT architectures for respective applications, often challenged by requiring major updates for adoption to a different application. Further, this comes with several uncertainties such as type of computational device required at the edge, mode of wireless connectivity required, methods to obtain power efficiency, and not ensuring rapid deployment. This paper starts with providing a horizontal overview of each layer in IoT architecture and options for different applications. Then it presents a broad application-driven modular architecture, which can be easily customized for rapid deployment. This paper presents the diverse hardware used in several IoT layers such as sensors, embedded processors, wireless transceivers, internet gateway, and application management cloud server. Later, this paper presents implementation results for diverse applications including healthcare, structural health monitoring, agriculture, and indoor tour guide systems. It is hoped that this research will assist the potential user to easily choose IoT hardware and software as it pertains to their respective needs.

  8. Complex and hierarchical micelle architectures from diblock copolymers using living, crystallization-driven polymerizations.

    Science.gov (United States)

    Gädt, Torben; Ieong, Nga Sze; Cambridge, Graeme; Winnik, Mitchell A; Manners, Ian

    2009-02-01

    Block copolymers consist of two or more chemically distinct polymer segments, or blocks, connected by a covalent link. In a selective solvent for one of the blocks, core-corona micelle structures are formed. We demonstrate that living polymerizations driven by the epitaxial crystallization of a core-forming metalloblock represent a synthetic tool that can be used to generate complex and hierarchical micelle architectures from diblock copolymers. The use of platelet micelles as initiators enables the formation of scarf-like architectures in which cylindrical micelle tassels of controlled length are grown from specific crystal faces. A similar process enables the fabrication of brushes of cylindrical micelles on a crystalline homopolymer substrate. Living polymerizations driven by heteroepitaxial growth can also be accomplished and are illustrated by the formation of tri- and pentablock and scarf architectures with cylinder-cylinder and platelet-cylinder connections, respectively, that involve different core-forming metalloblocks.

  9. Elsevier special issue on foundations and applications of model driven architecture

    NARCIS (Netherlands)

    Aksit, Mehmet; Ivanov, Ivan

    2008-01-01

    Model Driven Architecture (MDA) is an approach for software development proposed by Object Management Group (OMG). The basic principle of MDA is the separation of the specification of system functionality from the specification of the implementation of that functionality on a specific platform. The

  10. An architecture for a continuous, user-driven, and data-driven application of clinical guidelines and its evaluation.

    Science.gov (United States)

    Shalom, Erez; Shahar, Yuval; Lunenfeld, Eitan

    2016-02-01

    Design, implement, and evaluate a new architecture for realistic continuous guideline (GL)-based decision support, based on a series of requirements that we have identified, such as support for continuous care, for multiple task types, and for data-driven and user-driven modes. We designed and implemented a new continuous GL-based support architecture, PICARD, which accesses a temporal reasoning engine, and provides several different types of application interfaces. We present the new architecture in detail in the current paper. To evaluate the architecture, we first performed a technical evaluation of the PICARD architecture, using 19 simulated scenarios in the preeclampsia/toxemia domain. We then performed a functional evaluation with the help of two domain experts, by generating patient records that simulate 60 decision points from six clinical guideline-based scenarios, lasting from two days to four weeks. Finally, 36 clinicians made manual decisions in half of the scenarios, and had access to the automated GL-based support in the other half. The measures used in all three experiments were correctness and completeness of the decisions relative to the GL. Mean correctness and completeness in the technical evaluation were 1±0.0 and 0.96±0.03 respectively. The functional evaluation produced only several minor comments from the two experts, mostly regarding the output's style; otherwise the system's recommendations were validated. In the clinically oriented evaluation, the 36 clinicians applied manually approximately 41% of the GL's recommended actions. Completeness increased to approximately 93% when using PICARD. Manual correctness was approximately 94.5%, and remained similar when using PICARD; but while 68% of the manual decisions included correct but redundant actions, only 3% of the actions included in decisions made when using PICARD were redundant. The PICARD architecture is technically feasible and is functionally valid, and addresses the realistic

  11. A CMOS self-powered front-end architecture for subcutaneous event-detector devices

    CERN Document Server

    Colomer-Farrarons, Jordi

    2011-01-01

    A CMOS Self-Powered Front-End Architecture for Subcutaneous Event-Detector Devices presents the conception and prototype realization of a Self-Powered architecture for subcutaneous detector devices. The architecture is designed to work as a true/false (event detector) or threshold level alarm of some substances, ions, etc. that are detected through a three-electrodes amperometric BioSensor approach. The device is conceived as a Low-Power subcutaneous implantable application powered by an inductive link, one emitter antenna at the external side of the skin and the receiver antenna under the ski

  12. Data-driven simulation methodology using DES 4-layer architecture

    Directory of Open Access Journals (Sweden)

    Aida Saez

    2016-05-01

    Full Text Available In this study, we present a methodology to build data-driven simulation models of manufacturing plants. We go further than other research proposals and we suggest focusing simulation model development under a 4-layer architecture (network, logic, database and visual reality. The Network layer includes system infrastructure. The Logic layer covers operations planning and control system, and material handling equipment system. The Database holds all the information needed to perform the simulation, the results used to analyze and the values that the Logic layer is using to manage the Plant. Finally, the Visual Reality displays an augmented reality system including not only the machinery and the movement but also blackboards and other Andon elements. This architecture provides numerous advantages as helps to build a simulation model that consistently considers the internal logistics, in a very flexible way.

  13. Internet-based biosurveillance methods for vector-borne diseases: Are they novel public health tools or just novelties?

    Science.gov (United States)

    Pollett, Simon; Althouse, Benjamin M; Forshey, Brett; Rutherford, George W; Jarman, Richard G

    2017-11-01

    Internet-based surveillance methods for vector-borne diseases (VBDs) using "big data" sources such as Google, Twitter, and internet newswire scraping have recently been developed, yet reviews on such "digital disease detection" methods have focused on respiratory pathogens, particularly in high-income regions. Here, we present a narrative review of the literature that has examined the performance of internet-based biosurveillance for diseases caused by vector-borne viruses, parasites, and other pathogens, including Zika, dengue, other arthropod-borne viruses, malaria, leishmaniasis, and Lyme disease across a range of settings, including low- and middle-income countries. The fundamental features, advantages, and drawbacks of each internet big data source are presented for those with varying familiarity of "digital epidemiology." We conclude with some of the challenges and future directions in using internet-based biosurveillance for the surveillance and control of VBD.

  14. Internet-based biosurveillance methods for vector-borne diseases: Are they novel public health tools or just novelties?

    Directory of Open Access Journals (Sweden)

    Simon Pollett

    2017-11-01

    Full Text Available Internet-based surveillance methods for vector-borne diseases (VBDs using "big data" sources such as Google, Twitter, and internet newswire scraping have recently been developed, yet reviews on such "digital disease detection" methods have focused on respiratory pathogens, particularly in high-income regions. Here, we present a narrative review of the literature that has examined the performance of internet-based biosurveillance for diseases caused by vector-borne viruses, parasites, and other pathogens, including Zika, dengue, other arthropod-borne viruses, malaria, leishmaniasis, and Lyme disease across a range of settings, including low- and middle-income countries. The fundamental features, advantages, and drawbacks of each internet big data source are presented for those with varying familiarity of "digital epidemiology." We conclude with some of the challenges and future directions in using internet-based biosurveillance for the surveillance and control of VBD.

  15. Web Service Architecture for e-Learning

    Directory of Open Access Journals (Sweden)

    Xiaohong Qiu

    2005-10-01

    Full Text Available Message-based Web Service architecture provides a unified approach to applications and Web Services that incorporates the flexibility of messaging and distributed components. We propose SMMV and MMMV collaboration as the general architecture of collaboration based on a Web service model, which accommodates both instructor-led learning and participatory learning. This approach derives from our message-based Model-View-Controller (M-MVC architecture of Web applications, comprises an event-driven Publish/Subscribe scheme, and provides effective collaboration with high interactivity of rich Web content for diverse clients over heterogeneous network environments.

  16. Sawtooth events and O+ in the plasma sheet and boundary layer: CME- and SIR-driven events

    Science.gov (United States)

    Lund, E. J.; Nowrouzi, N.; Kistler, L. M.; Cai, X.; Liao, J.

    2017-12-01

    The role of ionospheric ions in sawtooth events is an open question. Simulations[1,2,3] suggest that O+ from the ionosphere produces a feedback mechanism for driving sawtooth events. However, observational evidence[4,5] suggest that the presence of O+ in the plasma sheet is neither necessary nor sufficient. In this study we investigate whether the solar wind driver of the geomagnetic storm has an effect on the result. Building on an earlier study[4] that used events for which Cluster data is available in the plasma sheet and boundary layer, we perform a superposed epoch analysis for coronal mass ejection (CME) driven storms and streaming interaction region (SIR) driven storms separately, to investigate the hypothesis that ionospheric O+ is an important contributor for CME-driven storms but not SIR-driven storms[2]. [1]O. J. Brambles et al. (2011), Science 332, 1183.[2]O. J. Brambles et al. (2013), JGR 118, 6026.[3]R. H. Varney et al. (2016), JGR 121, 9688.[4]J. Liao et al. (2014), JGR 119, 1572.[5]E. J. Lund et al. (2017), JGR, submitted.

  17. Event-driven simulation of neural population synchronization facilitated by electrical coupling.

    Science.gov (United States)

    Carrillo, Richard R; Ros, Eduardo; Barbour, Boris; Boucheny, Christian; Coenen, Olivier

    2007-02-01

    Most neural communication and processing tasks are driven by spikes. This has enabled the application of the event-driven simulation schemes. However the simulation of spiking neural networks based on complex models that cannot be simplified to analytical expressions (requiring numerical calculation) is very time consuming. Here we describe briefly an event-driven simulation scheme that uses pre-calculated table-based neuron characterizations to avoid numerical calculations during a network simulation, allowing the simulation of large-scale neural systems. More concretely we explain how electrical coupling can be simulated efficiently within this computation scheme, reproducing synchronization processes observed in detailed simulations of neural populations.

  18. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  19. Effects of various event building techniques on data acquisition system architectures

    International Nuclear Information System (INIS)

    Barsotti, E.; Booth, A.; Bowden, M.

    1990-04-01

    The preliminary specifications for various new detectors throughout the world including those at the Superconducting Super Collider (SSC) already make it clear that existing event building techniques will be inadequate for the high trigger and data rates anticipated for these detectors. In the world of high-energy physics many approaches have been taken to solving the problem of reading out data from a whole detector and presenting a complete event to the physicist, while simultaneously keeping deadtime to a minimum. This paper includes a review of multiprocessor and telecommunications interconnection networks and how these networks relate to event building in general, illustrating advantages of the various approaches. It presents a more detailed study of recent research into new event building techniques which incorporate much greater parallelism to better accommodate high data rates. The future in areas such as front-end electronics architectures, high speed data links, event building and online processor arrays is also examined. Finally, details of a scalable parallel data acquisition system architecture being developed at Fermilab are given. 35 refs., 31 figs., 1 tab

  20. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  1. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    Science.gov (United States)

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  2. A self-scaling, distributed information architecture for public health, research, and clinical care.

    Science.gov (United States)

    McMurry, Andrew J; Gilbert, Clint A; Reis, Ben Y; Chueh, Henry C; Kohane, Isaac S; Mandl, Kenneth D

    2007-01-01

    This study sought to define a scalable architecture to support the National Health Information Network (NHIN). This architecture must concurrently support a wide range of public health, research, and clinical care activities. The architecture fulfils five desiderata: (1) adopt a distributed approach to data storage to protect privacy, (2) enable strong institutional autonomy to engender participation, (3) provide oversight and transparency to ensure patient trust, (4) allow variable levels of access according to investigator needs and institutional policies, (5) define a self-scaling architecture that encourages voluntary regional collaborations that coalesce to form a nationwide network. Our model has been validated by a large-scale, multi-institution study involving seven medical centers for cancer research. It is the basis of one of four open architectures developed under funding from the Office of the National Coordinator of Health Information Technology, fulfilling the biosurveillance use case defined by the American Health Information Community. The model supports broad applicability for regional and national clinical information exchanges. This model shows the feasibility of an architecture wherein the requirements of care providers, investigators, and public health authorities are served by a distributed model that grants autonomy, protects privacy, and promotes participation.

  3. An event driven algorithm for fractal cluster formation

    NARCIS (Netherlands)

    González, S.; Gonzalez Briones, Sebastián; Thornton, Anthony Richard; Luding, Stefan

    2011-01-01

    A new cluster based event-driven algorithm is developed to simulate the formation of clusters in a two dimensional gas: particles move freely until they collide and "stick" together irreversibly. These clusters aggregate into bigger structures in an isotompic way, forming fractal structures whose

  4. An event driven algorithm for fractal cluster formation

    NARCIS (Netherlands)

    González, S.; Thornton, Anthony Richard; Luding, Stefan

    2010-01-01

    A new cluster based event-driven algorithm is developed to simulate the formation of clusters in a two dimensional gas: particles move freely until they collide and "stick" together irreversibly. These clusters aggregate into bigger structures in an isotompic way, forming fractal structures whose

  5. Energy storage technologies and hybrid architectures for specific diesel-driven rail duty cycles: Design and system integration aspects

    International Nuclear Information System (INIS)

    Meinert, M.; Prenleloup, P.; Schmid, S.; Palacin, R.

    2015-01-01

    Highlights: • We assessed integration of energy storage systems into hybrid system architectures. • We considered mechanical and electrical energy storage systems. • Potential of different combinations has been analyzed by standardized duty cycles. • Most promising are diesel-driven suburban, regional and shunting operations. • Double-layer capacitors and Lithium-ion batteries have the highest potential. - Abstract: The use of diesel-driven traction is an intrinsic part of the functioning of railway systems and it is expected to continue being so for the foreseeable future. The recent introduction of more restrictive greenhouse gas emission levels and other legislation aiming at the improvement of the environmental performance of railway systems has led to the need of exploring alternatives for cleaner diesel rolling stock. This paper focuses on assessing energy storage systems and the design of hybrid system architectures to determine their potential use in specific diesel-driven rail duty cycles. Hydrostatic accumulators, flywheels, Lithium-ion batteries and double-layer capacitors have been assessed and used to design hybrid system architectures. The potential of the different technology combinations has been analyzed using standardized duty cycles enhanced with gradient profiles related to suburban, regional and shunting operations. The results show that double-layer capacitors and Lithium-ion batteries have the highest potential to be successfully integrated into the system architecture of diesel-driven rail vehicles. Furthermore, the results also suggest that combining these two energy storage technologies into a single hybridisation package is a highly promising design that draws on their strengthens without any significant drawbacks.

  6. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.

    Science.gov (United States)

    Rueckauer, Bodo; Lungu, Iulia-Alexandra; Hu, Yuhuang; Pfeiffer, Michael; Liu, Shih-Chii

    2017-01-01

    Spiking neural networks (SNNs) can potentially offer an efficient way of doing inference because the neurons in the networks are sparsely activated and computations are event-driven. Previous work showed that simple continuous-valued deep Convolutional Neural Networks (CNNs) can be converted into accurate spiking equivalents. These networks did not include certain common operations such as max-pooling, softmax, batch-normalization and Inception-modules. This paper presents spiking equivalents of these operations therefore allowing conversion of nearly arbitrary CNN architectures. We show conversion of popular CNN architectures, including VGG-16 and Inception-v3, into SNNs that produce the best results reported to date on MNIST, CIFAR-10 and the challenging ImageNet dataset. SNNs can trade off classification error rate against the number of available operations whereas deep continuous-valued neural networks require a fixed number of operations to achieve their classification error rate. From the examples of LeNet for MNIST and BinaryNet for CIFAR-10, we show that with an increase in error rate of a few percentage points, the SNNs can achieve more than 2x reductions in operations compared to the original CNNs. This highlights the potential of SNNs in particular when deployed on power-efficient neuromorphic spiking neuron chips, for use in embedded applications.

  7. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification

    Directory of Open Access Journals (Sweden)

    Bodo Rueckauer

    2017-12-01

    Full Text Available Spiking neural networks (SNNs can potentially offer an efficient way of doing inference because the neurons in the networks are sparsely activated and computations are event-driven. Previous work showed that simple continuous-valued deep Convolutional Neural Networks (CNNs can be converted into accurate spiking equivalents. These networks did not include certain common operations such as max-pooling, softmax, batch-normalization and Inception-modules. This paper presents spiking equivalents of these operations therefore allowing conversion of nearly arbitrary CNN architectures. We show conversion of popular CNN architectures, including VGG-16 and Inception-v3, into SNNs that produce the best results reported to date on MNIST, CIFAR-10 and the challenging ImageNet dataset. SNNs can trade off classification error rate against the number of available operations whereas deep continuous-valued neural networks require a fixed number of operations to achieve their classification error rate. From the examples of LeNet for MNIST and BinaryNet for CIFAR-10, we show that with an increase in error rate of a few percentage points, the SNNs can achieve more than 2x reductions in operations compared to the original CNNs. This highlights the potential of SNNs in particular when deployed on power-efficient neuromorphic spiking neuron chips, for use in embedded applications.

  8. Materials Driven Architectural Design and Representation

    DEFF Research Database (Denmark)

    Kruse Aagaard, Anders

    2015-01-01

    This paper aims to outline a framework for a deeper connection between experimentally obtained material knowledge and architectural design. While materials and architecture in the process of realisation are tightly connected, architectural design and representation are often distanced from...... another role in relation to architectural production. It is, in this paper, the intention to point at material research as an active initiator in explorative approaches to architectural design methods and architectural representation. This paper will point at the inclusion of tangible and experimental...... material research in the early phases of architectural design and to that of the architectural set of tools and representation. The paper will through use of existing research and the author’s own material research and practice suggest a way of using a combination of digital drawing, digital fabrication...

  9. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    Science.gov (United States)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This viewgraph presentation reviews the architectural description of the Mission Data Processing and Control System (MPCS). MPCS is an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is designed with these factors (1) Enabling plug and play architecture (2) MPCS has strong inheritance from GDS components that have been developed for other Flight Projects (MER, MRO, DAWN, MSAP), and are currently being used in operations and ATLO, and (3) MPCS components are Java-based, platform independent, and are designed to consume and produce XML-formatted data

  10. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    Science.gov (United States)

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  11. Client and event driven data hub system at CDF

    International Nuclear Information System (INIS)

    Kilminster, Ben; McFarland, Kevin; Vaiciulis, Tony; Matsunaga, Hiroyuki; Shimojima, Makoto

    2001-01-01

    The Consumer-Server Logger (CSL) system at the Collider Detector at Fermilab is a client and event driven data hub capable of receiving physics events from multiple connections, and logging them to multiple streams while distributing them to multiple online analysis programs (consumers). Its multiple-partitioned design allows data flowing through different paths of the detector sub-systems to be processed separately. The CSL system, using a set of internal memory buffers and message queues mapped to the location of events within its programs, and running on an SGI 2200 Server, is able to process at least the required 20 MB/s of constant event logging (75 Hz of 250 KB events) while also filtering up to 10 MB/s to consumers requesting specific types of events

  12. Biosurveillance Using Clinical Diagnoses and Social Media Indicators in Military Populations

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Volkova, Svitlana [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rounds, Jeremiah [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Charles-Smith, Lauren E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Harrison, Joshua J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Joshua A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Han, Keith S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-02-23

    U.S. military influenza surveillance uses electronic reporting of clinical diagnoses to monitor health of military personnel and detect naturally occurring and bioterrorism-related epidemics. While accurate, these systems lack in timeliness. More recently, researchers have used novel data sources to detect influenza in real time and capture nontraditional populations. With data-mining techniques, military social media users are identified and influenza-related discourse is integrated along with medical data into a comprehensive disease model. By leveraging heterogeneous data streams and developing dashboard biosurveillance analytics, the researchers hope to increase the speed at which outbreaks are detected and provide accurate disease forecasting among military personnel.

  13. A computer architecture for intelligent machines

    Science.gov (United States)

    Lefebvre, D. R.; Saridis, G. N.

    1992-01-01

    The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  14. Oak Ridge Bio-surveillance Toolkit (ORBiT): Integrating Big-Data Analytics with Visual Analysis for Public Health Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Pullum, Laura L [ORNL; Steed, Chad A [ORNL; Chennubhotla, Chakra [University of Pittsburgh School of Medicine, Pittsburgh PA; Quinn, Shannon [University of Pittsburgh School of Medicine, Pittsburgh PA

    2013-01-01

    In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the data analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.

  15. Constructing rigorous and broad biosurveillance networks for detecting emerging zoonotic outbreaks.

    Directory of Open Access Journals (Sweden)

    Mac Brown

    Full Text Available Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a wide range of scenarios of pathogen spread. We demonstrate how a hierarchy of mathematical and statistical tools can be used in surveillance planning help guide successful surveillance and mitigation policies for a wide range of zoonotic pathogens. The model forecasts can help clarify the complexities of potential scenarios, and optimize biosurveillance programs for rapidly detecting infectious diseases. Using the highly pathogenic zoonotic H5N1 avian influenza 2006-2007 epidemic in Nigeria as an example, we determined the risk for infection for localized areas in an outbreak and designed biosurveillance stations that are effective for different pathogen strains and a range of possible outbreak locations. We created a general multi-scale, multi-host stochastic SEIR epidemiological network model, with both short and long-range movement, to simulate the spread of an infectious disease through Nigerian human, poultry, backyard duck, and wild bird populations. We chose parameter ranges specific to avian influenza (but not to a particular strain and used a Latin hypercube sample experimental design to investigate epidemic predictions in a thousand simulations. We ranked the risk of local regions by the number of times they became infected in the ensemble of simulations. These spatial statistics were then complied into a potential risk map of infection. Finally, we validated the results with a known outbreak, using spatial analysis of all the simulation runs to show the progression matched closely with the observed location of the farms infected in the 2006-2007 epidemic.

  16. Fork-join and data-driven execution models on multi-core architectures: Case study of the FMM

    KAUST Repository

    Amer, Abdelhalim; Maruyama, Naoya; Pericà s, Miquel; Taura, Kenjiro; Yokota, Rio; Matsuoka, Satoshi

    2013-01-01

    Extracting maximum performance of multi-core architectures is a difficult task primarily due to bandwidth limitations of the memory subsystem and its complex hierarchy. In this work, we study the implications of fork-join and data-driven execution

  17. Smart SOA platforms in cloud computing architectures

    CERN Document Server

    Exposito , Ernesto

    2014-01-01

    This book is intended to introduce the principles of the Event-Driven and Service-Oriented Architecture (SOA 2.0) and its role in the new interconnected world based on the cloud computing architecture paradigm. In this new context, the concept of "service" is widely applied to the hardware and software resources available in the new generation of the Internet. The authors focus on how current and future SOA technologies provide the basis for the smart management of the service model provided by the Platform as a Service (PaaS) layer.

  18. The ATLAS Event Index: The Architecture of the Core Engine

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2017-01-01

    The global view of the ATLAS Event Index system has been presented in the 17th ACAT Conference. This article concentrates on the architecture of the system core component. This component handles the final stage of the event metadata import, it organizes its storage and provides a fast and feature-rich access to all information. A user is able to interrogate metadata in various ways, including by executing user-provided code on the data to make selections and to interpret the results. A wide spectrum of clients is available, from a set of linux-like commands to an interactive graphical Web Service. The stored event metadata contain the basic description of the related events, the references to the experiment event storage, the full trigger record and can be extended with other event characteristics. Derived collections of events can be created. Such collections can be annotated and tagged with further information.

  19. The ATLAS Event Index: The Architecture of the Core Engine

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration; Hrivnac, Julius

    2017-01-01

    The global view of the ATLAS Event Index system has been presented in the last ACAT. This talk will concentrate on the architecture of the system core component. This component handles the final stage of the event metadata import, it organizes its storage and provides a fast and feature-rich access to all information. A user is able to interrogate metadata in various ways, including by executing user-provided code on the data to make selections and to interpret the results. A wide spectrum of clients is available, from a set of linux-like commands to an interactive graphical Web Service. The stored event metadata contain the basic description of the related events, the references to the experiment event storage, the full trigger record and can be extended with other event characteristics. Derived collections of events can be created. Such collections can be annotated and tagged with further information. This talk will describe all system sub-components and their development evolution, which lead into the choi...

  20. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  1. HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains

    Science.gov (United States)

    Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro

    The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.

  2. Virtual Sensor Web Architecture

    Science.gov (United States)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  3. Kinetic Digitally-Driven Architectural Structures as ‘Marginal’ Objects – a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Sokratis Yiannoudes

    2014-07-01

    Full Text Available Although the most important reasons for designing digitally-driven kinetic architectural structures seem to be practical ones, namely functional flexibility and adaptation to changing conditions and needs, this paper argues that there is possibly an additional socio-cultural aspect driving their design and construction. Through this argument, the paper attempts to debate their status and question their concepts and practices.Looking at the design explorations and discourses of real or visionary technologically-augmented architecture since the 1960s, one cannot fail to notice the use of biological metaphors and concepts to describe them – an attempt to ‘naturalise’ them which culminates today in the conception of kinetic structures and intelligent environments as literally ‘alive’. Examining these attitudes in contemporary examples, the paper demonstrates that digitally-driven kinetic structures can be conceived as artificial ‘living’ machines that undermine the boundary between the natural and the artificial. It argues that by ‘humanising’ these structures, attributing biological characteristics such as self-initiated motion, intelligence and reactivity, their designers are ‘trying’ to subvert and blur the human-machine (-architecture discontinuity.The argument is developed by building a conceptual framework which is based on evidence from the social studies of science and technology, in particular their critique in modern nature-culture and human-machine distinctions, as well as the history and theory of artificial life which discuss the cultural significance and sociology of ‘living’ objects. In particular, the paper looks into the techno-scientific discourses and practices which, since the 18th century, have been exploring the creation of ‘marginal’ objects, i.e. seemingly alive objects made to challenge the nature-artifice boundary.

  4. Discovering Multi-scale Co-occurrence Patterns of Asthma and Influenza with the Oak Ridge Bio-surveillance Toolkit

    Directory of Open Access Journals (Sweden)

    Arvind eRamanathan

    2015-08-01

    Full Text Available We describe a data-driven unsupervised machine learning approach to extract geo-temporal co-occurrence patterns of asthma and the flu from large-scale electronic healthcare reimbursement claims (eHRC datasets. Specifically, we examine the eHRC data from the 2009-2010 pandemic H1N1 influenza season and analyze whether different geographic regions within the United States (US showed an increase in co-occurrence patterns of the flu and asthma. Our analyses reveal that the temporal patterns extracted from the eHRC data show a distinct lag time between the peak incidence of the asthma and the flu. While the increased occurrence of asthma contributed to increased flu incidence during the pandemic, this co-occurrence is predominant for female patients. The geo-temporal patterns reveal that the co-occurrence of the flu and asthma are typically concentrated within the south-east US. Further, in agreement with previous studies, large urban areas (such as New York, Miami and Los Angeles exhibit co-occurrence patterns that suggest a peak incidence of asthma and flu significantly early in the spring and winter seasons. Together, our data-analytic approach, integrated within the Oak Ridge Bio-surveillance Toolkit platform, demonstrates how eHRC data can provide novel insights into co-occurring disease patterns.

  5. Enhancing 'Whole-of-Government' Response to Biological Events in Korea: Able Response 2014.

    Science.gov (United States)

    Tak, Sangwoo; Jareb, Anton; Choi, Suon; Sikes, Marvin; Choi, Yeon Hwa; Boo, Hyeong-Wook

    2018-01-01

    Since 2011, the Republic of Korea (ROK) and United States (U.S.) have been collaborating to conduct inter- and intra-governmental exercises to jointly respond to biological events in Korea. These exercises highlight U.S. interest in increasing its global biosurveillance capability and the ROK's interest in improving cooperation among ministries to respond to crises. With Able Response (AR) exercises, the ROK and U.S. have improved coordination among US and ROK government and defense agencies responding to potential bio-threats and identified additional areas on which to apply refinements in policies and practices. In 2014, the AR exercise employed a Biosurveillance Portal (BSP) to facilitate more effective communication among participating agencies and countries including Australia. In the present paper, we seek to provide a comprehensive assessment of the AR 2014 (AR14) exercise and make recommendations for future improvements. Incorporating a more realistic response in future scenarios by integrating a tactical response episode in the exercise is recommended.

  6. Biosurveillance in Central Asia: Successes and Challenges of Tick-Borne Disease Research in Kazakhstan and Kyrgyzstan

    OpenAIRE

    Hay, John; Yeh, Kenneth B.; Dasgupta, Debanjana; Shapieva, Zhanna; Omasheva, Gulnara; Deryabin, Pavel; Nurmakhanov, Talgat; Ayazbayev, Timur; Andryushchenko, Alexei; Zhunushov, Asankadyr; Hewson, Roger; Farris, Christina M.; Richards, Allen L.

    2016-01-01

    Central Asia is a vast geographic region that includes five former Soviet Union republics: Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan, and Uzbekistan. The region has a unique infectious disease burden, and a history that includes Silk Road trade routes and networks that were part of the anti-plague and biowarfare programs in the former Soviet Union. Post-Soviet Union biosurveillance research in this unique area of the world has met with several challenges, including lack of funding and ...

  7. LESSONS LEARNED Biosurveillance Mobile App Development Intern Competition (Summer 2013)

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Christine F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henry, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corley, Courtney D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-01-14

    The purpose of the lessons learned document for the BEOWulf Biosurveillance Mobile App Development Intern Competition is to capture the project’s lessons learned in a formal document for use by other project managers on similar future projects. This document may be used as part of new project planning for similar projects in order to determine what problems occurred and how those problems were handled and may be avoided in the future. Additionally, this document details what went well with the project and why, so that other project managers may capitalize on these actions. Project managers may also use this document to determine who the project team members were in order to solicit feedback for planning their projects in the future. This document will be formally communicated with the organization and will become a part of the organizational assets and archives.

  8. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  9. Heterogeneous network architectures

    DEFF Research Database (Denmark)

    Christiansen, Henrik Lehrmann

    2006-01-01

    is flexibility. This thesis investigates such heterogeneous network architectures and how to make them flexible. A survey of algorithms for network design is presented, and it is described how using heuristics can increase the speed. A hierarchical, MPLS based network architecture is described......Future networks will be heterogeneous! Due to the sheer size of networks (e.g., the Internet) upgrades cannot be instantaneous and thus heterogeneity appears. This means that instead of trying to find the olution, networks hould be designed as being heterogeneous. One of the key equirements here...... and it is discussed that it is advantageous to heterogeneous networks and illustrated by a number of examples. Modeling and simulation is a well-known way of doing performance evaluation. An approach to event-driven simulation of communication networks is presented and mixed complexity modeling, which can simplify...

  10. Estimating the Probability of Wind Ramping Events: A Data-driven Approach

    OpenAIRE

    Wang, Cheng; Wei, Wei; Wang, Jianhui; Qiu, Feng

    2016-01-01

    This letter proposes a data-driven method for estimating the probability of wind ramping events without exploiting the exact probability distribution function (PDF) of wind power. Actual wind data validates the proposed method.

  11. Seabed resident event driven profiling system (SREP). Concept, design and tests

    Digital Repository Service at National Institute of Oceanography (India)

    Mascarenhas, A.A.M.Q.; Afzulpurkar, S.; Maurya, P.K.; Fernandes, L.; Madhan, R.; Desa, E.S.; Dabolkar, N.A.; Navelkar, G.S.; Naik, L.; Shetye, V.G.; Shetty, N.B.; Prabhudesai, S.P.; Nagvekar, S.; Vimalakumari, D.

    The seabed resident event driven profiling system (SREP) described here offers a novel, optimized approach to profiling in coastal waters from seabed to sea surface during the rough seas encountered in the southwest monsoon season (June...

  12. Selection of initial events of accelerator driven subcritical system

    International Nuclear Information System (INIS)

    Wang Qianglong; Hu Liqin; Wang Jiaqun; Li Yazhou; Yang Zhiyi

    2013-01-01

    The Probabilistic Safety Assessment (PSA) is an important tool in reactor safety analysis and a significant reference to the design and operation of reactor. It is the origin and foundation of the PSA for a reactor to select the initial events. Accelerator Driven Subcritical System (ADS) has advanced design characteristics, complicated subsystems and little engineering and operating experience, which makes it much more difficult to identify the initial events of ADS. Based on the current design project of ADS, the system's safety characteristics and special issues were analyzed in this article. After a series of deductions with Master Logic Diagram (MLD) and considering the relating experience of other advanced research reactors, a preliminary initial events was listed finally, which provided the foundation for the next safety assessment. (authors)

  13. Teleradiology system analysis using a discrete event-driven block-oriented network simulator

    Science.gov (United States)

    Stewart, Brent K.; Dwyer, Samuel J., III

    1992-07-01

    Performance evaluation and trade-off analysis are the central issues in the design of communication networks. Simulation plays an important role in computer-aided design and analysis of communication networks and related systems, allowing testing of numerous architectural configurations and fault scenarios. We are using the Block Oriented Network Simulator (BONeS, Comdisco, Foster City, CA) software package to perform discrete, event- driven Monte Carlo simulations in capacity planning, tradeoff analysis and evaluation of alternate architectures for a high-speed, high-resolution teleradiology project. A queuing network model of the teleradiology system has been devise, simulations executed and results analyzed. The wide area network link uses a switched, dial-up N X 56 kbps inverting multiplexer where the number of digital voice-grade lines (N) can vary from one (DS-0) through 24 (DS-1). The proposed goal of such a system is 200 films (2048 X 2048 X 12-bit) transferred between a remote and local site in an eight hour period with a mean delay time less than five minutes. It is found that: (1) the DS-1 service limit is around 100 films per eight hour period with a mean delay time of 412 +/- 39 seconds, short of the goal stipulated above; (2) compressed video teleconferencing can be run simultaneously with image data transfer over the DS-1 wide area network link without impacting the performance of the described teleradiology system; (3) there is little sense in upgrading to a higher bandwidth WAN link like DS-2 or DS-3 for the current system; and (4) the goal of transmitting 200 films in an eight hour period with a mean delay time less than five minutes can be achieved simply if the laser printer interface is updated from the current DR-11W interface to a much faster SCSI interface.

  14. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  15. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks.

    Science.gov (United States)

    Naveros, Francisco; Garrido, Jesus A; Carrillo, Richard R; Ros, Eduardo; Luque, Niceto R

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  16. Event-driven charge-coupled device design and applications therefor

    Science.gov (United States)

    Doty, John P. (Inventor); Ricker, Jr., George R. (Inventor); Burke, Barry E. (Inventor); Prigozhin, Gregory Y. (Inventor)

    2005-01-01

    An event-driven X-ray CCD imager device uses a floating-gate amplifier or other non-destructive readout device to non-destructively sense a charge level in a charge packet associated with a pixel. The output of the floating-gate amplifier is used to identify each pixel that has a charge level above a predetermined threshold. If the charge level is above a predetermined threshold the charge in the triggering charge packet and in the charge packets from neighboring pixels need to be measured accurately. A charge delay register is included in the event-driven X-ray CCD imager device to enable recovery of the charge packets from neighboring pixels for accurate measurement. When a charge packet reaches the end of the charge delay register, control logic either dumps the charge packet, or steers the charge packet to a charge FIFO to preserve it if the charge packet is determined to be a packet that needs accurate measurement. A floating-diffusion amplifier or other low-noise output stage device, which converts charge level to a voltage level with high precision, provides final measurement of the charge packets. The voltage level is eventually digitized by a high linearity ADC.

  17. A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises

    Science.gov (United States)

    Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.

    2012-04-01

    The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer

  18. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    Science.gov (United States)

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  19. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    Directory of Open Access Journals (Sweden)

    Ali Balador

    2018-03-01

    Full Text Available Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  20. Architectural Strategies for Enabling Data-Driven Science at Scale

    Science.gov (United States)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  1. Model Driven Engineering

    Science.gov (United States)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  2. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  3. The ARCOMEM Architecture for Social- and Semantic-Driven Web Archiving

    Directory of Open Access Journals (Sweden)

    Thomas Risse

    2014-11-01

    Full Text Available The constantly growing amount ofWeb content and the success of the SocialWeb lead to increasing needs for Web archiving. These needs go beyond the pure preservationo of Web pages. Web archives are turning into “community memories” that aim at building a better understanding of the public view on, e.g., celebrities, court decisions and other events. Due to the size of the Web, the traditional “collect-all” strategy is in many cases not the best method to build Web archives. In this paper, we present the ARCOMEM (From Future Internet 2014, 6 689 Collect-All Archives to Community Memories architecture and implementation that uses semantic information, such as entities, topics and events, complemented with information from the Social Web to guide a novel Web crawler. The resulting archives are automatically enriched with semantic meta-information to ease the access and allow retrieval based on conditions that involve high-level concepts.

  4. The ATLAS EventIndex: architecture, design choices, deployment and first operation experience

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration; Cranshaw, Jack; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Glasman, Claudia; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Prokoshin, Fedor; Salt, José; Sánchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2015-01-01

    The EventIndex is the complete catalogue of all ATLAS events, keeping the references to all files that contain a given event in any processing stage. It replaces the TAG database, which had been in use during LHC Run 1. For each event it contains its identifiers, the trigger pattern and the GUIDs of the files containing it. Major use cases are event picking, feeding the Event Service used on some production sites, and technical checks of the completion and consistency of processing campaigns. The system design is highly modular so that its components (data collection system, storage system based on Hadoop, query web service and interfaces to other ATLAS systems) could be developed separately and in parallel during LS1. The EventIndex is in operation for the start of LHC Run 2. This paper describes the high-level system architecture, the technical design choices and the deployment process and issues. The performance of the data collection and storage systems, as well as the query services, are also reported.

  5. The ATLAS EventIndex: architecture, design choices, deployment and first operation experience

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration; Cranshaw, Jack; Favareto, Andrea; Fernandez Casani, Alvaro; Gallas, Elizabeth; Glasman, Claudia; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Malon, David; Prokoshin, Fedor; Salt, José; Sánchez, Javier; Rainer Toebbicke; Yuan, Ruijun

    2015-01-01

    The EventIndex is the complete catalogue of all ATLAS events, keeping the references to all files that contain a given event in any processing stage. It replaces the TAG database, which had been in use during LHC Run 1. For each event it contains its identifiers, the trigger pattern and the GUIDs of the files containing it. Major use cases are event picking, feeding the Event Service used on some production sites, and technical checks of the completion and consistency of processing campaigns. The system design is highly modular so that its components (data collection system, storage system based on Hadoop, query web service and interfaces to other ATLAS systems) could be developed separately and in parallel during LS1. The EventIndex is in operation for the start of LHC Run 2. This talk describes the high level system architecture, the technical design choices and the deployment process and issues. The performance of the data collection and storage systems, as well as the query services, will be reported.

  6. The ATLAS EventIndex: architecture, design choices, deployment and first operation experience

    Science.gov (United States)

    Barberis, D.; Cárdenas Zárate, S. E.; Cranshaw, J.; Favareto, A.; Fernández Casaní, Á.; Gallas, E. J.; Glasman, C.; González de la Hoz, S.; Hřivnáč, J.; Malon, D.; Prokoshin, F.; Salt Cairols, J.; Sánchez, J.; Többicke, R.; Yuan, R.

    2015-12-01

    The EventIndex is the complete catalogue of all ATLAS events, keeping the references to all files that contain a given event in any processing stage. It replaces the TAG database, which had been in use during LHC Run 1. For each event it contains its identifiers, the trigger pattern and the GUIDs of the files containing it. Major use cases are event picking, feeding the Event Service used on some production sites, and technical checks of the completion and consistency of processing campaigns. The system design is highly modular so that its components (data collection system, storage system based on Hadoop, query web service and interfaces to other ATLAS systems) could be developed separately and in parallel during LSI. The EventIndex is in operation for the start of LHC Run 2. This paper describes the high-level system architecture, the technical design choices and the deployment process and issues. The performance of the data collection and storage systems, as well as the query services, are also reported.

  7. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  8. Client-Side Event Processing for Personalized Web Advertisement

    Science.gov (United States)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.

  9. Polar cap flow channel events: spontaneous and driven responses

    Directory of Open Access Journals (Sweden)

    P. E. Sandholt

    2010-11-01

    Full Text Available We present two case studies of specific flow channel events appearing at the dusk and/or dawn polar cap boundary during passage at Earth of interplanetary (IP coronal mass ejections (ICMEs on 10 January and 25 July 2004. The channels of enhanced (>1 km/s antisunward convection are documented by SuperDARN radars and dawn-dusk crossings of the polar cap by the DMSP F13 satellite. The relationship with Birkeland currents (C1–C2 located poleward of the traditional R1–R2 currents is demonstrated. The convection events are manifest in ground magnetic deflections obtained from the IMAGE (International Monitor for Auroral Geomagnetic Effects Svalbard chain of ground magnetometer stations located within 71–76° MLAT. By combining the ionospheric convection data and the ground magnetograms we are able to study the temporal behaviour of the convection events. In the two ICME case studies the convection events belong to two different categories, i.e., directly driven and spontaneous events. In the 10 January case two sharp southward turnings of the ICME magnetic field excited corresponding convection events as detected by IMAGE and SuperDARN. We use this case to determine the ground magnetic signature of enhanced flow channel events (the NH-dusk/By<0 variant. In the 25 July case a several-hour-long interval of steady southwest ICME field (Bz<0; By<0 gave rise to a long series of spontaneous convection events as detected by IMAGE when the ground stations swept through the 12:00–18:00 MLT sector. From the ground-satellite conjunction on 25 July we infer the pulsed nature of the polar cap ionospheric flow channel events in this case. The typical duration of these convection enhancements in the polar cap is 10 min.

  10. Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI

    International Nuclear Information System (INIS)

    Korcyl, K; Konorov, I; Kühn, W; Schmitt, L

    2012-01-01

    A novel architecture is being proposed for the data acquisition and trigger system of the PANDA experiment at the HESR facility at FAIR/GSI. The experiment will run without hardware trigger signal using timestamps to correlate detector data from a given time window. The broad physics program in combination with the high rate of 2 * 10 7 interactions per second requires very selective filtering algorithms accessing information from many detectors. Therefore the effective filtering will happen later than in today's systems ie. after the event building. To assess that, the complete architecture will be built of two stages: the data concentrator stage providing event building and the rate reduction stage. For the former stage, which requires a throughput of 100 GB/s to perform event building, we propose two layers of ATCA crates filled with Compute Nodes - modules designed at IHEP and University of Giessen for trigger and data acquisition systems. Currently each board is equipped with 5 Virtex4 FX60 FPGAs and high bandwidth connectivity is provided by 8 front panel RocketIO ports and 12 backplane ports for the inter-module communication. We designed simplified models of the components of the architecture and using the SystemC library as support for the discrete event simulations, demonstrate the expected throughput of the full-size system. We also show impact of some architectural choices and key parameters on the architecture's performance.

  11. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    Science.gov (United States)

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  12. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    Directory of Open Access Journals (Sweden)

    Ali Albattat

    2016-08-01

    Full Text Available The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems. These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  13. Event-Driven Control for Networked Control Systems With Quantization and Markov Packet Losses.

    Science.gov (United States)

    Yang, Hongjiu; Xu, Yang; Zhang, Jinhui

    2016-05-23

    In this paper, event-driven is used in a networked control system (NCS) which is subjected to the effect of quantization and packet losses. A discrete event-detector is used to monitor specific events in the NCS. Both an arbitrary region quantizer and Markov jump packet losses are also considered for the NCS. Based on zoom strategy and Lyapunov theory, a complete proof is given to guarantee mean square stability of the closed-loop system. Stabilization of the NCS is ensured by designing a feedback controller. Lastly, an inverted pendulum model is given to show the advantages and effectiveness of the proposed results.

  14. Event-driven processing for hardware-efficient neural spike sorting

    Science.gov (United States)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  15. Global optimization driven by genetic algorithms for disruption predictors based on APODIS architecture

    Energy Technology Data Exchange (ETDEWEB)

    Rattá, G.A., E-mail: giuseppe.ratta@ciemat.es [Laboratorio Nacional de Fusión, CIEMAT, Madrid (Spain); Vega, J. [Laboratorio Nacional de Fusión, CIEMAT, Madrid (Spain); Murari, A. [Consorzio RFX, Associazione EURATOM/ENEA per la Fusione, Padua (Italy); Dormido-Canto, S. [Dpto. de Informática y Automática, Universidad Nacional de Educación a Distancia, Madrid (Spain); Moreno, R. [Laboratorio Nacional de Fusión, CIEMAT, Madrid (Spain)

    2016-11-15

    Highlights: • A global optimization method based on genetic algorithms was developed. • It allowed improving the prediction of disruptions using APODIS architecture. • It also provides the potential opportunity to develop a spectrum of future predictors using different training datasets. • The future analysis of how their structures reassemble and evolve in each test may help to improve the development of disruption predictors for ITER. - Abstract: Since year 2010, the APODIS architecture has proven its accuracy predicting disruptions in JET tokamak. Nevertheless, it has shown margins for improvements, fact indisputable after the enhanced performances achieved in posterior upgrades. In this article, a complete optimization driven by Genetic Algorithms (GA) is applied to it aiming at considering all possible combination of signals, signal features, quantity of models, their characteristics and internal parameters. This global optimization targets the creation of the best possible system with a reduced amount of required training data. The results harbor no doubts about the reliability of the global optimization method, allowing to outperform the ones of previous versions: 91.77% of predictions (89.24% with an anticipation higher than 10 ms) with a 3.55% of false alarms. Beyond its effectiveness, it also provides the potential opportunity to develop a spectrum of future predictors using different training datasets.

  16. Global optimization driven by genetic algorithms for disruption predictors based on APODIS architecture

    International Nuclear Information System (INIS)

    Rattá, G.A.; Vega, J.; Murari, A.; Dormido-Canto, S.; Moreno, R.

    2016-01-01

    Highlights: • A global optimization method based on genetic algorithms was developed. • It allowed improving the prediction of disruptions using APODIS architecture. • It also provides the potential opportunity to develop a spectrum of future predictors using different training datasets. • The future analysis of how their structures reassemble and evolve in each test may help to improve the development of disruption predictors for ITER. - Abstract: Since year 2010, the APODIS architecture has proven its accuracy predicting disruptions in JET tokamak. Nevertheless, it has shown margins for improvements, fact indisputable after the enhanced performances achieved in posterior upgrades. In this article, a complete optimization driven by Genetic Algorithms (GA) is applied to it aiming at considering all possible combination of signals, signal features, quantity of models, their characteristics and internal parameters. This global optimization targets the creation of the best possible system with a reduced amount of required training data. The results harbor no doubts about the reliability of the global optimization method, allowing to outperform the ones of previous versions: 91.77% of predictions (89.24% with an anticipation higher than 10 ms) with a 3.55% of false alarms. Beyond its effectiveness, it also provides the potential opportunity to develop a spectrum of future predictors using different training datasets.

  17. Comparing Transformation Possibilities of Topological Functioning Model and BPMN in the Context of Model Driven Architecture

    Directory of Open Access Journals (Sweden)

    Solomencevs Artūrs

    2016-05-01

    Full Text Available The approach called “Topological Functioning Model for Software Engineering” (TFM4SE applies the Topological Functioning Model (TFM for modelling the business system in the context of Model Driven Architecture. TFM is a mathematically formal computation independent model (CIM. TFM4SE is compared to an approach that uses BPMN as a CIM. The comparison focuses on CIM modelling and on transformation to UML Sequence diagram on the platform independent (PIM level. The results show the advantages and drawbacks the formalism of TFM brings into the development.

  18. Science to Support Management of Receiving Waters in an Event-Driven Ecosystem: From Land to River to Sea

    Directory of Open Access Journals (Sweden)

    Stuart E. Bunn

    2013-06-01

    Full Text Available Managing receiving-water quality, ecosystem health and ecosystem service delivery is challenging in regions where extreme rainfall and runoff events occur episodically, confounding and often intensifying land-degradation impacts. We synthesize the approaches used in river, reservoir and coastal water management in the event-driven subtropics of Australia, and the scientific research underpinning them. Land-use change has placed the receiving waters of Moreton Bay, an internationally-significant coastal wetland, at risk of ecological degradation through increased nutrient and sediment loads. The event-driven climate exacerbates this issue, as the waterways and ultimately Moreton Bay receive large inputs of nutrients and sediment during events, well above those received throughout stable climatic periods. Research on the water quality and ecology of the region’s rivers and coastal waters has underpinned the development of a world-renowned monitoring program and, in combination with catchment-source tracing methods and modeling, has revealed the key mechanisms and management strategies by which receiving-water quality, ecosystem health and ecosystem services can be maintained and improved. These approaches provide a useful framework for management of water bodies in other regions driven by episodic events, or where novel stressors are involved (e.g., climate change, urbanization, to support sustained ecosystem service delivery and restoration of aquatic ecosystems.

  19. The GOES-R Product Generation Architecture

    Science.gov (United States)

    Dittberner, G. J.; Kalluri, S.; Hansen, D.; Weiner, A.; Tarpley, A.; Marley, S.

    2011-12-01

    The GOES-R system will substantially improve users' ability to succeed in their work by providing data with significantly enhanced instruments, higher resolution, much shorter relook times, and an increased number and diversity of products. The Product Generation architecture is designed to provide the computer and memory resources necessary to achieve the necessary latency and availability for these products. Over time, new and updated algorithms are expected to be added and old ones removed as science advances and new products are developed. The GOES-R GS architecture is being planned to maintain functionality so that when such changes are implemented, operational product generation will continue without interruption. The primary parts of the PG infrastructure are the Service Based Architecture (SBA) and the Data Fabric (DF). SBA is the middleware that encapsulates and manages science algorithms that generate products. It is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DF to provide this data communication layer between algorithms. The DF provides an abstract interface over a distributed and persistent multi-layered storage system (e.g., memory based caching above disk-based storage) and an event management system that allows event-driven algorithm services to know when instrument data are available and where they reside. Together, the SBA and the DF provide a

  20. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    International Nuclear Information System (INIS)

    Hadley, S; Kessler, M; Litzenberg, D; Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M; Keranen, W; Covington, E; Moran, J

    2015-01-01

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  1. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, S; Kessler, M [The University of Michigan, Ann Arbor, MI (United States); Litzenberg, D [Univ Michigan, Ann Arbor, MI (United States); Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M [University of Michigan, Ann Arbor, MI (United States); Keranen, W [Varian Medical Systems, Palo Alto, CA (United States); Covington, E [University of Michigan Hospital and Health System, Ann Arbor, MI (United States); Moran, J [Univ Michigan Medical Center, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  2. An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Donev, A; Garcia, A L; Alder, B J

    2007-07-30

    A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.

  3. Temporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis.

    Science.gov (United States)

    Sussman, Elyse; Winkler, István; Kreuzer, Judith; Saher, Marieke; Näätänen, Risto; Ritter, Walter

    2002-12-01

    Our previous study showed that the auditory context could influence whether two successive acoustic changes occurring within the temporal integration window (approximately 200ms) were pre-attentively encoded as a single auditory event or as two discrete events (Cogn Brain Res 12 (2001) 431). The aim of the current study was to assess whether top-down processes could influence the stimulus-driven processes in determining what constitutes an auditory event. Electroencepholagram (EEG) was recorded from 11 scalp electrodes to frequently occurring standard and infrequently occurring deviant sounds. Within the stimulus blocks, deviants either occurred only in pairs (successive feature changes) or both singly and in pairs. Event-related potential indices of change and target detection, the mismatch negativity (MMN) and the N2b component, respectively, were compared with the simultaneously measured performance in discriminating the deviants. Even though subjects could voluntarily distinguish the two successive auditory feature changes from each other, which was also indicated by the elicitation of the N2b target-detection response, top-down processes did not modify the event organization reflected by the MMN response. Top-down processes can extract elemental auditory information from a single integrated acoustic event, but the extraction occurs at a later processing stage than the one whose outcome is indexed by MMN. Initial processes of auditory event-formation are fully governed by the context within which the sounds occur. Perception of the deviants as two separate sound events (the top-down effects) did not change the initial neural representation of the same deviants as one event (indexed by the MMN), without a corresponding change in the stimulus-driven sound organization.

  4. Exporting Humanist Architecture

    DEFF Research Database (Denmark)

    Nielsen, Tom

    2016-01-01

    The article is a chapter in the catalogue for the Danish exhibition at the 2016 Architecture Biennale in Venice. The catalogue is conceived at an independent book exploring the theme Art of Many - The Right to Space. The chapter is an essay in this anthology tracing and discussing the different...... values and ethical stands involved in the export of Danish Architecture. Abstract: Danish architecture has, in a sense, been driven by an unwritten contract between the architects and the democratic state and its institutions. This contract may be viewed as an ethos – an architectural tradition...... with inherent aesthetic and moral values. Today, however, Danish architecture is also an export commodity. That raises questions, which should be debated as openly as possible. What does it mean for architecture and architects to practice in cultures and under political systems that do not use architecture...

  5. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  6. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  7. LCP method for a planar passive dynamic walker based on an event-driven scheme

    Science.gov (United States)

    Zheng, Xu-Dong; Wang, Qi

    2018-06-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  8. Events and mega events: leisure and business in tourism

    Directory of Open Access Journals (Sweden)

    Ricardo Alexandre Paiva

    2015-12-01

    Full Text Available The promotion of events and mega events mobilizes at the same time, in a concatenated way or not, leisure and business practices, which are captured by the tourism industry as a stimulus for the reproduction of capitalism, by the amount of other activities which raise (primary, secondary and tertiary , placing the architecture and the city as protagonists in contemporary urban development. In this sense, the article analyzes the articulation of events and mega events to the provision of architecture and urban infrastructure, as well as the construction of the tourist image of the places, motivated by leisure and business activities. The methodological procedures have theoretical and exploratory character and have multidisciplinary intentions. This will be discussed, in a historical perspective, the concepts of leisure and business activities that raise as moving or traveling; next it will be delimited similarities and differences between tourism events and business tourism, entering after the analysis of the distinctions between events and mega events, highlighting the complexity and the role of mega-events as a major symptom of globalization; finally it will be presented the spatial scale developments in architecture and the city in the realization of (mega events, as well as its impact on the city's image. As a synthesis, it is important to notice that spatial developments business tourism, events and mega events are manifested in various scales and with different levels of complexity, revealing the strengths and / or weaknesses of the places. The urban planning, architecture and urbanism are important objects of knowledge and spatial intervention to ensure infrastructure and urban and architectural structures appropriate for events, which should be sensitive to the demands of tourists and host communities.

  9. Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data

    Science.gov (United States)

    Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.

    2017-12-01

    Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.

  10. A Hybrid Adaptive Routing Algorithm for Event-Driven Wireless Sensor Networks

    Science.gov (United States)

    Figueiredo, Carlos M. S.; Nakamura, Eduardo F.; Loureiro, Antonio A. F.

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption. PMID:22423207

  11. Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A novel architecture is being proposed for the data acquisition and trigger system for PANDA experiment at the HESR facility at FAIR/GSI. The experiment will run without the hardware trigger signal and use timestamps to correlate detector data from a given time window. The broad physics program in combination with high rate of 2 10^7 interactions require very selective filtering algorithms which access information from almost all detectors. Therefore the effective filtering will happen later than it used to in today's systems ie after the event building. To assess that, the complete architecture will be built of two stages: the data concentrator stage providing event building and the rate reduction stage. For the former stage, which allows to switch 100 GB/s of event fragments to perform event building, we propose two layers of ATCA crates filled with compute nodes - modules designed at University of Giessen for trigger and data acquisition systems. Each board is equipped with 5 Virtex4 FX60 FPG...

  12. An Ontology Driven Information Architecture for Big Data and Diverse Domains

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Hardman, Sean; Joyner, Ron; Ramirez, Paul

    2013-04-01

    The Planetary Data System's has just released the PDS4 system for first use. Its architecture is comprised of three principle parts, an ontology that captures knowledge from the planetary science domain, a federated registry/repository system for product identification, versioning, tracking, and storage, and a REST-based service layer for search, retrieval, and distribution. An ontology modeling tool is used to prescriptively capture product definitions that adhere to object-oriented principles and that are compliant with specific registry, archive, and data dictionary reference models. The resulting information model is product centric, allowing all information to be packaged into products and tracked in the registry. The flexibility required in a diverse domain is provided through the use of object-oriented extensions and a hierarchical governance scheme with common, discipline, and mission levels. Finally all PDS4 data standards are generated or derived from the information model. The federated registry provides identification, versioning, and tracking functionality across federated repositories and is configured for deployment using configuration files generated from the ontology. Finally a REST-based service layer provides for metadata harvest, product transformation, packaging, and search, and portal hosting. A model driven architecture allows the data and software engineering teams to develop in parallel with minimal team interaction. The resulting software remains relatively stable as the domain evolves. Finally the development of a single shared ontology promotes interoperability and data correlation and helps meet the expectations of modern scientists for science data discovery, access and use. This presentation will provide an overview of PDS4 focusing on the data standards, how they were developed, how they are now being used, and will present some of the lessons learned while developing in a diverse scientific community. Copyright 2013 California

  13. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  14. Business model driven service architecture design for enterprise application integration

    OpenAIRE

    Gacitua-Decar, Veronica; Pahl, Claus

    2008-01-01

    Increasingly, organisations are using a Service-Oriented Architecture (SOA) as an approach to Enterprise Application Integration (EAI), which is required for the automation of business processes. This paper presents an architecture development process which guides the transition from business models to a service-based software architecture. The process is supported by business reference models and patterns. Firstly, the business process models are enhanced with domain model elements, applicat...

  15. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices

    Science.gov (United States)

    2018-01-01

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario. PMID:29748468

  16. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices.

    Science.gov (United States)

    Muñoz, Sergio; Araque, Oscar; Sánchez-Rada, J Fernando; Iglesias, Carlos A

    2018-05-10

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario.

  17. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices

    Directory of Open Access Journals (Sweden)

    Sergio Muñoz

    2018-05-01

    Full Text Available The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i the design of an emotion aware automation platform architecture for smart offices; (ii the semantic modelling of the system; and (iii the implementation and evaluation of the proposed architecture in a real scenario.

  18. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  19. Analytical method of CIM to PIM transformation in Model Driven Architecture (MDA

    Directory of Open Access Journals (Sweden)

    Martin Kardos

    2010-06-01

    Full Text Available Information system’s models on higher level of abstraction have become a daily routine in many software companies. The concept of Model Driven Architecture (MDA published by standardization body OMG1 since 2001 has become a concept for creation of software applications and information systems. MDA specifies four levels of abstraction: top three levels are created as graphical models and the last one as implementation code model. Many research works of MDA are focusing on the lower levels and transformations between each other. The top level of abstraction, called Computation Independent Model (CIM and its transformation to the lower level called Platform Independent Model (PIM is not so extensive research topic. Considering to a great importance and usability of this level in practice of IS2Keywords: transformation, MDA, CIM, PIM, UML, DFD. development now our research activity is focused to this highest level of abstraction – CIM and its possible transformation to the lower PIM level. In this article we are presenting a possible solution of CIM modeling and its analytic method of transformation to PIM.

  20. NOvA Event Building, Buffering and Data-Driven Triggering From Within the DAQ System

    International Nuclear Information System (INIS)

    Fischler, M; Rechenmacher, R; Green, C; Kowalkowski, J; Norman, A; Paterno, M

    2012-01-01

    The NOvA experiment is a long baseline neutrino experiment design to make precision probes of the structure of neutrino mixing. The experiment features a unique deadtimeless data acquisition system that is capable acquiring and building an event data stream from the continuous readout of the more than 360,000 far detector channels. In order to achieve its physics goals the experiment must be able to buffer, correlate and extract the data in this stream with the beam-spills that occur that Fermilab. In addition the NOvA experiment seeks to enhance its data collection efficiencies for rare class of event topologies that are valuable for calibration through the use of data driven triggering. The NOvA-DDT is a prototype Data-Driven Triggering system. NOvA-DDT has been developed using the Fermilab artdaq generic DAQ/Event-building toolkit. This toolkit provides the advantages of sharing online software infrastructure with other Intensity Frontier experiments, and of being able to use any offline analysis module-unchanged-as a component of the online triggering decisions. We have measured the performance and overhead of NOvA-DDT framework using a Hough transform based trigger decision module developed for the NOvA detector to identify cosmic rays. The results of these tests which were run on the NOvA prototype near detector, yielded a mean processing time of 98 ms per event, while consuming only 1/16th of the available processing capacity. These results provide a proof of concept that a NOvA-DDT based processing system is a viable strategy for data acquisition and triggering for the NOvA far detector.

  1. A High-Speed, Event-Driven, Active Pixel Sensor Readout for Photon-Counting Microchannel Plate Detectors

    Science.gov (United States)

    Kimble, Randy A.; Pain, Bedabrata; Norton, Timothy J.; Haas, J. Patrick; Oegerle, William R. (Technical Monitor)

    2002-01-01

    Silicon array readouts for microchannel plate intensifiers offer several attractive features. In this class of detector, the electron cloud output of the MCP intensifier is converted to visible light by a phosphor; that light is then fiber-optically coupled to the silicon array. In photon-counting mode, the resulting light splashes on the silicon array are recognized and centroided to fractional pixel accuracy by off-chip electronics. This process can result in very high (MCP-limited) spatial resolution while operating at a modest MCP gain (desirable for dynamic range and long term stability). The principal limitation of intensified CCD systems of this type is their severely limited local dynamic range, as accurate photon counting is achieved only if there are not overlapping event splashes within the frame time of the device. This problem can be ameliorated somewhat by processing events only in pre-selected windows of interest of by using an addressable charge injection device (CID) for the readout array. We are currently pursuing the development of an intriguing alternative readout concept based on using an event-driven CMOS Active Pixel Sensor. APS technology permits the incorporation of discriminator circuitry within each pixel. When coupled with suitable CMOS logic outside the array area, the discriminator circuitry can be used to trigger the readout of small sub-array windows only when and where an event splash has been detected, completely eliminating the local dynamic range problem, while achieving a high global count rate capability and maintaining high spatial resolution. We elaborate on this concept and present our progress toward implementing an event-driven APS readout.

  2. Research on a Hierarchical Dynamic Automatic Voltage Control System Based on the Discrete Event-Driven Method

    Directory of Open Access Journals (Sweden)

    Yong Min

    2013-06-01

    Full Text Available In this paper, concepts and methods of hybrid control systems are adopted to establish a hierarchical dynamic automatic voltage control (HD-AVC system, realizing the dynamic voltage stability of power grids. An HD-AVC system model consisting of three layers is built based on the hybrid control method and discrete event-driven mechanism. In the Top Layer, discrete events are designed to drive the corresponding control block so as to avoid solving complex multiple objective functions, the power system’s characteristic matrix is formed and the minimum amplitude eigenvalue (MAE is calculated through linearized differential-algebraic equations. MAE is applied to judge the system’s voltage stability and security and construct discrete events. The Middle Layer is responsible for management and operation, which is also driven by discrete events. Control values of the control buses are calculated based on the characteristics of power systems and the sensitivity method. Then control values generate control strategies through the interface block. In the Bottom Layer, various control devices receive and implement the control commands from the Middle Layer. In this way, a closed-loop power system voltage control is achieved. Computer simulations verify the validity and accuracy of the HD-AVC system, and verify that the proposed HD-AVC system is more effective than normal voltage control methods.

  3. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  4. Pattern-Driven Architectural Partitioning. Balancing Functional and Non-functional Requirements

    NARCIS (Netherlands)

    Harrison, Neil; Avgeriou, Paris

    2007-01-01

    One of the vexing challenges of software architecture is the problem of satisfying the functional specifications of the system to be created while at the same time meeting its non-functional needs. In this work we focus on the early stages of the software architecture process, when initial

  5. Network-driven design principles for neuromorphic systems

    OpenAIRE

    Partzsch, Johannes; Sch?ffny, Rene

    2015-01-01

    Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures i...

  6. Convolutional neural networks for event-related potential detection: impact of the architecture.

    Science.gov (United States)

    Cecotti, H

    2017-07-01

    The detection of brain responses at the single-trial level in the electroencephalogram (EEG) such as event-related potentials (ERPs) is a difficult problem that requires different processing steps to extract relevant discriminant features. While most of the signal and classification techniques for the detection of brain responses are based on linear algebra, different pattern recognition techniques such as convolutional neural network (CNN), as a type of deep learning technique, have shown some interests as they are able to process the signal after limited pre-processing. In this study, we propose to investigate the performance of CNNs in relation of their architecture and in relation to how they are evaluated: a single system for each subject, or a system for all the subjects. More particularly, we want to address the change of performance that can be observed between specifying a neural network to a subject, or by considering a neural network for a group of subjects, taking advantage of a larger number of trials from different subjects. The results support the conclusion that a convolutional neural network trained on different subjects can lead to an AUC above 0.9 by using an appropriate architecture using spatial filtering and shift invariant layers.

  7. Tree-based server-middleman-client architecture: improving scalability and reliability for voting-based network games in ad hoc wireless networks

    Science.gov (United States)

    Guo, Y.; Fujinoki, H.

    2006-10-01

    The concept of a new tree-based architecture for networked multi-player games was proposed by Matuszek to improve scalability in network traffic at the same time to improve reliability. The architecture (we refer it as "Tree-Based Server- Middlemen-Client architecture") will solve the two major problems in ad-hoc wireless networks: frequent link failures and significance in battery power consumption at wireless transceivers by using two new techniques, recursive aggregation of client messages and subscription-based propagation of game state. However, the performance of the TBSMC architecture has never been quantitatively studied. In this paper, the TB-SMC architecture is compared with the client-server architecture using simulation experiments. We developed an event driven simulator to evaluate the performance of the TB-SMC architecture. In the network traffic scalability experiments, the TB-SMC architecture resulted in less than 1/14 of the network traffic load for 200 end users. In the reliability experiments, the TB-SMC architecture improved the number of successfully delivered players' votes by 31.6, 19.0, and 12.4% from the clientserver architecture at high (failure probability of 90%), moderate (50%) and low (10%) failure probability.

  8. A Full Parallel Event Driven Readout Technique for Area Array SPAD FLIM Image Sensors

    Directory of Open Access Journals (Sweden)

    Kaiming Nie

    2016-01-01

    Full Text Available This paper presents a full parallel event driven readout method which is implemented in an area array single-photon avalanche diode (SPAD image sensor for high-speed fluorescence lifetime imaging microscopy (FLIM. The sensor only records and reads out effective time and position information by adopting full parallel event driven readout method, aiming at reducing the amount of data. The image sensor includes four 8 × 8 pixel arrays. In each array, four time-to-digital converters (TDCs are used to quantize the time of photons’ arrival, and two address record modules are used to record the column and row information. In this work, Monte Carlo simulations were performed in Matlab in terms of the pile-up effect induced by the readout method. The sensor’s resolution is 16 × 16. The time resolution of TDCs is 97.6 ps and the quantization range is 100 ns. The readout frame rate is 10 Mfps, and the maximum imaging frame rate is 100 fps. The chip’s output bandwidth is 720 MHz with an average power of 15 mW. The lifetime resolvability range is 5–20 ns, and the average error of estimated fluorescence lifetimes is below 1% by employing CMM to estimate lifetimes.

  9. Integration of Enzymes in Polyaniline-Sensitized 3D Inverse Opal TiO2 Architectures for Light-Driven Biocatalysis and Light-to-Current Conversion.

    Science.gov (United States)

    Riedel, Marc; Lisdat, Fred

    2018-01-10

    Inspired by natural photosynthesis, coupling of artificial light-sensitive entities with biocatalysts in a biohybrid format can result in advanced photobioelectronic systems. Herein, we report on the integration of sulfonated polyanilines (PMSA1) and PQQ-dependent glucose dehydrogenase (PQQ-GDH) into inverse opal TiO 2 (IO-TiO 2 ) electrodes. While PMSA1 introduces sensitivity for visible light into the biohybrid architecture and ensures the efficient wiring between the IO-TiO 2 electrode and the biocatalytic entity, PQQ-GDH provides the catalytic activity for the glucose oxidation and therefore feeds the light-driven reaction with electrons for an enhanced light-to-current conversion. Here, the IO-TiO 2 electrodes with pores of around 650 nm provide a suitable interface and morphology needed for the stable and functional assembly of polymer and enzyme. The IO-TiO 2 electrodes have been prepared by a template approach applying spin coating, allowing an easy scalability of the electrode height and surface area. The successful integration of the polymer and the enzyme is confirmed by the generation of an anodic photocurrent, showing an enhanced magnitude with increasing glucose concentrations. Compared to flat and nanostructured TiO 2 electrodes, the three-layered IO-TiO 2 electrodes give access to a 24-fold and 29-fold higher glucose-dependent photocurrent due to the higher polymer and enzyme loading in IO films. The three-dimensional IO-TiO 2 |PMSA1|PQQ-GDH architecture reaches maximum photocurrent densities of 44.7 ± 6.5 μA cm -2 at low potentials in the presence of glucose (for a three TiO 2 layer arrangement). The onset potential for the light-driven substrate oxidation is found to be at -0.315 V vs Ag/AgCl (1 M KCl) under illumination with 100 mW cm -2 , which is more negative than the redox potential of the enzyme. The results demonstrate the advantageous properties of IO-TiO 2 |PMSA1|PQQ-GDH biohybrid architectures for the light-driven glucose conversion

  10. Service Modularity and Architecture

    DEFF Research Database (Denmark)

    Brax, Saara A.; Bask, Anu; Hsuan, Juliana

    2017-01-01

    , platform-based and mass-customized service business models, comparative research designs, customer perspectives and service experience, performance in context of modular services, empirical evidence of benefits and challenges, architectural innovation in services, modularization in multi-provider contexts......Purpose: Services are highly important in a world economy which has increasingly become service driven. There is a growing need to better understand the possibilities for, and requirements of, designing modular service architectures. The purpose of this paper is to elaborate on the roots...... of the emerging research stream on service modularity, provide a concise overview of existing work on the subject, and outline an agenda for future research on service modularity and architecture. The articles in the special issue offer four diverse sets of research on service modularity and architecture. Design...

  11. Exploring a model-driven architecture (MDA) approach to health care information systems development.

    Science.gov (United States)

    Raghupathi, Wullianallur; Umar, Amjad

    2008-05-01

    To explore the potential of the model-driven architecture (MDA) in health care information systems development. An MDA is conceptualized and developed for a health clinic system to track patient information. A prototype of the MDA is implemented using an advanced MDA tool. The UML provides the underlying modeling support in the form of the class diagram. The PIM to PSM transformation rules are applied to generate the prototype application from the model. The result of the research is a complete MDA methodology to developing health care information systems. Additional insights gained include development of transformation rules and documentation of the challenges in the application of MDA to health care. Design guidelines for future MDA applications are described. The model has the potential for generalizability. The overall approach supports limited interoperability and portability. The research demonstrates the applicability of the MDA approach to health care information systems development. When properly implemented, it has the potential to overcome the challenges of platform (vendor) dependency, lack of open standards, interoperability, portability, scalability, and the high cost of implementation.

  12. Self-Adaptive Event-Driven Simulation of Multi-Scale Plasma Systems

    Science.gov (United States)

    Omelchenko, Yuri; Karimabadi, Homayoun

    2005-10-01

    Multi-scale plasmas pose a formidable computational challenge. The explicit time-stepping models suffer from the global CFL restriction. Efficient application of adaptive mesh refinement (AMR) to systems with irregular dynamics (e.g. turbulence, diffusion-convection-reaction, particle acceleration etc.) may be problematic. To address these issues, we developed an alternative approach to time stepping: self-adaptive discrete-event simulation (DES). DES has origin in operations research, war games and telecommunications. We combine finite-difference and particle-in-cell techniques with this methodology by assuming two caveats: (1) a local time increment, dt for a discrete quantity f can be expressed in terms of a physically meaningful quantum value, df; (2) f is considered to be modified only when its change exceeds df. Event-driven time integration is self-adaptive as it makes use of causality rules rather than parametric time dependencies. This technique enables asynchronous flux-conservative update of solution in accordance with local temporal scales, removes the curse of the global CFL condition, eliminates unnecessary computation in inactive spatial regions and results in robust and fast parallelizable codes. It can be naturally combined with various mesh refinement techniques. We discuss applications of this novel technology to diffusion-convection-reaction systems and hybrid simulations of magnetosonic shocks.

  13. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  14. Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Bawej, Tomasz; et al.

    2014-01-01

    TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities. During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.

  15. Event-driven control of a speed varying digital displacement machine

    DEFF Research Database (Denmark)

    Pedersen, Niels Henrik; Johansen, Per; Andersen, Torben O.

    2017-01-01

    . The controller synthesis is carried out as a discrete optimal deterministic problem with full state feedback. Based on a linear analysis of the feedback control system, stability is proven in a pre-specified operation region. Simulation of a non-linear evaluation model with the controller implemented shows great...... be treated as a Discrete Linear Time Invariant control problem with synchronous sampling rate. To make synchronous linear control theory applicable for a variable speed digital displacement machine, a method based on event-driven control is presented. Using this method, the time domain differential equations...... are converted into the spatial (position) domain to obtain a constant sampling rate and thus allowing for use of classical control theory. The method is applied to a down scaled digital fluid power motor, where the motor speed is controlled at varying references under varying pressure and load torque conditions...

  16. The ATLAS Analysis Architecture

    International Nuclear Information System (INIS)

    Cranmer, K.S.

    2008-01-01

    We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability

  17. Future demands for an Industrialized Architecture?

    DEFF Research Database (Denmark)

    Beim, Anne

    2011-01-01

    When speaking about the future demands for industrialized architecture – or how to translate industrialized processes into tectonic sustainable design strategies in architecture – several questions come to mind. First of all, why is the building industry in comparison to the production industry...... these questions raise a wide-spread discussion, one could argue that the building industry can benefit from different ways of architectural synthesis thinking as a basis for improving. This understood in such a way that industrialized manufacturing technologies and products should be driven by ideas...

  18. Science-Driven Computing: NERSC's Plan for 2006-2010

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.; Banda,Michael J.; Bethel, E. Wes; Craw, James M.; Fortney, William J.; Hules,John A.; Meyer, Nancy L.; Meza, Juan C.; Ng, Esmond G.; Rippe, Lynn E.; Saphir, William C.; Verdier, Francesca; Walter, Howard A.; Yelick,Katherine A.

    2005-05-16

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise of the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.

  19. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  20. Unraveling Supply-Driven Business Models of Architectural Firms

    NARCIS (Netherlands)

    Bos-De Vos, M.; Volker, L.; Wamelink, J.W.F.; Kaminsky, Jessica; Zerjav, Vedran

    2016-01-01

    Architectural firms deliver services for various, unique projects that are all characterized by a high level of uncertainty. To successfully propose, create and capture value, they need business models that are able to deal with this variety and uncertainty. So far, little is known about the

  1. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    Science.gov (United States)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  2. Ballooning-mirror instability and internally driven Pc 4--5 wave events

    International Nuclear Information System (INIS)

    Cheng, C.Z.; Qian, Q.; Takahashi, K.; Lui, A.T.Y.

    1994-03-01

    A kinetic-MHD field-aligned eigenmode stability analysis of low frequency ballooning-mirror instabilities has been performed for anisotropic pressure plasma sin the magnetosphere. The ballooning mode is mainly a transverse wave driven unstable by pressure gradient in the bad curvature region. The mirror mode with a dominant compressional magnetic field perturbation is excited when the product of plasma beta and pressure anisotropy (P perpendicular /P parallel > 1) is large. From the AMPTE/CCE particle and magnetic field data observed during Pc 4--5 wave events the authors compute the ballooning-mirror instability parameters and perform a correlation study with the theoretical instability threshold. They find that compressional Pc 5 waves approximately satisfy the ballooning-mirror instability condition, and transverse Pc 4--5 waves are probably related to resonant ballooning instabilities with small pressure anisotropy

  3. Tectonic thinking in contemporary industrialized architecture

    DEFF Research Database (Denmark)

    Beim, Anne

    2013-01-01

    a creative force in building constructions, structural features and architectural design (construing) – helps to identify and refine technology transfer in contemporary industrialized building construction’. Through various references from the construction industry, business theory and architectural practice......This paper argues for a new critical approach to the ways architectural design strategies are developing. Contemporary construction industry appears to evolve into highly specialized and optimized processes driven by industrialized manufacturing, therefore the role of the architect...... and the understanding of the architectural design process ought to be revised. The paper is based on the following underlying hypothesis: ‘Tectonic thinking – defined as a central attention towards the nature, the properties, and the application of building materials (construction) and how this attention forms...

  4. NEVESIM: event-driven neural simulation framework with a Python interface.

    Science.gov (United States)

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  5. Sensitivity of Water Scarcity Events to ENSO-Driven Climate Variability at the Global Scale

    Science.gov (United States)

    Veldkamp, T. I. E.; Eisner, S.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2015-01-01

    Globally, freshwater shortage is one of the most dangerous risks for society. Changing hydro-climatic and socioeconomic conditions have aggravated water scarcity over the past decades. A wide range of studies show that water scarcity will intensify in the future, as a result of both increased consumptive water use and, in some regions, climate change. Although it is well-known that El Niño- Southern Oscillation (ENSO) affects patterns of precipitation and drought at global and regional scales, little attention has yet been paid to the impacts of climate variability on water scarcity conditions, despite its importance for adaptation planning. Therefore, we present the first global-scale sensitivity assessment of water scarcity to ENSO, the most dominant signal of climate variability. We show that over the time period 1961-2010, both water availability and water scarcity conditions are significantly correlated with ENSO-driven climate variability over a large proportion of the global land area (> 28.1 %); an area inhabited by more than 31.4% of the global population. We also found, however, that climate variability alone is often not enough to trigger the actual incidence of water scarcity events. The sensitivity of a region to water scarcity events, expressed in terms of land area or population exposed, is determined by both hydro-climatic and socioeconomic conditions. Currently, the population actually impacted by water scarcity events consists of 39.6% (CTA: consumption-to-availability ratio) and 41.1% (WCI: water crowding index) of the global population, whilst only 11.4% (CTA) and 15.9% (WCI) of the global population is at the same time living in areas sensitive to ENSO-driven climate variability. These results are contrasted, however, by differences in growth rates found under changing socioeconomic conditions, which are relatively high in regions exposed to water scarcity events. Given the correlations found between ENSO and water availability and scarcity

  6. BioCat 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Noonan, Christine F.; Bartholomew, Rachel A.; Franklin, Trisha L.; Hutchison, Janine R.; Lancaster, Mary J.; Madison, Michael C.; Piatt, Andrew W.

    2013-09-16

    The U.S. Department of Homeland Security (DHS) National Biosurveillance Integration Center (NBIC) was established in 2008 with a primary mission to “(1) enhance the capability of the Federal Government to (A) rapidly identify, characterize, localize, and track a biological event of national concern by integrating and analyzing data relating to human health, animal, plant, food, and environmental monitoring systems (both national and international); and (B) disseminate alerts and other information to Member Agencies and, in coordination with (and where possible through) Member Agencies, to agencies of State, local, and tribal governments, as appropriate, to enhance the ability of such agencies to respond to a biological event of national concern; and (2) oversee development and operation of the National Biosurveillance Integration System (NBIS).” Inherent in its mission then and the broader NBIS, NBIC is concerned with the identification, understanding, and use of a variety of biosurveillance models and systems. The goal of this project is to characterize, evaluate, classify, and catalog existing disease forecast and prediction models that could provide operational decision support for recognizing a biological event having a potentially significant impact. Additionally, gaps should be identified and recommendations made on using disease models in an operational environment to support real-time decision making.

  7. Architectural Analysis of Dynamically Reconfigurable Systems

    Science.gov (United States)

    Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly

    2010-01-01

    oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.

  8. Architectural Innovations Influenced by Climatic Phenomena (4.2 Ka Event in the Late Old Kingdom (Saqqara, Egypt

    Directory of Open Access Journals (Sweden)

    Kuraszkiewicz Kamil O.

    2016-06-01

    Full Text Available The work of the Polish-Egyptian Archaeological Mission at Saqqara revealed a cemetery of palace officials that was in use during the late Old Kingdom. The evidence found during the exploration of the tombs indicates that the tomb builders were aware of the problems resulting from torrential rains in last years of functioning of the cemetery and that architectural solutions have been invented against these problems. The discussed phenomena seem to be directly related to the 4.2 ka event.

  9. Minimizing cache misses in an event-driven network server: A case study of TUX

    DEFF Research Database (Denmark)

    Bhatia, Sapan; Consel, Charles; Lawall, Julia Laetitia

    2006-01-01

    We analyze the performance of CPU-bound network servers and demonstrate experimentally that the degradation in the performance of these servers under high-concurrency workloads is largely due to inefficient use of the hardware caches. We then describe an approach to speeding up event-driven network...... servers by optimizing their use of the L2 CPU cache in the context of the TUX Web server, known for its robustness to heavy load. Our approach is based on a novel cache-aware memory allocator and a specific scheduling strategy that together ensure that the total working data set of the server stays...

  10. Event-Driven Process Chains (EPC)

    Science.gov (United States)

    Mendling, Jan

    This chapter provides a comprehensive overview of Event-driven Process Chains (EPCs) and introduces a novel definition of EPC semantics. EPCs became popular in the 1990s as a conceptual business process modeling language in the context of reference modeling. Reference modeling refers to the documentation of generic business operations in a model such as service processes in the telecommunications sector, for example. It is claimed that reference models can be reused and adapted as best-practice recommendations in individual companies (see [230, 168, 229, 131, 400, 401, 446, 127, 362, 126]). The roots of reference modeling can be traced back to the Kölner Integrationsmodell (KIM) [146, 147] that was developed in the 1960s and 1970s. In the 1990s, the Institute of Information Systems (IWi) in Saarbrücken worked on a project with SAP to define a suitable business process modeling language to document the processes of the SAP R/3 enterprise resource planning system. There were two results from this joint effort: the definition of EPCs [210] and the documentation of the SAP system in the SAP Reference Model (see [92, 211]). The extensive database of this reference model contains almost 10,000 sub-models: 604 of them non-trivial EPC business process models. The SAP Reference model had a huge impact with several researchers referring to it in their publications (see [473, 235, 127, 362, 281, 427, 415]) as well as motivating the creation of EPC reference models in further domains including computer integrated manufacturing [377, 379], logistics [229] or retail [52]. The wide-spread application of EPCs in business process modeling theory and practice is supported by their coverage in seminal text books for business process management and information systems in general (see [378, 380, 49, 384, 167, 240]). EPCs are frequently used in practice due to a high user acceptance [376] and extensive tool support. Some examples of tools that support EPCs are ARIS Toolset by IDS

  11. Big data in cloud : a data architecture

    OpenAIRE

    Sá, Jorge Vaz de Oliveira e; Martins, César Silva; Simões, Paulo

    2015-01-01

    Nowadays, organizations have at their disposal a large volume of data with a wide variety of types. Technology-driven organizations want to capture process and analyze this data at a fast velocity, in order to better understand and manage their customers, their operations and their business processes. As much as data volume and variety increases and as faster analytic results are needed, more demanding is for a data architecture. This data architecture should enable collecting,...

  12. Ontology Driven Meta-Modeling of Service Oriented Architecture

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... #Department of Computer Applications, National Institute of ... *5Department of Computer Science, Winona State University, MN, USA ..... Further, it has aided in service .... Software: A Research Roadmap”, Workshop on the Future of ... and A. Solberg, “Model-driven service engineering with SoaML”, in.

  13. Domain architecture conservation in orthologs

    Science.gov (United States)

    2011-01-01

    Background As orthologous proteins are expected to retain function more often than other homologs, they are often used for functional annotation transfer between species. However, ortholog identification methods do not take into account changes in domain architecture, which are likely to modify a protein's function. By domain architecture we refer to the sequential arrangement of domains along a protein sequence. To assess the level of domain architecture conservation among orthologs, we carried out a large-scale study of such events between human and 40 other species spanning the entire evolutionary range. We designed a score to measure domain architecture similarity and used it to analyze differences in domain architecture conservation between orthologs and paralogs relative to the conservation of primary sequence. We also statistically characterized the extents of different types of domain swapping events across pairs of orthologs and paralogs. Results The analysis shows that orthologs exhibit greater domain architecture conservation than paralogous homologs, even when differences in average sequence divergence are compensated for, for homologs that have diverged beyond a certain threshold. We interpret this as an indication of a stronger selective pressure on orthologs than paralogs to retain the domain architecture required for the proteins to perform a specific function. In general, orthologs as well as the closest paralogous homologs have very similar domain architectures, even at large evolutionary separation. The most common domain architecture changes observed in both ortholog and paralog pairs involved insertion/deletion of new domains, while domain shuffling and segment duplication/deletion were very infrequent. Conclusions On the whole, our results support the hypothesis that function conservation between orthologs demands higher domain architecture conservation than other types of homologs, relative to primary sequence conservation. This supports the

  14. A distributed real-time system for event-driven control and dynamic data acquisition on a fusion plasma experiment

    International Nuclear Information System (INIS)

    Sousa, J.; Combo, A.; Batista, A.; Correia, M.; Trotman, D.; Waterhouse, J.; Varandas, C.A.F.

    2000-01-01

    A distributed real-time trigger and timing system, designed in a tree-type topology and implemented in VME and CAMAC versions, has been developed for a magnetic confinement fusion experiment. It provides sub-microsecond time latencies for the transport of small data objects allowing event-driven discharge control with failure counteraction, dynamic pre-trigger sampling and event recording as well as accurate simultaneous triggers and synchronism on all nodes with acceptable optimality and predictability of timeliness. This paper describes the technical characteristics of the hardware components (central unit composed by one or more reflector crates, event and synchronism reflector cards, event and pulse node module, fan-out and fan-in modules) as well as software for both tests and integration on a global data acquisition system. The results of laboratory operation for several configurations and the overall performance of the system are presented and analysed

  15. Architectural Refinement for the Design of Survivable Systems

    National Research Council Canada - National Science Library

    Ellison, Robert

    2001-01-01

    ...; that is, have no central administration and no unified security policy. The survivable architecture refinement is an iterative risk-driven process which adopts the structure of Boehm's Spiral Model Boehm 88...

  16. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    that a rain time series can be considered by an alternation of independent rain event and no rain period. The five selected feature are used to perform a hierarchical clustering of the events. The well-known division between stratiform and convective events appears clearly. This classification into two classes is then refined in 5 fairly homogeneous subclasses. The data driven analysis performed on whole rain events instead of fixed length samples allows identifying strong relationships between macrophysics (based on rain rate) and microphysics (based on raindrops) features. We show that among the 5 identified subclasses some of them have specific microphysics characteristics. Obtaining information on microphysical characteristics of rainfall events from rain gauges measurement suggests many implications in development of the quantitative precipitation estimation (QPE), for the improvement of rain rate retrieval algorithm in remote sensing context.

  17. Future demands for an Industrialized Architecture?

    DEFF Research Database (Denmark)

    Beim, Anne

    2011-01-01

    these questions raise a wide-spread discussion, one could argue that the building industry can benefit from different ways of architectural synthesis thinking as a basis for improving. This understood in such a way that industrialized manufacturing technologies and products should be driven by ideas...

  18. Integrated Optoelectronic Networks for Application-Driven Multicore Computing

    Science.gov (United States)

    2017-05-08

    AFRL-AFOSR-VA-TR-2017-0102 Integrated Optoelectronic Networks for Application- Driven Multicore Computing Sudeep Pasricha COLORADO STATE UNIVERSITY...AND SUBTITLE Integrated Optoelectronic Networks for Application-Driven Multicore Computing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-13-1-0110 5c...and supportive materials with innovative architectural designs that integrate these components according to system-wide application needs. 15

  19. The Need for Software Architecture Evaluation in the Acquisition of Software-Intensive Sysetms

    Science.gov (United States)

    2014-01-01

    Function and Performance Specification GIG Global Information Grid ISO International Standard Organisation MDA Model Driven Architecture...architecture and design, which is a key part of knowledge-based economy UNCLASSIFIED DSTO-TR-2936 UNCLASSIFIED 24  Allow Australian SMEs to

  20. Laboratory infrastructure driven key performance indicator development using the smart grid architecture model

    DEFF Research Database (Denmark)

    Syed, Mazheruddin H.; Guillo-Sansano, Efren; Blair, Steven M.

    2017-01-01

    This study presents a methodology for collaboratively designing laboratory experiments and developing key performance indicators for the testing and validation of novel power system control architectures in multiple laboratory environments. The contribution makes use of the smart grid architecture...

  1. Trans-architecture

    Directory of Open Access Journals (Sweden)

    Tim Gough

    2017-12-01

    Full Text Available Starting from the intense experience of the gay club, this paper asks whether that experience or event can be acknowledged by architectural theory. Via a reading of Judith Butler’s Gender Trouble and Jacques Derrida’s Before the Law, it posits that the transing of gender can be a clue as to the transing of architecture away from essentialising ontologies. It then uses Deleuze and Guattari’s idea of an assemblage to show how this can be done, making reference to the assemblage of the gay seduction scene in Proust’s Remembrance of Things Past and the image of the interplay of orchid and wasp that is inspired by it. The paper concludes by showing how this ontology relates to a specific instance of transing architecture in the gay and SM clubs of Vauxhall, South London.

  2. Development of a data-driven algorithm to determine the W+jets background in t anti t events in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Mehlhase, Sascha

    2010-07-12

    The physics of the top quark is one of the key components in the physics programme of the ATLAS experiment at the Large Hadron Collider at CERN. In this thesis, general studies of the jet trigger performance for top quark events using fully simulated Monte Carlo samples are presented and two data-driven techniques to estimate the multi-jet trigger efficiency and the W+Jets background in top pair events are introduced to the ATLAS experiment. In a tag-and-probe based method, using a simple and common event selection and a high transverse momentum lepton as tag object, the possibility to estimate the multijet trigger efficiency from data in ATLAS is investigated and it is shown that the method is capable of estimating the efficiency without introducing any significant bias by the given tag selection. In the second data-driven analysis a new method to estimate the W+Jets background in a top-pair event selection is introduced to ATLAS. By defining signal and background dominated regions by means of the jet multiplicity and the pseudo-rapidity distribution of the lepton in the event, the W+Jets contribution is extrapolated from the background dominated into the signal dominated region. The method is found to estimate the given background contribution as a function of the jet multiplicity with an accuracy of about 25% for most of the top dominated region with an integrated luminosity of above 100 pb{sup -1} at {radical}(s) = 10 TeV. This thesis also covers a study summarising the thermal behaviour and expected performance of the Pixel Detector of ATLAS. All measurements performed during the commissioning phase of 2008/09 yield results within the specification of the system and the performance is expected to stay within those even after several years of running under LHC conditions. (orig.)

  3. Development of a data-driven algorithm to determine the W+jets background in t anti t events in ATLAS

    International Nuclear Information System (INIS)

    Mehlhase, Sascha

    2010-01-01

    The physics of the top quark is one of the key components in the physics programme of the ATLAS experiment at the Large Hadron Collider at CERN. In this thesis, general studies of the jet trigger performance for top quark events using fully simulated Monte Carlo samples are presented and two data-driven techniques to estimate the multi-jet trigger efficiency and the W+Jets background in top pair events are introduced to the ATLAS experiment. In a tag-and-probe based method, using a simple and common event selection and a high transverse momentum lepton as tag object, the possibility to estimate the multijet trigger efficiency from data in ATLAS is investigated and it is shown that the method is capable of estimating the efficiency without introducing any significant bias by the given tag selection. In the second data-driven analysis a new method to estimate the W+Jets background in a top-pair event selection is introduced to ATLAS. By defining signal and background dominated regions by means of the jet multiplicity and the pseudo-rapidity distribution of the lepton in the event, the W+Jets contribution is extrapolated from the background dominated into the signal dominated region. The method is found to estimate the given background contribution as a function of the jet multiplicity with an accuracy of about 25% for most of the top dominated region with an integrated luminosity of above 100 pb -1 at √(s) = 10 TeV. This thesis also covers a study summarising the thermal behaviour and expected performance of the Pixel Detector of ATLAS. All measurements performed during the commissioning phase of 2008/09 yield results within the specification of the system and the performance is expected to stay within those even after several years of running under LHC conditions. (orig.)

  4. Role of data aggregation in biosurveillance detection strategies with applications from ESSENCE.

    Science.gov (United States)

    Burkom, Howard S; Elbert, Y; Feldman, A; Lin, J

    2004-09-24

    Syndromic surveillance systems are used to monitor daily electronic data streams for anomalous counts of features of varying specificity. The monitored quantities might be counts of clinical diagnoses, sales of over-the-counter influenza remedies, school absenteeism among a given age group, and so forth. Basic data-aggregation decisions for these systems include determining which records to count and how to group them in space and time. This paper discusses the application of spatial and temporal data-aggregation strategies for multiple data streams to alerting algorithms appropriate to the surveillance region and public health threat of interest. Such a strategy was applied and evaluated for a complex, authentic, multisource, multiregion environment, including >2 years of data records from a system-evaluation exercise for the Defense Advanced Research Project Agency (DARPA). Multivariate and multiple univariate statistical process control methods were adapted and applied to the DARPA data collection. Comparative parametric analyses based on temporal aggregation were used to optimize the performance of these algorithms for timely detection of a set of outbreaks identified in the data by a team of epidemiologists. The sensitivity and timeliness of the most promising detection methods were tested at empirically calculated thresholds corresponding to multiple practical false-alert rates. Even at the strictest false-alert rate, all but one of the outbreaks were detected by the best method, and the best methods achieved a 1-day median time before alert over the set of test outbreaks. These results indicate that a biosurveillance system can provide a substantial alerting-timeliness advantage over traditional public health monitoring for certain outbreaks. Comparative analyses of individual algorithm results indicate further achievable improvement in sensitivity and specificity.

  5. GAUDI-Architecture design document

    CERN Document Server

    Mato, P

    1998-01-01

    98-064 This document is the result of the architecture design phase for the LHCb event data processing applications project. The architecture of the LHCb software system includes its logical and physical structure which has been forged by all the strategic and tactical decisions applied during development. The strategic decisions should be made explicitly with the considerations for the trade-off of each alternative. The other purpose of this document is that it serves as the main material for the scheduled architecture review that will take place in the next weeks. The architecture review will allow us to identify what are the weaknesses or strengths of the proposed architecture as well as we hope to obtain a list of suggested changes to improve it. All that well before the system is being realized in code. It is in our interest to identify the possible problems at the architecture design phase of the software project before much of the software is implemented. Strategic decisions must be cross checked caref...

  6. (I) A Declarative Framework for ERP Systems(II) Reactors: A Data-Driven Programming Model for Distributed Applications

    DEFF Research Database (Denmark)

    Stefansen, Christian Oskar Erik

    This dissertation is a collection of six adapted research papers pertaining to two areas of research. (I) A Declarative Framework for ERP Systems: • POETS: Process-Oriented Event-driven Transaction Systems. The paper describes an ontological analysis of a small segment of the enterprise domain......, namely the general ledger and accounts receivable. The result is an event-based approach to designing ERP systems and an abstract-level sketch of the architecture. • Compositional Specification of Commercial Contracts. The paper describes the design, multiple semantics, and use of a domain....... • Using Soft Constraints to Guide Users in Flexible Business Process Management Systems. The paper shows how the inability of a process language to express soft constraints—constraints that can be violated occasionally, but are closely monitored—leads to a loss of intentional information in process...

  7. Editorial - Special Issue on Model-driven Service-oriented architectures

    NARCIS (Netherlands)

    Andrade Almeida, João; Ferreira Pires, Luis; van Sinderen, Marten J.; Steen, M.W.A.

    2009-01-01

    Model-driven approaches to software development have proliferated in recent years owing to the availability of techniques based on metamodelling and model transformations, such as the meta-object facility (MOF) and the query view transformation (QVT) standards. During the same period,

  8. Architecture design of the multi-functional wavelet-based ECG microprocessor for realtime detection of abnormal cardiac events.

    Science.gov (United States)

    Cheng, Li-Fang; Chen, Tung-Chien; Chen, Liang-Gee

    2012-01-01

    Most of the abnormal cardiac events such as myocardial ischemia, acute myocardial infarction (AMI) and fatal arrhythmia can be diagnosed through continuous electrocardiogram (ECG) analysis. According to recent clinical research, early detection and alarming of such cardiac events can reduce the time delay to the hospital, and the clinical outcomes of these individuals can be greatly improved. Therefore, it would be helpful if there is a long-term ECG monitoring system with the ability to identify abnormal cardiac events and provide realtime warning for the users. The combination of the wireless body area sensor network (BASN) and the on-sensor ECG processor is a possible solution for this application. In this paper, we aim to design and implement a digital signal processor that is suitable for continuous ECG monitoring and alarming based on the continuous wavelet transform (CWT) through the proposed architectures--using both programmable RISC processor and application specific integrated circuits (ASIC) for performance optimization. According to the implementation results, the power consumption of the proposed processor integrated with an ASIC for CWT computation is only 79.4 mW. Compared with the single-RISC processor, about 91.6% of the power reduction is achieved.

  9. Network-driven design principles for neuromorphic systems

    Directory of Open Access Journals (Sweden)

    Johannes ePartzsch

    2015-10-01

    Full Text Available Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems.

  10. Network-driven design principles for neuromorphic systems.

    Science.gov (United States)

    Partzsch, Johannes; Schüffny, Rene

    2015-01-01

    Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems.

  11. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  12. Optimal event handling by multiple unmanned aerial vehicles

    NARCIS (Netherlands)

    de Roo, Martijn; Frasca, Paolo; Carloni, Raffaella

    This paper proposes a control architecture for a fleet of unmanned aerial vehicles that is responsible for handling the events that take place in a given area. The architecture guarantees that each event is handled by the required number of vehicles in the shortest time, while the rest of the fleet

  13. Synchronization of autonomous objects in discrete event simulation

    Science.gov (United States)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  14. Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines.

    Science.gov (United States)

    Neftci, Emre O; Augustine, Charles; Paul, Somnath; Detorakis, Georgios

    2017-01-01

    An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Gradient Back Propagation (BP) rule, often relies on the immediate availability of network-wide information stored with high-precision memory during learning, and precise operations that are difficult to realize in neuromorphic hardware. Remarkably, recent work showed that exact backpropagated gradients are not essential for learning deep representations. Building on these results, we demonstrate an event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations. Using a two-compartment Leaky Integrate & Fire (I&F) neuron, the rule requires only one addition and two comparisons for each synaptic weight, making it very suitable for implementation in digital or mixed-signal neuromorphic hardware. Our results show that using eRBP, deep representations are rapidly learned, achieving classification accuracies on permutation invariant datasets comparable to those obtained in artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.

  15. Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines

    Directory of Open Access Journals (Sweden)

    Emre O. Neftci

    2017-06-01

    Full Text Available An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Gradient Back Propagation (BP rule, often relies on the immediate availability of network-wide information stored with high-precision memory during learning, and precise operations that are difficult to realize in neuromorphic hardware. Remarkably, recent work showed that exact backpropagated gradients are not essential for learning deep representations. Building on these results, we demonstrate an event-driven random BP (eRBP rule that uses an error-modulated synaptic plasticity for learning deep representations. Using a two-compartment Leaky Integrate & Fire (I&F neuron, the rule requires only one addition and two comparisons for each synaptic weight, making it very suitable for implementation in digital or mixed-signal neuromorphic hardware. Our results show that using eRBP, deep representations are rapidly learned, achieving classification accuracies on permutation invariant datasets comparable to those obtained in artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.

  16. An optical study of multiple NEIAL events driven by low energy electron precipitation

    Directory of Open Access Journals (Sweden)

    J. M. Sullivan

    2008-08-01

    Full Text Available Optical data are compared with EISCAT radar observations of multiple Naturally Enhanced Ion-Acoustic Line (NEIAL events in the dayside cusp. This study uses narrow field of view cameras to observe small-scale, short-lived auroral features. Using multiple-wavelength optical observations, a direct link between NEIAL occurrences and low energy (about 100 eV optical emissions is shown. This is consistent with the Langmuir wave decay interpretation of NEIALs being driven by streams of low-energy electrons. Modelling work connected with this study shows that, for the measured ionospheric conditions and precipitation characteristics, growth of unstable Langmuir (electron plasma waves can occur, which decay into ion-acoustic wave modes. The link with low energy optical emissions shown here, will enable future studies of the shape, extent, lifetime, grouping and motions of NEIALs.

  17. On the Role of Ionospheric Ions in Sawtooth Events

    Science.gov (United States)

    Lund, E. J.; Nowrouzi, N.; Kistler, L. M.; Cai, X.; Frey, H. U.

    2018-01-01

    Simulations have suggested that feedback of heavy ions originating in the ionosphere is an important mechanism for driving sawtooth injections. However, this feedback may only be necessary for events driven by coronal mass ejections (CMEs), whereas in events driven by streaming interaction regions (SIRs), solar wind variability may suffice to drive these injections. Here we present case studies of two sawtooth events for which in situ data are available in both the magnetotail (Cluster) and the nightside auroral region (FAST), as well as global auroral images (IMAGE). One event, on 1 October 2001, was driven by a CME; the other, on 24 October 2002, was driven by an SIR. The available data do not support the hypothesis that heavy ion feedback is necessary to drive either event. This result is consistent with simulations of the SIR-driven event but disagrees with simulation results for a different CME-driven event. We also find that in an overwhelming majority of the sawtooth injections for which Cluster tail data are available, the O+ observed in the tail comes from the cusp rather than the nightside auroral region, which further casts doubt on the hypothesis that ionospheric heavy ion feedback is the cause of sawtooth injections.

  18. Architecture of reconciliation

    OpenAIRE

    Tyrrell, Roger; Astridge, S.

    2008-01-01

    One quarter of all the world’s cranes are located in the fastest growing city in the world; Dubai. The paradox is, that in striving for global economic recognition Dubai has become a parody of itself, a mythology of forms; an adult Disneyland built upon the silent deserts of the past. The emphasis upon ‘landmark’ architecture is primarily driven and controlled by global economics and the quest for recognition upon the global stage. As a result, these new forms lack empathy and humility and ha...

  19. Functional language and data flow architectures

    Science.gov (United States)

    Ercegovac, M. D.; Patel, D. R.; Lang, T.

    1983-01-01

    This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.

  20. Reference architecture of application services for personal wellbeing information management.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha

    2011-01-01

    Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.

  1. Model-Driven Theme/UML

    Science.gov (United States)

    Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán

    Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.

  2. Architecture-independent power bound for vibration energy harvesters

    International Nuclear Information System (INIS)

    Halvorsen, E; Le, C P; Mitcheson, P D; Yeatman, E M

    2013-01-01

    The maximum output power of energy harvesters driven by harmonic vibrations is well known for a range of specific harvester architectures. An architecture-independent bound based on the mechanical input-power also exists and gives a strict limit on achievable power with one mechanical degree of freedom, but is a least upper bound only for lossless devices. We report a new theoretical bound on the output power of vibration energy harvesters that includes parasitic, linear mechanical damping while still being architecture independent. This bound greatly improves the previous bound at moderate force amplitudes and is compared to the performance of established harvester architectures which are shown to agree with it in limiting cases. The bound is a hard limit on achievable power with one mechanical degree of freedom and can not be circumvented by transducer or power-electronic-interface design

  3. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  4. Semantic Web-Driven LMS Architecture towards a Holistic Learning Process Model Focused on Personalization

    Science.gov (United States)

    Kerkiri, Tania

    2010-01-01

    A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…

  5. The CMS Event Builder

    CERN Document Server

    Brigljevic, V; Cano, E; Cittolin, Sergio; Csilling, Akos; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutleber, J; Jacobs, C; Kozlovszky, Miklos; Larsen, H; Magrans de Abril, Ildefons; Meijers, F; Meschi, E; Murray, S; Oh, A; Orsini, L; Pollet, L; Rácz, A; Samyn, D; Scharff-Hansen, P; Schwick, C; Sphicas, Paris; ODell, V; Suzuki, I; Berti, L; Maron, G; Toniolo, N; Zangrando, L; Ninane, A; Erhan, S; Bhattacharya, S; Branson, J G

    2003-01-01

    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragmen...

  6. Discrete Event Modeling and Simulation-Driven Engineering for the ATLAS Data Acquisition Network

    CERN Document Server

    Bonaventura, Matias Alejandro; The ATLAS collaboration; Castro, Rodrigo Daniel

    2016-01-01

    We present an iterative and incremental development methodology for simulation models in network engineering projects. Driven by the DEVS (Discrete Event Systems Specification) formal framework for modeling and simulation we assist network design, test, analysis and optimization processes. A practical application of the methodology is presented for a case study in the ATLAS particle physics detector, the largest scientific experiment built by man where scientists around the globe search for answers about the origins of the universe. The ATLAS data network convey real-time information produced by physics detectors as beams of particles collide. The produced sub-atomic evidences must be filtered and recorded for further offline scrutiny. Due to the criticality of the transported data, networks and applications undergo careful engineering processes with stringent quality of service requirements. A tight project schedule imposes time pressure on design decisions, while rapid technology evolution widens the palett...

  7. Fast, Accurate Memory Architecture Simulation Technique Using Memory Access Characteristics

    OpenAIRE

    小野, 貴継; 井上, 弘士; 村上, 和彰

    2007-01-01

    This paper proposes a fast and accurate memory architecture simulation technique. To design memory architecture, the first steps commonly involve using trace-driven simulation. However, expanding the design space makes the evaluation time increase. A fast simulation is achieved by a trace size reduction, but it reduces the simulation accuracy. Our approach can reduce the simulation time while maintaining the accuracy of the simulation results. In order to evaluate validity of proposed techniq...

  8. EMIR: a configurable hierarchical system for event monitoring and incident response

    Science.gov (United States)

    Deich, William T. S.

    2014-07-01

    The Event Monitor and Incident Response system (emir) is a flexible, general-purpose system for monitoring and responding to all aspects of instrument, telescope, and general facility operations, and has been in use at the Automated Planet Finder telescope for two years. Responses to problems can include both passive actions (e.g. generating alerts) and active actions (e.g. modifying system settings). Emir includes a monitor-and-response daemon, plus graphical user interfaces and text-based clients that automatically configure themselves from data supplied at runtime by the daemon. The daemon is driven by a configuration file that describes each condition to be monitored, the actions to take when the condition is triggered, and how the conditions are aggregated into hierarchical groups of conditions. Emir has been implemented for the Keck Task Library (KTL) keyword-based systems used at Keck and Lick Observatories, but can be readily adapted to many event-driven architectures. This paper discusses the design and implementation of Emir , and the challenges in balancing the competing demands for simplicity, flexibility, power, and extensibility. Emir 's design lends itself well to multiple purposes, and in addition to its core monitor and response functions, it provides an effective framework for computing running statistics, aggregate values, and summary state values from the primitive state data generated by other subsystems, and even for creating quick-and-dirty control loops for simple systems.

  9. Developing Distributed System With Service Resource Oriented Architecture

    Directory of Open Access Journals (Sweden)

    Hermawan Hermawan

    2012-06-01

    Full Text Available Service Oriented Architecture is a design paradigm in software engineering with which a distributed system is built for an enterprise. This paradigm aims at providing the system as a service through a protocol in web service technology, namely Simple Object Access Protocol (SOAP. However, SOA is service level agreements of webservice. For this reason, this reasearch aims at combining SOA with Resource Oriented Architecture in order to expand scalability of services. This combination creates Sevice Resource Oriented Architecture (SROA with which a distributed system is developed that integrates services within project management software. Following this design, the software is developed according to a framework of Agile Model Driven Development which can reduce complexities of the whole process of software development.

  10. A New Path-Constrained Rendezvous Planning Approach for Large-Scale Event-Driven Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ahmadreza Vajdi

    2018-05-01

    Full Text Available We study the problem of employing a mobile-sink into a large-scale Event-Driven Wireless Sensor Networks (EWSNs for the purpose of data harvesting from sensor-nodes. Generally, this employment improves the main weakness of WSNs that is about energy-consumption in battery-driven sensor-nodes. The main motivation of our work is to address challenges which are related to a network’s topology by adopting a mobile-sink that moves in a predefined trajectory in the environment. Since, in this fashion, it is not possible to gather data from sensor-nodes individually, we adopt the approach of defining some of the sensor-nodes as Rendezvous Points (RPs in the network. We argue that RP-planning in this case is a tradeoff between minimizing the number of RPs while decreasing the number of hops for a sensor-node that needs data transformation to the related RP which leads to minimizing average energy consumption in the network. We address the problem by formulating the challenges and expectations as a Mixed Integer Linear Programming (MILP. Henceforth, by proving the NP-hardness of the problem, we propose three effective and distributed heuristics for RP-planning, identifying sojourn locations, and constructing routing trees. Finally, experimental results prove the effectiveness of our approach.

  11. A New Path-Constrained Rendezvous Planning Approach for Large-Scale Event-Driven Wireless Sensor Networks.

    Science.gov (United States)

    Vajdi, Ahmadreza; Zhang, Gongxuan; Zhou, Junlong; Wei, Tongquan; Wang, Yongli; Wang, Tianshu

    2018-05-04

    We study the problem of employing a mobile-sink into a large-scale Event-Driven Wireless Sensor Networks (EWSNs) for the purpose of data harvesting from sensor-nodes. Generally, this employment improves the main weakness of WSNs that is about energy-consumption in battery-driven sensor-nodes. The main motivation of our work is to address challenges which are related to a network’s topology by adopting a mobile-sink that moves in a predefined trajectory in the environment. Since, in this fashion, it is not possible to gather data from sensor-nodes individually, we adopt the approach of defining some of the sensor-nodes as Rendezvous Points (RPs) in the network. We argue that RP-planning in this case is a tradeoff between minimizing the number of RPs while decreasing the number of hops for a sensor-node that needs data transformation to the related RP which leads to minimizing average energy consumption in the network. We address the problem by formulating the challenges and expectations as a Mixed Integer Linear Programming (MILP). Henceforth, by proving the NP-hardness of the problem, we propose three effective and distributed heuristics for RP-planning, identifying sojourn locations, and constructing routing trees. Finally, experimental results prove the effectiveness of our approach.

  12. A New Path-Constrained Rendezvous Planning Approach for Large-Scale Event-Driven Wireless Sensor Networks

    Science.gov (United States)

    Zhang, Gongxuan; Wang, Yongli; Wang, Tianshu

    2018-01-01

    We study the problem of employing a mobile-sink into a large-scale Event-Driven Wireless Sensor Networks (EWSNs) for the purpose of data harvesting from sensor-nodes. Generally, this employment improves the main weakness of WSNs that is about energy-consumption in battery-driven sensor-nodes. The main motivation of our work is to address challenges which are related to a network’s topology by adopting a mobile-sink that moves in a predefined trajectory in the environment. Since, in this fashion, it is not possible to gather data from sensor-nodes individually, we adopt the approach of defining some of the sensor-nodes as Rendezvous Points (RPs) in the network. We argue that RP-planning in this case is a tradeoff between minimizing the number of RPs while decreasing the number of hops for a sensor-node that needs data transformation to the related RP which leads to minimizing average energy consumption in the network. We address the problem by formulating the challenges and expectations as a Mixed Integer Linear Programming (MILP). Henceforth, by proving the NP-hardness of the problem, we propose three effective and distributed heuristics for RP-planning, identifying sojourn locations, and constructing routing trees. Finally, experimental results prove the effectiveness of our approach. PMID:29734718

  13. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  14. A practice-driven systematic review of dependency analysis solutions

    NARCIS (Netherlands)

    Callo Arias, Trosky B.; Spek, Pieter van der; Avgeriou, Paris

    2011-01-01

    When following architecture-driven strategies to develop large software-intensive systems, the analysis of the dependencies is not an easy task. In this paper, we report a systematic literature review on dependency analysis solutions. Dependency analysis concerns making dependencies due to

  15. Event-driven Adaptation in COP

    Directory of Open Access Journals (Sweden)

    Pierpaolo Degano

    2016-06-01

    Full Text Available Context-Oriented Programming languages provide us with primitive constructs to adapt program behaviour depending on the evolution of their operational environment, namely the context. In previous work we proposed ML_CoDa, a context-oriented language with two-components: a declarative constituent for programming the context and a functional one for computing. This paper describes an extension of ML_CoDa to deal with adaptation to unpredictable context changes notified by asynchronous events.

  16. EDAS: An Evaluation Prototype for Autonomic Event-Driven Adaptive Security in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Waqas Aman

    2015-07-01

    Full Text Available In Internet of Things (IoT, the main driving technologies are considered to be tiny sensory objects. These objects cannot host traditional preventive and detective technologies to provide protection against the increasing threat sophistication. Furthermore, these solutions are limited to analyzing particular contextual information, for instance network information or files, and do not provide holistic context for risk analysis and response. Analyzing a part of a situation may lead to false alarms and later to unnecessary and incorrect configurations. To overcome these concerns, we proposed an event-driven adaptive security (EDAS model for IoT. EDAS aims to observe security events (changes generated by various things in the monitored IoT environment, investigates any intentional or unintentional risks associated with the events and adapts to it autonomously. It correlates different events in time and space to reduce any false alarms and provides a mechanism to predict attacks before they are realized. Risks are responded to autonomically by utilizing a runtime adaptation ontology. The mitigation action is chosen after assessing essential information, such as the risk faced, user preferences, device capabilities and service requirements. Thus, it selects an optimal mitigation action in a particular adverse situation. The objective of this paper is to investigate EDAS feasibility and its aptitude as a real-world prototype in a remote patient monitoring context. It details how EDAS can be a practical choice for IoT-eHealth in terms of the security, design and implementation features it offers as compared to traditional security controls. We have explained the prototype’s major components and have highlighted the key technical challenges.

  17. Managing business compliance using model-driven security management

    Science.gov (United States)

    Lang, Ulrich; Schreiner, Rudolf

    Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.

  18. Architecture and development of the CDF hardware event builder

    International Nuclear Information System (INIS)

    Shaw, T.M.; Booth, A.W.; Bowden, M.

    1989-01-01

    A hardware Event Builder (EVB) has been developed for use at the Collider Detector experiment at Fermi National Accelerator (CDF). the Event builder presently consists of five FASTBUS modules and has the task of reading out the front end scanners, reformatting the data into YBOS bank structure, and transmitting the data to a Level 3 (L3) trigger system which is composed of multiple VME processing nodes. The Event Builder receives its instructions from a VAX based Buffer Manager (BFM) program via a Unibus Processor Interface (UPI). The Buffer Manager instructs the Event Builder to read out one of the four CDF front end buffers. The Event Builder then informs the Buffer Manager when the event has been formatted and then is instructed to push it up to the L3 trigger system. Once in the L3 system, a decision is made as to whether to write the event to tape

  19. Model-driven Service Engineering with SoaML

    Science.gov (United States)

    Elvesæter, Brian; Carrez, Cyril; Mohagheghi, Parastoo; Berre, Arne-Jørgen; Johnsen, Svein G.; Solberg, Arnor

    This chapter presents a model-driven service engineering (MDSE) methodology that uses OMG MDA specifications such as BMM, BPMN and SoaML to identify and specify services within a service-oriented architecture. The methodology takes advantage of business modelling practices and provides a guide to service modelling with SoaML. The presentation is case-driven and illuminated using the telecommunication example. The chapter focuses in particular on the use of the SoaML modelling language as a means for expressing service specifications that are aligned with business models and can be realized in different platform technologies.

  20. Effect of Ceramic Scaffold Architectural Parameters on Biological Response

    Directory of Open Access Journals (Sweden)

    Maria Isabella eGariboldi

    2015-10-01

    Full Text Available Numerous studies have focused on the optimization of ceramic architectures to fulfill a variety of scaffold functional requirements and improve biological response. Conventional fabrication techniques, however, do not allow for the production of geometrically controlled, reproducible structures and often fail to allow the independent variation of individual geometric parameters. Current developments in additive manufacturing technologies suggest that 3D printing will allow a more controlled and systematic exploration of scaffold architectures. This more direct translation of design into structure requires a pipeline for design-driven optimization. A theoretical framework for systematic design and evaluation of architectural parameters on biological response is presented. Four levels of architecture are considered, namely (1 surface topography, (2 pore size and geometry, (3 porous networks and (4 macroscopic pore arrangement, including the potential for spatially varied architectures. Studies exploring the effect of various parameters within these levels are reviewed. This framework will hopefully allow uncovering of new relationships between architecture and biological response in a more systematic way, as well as inform future refinement of fabrication techniques to fulfill architectural necessities with a consideration of biological implications.

  1. Architectural Intention as the Mediator of Lean Housing Construction

    DEFF Research Database (Denmark)

    Frier, Marie; Kirkegaard, Poul Henning; Fisker, Anna Marie

    2008-01-01

    In recent years a number of companies have taken up the challenge of producing prefab houses using lean principles, hereby incorporating value driven production theory as the means to optimize construction processes. However, the value of home is dependent on architectural qualities and interior ...

  2. A First-level Event Selector for the CBM Experiment at FAIR

    International Nuclear Information System (INIS)

    Cuveland, J de; Lindenstruth, V

    2011-01-01

    The CBM experiment at the upcoming FAIR accelerator aims to create highest baryon densities in nucleus-nucleus collisions and to explore the properties of super-dense nuclear matter. Event rates of 10 MHz are needed for high-statistics measurements of rare probes, while event selection requires complex global triggers like secondary vertex search. To meet these demands, the CBM experiment uses self-triggered detector front-ends and a data push readout architecture. The First-level Event Selector (FLES) is the central physics selection system in CBM. It receives all hits and performs online event selection on the 1 TByte/s input data stream. The event selection process requires high-throughput event building and full event reconstruction using fast, vectorized track reconstruction algorithms. The current FLES architecture foresees a scalable high-performance computer. To achieve the high throughput and computation efficiency, all available computing devices will have to be used, in particular FPGAs at the first stages of the system and heterogeneous many-core architectures such as CPUs for efficient track reconstruction. A high-throughput network infrastructure and flow control in the system are other key aspects. In this paper, we present the foreseen architecture of the First-level Event Selector.

  3. Digital architecture, wearable computers and providing affinity

    DEFF Research Database (Denmark)

    Guglielmi, Michel; Johannesen, Hanne Louise

    2005-01-01

    as the setting for the events of experience. Contemporary architecture is a meta-space residing almost any thinkable field, striving to blur boundaries between art, architecture, design and urbanity and break down the distinction between the material and the user or inhabitant. The presentation for this paper...... will, through research, a workshop and participation in a cumulus competition, focus on the exploration of boundaries between digital architecture, performative space and wearable computers. Our design method in general focuses on the interplay between the performing body and the environment – between...

  4. Synchronous diversification of Sulawesi's iconic artiodactyls driven by recent geological events.

    Science.gov (United States)

    Frantz, Laurent A F; Rudzinski, Anna; Nugraha, Abang Mansyursyah Surya; Evin, Allowen; Burton, James; Hulme-Beaman, Ardern; Linderholm, Anna; Barnett, Ross; Vega, Rodrigo; Irving-Pease, Evan K; Haile, James; Allen, Richard; Leus, Kristin; Shephard, Jill; Hillyer, Mia; Gillemot, Sarah; van den Hurk, Jeroen; Ogle, Sharron; Atofanei, Cristina; Thomas, Mark G; Johansson, Friederike; Mustari, Abdul Haris; Williams, John; Mohamad, Kusdiantoro; Damayanti, Chandramaya Siska; Wiryadi, Ita Djuwita; Obbles, Dagmar; Mona, Stephano; Day, Hally; Yasin, Muhammad; Meker, Stefan; McGuire, Jimmy A; Evans, Ben J; von Rintelen, Thomas; Ho, Simon Y W; Searle, Jeremy B; Kitchener, Andrew C; Macdonald, Alastair A; Shaw, Darren J; Hall, Robert; Galbusera, Peter; Larson, Greger

    2018-04-11

    The high degree of endemism on Sulawesi has previously been suggested to have vicariant origins, dating back to 40 Ma. Recent studies, however, suggest that much of Sulawesi's fauna assembled over the last 15 Myr. Here, we test the hypothesis that more recent uplift of previously submerged portions of land on Sulawesi promoted diversification and that much of its faunal assemblage is much younger than the island itself. To do so, we combined palaeogeographical reconstructions with genetic and morphometric datasets derived from Sulawesi's three largest mammals: the babirusa, anoa and Sulawesi warty pig. Our results indicate that although these species most likely colonized the area that is now Sulawesi at different times (14 Ma to 2-3 Ma), they experienced an almost synchronous expansion from the central part of the island. Geological reconstructions indicate that this area was above sea level for most of the last 4 Myr, unlike most parts of the island. We conclude that emergence of land on Sulawesi (approx. 1-2 Myr) may have allowed species to expand synchronously. Altogether, our results indicate that the establishment of the highly endemic faunal assemblage on Sulawesi was driven by geological events over the last few million years. © 2018 The Authors.

  5. Roadmap to the SRS computing architecture

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, A.

    1994-07-05

    This document outlines the major steps that must be taken by the Savannah River Site (SRS) to migrate the SRS information technology (IT) environment to the new architecture described in the Savannah River Site Computing Architecture. This document proposes an IT environment that is {open_quotes}...standards-based, data-driven, and workstation-oriented, with larger systems being utilized for the delivery of needed information to users in a client-server relationship.{close_quotes} Achieving this vision will require many substantial changes in the computing applications, systems, and supporting infrastructure at the site. This document consists of a set of roadmaps which provide explanations of the necessary changes for IT at the site and describes the milestones that must be completed to finish the migration.

  6. New Energy Architecture. Myanmar

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-06-15

    A global transition towards a new energy architecture is under way, driven by countries' need to respond to the changing dynamics of economic growth, environmental sustainability and energy security. The World Economic Forum, in collaboration with Accenture, has created the New Energy Architecture Initiative to address and accelerate this transition. The Initiative supports the development of national strategies and policy frameworks as countries seek to achieve the combined goals of energy security and access, sustainability, and economic growth and development. The World Economic Forum has formed a partnership with the Ministry of Energy of Myanmar to help apply the Initiative's approach to this developing and resource-rich nation. The Asian Development Bank and the World Economic Forum's Project Adviser, Accenture, have collaborated with the Forum on this consultation process, and have been supported by relevant government, industry and civil society stakeholders. The consultation process aims to understand the nation's current energy architecture challenges and provide an overview of a path to a New Energy Architecture through a series of insights. These insights could form the basis for a long-term multistakeholder roadmap to build Myanmar's energy sector in a way that is secure and sustainable, and promotes economic growth as the country makes its democratic transition. While not all recommendations can be implemented in the near term, they do provide options for creating a prioritized roadmap for Myanmar's energy transition. This report is the culmination of a nine-month multistakeholder process investigating Myanmar's energy architecture. Over the course of many visits to the country, the team has conducted numerous interviews, multistakeholder workshops, and learning and data-gathering exercises to ensure a comprehensive range of information and views. The team has also engaged with a variety of stakeholders to better inform their findings, which have come

  7. New Energy Architecture. Myanmar

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-06-15

    A global transition towards a new energy architecture is under way, driven by countries' need to respond to the changing dynamics of economic growth, environmental sustainability and energy security. The World Economic Forum, in collaboration with Accenture, has created the New Energy Architecture Initiative to address and accelerate this transition. The Initiative supports the development of national strategies and policy frameworks as countries seek to achieve the combined goals of energy security and access, sustainability, and economic growth and development. The World Economic Forum has formed a partnership with the Ministry of Energy of Myanmar to help apply the Initiative's approach to this developing and resource-rich nation. The Asian Development Bank and the World Economic Forum's Project Adviser, Accenture, have collaborated with the Forum on this consultation process, and have been supported by relevant government, industry and civil society stakeholders. The consultation process aims to understand the nation's current energy architecture challenges and provide an overview of a path to a New Energy Architecture through a series of insights. These insights could form the basis for a long-term multistakeholder roadmap to build Myanmar's energy sector in a way that is secure and sustainable, and promotes economic growth as the country makes its democratic transition. While not all recommendations can be implemented in the near term, they do provide options for creating a prioritized roadmap for Myanmar's energy transition. This report is the culmination of a nine-month multistakeholder process investigating Myanmar's energy architecture. Over the course of many visits to the country, the team has conducted numerous interviews, multistakeholder workshops, and learning and data-gathering exercises to ensure a comprehensive range of information and views. The team has also engaged with a variety of stakeholders to better

  8. Framework for Infectious Disease Analysis: A comprehensive and integrative multi-modeling approach to disease prediction and management.

    Science.gov (United States)

    Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark

    2017-12-01

    The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.

  9. Security in the Cache and Forward Architecture for the Next Generation Internet

    Science.gov (United States)

    Hadjichristofi, G. C.; Hadjicostis, C. N.; Raychaudhuri, D.

    The future Internet architecture will be comprised predominately of wireless devices. It is evident at this stage that the TCP/IP protocol that was developed decades ago will not properly support the required network functionalities since contemporary communication profiles tend to be data-driven rather than host-based. To address this paradigm shift in data propagation, a next generation architecture has been proposed, the Cache and Forward (CNF) architecture. This research investigates security aspects of this new Internet architecture. More specifically, we discuss content privacy, secure routing, key management and trust management. We identify security weaknesses of this architecture that need to be addressed and we derive security requirements that should guide future research directions. Aspects of the research can be adopted as a step-stone as we build the future Internet.

  10. Baseline Preferences for Daily, Event-Driven, or Periodic HIV Pre-Exposure Prophylaxis among Gay and Bisexual Men in the PRELUDE Demonstration Project

    Directory of Open Access Journals (Sweden)

    Stefanie J. Vaccher

    2017-12-01

    Full Text Available IntroductionThe effectiveness of daily pre-exposure prophylaxis (PrEP is well established. However, there has been increasing interest in non-daily dosing schedules among gay and bisexual men (GBM. This paper explores preferences for PrEP dosing schedules among GBM at baseline in the PRELUDE demonstration project.Materials and methodsIndividuals at high-risk of HIV were enrolled in a free PrEP demonstration project in New South Wales, Australia, between November 2014 and April 2016. At baseline, they completed an online survey containing detailed behavioural, demographic, and attitudinal questions, including their ideal way to take PrEP: daily (one pill taken every day, event-driven (pills taken only around specific risk events, or periodic (daily dosing during periods of increased risk.ResultsOverall, 315 GBM (98% of study sample provided a preferred PrEP dosing schedule at baseline. One-third of GBM expressed a preference for non-daily PrEP dosing: 20% for event-driven PrEP, and 14% for periodic PrEP. Individuals with a trade/vocational qualification were more likely to prefer periodic to daily PrEP [adjusted odds ratio (aOR = 4.58, 95% confidence intervals (95% CI: (1.68, 12.49], compared to individuals whose highest level of education was high school. Having an HIV-positive main regular partner was associated with strong preference for daily, compared to event-driven PrEP [aOR = 0.20, 95% CI: (0.04, 0.87]. Participants who rated themselves better at taking medications were more likely to prefer daily over periodic PrEP [aOR = 0.39, 95% CI: (0.20, 0.76].DiscussionIndividuals’ preferences for PrEP schedules are associated with demographic and behavioural factors that may impact on their ability to access health services and information about PrEP and patterns of HIV risk. At the time of data collection, there were limited data available about the efficacy of non-daily PrEP schedules, and clinicians only recommended daily PrEP to

  11. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  12. Event processing for business organizing the real-time enterprise

    CERN Document Server

    Luckham, David C

    2011-01-01

    Find out how Events Processing (EP) works and how it can workfor you Business Event Processing: An Introduction and StrategyGuide thoroughly describes what EP is, how to use it, and howit relates to other popular information technology architecturessuch as Service Oriented Architecture. Explains how sense and response architectures are being appliedwith tremendous results to businesses throughout the world andshows businesses how they can get started implementing EPShows how to choose business event processing technology tosuit your specific business needs and how to keep costs of adoptingit

  13. A Generic Architecture for Autonomous Uninhabited Vehicles

    National Research Council Canada - National Science Library

    Barbier, Magali; Gabard, Jean-Francois; Ayreault, Herve

    2007-01-01

    ...; few solutions propose architecture adaptive to several types of platform. Autonomous vehicles that move in partially known and dynamic environments have to deal with asynchronous disruptive events...

  14. Production experience with the ATLAS Event Service

    Science.gov (United States)

    Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.

  15. Wavy Architecture Thin-Film Transistor for Ultrahigh Resolution Flexible Displays

    KAUST Repository

    Hanna, Amir Nabil; Kutbee, Arwa Talal; Subedi, Ram Chandra; Ooi, Boon S.; Hussain, Muhammad Mustafa

    2017-01-01

    A novel wavy-shaped thin-film-transistor (TFT) architecture, capable of achieving 70% higher drive current per unit chip area when compared with planar conventional TFT architectures, is reported for flexible display application. The transistor, due to its atypical architecture, does not alter the turn-on voltage or the OFF current values, leading to higher performance without compromising static power consumption. The concept behind this architecture is expanding the transistor's width vertically through grooved trenches in a structural layer deposited on a flexible substrate. Operation of zinc oxide (ZnO)-based TFTs is shown down to a bending radius of 5 mm with no degradation in the electrical performance or cracks in the gate stack. Finally, flexible low-power LEDs driven by the respective currents of the novel wavy, and conventional coplanar architectures are demonstrated, where the novel architecture is able to drive the LED at 2 × the output power, 3 versus 1.5 mW, which demonstrates the potential use for ultrahigh resolution displays in an area efficient manner.

  16. Wavy Architecture Thin-Film Transistor for Ultrahigh Resolution Flexible Displays

    KAUST Repository

    Hanna, Amir Nabil

    2017-11-13

    A novel wavy-shaped thin-film-transistor (TFT) architecture, capable of achieving 70% higher drive current per unit chip area when compared with planar conventional TFT architectures, is reported for flexible display application. The transistor, due to its atypical architecture, does not alter the turn-on voltage or the OFF current values, leading to higher performance without compromising static power consumption. The concept behind this architecture is expanding the transistor\\'s width vertically through grooved trenches in a structural layer deposited on a flexible substrate. Operation of zinc oxide (ZnO)-based TFTs is shown down to a bending radius of 5 mm with no degradation in the electrical performance or cracks in the gate stack. Finally, flexible low-power LEDs driven by the respective currents of the novel wavy, and conventional coplanar architectures are demonstrated, where the novel architecture is able to drive the LED at 2 × the output power, 3 versus 1.5 mW, which demonstrates the potential use for ultrahigh resolution displays in an area efficient manner.

  17. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  18. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  19. A data acquisition architecture for the SSC

    International Nuclear Information System (INIS)

    Partridge, R.

    1990-01-01

    An SSC data acquisition architecture applicable to high-p T detectors is described. The architecture is based upon a small set of design principles that were chosen to simplify communication between data acquisition elements while providing the required level of flexibility and performance. The architecture features an integrated system for data collection, event building, and communication with a large processing farm. The interface to the front end electronics system is also discussed. A set of design parameters is given for a data acquisition system that should meet the needs of high-p T detectors at the SSC

  20. Design of a Scalable Event Notification Service: Interface and Architecture

    National Research Council Canada - National Science Library

    Carzaniga, Antonio; Rosenblum, David S; Wolf, Alexander L

    1998-01-01

    Event-based distributed systems are programmed to operate in response to events. An event notification service is an application-independent infrastructure that supports the construction of event-based systems...

  1. 3D ARCHITECTURAL VIDEOMAPPING

    Directory of Open Access Journals (Sweden)

    R. Catanese

    2013-07-01

    Full Text Available 3D architectural mapping is a video projection technique that can be done with a survey of a chosen building in order to realize a perfect correspondence between its shapes and the images in projection. As a performative kind of audiovisual artifact, the real event of the 3D mapping is a combination of a registered video animation file with a real architecture. This new kind of visual art is becoming very popular and its big audience success testifies new expressive chances in the field of urban design. My case study has been experienced in Pisa for the Luminara feast in 2012.

  2. Architectural Lessons: Look Back In Order To Move Forward

    Science.gov (United States)

    Huang, T.; Djorgovski, S. G.; Caltagirone, S.; Crichton, D. J.; Hughes, J. S.; Law, E.; Pilone, D.; Pilone, T.; Mahabal, A.

    2015-12-01

    True elegance of scalable and adaptable architecture is not about incorporating the latest and greatest technologies. Its elegance is measured by its ability to scale and adapt as its operating environment evolves over time. Architecture is the link that bridges people, process, policies, interfaces, and technologies. Architectural development begins by observe the relationships which really matter to the problem domain. It follows by the creation of a single, shared, evolving, pattern language, which everyone contributes to, and everyone can use [C. Alexander, 1979]. Architects are the true artists. Like all masterpieces, the values and strength of architectures are measured not by the volumes of publications, it is measured by its ability to evolve. An architect must look back in order to move forward. This talk discusses some of the prior works including onboard data analysis system, knowledgebase system, cloud-based Big Data platform, as enablers to help shape the new generation of Earth Science projects at NASA and EarthCube where a community-driven architecture is the key to enable data-intensive science. [C. Alexander, The Timeless Way of Building, Oxford University, 1979.

  3. Storyboard as a Representation of Urban Architectural Settings

    Directory of Open Access Journals (Sweden)

    Rahman Wahid Arif

    2018-01-01

    This paper aims to explore the potential of storyboarding practice in Basic Design 2 studio as part of architectural education at University of Indonesia. Adopting a narrative element, storyboard in this studio is used to read urban architectural settings and retell everyday life events; scene by scene, unfold in space and time, through different kinds of creative representations. By doing this exercise, the students ‘sense of spatial arrangement is developed by their understanding of position and orientation of objects settings. They also learned about how the time works; both in compressed or expanded ways. Decision-making in choosing the key events within the storyboard plays a role in making engaging visuals. In conclusion, storyboarding exercise to represent urban architectural settings will enhance the students ‘sensitivity of space, time, and how their ideas are being told by making a rich, multi-layers of narrative.

  4. Data-driven workflows for microservices

    DEFF Research Database (Denmark)

    Safina, Larisa; Mazzara, Manuel; Montesi, Fabrizio

    2016-01-01

    Microservices is an architectural style inspired by service-oriented computing that has recently started gainingpopularity. Jolie is a programming language based on the microservices paradigm: the main building block of Jolie systems are services, in contrast to, e.g., functions or objects....... The primitives offered by the Jolie language elicit many of the recurring patterns found in microservices, like load balancers and structured processes. However, Jolie still lacks some useful constructs for dealing with message types and data manipulation that are present in service-oriented computing......). We show the impact of our implementation on some of the typical scenarios found in microservice systems. This shows how computation can move from a process-driven to a data-driven approach, and leads to the preliminary identification of recurring communication patterns that can be shaped as design...

  5. The Premises of the Event. Are architectural competitions incubators for events?

    Directory of Open Access Journals (Sweden)

    Loïse Lenne

    2013-12-01

    Full Text Available Around 1980, two important competitions were launched on both sides of the Channel. One led to the Grande Arche in La Défense (Johann Otto von Spreckelsen, 1982–1989 and the other to the Lloyd’s building of London (Richard Rogers Partnership, 1977–1986. Recalling the history of these two projects, we will try, in this article, to show how the programmes, their formulation, the methods used and, above all, the culture of the various actors influenced both the decisions and the built results. At the end of the paper, we propose to see these buildings as events. Based on the analysis of these competitions, we will show that these buildings can then be considered as belonging to two different categories – historical and spatial event – that we will define.

  6. Implementing a Dynamic Database-Driven Course Using LAMP

    Science.gov (United States)

    Laverty, Joseph Packy; Wood, David; Turchek, John

    2011-01-01

    This paper documents the formulation of a database driven open source architecture web development course. The design of a web-based curriculum faces many challenges: a) relative emphasis of client and server-side technologies, b) choice of a server-side language, and c) the cost and efficient delivery of a dynamic web development, database-driven…

  7. Fostering Organizational Innovation based on modeling the Marketing Research Process through Event-driven Process Chain (EPC

    Directory of Open Access Journals (Sweden)

    Elena Fleacă

    2016-11-01

    Full Text Available Enterprises competing in an actual business framework are required to win and maintain their competitiveness by flexibility, fast reaction and conformation to the changing customers' needs based on innovation of work related to products, services, and internal processes. The paper addresses these challenges which gain more complex bonds in a case of high pressure for innovation. The methodology commences with a literature review of the current knowledge on innovation through business processes management. Secondly, it has been applied the Event-driven Process Chain tool from the scientific literature to model the variables of marketing research process. The findings highlight benefits of marketing research workflow that enhances the value of market information while reducing costs of obtaining it, in a coherent way.

  8. Model-driven development of smart grid services using SoaML

    DEFF Research Database (Denmark)

    Kosek, Anna Magdalena; Gehrke, Oliver

    2014-01-01

    This paper presents a model-driven software devel- opment process which can be applied to the design of smart grid services. The Service Oriented Architecture Modelling Language (SoaML) is used to describe the architecture as well as the roles and interactions between service participants....... The individual modelling steps and an example design of a SoaML model for a voltage control service are presented and explained. Finally, the paper discusses a proof-of-concept implementation of the modelled service in a smart grid testing laboratory....

  9. ONTOLOGY-DRIVEN TOOL FOR UTILIZING PROGRAMMING STYLES

    Directory of Open Access Journals (Sweden)

    Nikolay Sidorov

    2017-07-01

    Full Text Available Activities of a programmer will be more effective and the software will be more understandable when within the process of software development, programming styles (standards are used, providing clarity of software texts. Purpose: In this research, we present the tool for the realization of new ontology-based methodology automated reasoning techniques for utilizing programming styles. In particular, we focus on representing programming styles in the form of formal ontologies, and study how description logic reasoner can assist programmers in utilizing programming standards. Our research hypothesis is as follows: ontological representation of programming styles can provide additional benefits over existing approaches in utilizing programmer of programming standards. Our research goal is to develop a tool to support the ontology-based utilizing programming styles. Methods: ontological representation of programming styles; object-oriented programming; ontology-driven utilizing of programming styles. Results: the architecture was obtained and the tool was developed in the Java language, which provide tool support of ontology-driven programming styles application method. On the example of naming of the Java programming language standard, features of implementation and application of the tool are provided. Discussion: application of programming styles in coding of program; lack of automated tools for the processes of programming standards application; tool based on new method of ontology-driven application of programming styles; an example of the implementation of tool architecture for naming rules of the Java language standard.

  10. Savannah River Site computing architecture

    Energy Technology Data Exchange (ETDEWEB)

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site's production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  11. Savannah River Site computing architecture

    Energy Technology Data Exchange (ETDEWEB)

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site`s production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  12. Peripheral visual feedback: a powerful means of supporting effective attention allocation in event-driven, data-rich environments.

    Science.gov (United States)

    Nikolic, M I; Sarter, N B

    2001-01-01

    Breakdowns in human-automation coordination in data-rich, event-driven domains such as aviation can be explained in part by a mismatch between the high degree of autonomy yet low observability of modern technology. To some extent, the latter is the result of an increasing reliance in feedback design on foveal vision--an approach that fails to support pilots in tracking system-induced changes and events in parallel with performing concurrent flight-related tasks. One possible solution to the problem is the distribution of tasks and information across sensory modalities and processing channels. A simulator study is presented that compared the effectiveness of current foveal feedback and two implementations of peripheral visual feedback for keeping pilots informed about uncommanded changes in the status of an automated cockpit system. Both peripheral visual displays resulted in higher detection rates and faster response times, without interfering with the performance of concurrent visual tasks any more than does currently available automation feedback. Potential applications include improved display designs that support effective attention allocation in a variety of complex dynamic environments, such as aviation, process control, and medicine.

  13. Real-time hypothesis driven feature extraction on parallel processing architectures

    DEFF Research Database (Denmark)

    Granmo, O.-C.; Jensen, Finn Verner

    2002-01-01

    the problem of higher-order feature-content/feature-feature correlation, causally complexly interacting features are identified through Bayesian network d-separation analysis and combined into joint features. When used on a moderately complex object-tracking case, the technique is able to select...... extraction, which selectively extract relevant features one-by-one, have in some cases achieved real-time performance on single processing element architectures. In this paperwe propose a novel technique which combines the above two approaches. Features are selectively extracted in parallelizable sets...

  14. Active House: an all active eco-architecture building envelope concept

    NARCIS (Netherlands)

    Zeiler, W.

    2008-01-01

    The present trend in energy efficient eco-architecture dwellings is the passive house concept. The ventilation of many of these passive houses is critical. The development of sustainable buildings is driven by the need to preserve the balance of nature. The ventilation of many of these passive

  15. The Reification of Patterns in the Design of Description-Driven Systems

    CERN Document Server

    Le Goff, J M; Kovács, Z; McClatchey, R

    2001-01-01

    To address the issues of reusability and evolvability in designing self- describing systems, this paper proposes a pattern-based, object-oriented, description-driven system architecture. The proposed architecture embodies four pillars - first, the adoption of a multi-layered meta-modeling architecture and reflective meta-level architecture, second, the identification of four data modeling relationships that must be made explicit such that they can be examined and modified dynamically, third, the identification of five design patterns which have emerged from practice and have proved essential in providing reusable building blocks for data management, and fourth, the encoding of the structural properties of the five design patterns by means of one pattern, the Graph pattern. The CRISTAL research project served as the basis onto which the pattern-based meta-object approach has been applied. The proposed architecture allows the realization of reusability and adaptability, and is fundamental in the specification o...

  16. RAS Initiative - Events

    Science.gov (United States)

    The NCI RAS Initiative has organized multiple events with outside experts to discuss how the latest scientific and technological breakthroughs can be applied to discover vulnerabilities in RAS-driven cancers.

  17. Accuracy-Energy Configurable Sensor Processor and IoT Device for Long-Term Activity Monitoring in Rare-Event Sensing Applications

    Directory of Open Access Journals (Sweden)

    Daejin Park

    2014-01-01

    Full Text Available A specially designed sensor processor used as a main processor in IoT (internet-of-thing device for the rare-event sensing applications is proposed. The IoT device including the proposed sensor processor performs the event-driven sensor data processing based on an accuracy-energy configurable event-quantization in architectural level. The received sensor signal is converted into a sequence of atomic events, which is extracted by the signal-to-atomic-event generator (AEG. Using an event signal processing unit (EPU as an accelerator, the extracted atomic events are analyzed to build the final event. Instead of the sampled raw data transmission via internet, the proposed method delays the communication with a host system until a semantic pattern of the signal is identified as a final event. The proposed processor is implemented on a single chip, which is tightly coupled in bus connection level with a microcontroller using a 0.18 μm CMOS embedded-flash process. For experimental results, we evaluated the proposed sensor processor by using an IR- (infrared radio- based signal reflection and sensor signal acquisition system. We successfully demonstrated that the expected power consumption is in the range of 20% to 50% compared to the result of the basement in case of allowing 10% accuracy error.

  18. Extreme weather events and infectious disease outbreaks.

    Science.gov (United States)

    McMichael, Anthony J

    2015-01-01

    Human-driven climatic changes will fundamentally influence patterns of human health, including infectious disease clusters and epidemics following extreme weather events. Extreme weather events are projected to increase further with the advance of human-driven climate change. Both recent and historical experiences indicate that infectious disease outbreaks very often follow extreme weather events, as microbes, vectors and reservoir animal hosts exploit the disrupted social and environmental conditions of extreme weather events. This review article examines infectious disease risks associated with extreme weather events; it draws on recent experiences including Hurricane Katrina in 2005 and the 2010 Pakistan mega-floods, and historical examples from previous centuries of epidemics and 'pestilence' associated with extreme weather disasters and climatic changes. A fuller understanding of climatic change, the precursors and triggers of extreme weather events and health consequences is needed in order to anticipate and respond to the infectious disease risks associated with human-driven climate change. Post-event risks to human health can be constrained, nonetheless, by reducing background rates of persistent infection, preparatory action such as coordinated disease surveillance and vaccination coverage, and strengthened disaster response. In the face of changing climate and weather conditions, it is critically important to think in ecological terms about the determinants of health, disease and death in human populations.

  19. An Architectural Approach towards Innovative Renewable Energy Infrastructure in Kapisillit, Greenland

    DEFF Research Database (Denmark)

    Carruth, Susan; Krogh, Peter

    2014-01-01

    workshop with architecture students who were asked to create conceptual strategies, driven by distributed, community-controlled renewable energy, for the future of the village. It culminates in a discussion on how this empirical work contributes towards the construction of a vocabulary of material...

  20. Digital column readout architectures for hybrid pixel detector readout chips

    International Nuclear Information System (INIS)

    Poikela, T; Plosila, J; Westerlund, T; Buytaert, J; Campbell, M; Gaspari, M De; Llopart, X; Wyllie, K; Gromov, V; Kluit, R; Beuzekom, M van; Zappon, F; Zivkovic, V; Brezina, C; Desch, K; Fu, Y; Kruth, A

    2014-01-01

    In this paper, two digital column architectures suitable for sparse readout of data from a pixel matrix in trigger-less applications are presented. Each architecture reads out a pixel matrix of 256 x 256 pixels with a pixel pitch of 55 μm. The first architecture has been implemented in the Timepix3 chip, and this is presented together with initial measurements. Simulation results and measured data are compared. The second architecture has been designed for Velopix, a readout chip planned for the LHCb VELO upgrade. Unlike Timepix3, this has to be tolerant to radiation-induced single-event effects. Results from post-layout simulations are shown with the circuit architectures

  1. A new practice-driven approach to develop software in a cyber-physical system environment

    Science.gov (United States)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  2. Architecture as mediator of culture, democracy and hope: XXIII World Congress of Architecture in Turin

    Directory of Open Access Journals (Sweden)

    Bogdanov Ana

    2009-01-01

    Full Text Available The XXIII World Congress of Architecture organized by The International Union of Architects (UIA was held in Turin from June 29 to July 03. Every third year this great architectural event gathers together more than one thousand professionals and students from 126 countries that UIA encompasses today. The main topic of the this year Congress was 'Transmitting Architecture' suggesting that future architecture will increasingly depend on communication and integration. It also suggests that global problems which architecture has been encountered with in recent years should be considered at all levels through ever active network of information and communications. In that term exchanging and sharing knowledge and experience among architects of different national and social backgrounds are crucial. Besides, transmitting architecture has also a faint futuristic sound which points out new technologies and their increasing importance in architecture. At last, this topic emphasize that the Congress has been oriented towards and open to not only academic discussions, but to younger generations and students who have always been most experienced in using digital technologies. Throughout the Congress Programme which consisted of lectures, debates, workshops exhibitions, where more than 600 speakers participated, various topics were discussed with a common aim to gain an insight into the situation on current architectural scene, to realize the position of architects and architecture in today's sphere of social, political, cultural, technical and technological factors, and consequently to establish possible tracks and offer recommendations for further progress in this field. Out of Congress, the participants had the opportunity of being introduced to Turin, the city of vision and sustainability put into action.

  3. Innovation of IT metasystems by means of event-driven paradigm using QDMS

    Science.gov (United States)

    Nedic, Vladimir; Despotovic, Danijela; Cvetanovic, Slobodan; Despotovic, Milan; Eric, Milan

    2016-10-01

    Globalisation of world economy brings new and more complex demands to business systems. In order to respond to these trends, business systems apply new paradigms that are inevitable reflecting on management metasystems - quality assurance (QA), as well as on information technology (IT) metasystems. Small and medium enterprises (in particular in food industry) do not have possibilities to access external resources to the extent that could provide adequate keeping up with these trends. That raises the question how to enhance synergetic effect of interaction between existing QA and IT metasystems in order to overcome resource gap and achieve set goals by internal resources. The focus of this article is to propose a methodology for utilisation of potential of quality assurance document management system (QDMS) as prototypical platform for initiating, developing, testing and improving new functionalities that are required by IT as support for buiness system management. In that way QDMS plays a role of catalyst that not only accelerates but could also enhance selectivity of the reactions of QA and IT metasystems and direct them on finding new functionalities based on event-driven paradigm. The article tries to show the process of modelling, development and implementation of a possible approach to this problem through conceptual survey and practical solution in the food industry.

  4. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  5. Impact of Material and Architecture Model Parameters on the Failure of Woven Ceramic Matrix Composites (CMCs) via the Multiscale Generalized Method of Cells

    Science.gov (United States)

    Liu, Kuang C.; Arnold, Steven M.

    2011-01-01

    It is well known that failure of a material is a locally driven event. In the case of ceramic matrix composites (CMCs), significant variations in the microstructure of the composite exist and their significance on both deformation and life response need to be assessed. Examples of these variations include changes in the fiber tow shape, tow shifting/nesting and voids within and between tows. In the present work, the effects of many of these architectural parameters and material scatter of woven ceramic composite properties at the macroscale (woven RUC) will be studied to assess their sensitivity. The recently developed Multiscale Generalized Method of Cells methodology is used to determine the overall deformation response, proportional elastic limit (first matrix cracking), and failure under tensile loading conditions. The macroscale responses investigated illustrate the effect of architectural and material parameters on a single RUC representing a five harness satin weave fabric. Results shows that the most critical architectural parameter is weave void shape and content with other parameters being less in severity. Variation of the matrix material properties was also studied to illustrate the influence of the material variability on the overall features of the composite stress-strain response.

  6. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  7. Assured Mission Support Space Architecture (AMSSA) study

    Science.gov (United States)

    Hamon, Rob

    1993-01-01

    The assured mission support space architecture (AMSSA) study was conducted with the overall goal of developing a long-term requirements-driven integrated space architecture to provide responsive and sustained space support to the combatant commands. Although derivation of an architecture was the focus of the study, there are three significant products from the effort. The first is a philosophy that defines the necessary attributes for the development and operation of space systems to ensure an integrated, interoperable architecture that, by design, provides a high degree of combat utility. The second is the architecture itself; based on an interoperable system-of-systems strategy, it reflects a long-range goal for space that will evolve as user requirements adapt to a changing world environment. The third product is the framework of a process that, when fully developed, will provide essential information to key decision makers for space systems acquisition in order to achieve the AMSSA goal. It is a categorical imperative that military space planners develop space systems that will act as true force multipliers. AMSSA provides the philosophy, process, and architecture that, when integrated with the DOD requirements and acquisition procedures, can yield an assured mission support capability from space to the combatant commanders. An important feature of the AMSSA initiative is the participation by every organization that has a role or interest in space systems development and operation. With continued community involvement, the concept of the AMSSA will become a reality. In summary, AMSSA offers a better way to think about space (philosophy) that can lead to the effective utilization of limited resources (process) with an infrastructure designed to meet the future space needs (architecture) of our combat forces.

  8. Ultra-Low-Power Event-Driven Radio Design

    NARCIS (Netherlands)

    Huang, X.

    2014-01-01

    The emerging field of internet of things promises mankind an enhanced life quality, produc-tivity and security. One critical technology enabler is ubiquitous and unobtrusive wireless connectivity activated by ambient events and operated with little human intervention for con-figuration and

  9. A Pattern Language for the Evolution of Component-based Software Architectures

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Jamshidi, Pooyan; Pahl, Claus

    2013-01-01

    Modern software systems are prone to a continuous evolution under frequently varying requirements. Architecture-centric software evolution enables change in system’s structure and behavior while maintaining a global view of the software to address evolution-centric tradeoffs. Lehman’s law...... evolution problems. We propose that architectural evolution process requires an explicit evolution-centric knowledge – that can be discovered, shared, and reused – to anticipate and guide change management. Therefore, we present a pattern language as a collection of interconnected change patterns......) as a complementary and integrated phase to facilitate reuse-driven architecture change execution (pattern language application). Reuse-knowledge in the proposed pattern language is expressed as a formalised collection of interconnected-patterns. Individual patterns in the language build on each other to facilitate...

  10. From green architecture to architectural green

    DEFF Research Database (Denmark)

    Earon, Ofri

    2011-01-01

    that describes the architectural exclusivity of this particular architecture genre. The adjective green expresses architectural qualities differentiating green architecture from none-green architecture. Currently, adding trees and vegetation to the building’s facade is the main architectural characteristics...... they have overshadowed the architectural potential of green architecture. The paper questions how a green space should perform, look like and function. Two examples are chosen to demonstrate thorough integrations between green and space. The examples are public buildings categorized as pavilions. One......The paper investigates the topic of green architecture from an architectural point of view and not an energy point of view. The purpose of the paper is to establish a debate about the architectural language and spatial characteristics of green architecture. In this light, green becomes an adjective...

  11. The NASA Integrated Information Technology Architecture

    Science.gov (United States)

    Baldridge, Tim

    1997-01-01

    This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context

  12. Architecture on Architecture

    DEFF Research Database (Denmark)

    Olesen, Karen

    2016-01-01

    that is not scientific or academic but is more like a latent body of data that we find embedded in existing works of architecture. This information, it is argued, is not limited by the historical context of the work. It can be thought of as a virtual capacity – a reservoir of spatial configurations that can...... correlation between the study of existing architectures and the training of competences to design for present-day realities.......This paper will discuss the challenges faced by architectural education today. It takes as its starting point the double commitment of any school of architecture: on the one hand the task of preserving the particular knowledge that belongs to the discipline of architecture, and on the other hand...

  13. On minimalism in architecture - space as experience

    Directory of Open Access Journals (Sweden)

    Vasilski Dragana

    2016-01-01

    Full Text Available Architecture has to be experienced to be understood. The complexity of the experience is seen through a better understanding of the relationship between objectivity (architecture and subjectivity (our life. Being physically, emotionally and psychologically aware of the space we occupy is an experience that could be described as being present, which is a sensation that is personal and difficult to explicitly describe. Research into experience through perception and emotion positions architecture within scientific fields, in particular psychological disciplines. Relying on the standpoints of Immanuel Kant, the paper considers the juxtaposition between (minimalism in architecture and philosophy on the topic of experience. Starting from the basic aspects of perception and representation of the world around us, a thesis is presented in which the notions of silence and light as experienced in minimalism (in architecture are considered as adequate counterparts to Kant’s factors of experience - the awareness of the objective order of events and the impossibility to perceive time itself. Through a case study we verify the starting hypothesis on minimalism (in architecture whereby space becomes an experience of how the world touches us.

  14. Ontology-driven extraction of event logs from relational databases

    NARCIS (Netherlands)

    Calvanese, Diego; Montali, Marco; Syamsiyah, Alifah; van der Aalst, Wil M P; Reichert, M.; Reijers, H.A.

    2015-01-01

    Process mining is an emerging discipline whose aim is to discover, monitor and improve real processes by extracting knowledge from event logs representing actual process executions in a given organizational setting. In this light, it can be applied only if faithful event logs, adhering to accepted

  15. Event-by-Event Elliptic Flow Fluctuations from PHOBOS

    Science.gov (United States)

    Wosiek, B.; Alver, B.; Back, B. B.; Baker, M. D.; Ballintijn, M.; Barton, D. S.; Betts, R. R.; Bickley, A. A.; Bindel, R.; Busza, W.; Carroll, A.; Chai, Z.; Chetluru, V.; Decowski, M. P.; García, E.; Gburek, T.; George, N.; Gulbrandsen, K.; Halliwell, C.; Hamblen, J.; Harnarine, I.; Hauer, M.; Henderson, C.; Hofman, D. J.; Hollis, R. S.; Hołyński, R.; Holzman, B.; Iordanova, A.; Johnson, E.; Kane, J. L.; Khan, N.; Kulinich, P.; Kuo, C. M.; Li, W.; Lin, W. T.; Loizides, C.; Manly, S.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Reed, C.; Richardson, E.; Roland, C.; Roland, G.; Sagerer, J.; Seals, H.; Sedykh, I.; Smith, C. E.; Stankiewicz, M. A.; Steinberg, P.; Stephans, G. S. F.; Sukhanov, A.; Szostak, A.; Tonjes, M. B.; Trzupek, A.; Vale, C.; van Nieuwenhuizen, G. J.; Vaurynovich, S. S.; Verdier, R.; Veres, G. I.; Walters, P.; Wenger, E.; Willhelm, D.; Wolfs, F. L. H.; Woźniak, K.; Wyngaardt, S.; Wysłouch, B.

    2009-04-01

    Recently PHOBOS has focused on the study of fluctuations and correlations in particle production in heavy-ion collisions at the highest energies delivered by the Relativistic Heavy Ion Collider (RHIC). In this report, we present results on event-by-event elliptic flow fluctuations in (Au+Au) collisions at sqrt {sNN}=200 GeV. A data-driven method was used to estimate the dominant contribution from non-flow correlations. Over the broad range of collision centralities, the observed large elliptic flow fluctuations are in agreement with the fluctuations in the initial source eccentricity.

  16. Production experience with the ATLAS Event Service

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00066086; The ATLAS collaboration; Calafiura, Paolo; Childers, John Taylor; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2017-01-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Comp...

  17. Production Experience with the ATLAS Event Service

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration

    2016-01-01

    The ATLAS Event Service (ES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the ES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Goggle Comput...

  18. Ion trap architectures and new directions

    Science.gov (United States)

    Siverns, James D.; Quraishi, Qudsia

    2017-12-01

    Trapped ion technology has seen advances in performance, robustness and versatility over the last decade. With increasing numbers of trapped ion groups worldwide, a myriad of trap architectures are currently in use. Applications of trapped ions include: quantum simulation, computing and networking, time standards and fundamental studies in quantum dynamics. Design of such traps is driven by these various research aims, but some universally desirable properties have lead to the development of ion trap foundries. Additionally, the excellent control achievable with trapped ions and the ability to do photonic readout has allowed progress on quantum networking using entanglement between remotely situated ion-based nodes. Here, we present a selection of trap architectures currently in use by the community and present their most salient characteristics, identifying features particularly suited for quantum networking. We also discuss our own in-house research efforts aimed at long-distance trapped ion networking.

  19. The ATLAS event filter

    CERN Document Server

    Beck, H P; Boissat, C; Davis, R; Duval, P Y; Etienne, F; Fede, E; Francis, D; Green, P; Hemmer, F; Jones, R; MacKinnon, J; Mapelli, Livio P; Meessen, C; Mommsen, R K; Mornacchi, Giuseppe; Nacasch, R; Negri, A; Pinfold, James L; Polesello, G; Qian, Z; Rafflin, C; Scannicchio, D A; Stanescu, C; Touchard, F; Vercesi, V

    1999-01-01

    An overview of the studies for the ATLAS Event Filter is given. The architecture and the high level design of the DAQ-1 prototype is presented. The current status if the prototypes is briefly given. Finally, future plans and milestones are given. (11 refs).

  20. The NBS-LRR architectures of plant R-proteins and metazoan NLRs evolved in independent events.

    Science.gov (United States)

    Urbach, Jonathan M; Ausubel, Frederick M

    2017-01-31

    There are intriguing parallels between plants and animals, with respect to the structures of their innate immune receptors, that suggest universal principles of innate immunity. The cytosolic nucleotide binding site-leucine rich repeat (NBS-LRR) resistance proteins of plants (R-proteins) and the so-called NOD-like receptors of animals (NLRs) share a domain architecture that includes a STAND (signal transduction ATPases with numerous domains) family NTPase followed by a series of LRRs, suggesting inheritance from a common ancestor with that architecture. Focusing on the STAND NTPases of plant R-proteins, animal NLRs, and their homologs that represent the NB-ARC (nucleotide-binding adaptor shared by APAF-1, certain R gene products and CED-4) and NACHT (named for NAIP, CIIA, HET-E, and TEP1) subfamilies of the STAND NTPases, we analyzed the phylogenetic distribution of the NBS-LRR domain architecture, used maximum-likelihood methods to infer a phylogeny of the NTPase domains of R-proteins, and reconstructed the domain structure of the protein containing the common ancestor of the STAND NTPase domain of R-proteins and NLRs. Our analyses reject monophyly of plant R-proteins and NLRs and suggest that the protein containing the last common ancestor of the STAND NTPases of plant R-proteins and animal NLRs (and, by extension, all NB-ARC and NACHT domains) possessed a domain structure that included a STAND NTPase paired with a series of tetratricopeptide repeats. These analyses reject the hypothesis that the domain architecture of R-proteins and NLRs was inherited from a common ancestor and instead suggest the domain architecture evolved at least twice. It remains unclear whether the NBS-LRR architectures were innovations of plants and animals themselves or were acquired by one or both lineages through horizontal gene transfer.

  1. On Control Strategies for Responsive Architectural Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Parigi, Dario

    2012-01-01

    The present paper considers control of responsive architectural structures for improvement of structural performance by recognizing changes in their environments and loads, adapting to meet goals, and using past events to improve future performance or maintain serviceability. The general scope of...

  2. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  3. An Enterprise Security Program and Architecture to Support Business Drivers

    Directory of Open Access Journals (Sweden)

    Brian Ritchot

    2013-08-01

    Full Text Available This article presents a business-focused approach to developing and delivering enterprise security architecture that is focused on enabling business objectives while providing a sensible and balanced approach to risk management. A balanced approach to enterprise security architecture can create the important linkages between the goals and objectives of a business, and it provides appropriate measures to protect the most critical assets within an organization while accepting risk where appropriate. Through a discussion of information assurance, this article makes a case for leveraging enterprise security architectures to meet an organizations' need for information assurance. The approach is derived from the Sherwood Applied Business Security Architecture (SABSA methodology, as put into practice by Seccuris Inc., an information assurance integrator. An understanding of Seccuris’ approach will illustrate the importance of aligning security activities with high-level business objectives while creating increased awareness of the duality of risk. This business-driven approach to enterprise security architecture can help organizations change the perception of IT security, positioning it as a tool to enable and assure business success, rather than be perceived as an obstacle to be avoided.

  4. ARCHITECTURAL PLACEMAKING OF TECHNOLOGY PARKS: ENCOURAGEMENT OF CREATIVE THINKING

    Directory of Open Access Journals (Sweden)

    Rykov Kirill Nikolaevich

    2012-10-01

    Full Text Available The present-day postindustrial or information-oriented society features an ever growing role of creative and intellectual abilities. This trend facilitates transformation of the workforce, as the portion of manual labor is reduced, while the one of intellectual labor goes up. As a result, architectural placemaking has to meet the new requirements driven by the specific nature of social and physiological constituents of the headwork. The aim of the article is the identification of new challenges that the high-quality architecture has to meet in its efforts to service the intellectual labour environment. For illustrative purposes, the author has chosen research and technology parks as the most typical postindustrial facilities. According to the author, intellectual constituents of the architectural practice represent systematic and research components. This division is the result of the analysis of research and technology parks. The author has made an attempt to identify special conditions of effective creativity in architectural practice. They include comfort, availability, information system development, calm, sociality, significance and variability. The list of conditions and general methods of their implementation presented by the author can be used in a wide range of project goals connected with the architectural design of research and technology parks and stimulation of creative potential of the people involved.

  5. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  6. Automation Hooks Architecture for Flexible Test Orchestration - Concept Development and Validation

    Science.gov (United States)

    Lansdowne, C. A.; Maclean, John R.; Winton, Chris; McCartney, Pat

    2011-01-01

    The Automation Hooks Architecture Trade Study for Flexible Test Orchestration sought a standardized data-driven alternative to conventional automated test programming interfaces. The study recommended composing the interface using multicast DNS (mDNS/SD) service discovery, Representational State Transfer (Restful) Web Services, and Automatic Test Markup Language (ATML). We describe additional efforts to rapidly mature the Automation Hooks Architecture candidate interface definition by validating it in a broad spectrum of applications. These activities have allowed us to further refine our concepts and provide observations directed toward objectives of economy, scalability, versatility, performance, severability, maintainability, scriptability and others.

  7. A model of nitrous oxide evolution from soil driven by rainfall events. I - Model structure and sensitivity. II - Model applications

    Science.gov (United States)

    Changsheng, LI; Frolking, Steve; Frolking, Tod A.

    1992-01-01

    Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.

  8. Developing architecture for upgrading I and C systems of an operating nuclear power plant using a quality attribute-driven design method

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Yong Suk; Keum, Jong Yong [SMART Technology Validation Division, Korea Atomic Energy Research Institute, 150-1 Dukjin-dong, Yuseong-gu, Daejon (Korea, Republic of); Kim, Hyeon Soo, E-mail: hskim401@cnu.ac.kr [Department of Computer Science and Engineering, Chungnam Nat' l Univ., 220 Gung-dong, Yuseong-gu, Daejon (Korea, Republic of)

    2011-12-15

    This paper presents the architecture for upgrading the instrumentation and control (I and C) systems of a Korean standard nuclear power plant (KSNP) as an operating nuclear power plant. This paper uses the analysis results of KSNP's I and C systems performed in a previous study. This paper proposes a Preparation-Decision-Design-Assessment (PDDA) process that focuses on quality oriented development, as a cyclical process to develop the architecture. The PDDA was motivated from the practice of architecture-based development used in software engineering fields. In the preparation step of the PDDA, the architecture of digital-based I and C systems was setup for an architectural goal. Single failure criterion and determinism were setup for architectural drivers. In the decision step, defense-in-depth, diversity, redundancy, and independence were determined as architectural tactics to satisfy the single failure criterion, and sequential execution was determined as a tactic to satisfy the determinism. After determining the tactics, the primitive digital-based I and C architecture was determined. In the design step, 17 systems were selected from the KSNP's I and C systems for the upgrade and functionally grouped based on the primitive architecture. The overall architecture was developed to show the deployment of the systems. The detailed architecture of the safety systems was developed by applying a 2-out-of-3 voting logic, and the detailed architecture of the non-safety systems was developed by hot-standby redundancy. While developing the detailed architecture, three ways of signal transmission were determined with proper rationales: hardwire, datalink, and network. In the assessment step, the required network performance, considering the worst-case of data transmission was calculated: the datalink was required by 120 kbps, the safety network by 5 Mbps, and the non-safety network by 60 Mbps. The architecture covered 17 systems out of 22 KSNP's I and C

  9. Developing architecture for upgrading I and C systems of an operating nuclear power plant using a quality attribute-driven design method

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Keum, Jong Yong; Kim, Hyeon Soo

    2011-01-01

    This paper presents the architecture for upgrading the instrumentation and control (I and C) systems of a Korean standard nuclear power plant (KSNP) as an operating nuclear power plant. This paper uses the analysis results of KSNP's I and C systems performed in a previous study. This paper proposes a Preparation–Decision–Design–Assessment (PDDA) process that focuses on quality oriented development, as a cyclical process to develop the architecture. The PDDA was motivated from the practice of architecture-based development used in software engineering fields. In the preparation step of the PDDA, the architecture of digital-based I and C systems was setup for an architectural goal. Single failure criterion and determinism were setup for architectural drivers. In the decision step, defense-in-depth, diversity, redundancy, and independence were determined as architectural tactics to satisfy the single failure criterion, and sequential execution was determined as a tactic to satisfy the determinism. After determining the tactics, the primitive digital-based I and C architecture was determined. In the design step, 17 systems were selected from the KSNP's I and C systems for the upgrade and functionally grouped based on the primitive architecture. The overall architecture was developed to show the deployment of the systems. The detailed architecture of the safety systems was developed by applying a 2-out-of-3 voting logic, and the detailed architecture of the non-safety systems was developed by hot-standby redundancy. While developing the detailed architecture, three ways of signal transmission were determined with proper rationales: hardwire, datalink, and network. In the assessment step, the required network performance, considering the worst-case of data transmission was calculated: the datalink was required by 120 kbps, the safety network by 5 Mbps, and the non-safety network by 60 Mbps. The architecture covered 17 systems out of 22 KSNP's I and C systems. The

  10. Motion camera based on a custom vision sensor and an FPGA architecture

    Science.gov (United States)

    Arias-Estrada, Miguel

    1998-09-01

    A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.

  11. Space architecture education for engineers and architects designing and planning beyond earth

    CERN Document Server

    Häuplik-Meusburger, Sandra

    2016-01-01

    This book considers two key educational tools for future generations of professionals with a space architecture background in the 21st century: (1) introducing the discipline of space architecture into the space system engineering curricula; and (2) developing space architecture as a distinct, complete training curriculum.  Professionals educated this way will help shift focus from solely engineering-driven transportation systems and “sortie” missions towards permanent off-world human presence. The architectural training teaches young professionals to operate at all scales from the “overall picture” down to the smallest details, to provide directive intention–not just analysis–to design opportunities, to address the relationship between human behavior and the built environment, and to interact with many diverse fields and disciplines throughout the project lifecycle. This book will benefit individuals and organizations responsible for planning transportation and habitat systems in space, while a...

  12. Interdisciplinary process driven performative morphologies : A morphogenomic approach towards developing context aware spatial formations

    NARCIS (Netherlands)

    Biloria, N.M.

    2011-01-01

    Architectural praxis is in continuous state of change. The introduction of information technology driven design techniques, constantly updating building information modeling protocols, new policy demands coupled together with environmental regulations and cultural fluctuations are all open-ended

  13. Kalman Filter Tracking on Parallel Architectures

    International Nuclear Information System (INIS)

    Cerati, Giuseppe; Elmer, Peter; Krutelyov, Slava; Lantz, Steven; Lefebvre, Matthieu; McDermott, Kevin; Riley, Daniel; Tadel, Matevž; Wittich, Peter; Würthwein, Frank; Yagil, Avi

    2016-01-01

    Power density constraints are limiting the performance improvements of modern CPUs. To address this we have seen the introduction of lower-power, multi-core processors such as GPGPU, ARM and Intel MIC. In order to achieve the theoretical performance gains of these processors, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High-Luminosity Large Hadron Collider (HL-LHC), for example, this will be by far the dominant problem. The need for greater parallelism has driven investigations of very different track finding techniques such as Cellular Automata or Hough Transforms. The most common track finding techniques in use today, however, are those based on a Kalman filter approach. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. They are known to provide high physics performance, are robust, and are in use today at the LHC. Given the utility of the Kalman filter in track finding, we have begun to port these algorithms to parallel architectures, namely Intel Xeon and Xeon Phi. We report here on our progress towards an end-to-end track reconstruction algorithm fully exploiting vectorization and parallelization techniques in a simplified experimental environment

  14. System architecture of communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth

    2017-04-01

    The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures

  15. Practical, redundant, failure-tolerant, self-reconfiguring embedded system architecture

    Science.gov (United States)

    Klarer, Paul R.; Hayward, David R.; Amai, Wendy A.

    2006-10-03

    This invention relates to system architectures, specifically failure-tolerant and self-reconfiguring embedded system architectures. The invention provides both a method and architecture for redundancy. There can be redundancy in both software and hardware for multiple levels of redundancy. The invention provides a self-reconfiguring architecture for activating redundant modules whenever other modules fail. The architecture comprises: a communication backbone connected to two or more processors and software modules running on each of the processors. Each software module runs on one processor and resides on one or more of the other processors to be available as a backup module in the event of failure. Each module and backup module reports its status over the communication backbone. If a primary module does not report, its backup module takes over its function. If the primary module becomes available again, the backup module returns to its backup status.

  16. SOA and EDA: a comparative study - similarities, difference and conceptual guidelines on their usage

    NARCIS (Netherlands)

    Allah Bukhsh, Zaharah; van Sinderen, Marten J.; Singh, Prince Mayurank

    2015-01-01

    Changing business requirements and new technologies trigger the business stakeholders to shift their approach from many small isolated systems to a single connected system. Integration of isolated systems is partially supported by service oriented architecture (SOA) and event driven architecture

  17. Reconfiguration of Brain Network Architectures between Resting-State and Complexity-Dependent Cognitive Reasoning.

    Science.gov (United States)

    Hearne, Luke J; Cocchi, Luca; Zalesky, Andrew; Mattingley, Jason B

    2017-08-30

    Our capacity for higher cognitive reasoning has a measurable limit. This limit is thought to arise from the brain's capacity to flexibly reconfigure interactions between spatially distributed networks. Recent work, however, has suggested that reconfigurations of task-related networks are modest when compared with intrinsic "resting-state" network architecture. Here we combined resting-state and task-driven functional magnetic resonance imaging to examine how flexible, task-specific reconfigurations associated with increasing reasoning demands are integrated within a stable intrinsic brain topology. Human participants (21 males and 28 females) underwent an initial resting-state scan, followed by a cognitive reasoning task involving different levels of complexity, followed by a second resting-state scan. The reasoning task required participants to deduce the identity of a missing element in a 4 × 4 matrix, and item difficulty was scaled parametrically as determined by relational complexity theory. Analyses revealed that external task engagement was characterized by a significant change in functional brain modules. Specifically, resting-state and null-task demand conditions were associated with more segregated brain-network topology, whereas increases in reasoning complexity resulted in merging of resting-state modules. Further increments in task complexity did not change the established modular architecture, but affected selective patterns of connectivity between frontoparietal, subcortical, cingulo-opercular, and default-mode networks. Larger increases in network efficiency within the newly established task modules were associated with higher reasoning accuracy. Our results shed light on the network architectures that underlie external task engagement, and highlight selective changes in brain connectivity supporting increases in task complexity. SIGNIFICANCE STATEMENT Humans have clear limits in their ability to solve complex reasoning problems. It is thought that

  18. Collaborative Event-Driven Coverage and Rate Allocation for Event Miss-Ratio Assurances in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ozgur Sanli H

    2010-01-01

    Full Text Available Wireless sensor networks are often required to provide event miss-ratio assurance for a given event type. To meet such assurances along with minimum energy consumption, this paper shows how a node's activation and rate assignment is dependent on its distance to event sources, and proposes a practical coverage and rate allocation (CORA protocol to exploit this dependency in realistic environments. Both uniform event distribution and nonuniform event distribution are considered and the notion of ideal correlation distance around a clusterhead is introduced for on-duty node selection. In correlation distance guided CORA, rate assignment assists coverage scheduling by determining which nodes should be activated for minimizing data redundancy in transmission. Coverage scheduling assists rate assignment by controlling the amount of overlap among sensing regions of neighboring nodes, thereby providing sufficient data correlation for rate assignment. Extensive simulation results show that CORA meets the required event miss-ratios in realistic environments. CORA's joint coverage scheduling and rate allocation reduce the total energy expenditure by 85%, average battery energy consumption by 25%, and the overhead of source coding up to 90% as compared to existing rate allocation techniques.

  19. ISTTOK real-time architecture

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Ivo S., E-mail: ivoc@ipfn.ist.utl.pt; Duarte, Paulo; Fernandes, Horácio; Valcárcel, Daniel F.; Carvalho, Pedro J.; Silva, Carlos; Duarte, André S.; Neto, André; Sousa, Jorge; Batista, António J.N.; Hekkert, Tiago; Carvalho, Bernardo B.

    2014-03-15

    Highlights: • All real-time diagnostics and actuators were integrated in the same control platform. • A 100 μs control cycle was achieved under the MARTe framework. • Time-windows based control with several event-driven control strategies implemented. • AC discharges with exception handling on iron core flux saturation. • An HTML discharge configuration was developed for configuring the MARTe system. - Abstract: The ISTTOK tokamak was upgraded with a plasma control system based on the Advanced Telecommunications Computing Architecture (ATCA) standard. This control system was designed to improve the discharge stability and to extend the operational space to the alternate plasma current (AC) discharges as part of the ISTTOK scientific program. In order to accomplish these objectives all ISTTOK diagnostics and actuators relevant for real-time operation were integrated in the control system. The control system was programmed in C++ over the Multi-threaded Application Real-Time executor (MARTe) which provides, among other features, a real-time scheduler, an interrupt handler, an intercommunications interface between code blocks and a clearly bounded interface with the external devices. As a complement to the MARTe framework, the BaseLib2 library provides the foundations for the data, code introspection and also a Hypertext Transfer Protocol (HTTP) server service. Taking advantage of the modular nature of MARTe, the algorithms of each diagnostic data processing, discharge timing, context switch, control and actuators output reference generation, run on well-defined blocks of code named Generic Application Module (GAM). This approach allows reusability of the code, simplified simulation, replacement or editing without changing the remaining GAMs. The ISTTOK control system GAMs run sequentially each 100 μs cycle on an Intel{sup ®} Q8200 4-core processor running at 2.33 GHz located in the ATCA crate. Two boards (inside the ATCA crate) with 32 analog

  20. ISTTOK real-time architecture

    International Nuclear Information System (INIS)

    Carvalho, Ivo S.; Duarte, Paulo; Fernandes, Horácio; Valcárcel, Daniel F.; Carvalho, Pedro J.; Silva, Carlos; Duarte, André S.; Neto, André; Sousa, Jorge; Batista, António J.N.; Hekkert, Tiago; Carvalho, Bernardo B.

    2014-01-01

    Highlights: • All real-time diagnostics and actuators were integrated in the same control platform. • A 100 μs control cycle was achieved under the MARTe framework. • Time-windows based control with several event-driven control strategies implemented. • AC discharges with exception handling on iron core flux saturation. • An HTML discharge configuration was developed for configuring the MARTe system. - Abstract: The ISTTOK tokamak was upgraded with a plasma control system based on the Advanced Telecommunications Computing Architecture (ATCA) standard. This control system was designed to improve the discharge stability and to extend the operational space to the alternate plasma current (AC) discharges as part of the ISTTOK scientific program. In order to accomplish these objectives all ISTTOK diagnostics and actuators relevant for real-time operation were integrated in the control system. The control system was programmed in C++ over the Multi-threaded Application Real-Time executor (MARTe) which provides, among other features, a real-time scheduler, an interrupt handler, an intercommunications interface between code blocks and a clearly bounded interface with the external devices. As a complement to the MARTe framework, the BaseLib2 library provides the foundations for the data, code introspection and also a Hypertext Transfer Protocol (HTTP) server service. Taking advantage of the modular nature of MARTe, the algorithms of each diagnostic data processing, discharge timing, context switch, control and actuators output reference generation, run on well-defined blocks of code named Generic Application Module (GAM). This approach allows reusability of the code, simplified simulation, replacement or editing without changing the remaining GAMs. The ISTTOK control system GAMs run sequentially each 100 μs cycle on an Intel ® Q8200 4-core processor running at 2.33 GHz located in the ATCA crate. Two boards (inside the ATCA crate) with 32 analog

  1. Black-swan events in animal populations

    OpenAIRE

    Anderson, Sean C.; Branch, Trevor A.; Cooper, Andrew B.; Dulvy, Nicholas K.

    2017-01-01

    Black swans?statistically improbable events with profound consequences?happen more often than expected in financial, social, and natural systems. Our work demonstrates the rare but systematic presence of black-swan events in animal populations around the world (mostly birds, mammals, and insects). These events are predominantly downward, implying that unexpected population crashes occur more frequently than increases. Black-swan events are not driven by life history (e.g., lifespan) but by ex...

  2. Fluid-driven origami-inspired artificial muscles

    Science.gov (United States)

    Li, Shuguang; Vogt, Daniel M.; Rus, Daniela; Wood, Robert J.

    2017-12-01

    Artificial muscles hold promise for safe and powerful actuation for myriad common machines and robots. However, the design, fabrication, and implementation of artificial muscles are often limited by their material costs, operating principle, scalability, and single-degree-of-freedom contractile actuation motions. Here we propose an architecture for fluid-driven origami-inspired artificial muscles. This concept requires only a compressible skeleton, a flexible skin, and a fluid medium. A mechanical model is developed to explain the interaction of the three components. A fabrication method is introduced to rapidly manufacture low-cost artificial muscles using various materials and at multiple scales. The artificial muscles can be programed to achieve multiaxial motions including contraction, bending, and torsion. These motions can be aggregated into systems with multiple degrees of freedom, which are able to produce controllable motions at different rates. Our artificial muscles can be driven by fluids at negative pressures (relative to ambient). This feature makes actuation safer than most other fluidic artificial muscles that operate with positive pressures. Experiments reveal that these muscles can contract over 90% of their initial lengths, generate stresses of ˜600 kPa, and produce peak power densities over 2 kW/kg—all equal to, or in excess of, natural muscle. This architecture for artificial muscles opens the door to rapid design and low-cost fabrication of actuation systems for numerous applications at multiple scales, ranging from miniature medical devices to wearable robotic exoskeletons to large deployable structures for space exploration.

  3. Bridging a divide: architecture for a joint hospital-primary care data warehouse.

    Science.gov (United States)

    An, Jeff; Keshavjee, Karim; Mirza, Kashif; Vassanji, Karim; Greiver, Michelle

    2015-01-01

    Healthcare costs are driven by a surprisingly small number of patients. Predicting who is likely to require care in the near future could help reduce costs by pre-empting use of expensive health care resources such as emergency departments and hospitals. We describe the design of an architecture for a joint hospital-primary care data warehouse (JDW) that can monitor the effectiveness of in-hospital interventions in reducing readmissions and predict which patients are most likely to be admitted to hospital in the near future. The design identifies the key governance elements, the architectural principles, the business case, the privacy architecture, future work flows, the IT infrastructure, the data analytics and the high level implementation plan for realization of the JDW. This architecture fills a gap in bridging data from two separate hospital and primary care organizations, not a single managed care entity with multiple locations. The JDW architecture design was well received by the stakeholders engaged and by senior leadership at the hospital and the primary care organization. Future plans include creating a demonstration system and conducting a pilot study.

  4. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation

    Science.gov (United States)

    Camuñas-Mesa, Luis A.; Domínguez-Cordero, Yaisel L.; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2018-01-01

    Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network. PMID:29515349

  5. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation

    Directory of Open Access Journals (Sweden)

    Luis A. Camuñas-Mesa

    2018-02-01

    Full Text Available Convolutional Neural Networks (ConvNets are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network.

  6. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation.

    Science.gov (United States)

    Camuñas-Mesa, Luis A; Domínguez-Cordero, Yaisel L; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2018-01-01

    Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network.

  7. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  8. Developing a real-time emulation of multiresolutional control architectures for complex, discrete-event systems

    Energy Technology Data Exchange (ETDEWEB)

    Davis, W.J.; Macro, J.G.; Brook, A.L. [Univ. of Illinois, Urbana, IL (United States)] [and others

    1996-12-31

    This paper first discusses an object-oriented, control architecture and then applies the architecture to produce a real-time software emulator for the Rapid Acquisition of Manufactured Parts (RAMP) flexible manufacturing system (FMS). In specifying the control architecture, the coordinated object is first defined as the primary modeling element. These coordinated objects are then integrated into a Recursive, Object-Oriented Coordination Hierarchy. A new simulation methodology, the Hierarchical Object-Oriented Programmable Logic Simulator, is then employed to model the interactions among the coordinated objects. The final step in implementing the emulator is to distribute the models of the coordinated objects over a network of computers and to synchronize their operation to a real-time clock. The paper then introduces the Hierarchical Subsystem Controller as an intelligent controller for the coordinated object. The proposed approach to intelligent control is then compared to the concept of multiresolutional semiosis that has been developed by Dr. Alex Meystel. Finally, the plans for implementing an intelligent controller for the RAMP FMS are discussed.

  9. A learnable parallel processing architecture towards unity of memory and computing.

    Science.gov (United States)

    Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J

    2015-08-14

    Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.

  10. A learnable parallel processing architecture towards unity of memory and computing

    Science.gov (United States)

    Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.

    2015-08-01

    Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.

  11. A radical paradigm shift for a new definition of architecture

    Directory of Open Access Journals (Sweden)

    Alessio Erioli

    2015-11-01

    Full Text Available Architecture is facing a period of great challenges and possibilities, typical of all crises. In such periods, the paradigms we have taken for granted so far are the same ones that generated the crisis, therefore they should undergo a critical revision and new modes of thought and operation (driven by computation can be actively explored and pursued. Instead of retreating to intellectual “safe houses”, architecture should open itself up to and be restructured by the accelerated pace of change imposed by reality, renouncing obsolete methods of anticipating and exerting control and welcoming a more proactive behaviour. It is therefore of primary importance to promote the exercise of projective imagination.

  12. Towards a QoE-Driven Resource Control in LTE and LTE-A Networks

    Directory of Open Access Journals (Sweden)

    Gerardo Gómez

    2013-01-01

    Full Text Available We propose a novel architecture for providing quality of experience (QoE awareness to mobile operator networks. In particular, we describe a possible architecture for QoE-driven resource control for long-term evolution (LTE and LTE-advanced networks, including a selection of KPIs to be monitored in different network elements. We also provide a description and numerical results of the QoE evaluation process for different data services as well as potential use cases that would benefit from the rollout of the proposed framework.

  13. Cross-cultural differences in processing of architectural ranking: evidence from an event-related potential study.

    Science.gov (United States)

    Mecklinger, Axel; Kriukova, Olga; Mühlmann, Heiner; Grunwald, Thomas

    2014-01-01

    Visual object identification is modulated by perceptual experience. In a cross-cultural ERP study we investigated whether cultural expertise determines how buildings that vary in their ranking between high and low according to the Western architectural decorum are perceived. Two groups of German and Chinese participants performed an object classification task in which high- and low-ranking Western buildings had to be discriminated from everyday life objects. ERP results indicate that an early stage of visual object identification (i.e., object model selection) is facilitated for high-ranking buildings for the German participants, only. At a later stage of object identification, in which object knowledge is complemented by information from semantic and episodic long-term memory, no ERP evidence for cultural differences was obtained. These results suggest that the identification of architectural ranking is modulated by culturally specific expertise with Western-style architecture already at an early processing stage.

  14. Environmental health--champions of One Health.

    Science.gov (United States)

    Eddy, Christopher; Stull, Paul A; Balster, Erik

    2013-01-01

    The authors find overwhelming evidence among environmental health practitioners that One Health disease reporting concepts are essential to the early detection of, and expedient recovery from, pandemic disease events. The authors also find, however, extraordinary evidence that local public health is not prepared, and potentially unaware of their responsibility, to be the initiator of the zoonotic infectious disease information intelligence necessary to make such early event mitigation possible. The authors propose that NEHA take an affirmative step towards the development of local public health-initiated biosurveillance systems by organizing and leading a tabletop study group that includes the Centers for Disease Control and Prevention, American Veterinary Medical Association, American Medical Association, Food and Drug Administration, U.S. Department of Agriculture, Institute of Medicine, and a robust panel of NEHA state affiliates. This study group should discuss the infrastructure necessary for local public health-the frontline against community-acquired infectious disease-to be the initiators of environmental health, veterinary, and medical One Health biosurveillance systems. The need to establish a community-focused, integrated disease prevention strategy that cautions people about the risks associated with food, water, animal, and contaminated environmental media, both prior to and during epidemic and pandemic events is equally important.

  15. Shock propagation in locally driven granular systems

    Science.gov (United States)

    Joy, Jilmy P.; Pathak, Sudhir N.; Das, Dibyendu; Rajesh, R.

    2017-09-01

    We study shock propagation in a system of initially stationary hard spheres that is driven by a continuous injection of particles at the origin. The disturbance created by the injection of energy spreads radially outward through collisions between particles. Using scaling arguments, we determine the exponent characterizing the power-law growth of this disturbance in all dimensions. The scaling functions describing the various physical quantities are determined using large-scale event-driven simulations in two and three dimensions for both elastic and inelastic systems. The results are shown to describe well the data from two different experiments on granular systems that are similarly driven.

  16. Automated Testing with Targeted Event Sequence Generation

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning; Prasad, Mukul R.; Møller, Anders

    2013-01-01

    Automated software testing aims to detect errors by producing test inputs that cover as much of the application source code as possible. Applications for mobile devices are typically event-driven, which raises the challenge of automatically producing event sequences that result in high coverage...

  17. Developing a Psychologically Inspired Cognitive Architecture for Robotic Control: The Symbolic and Subsymbolic Robotic Intelligence Control System (SS-RICS

    Directory of Open Access Journals (Sweden)

    Troy Dale Kelley

    2006-09-01

    Full Text Available This paper describes the ongoing development of a robotic control architecture that was inspired by computational cognitive architectures from the discipline of cognitive psychology. The robotic control architecture combines symbolic and subsymbolic representations of knowledge into a unified control structure. The architecture is organized as a goal driven, serially executing, production system at the highest symbolic level; and a multiple algorithm, parallel executing, simple collection of algorithms at the lowest subsymbolic level. The goal is to create a system that will progress through the same cognitive developmental milestones as do human infants. Common robotics problems of localization, object recognition, and object permanence are addressed within the specified framework.

  18. Developing a Psychologically Inspired Cognitive Architecture for Robotic Control: The Symbolic and Subsymbolic Robotic Intelligence Control System (SS-RICS

    Directory of Open Access Journals (Sweden)

    Troy Dale Kelley

    2008-11-01

    Full Text Available This paper describes the ongoing development of a robotic control architecture that was inspired by computational cognitive architectures from the discipline of cognitive psychology. The robotic control architecture combines symbolic and subsymbolic representations of knowledge into a unified control structure. The architecture is organized as a goal driven, serially executing, production system at the highest symbolic level; and a multiple algorithm, parallel executing, simple collection of algorithms at the lowest subsymbolic level. The goal is to create a system that will progress through the same cognitive developmental milestones as do human infants. Common robotics problems of localization, object recognition, and object permanence are addressed within the specified framework.

  19. The Current Scenario of Curvilinear Architecture in Malaysia

    Directory of Open Access Journals (Sweden)

    Faridah Aduan

    2007-12-01

    Full Text Available The Bilbao Effect incorporates itself into numerous iconic buildings and grand design s of international architects such as Frank Gehry, Santiago Calatrava, Norman Foster, Renzo Piano, Phillip Cox and Toyo Ito who exploit curvilinear forms as their architectural language .Throughout the world, major events such as the Olympic Games have catalysed the implementation of curvilinear architecture while in Malaysia, the development of Putrajaya has provided opportunity for iconic forms, expressed in curvilinearity. The paper focuses on the current scenario of curvilinear architecture in Malaysia and its posit ion in the international arena. It strives to answer the question of 'How far have Malaysian architects gone in implementing curvilinear architecture?' This is done by first formulating a 'Taxonomy of Rigid Curvilinear Architectural Forms' based on the works of renowned international architects. The taxonomy constitutes the instrument for gauging the position of Malaysian architects. This is achieved by having the works of local architects mapped on to the taxonomy. The research findings indicate that the international architects have advanced by leaps and bounds ahead of their Malaysian counterparts in implementing curvilinear architecture. Several recommendations are proposed in order to narrow this gap .The paper focuses on column-free, rigid and permanent buildings completed from 1990 onwards .

  20. Performance of the CMS Event Builder

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J.M.; et al.

    2017-11-22

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of to the high-level trigger farm. The DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbit/s Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbit/s Infiniband FDR Clos network has been chosen for the event builder. This paper presents the implementation and performance of the event-building system.

  1. Opportunistic or event-driven maintenance at the Stanford Linear Accelerator Center

    International Nuclear Information System (INIS)

    Allen, C.W.; Anderson, S.; Erickson, R.; Linebarger, W.; Sheppard, J.C.; Stanek, M.

    1997-03-01

    The Stanford Linear Accelerator Center (SLAC) uses a maintenance management philosophy that is best described as opportunistic or event-driven. Opportunistic maintenance can be defined as a systematic method of collecting, investigating, pre-planning, and publishing a set of proposed maintenance tasks and acting on them when there is an unscheduled failure or repair ''opportunity''. Opportunistic maintenance can be thought of as a modification of the run-to-fail maintenance management philosophy. This maintenance plan was adopted and developed to improve the overall availability of SLAC's linear accelerator, beam delivery systems, and associated controls, power systems, and utilities. In the late 1980's, as the technical complexity of the accelerator facility increased, variations on a conventional maintenance plan were used with mixed results. These variations typically included some type of regular periodic interruption to operations. The periodic shutdowns and unscheduled failures were additive and resulted in unsatisfactory availability. Maintenance issues are evaluated in a daily meeting that includes the accelerator managers, maintenance supervisors and managers, safety office personnel, program managers, and accelerator operators. Lists of pending maintenance tasks are made available to the general SLAC population by a World Wide Web site on a local internet. A conventional information system which pre-dates the WWW site is still being used to provide paper copies to groups that are not yet integrated into the WWW system. The local internet provides real time maintenance information, allowing people throughout the facility to track progress on tasks with essentially real-time status updates. With the introduction of opportunistic maintenance, the accelerator's availability has been measurably better. This paper will discuss processes, rolls and responsibilities of key maintenance groups, and management tools developed to support opportunistic maintenance

  2. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  3. Fluid-driven origami-inspired artificial muscles.

    Science.gov (United States)

    Li, Shuguang; Vogt, Daniel M; Rus, Daniela; Wood, Robert J

    2017-12-12

    Artificial muscles hold promise for safe and powerful actuation for myriad common machines and robots. However, the design, fabrication, and implementation of artificial muscles are often limited by their material costs, operating principle, scalability, and single-degree-of-freedom contractile actuation motions. Here we propose an architecture for fluid-driven origami-inspired artificial muscles. This concept requires only a compressible skeleton, a flexible skin, and a fluid medium. A mechanical model is developed to explain the interaction of the three components. A fabrication method is introduced to rapidly manufacture low-cost artificial muscles using various materials and at multiple scales. The artificial muscles can be programed to achieve multiaxial motions including contraction, bending, and torsion. These motions can be aggregated into systems with multiple degrees of freedom, which are able to produce controllable motions at different rates. Our artificial muscles can be driven by fluids at negative pressures (relative to ambient). This feature makes actuation safer than most other fluidic artificial muscles that operate with positive pressures. Experiments reveal that these muscles can contract over 90% of their initial lengths, generate stresses of ∼600 kPa, and produce peak power densities over 2 kW/kg-all equal to, or in excess of, natural muscle. This architecture for artificial muscles opens the door to rapid design and low-cost fabrication of actuation systems for numerous applications at multiple scales, ranging from miniature medical devices to wearable robotic exoskeletons to large deployable structures for space exploration. Copyright © 2017 the Author(s). Published by PNAS.

  4. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  5. Dead-time free pixel readout architecture for ATLAS front-end IC

    CERN Document Server

    Einsweiler, Kevin F; Kleinfelder, S A; Luo, L; Marchesini, R; Milgrome, O; Pengg, F X

    1999-01-01

    A low power sparse scan readout architecture has been developed for the ATLAS pixel front-end IC. The architecture supports a dual discriminator and extracts the time over threshold (TOT) information along with a 2-D spatial address $9 of the hits associating them with a unique 7-bit beam crossing number. The IC implements level-1 trigger filtering along with event building (grouping together all hits in a beam crossing) in the end of column (EOC) buffer. The $9 events are transmitted over a 40 MHz serial data link with the protocol supporting buffer overflow handling by appending error flags to events. This mixed-mode full custom IC is implemented in 0.8 mu HP process to meet the $9 requirements for the pixel readout in the ATLAS inner detector. The circuits have been tested and the IC provides dead-time-less ambiguity free readout at 40 MHz data rate.

  6. RBAC Driven Least Privilege Architecture For Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hull, Julie [Honeywell International Inc., Golden Valley, MN (United States); Markham, Mark [Honeywell International Inc., Golden Valley, MN (United States)

    2014-01-25

    The concept of role based access control (RBAC) within the IT environment has been studied by researchers and was supported by NIST (circa 1992). This earlier work highlighted the benefits of RBAC which include reduced administrative workload and policies which are easier to analyze and apply. The goals of this research were to expand the application of RBAC in the following ways. Apply RBAC to the control systems environment: The typical RBAC model within the IT environment is used to control a user’s access to files. Within the control system environment files are replaced with measurement (e.g., temperature) and control (e.g. valve) points organized as a hierarchy of control assets (e.g. a boiler, compressor, refinery unit). Control points have parameters (e.g., high alarm limit, set point, etc.) associated with them. The RBAC model is extended to support access to points and their parameters based upon roles while at the same time allowing permissions for the points to be defined at the asset level or point level directly. In addition, centralized policy administration with distributed access enforcement mechanisms was developed to support the distributed architecture of distributed control systems and SCADA; Extend the RBAC model to include access control for software and devices: The established RBAC approach is to assign users to roles. This work extends that notion by first breaking the control system down into three layers 1) users, 2) software and 3) devices. An RBAC model is then created for each of these three layers. The result is that RBAC can be used to define machine-to-machine policy enforced via the IP security (IPsec) protocol. This highlights the potential to use RBAC for machine-to-machine connectivity within the internet of things; and Enable dynamic policy based upon the operating mode of the system: The IT environment is generally static with respect to policy. However, large cyber physical systems such as industrial controls have various

  7. Ligand Bridging-Angle-Driven Assembly of Molecular Architectures Based on Quadruply Bonded Mo-Mo Dimers

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jian-Rong; Yakovenko, Andrey A; Lu, Weigang; Timmons, Daren J; Zhuang, Wenjuan; Yuan, Daqiang; Zhou, Hong-Cai

    2010-12-15

    A systematic exploration of the assembly of Mo₂(O₂C-)₄-based metal–organic molecular architectures structurally controlled by the bridging angles of rigid organic linkers has been performed. Twelve bridging dicarboxylate ligands were designed to be of different sizes with bridging angles of 0, 60, 90, and 120° while incorporating a variety of nonbridging functional groups, and these ligands were used as linkers. These dicarboxylate linkers assemble with quadruply bonded Mo–Mo clusters acting as nodes to give 13 molecular architectures, termed metal–organic polygons/polyhedra with metal cluster node arrangements of a linear shape, triangle, octahedron, and cuboctahedron/anti-cuboctahedron. The syntheses of these complexes have been optimized and their structures determined by single-crystal X-ray diffraction. The results have shown that the shape and size of the resulting molecular architecture can be controlled by tuning the bridging angle and size of the linker, respectively. Functionalization of the linker can adjust the solubility of the ensuing molecular assembly but has little or no effect on the geometry of the product. Preliminary gas adsorption, spectroscopic, and electrochemical properties of selected members were also studied. The present work is trying to enrich metal-containing supramolecular chemistry through the inclusion of well-characterized quadruply bonded Mo–Mo units into the structures, which can widen the prospect of additional electronic functionality, thereby leading to novel properties.

  8. Ligand bridging-angle-driven assembly of molecular architectures based on quadruply bonded Mo-Mo dimers.

    Science.gov (United States)

    Li, Jian-Rong; Yakovenko, Andrey A; Lu, Weigang; Timmons, Daren J; Zhuang, Wenjuan; Yuan, Daqiang; Zhou, Hong-Cai

    2010-12-15

    A systematic exploration of the assembly of Mo2(O2C-)4-based metal-organic molecular architectures structurally controlled by the bridging angles of rigid organic linkers has been performed. Twelve bridging dicarboxylate ligands were designed to be of different sizes with bridging angles of 0, 60, 90, and 120° while incorporating a variety of nonbridging functional groups, and these ligands were used as linkers. These dicarboxylate linkers assemble with quadruply bonded Mo-Mo clusters acting as nodes to give 13 molecular architectures, termed metal-organic polygons/polyhedra with metal cluster node arrangements of a linear shape, triangle, octahedron, and cuboctahedron/anti-cuboctahedron. The syntheses of these complexes have been optimized and their structures determined by single-crystal X-ray diffraction. The results have shown that the shape and size of the resulting molecular architecture can be controlled by tuning the bridging angle and size of the linker, respectively. Functionalization of the linker can adjust the solubility of the ensuing molecular assembly but has little or no effect on the geometry of the product. Preliminary gas adsorption, spectroscopic, and electrochemical properties of selected members were also studied. The present work is trying to enrich metal-containing supramolecular chemistry through the inclusion of well-characterized quadruply bonded Mo-Mo units into the structures, which can widen the prospect of additional electronic functionality, thereby leading to novel properties.

  9. International collaboration and partnering in the supply chain as business opportunities for architectural firms

    NARCIS (Netherlands)

    Bos-de Vos, M.; Lieftink, B.; Volker, L.; Wamelink, J.W.F.

    2014-01-01

    Due to a shift towards market driven concepts, a risk allocation from the demand side to the supply side and the increasing competition with other skilled actors in the value chain, architectural firms have to adapt quickly to stay competitive. They need to innovate not only their products and

  10. Architectural slicing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2013-01-01

    Architectural prototyping is a widely used practice, con- cerned with taking architectural decisions through experiments with light- weight implementations. However, many architectural decisions are only taken when systems are already (partially) implemented. This is prob- lematic in the context...... of architectural prototyping since experiments with full systems are complex and expensive and thus architectural learn- ing is hindered. In this paper, we propose a novel technique for harvest- ing architectural prototypes from existing systems, \\architectural slic- ing", based on dynamic program slicing. Given...... a system and a slicing criterion, architectural slicing produces an architectural prototype that contain the elements in the architecture that are dependent on the ele- ments in the slicing criterion. Furthermore, we present an initial design and implementation of an architectural slicer for Java....

  11. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  12. Generic Software Architecture for Launchers

    Science.gov (United States)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  13. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  14. SUSTAINABLE ARCHITECTURE : WHAT ARCHITECTURE STUDENTS THINK

    OpenAIRE

    SATWIKO, PRASASTO

    2013-01-01

    Sustainable architecture has become a hot issue lately as the impacts of climate change become more intense. Architecture educations have responded by integrating knowledge of sustainable design in their curriculum. However, in the real life, new buildings keep coming with designs that completely ignore sustainable principles. This paper discusses the results of two national competitions on sustainable architecture targeted for architecture students (conducted in 2012 and 2013). The results a...

  15. Running Parallel Discrete Event Simulators on Sierra

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jefferson, D. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  16. Architecture

    OpenAIRE

    Clear, Nic

    2014-01-01

    When discussing science fiction’s relationship with architecture, the usual practice is to look at the architecture “in” science fiction—in particular, the architecture in SF films (see Kuhn 75-143) since the spaces of literary SF present obvious difficulties as they have to be imagined. In this essay, that relationship will be reversed: I will instead discuss science fiction “in” architecture, mapping out a number of architectural movements and projects that can be viewed explicitly as scien...

  17. Usability and user driven innovation - unity or clash?

    DEFF Research Database (Denmark)

    Fronczek-Munter, Aneta

    2011-01-01

    Aim: To present different understandings of the concepts ‗usability‘ and ‗user driven innovation‘ and discuss if and how the built environment can benefit from these concepts and the unity of them. Approach and methodology: The paper is based on literature reviews of scientific journals and other...... influential publications within the academic fields of Facilities Management, Architecture and Engineering, Participatory Design and Software design. Outline: The paper will discuss different understandings of the concept ‗usability‘ and its relation to ‗user driven innovation‘, which depends on the academic...... field and area of professional application. The concept of usability has its roots in evaluations of consumer products and user interfaces of computer software. During the last 5-10 years there has been a new development of research in usability of buildings and workplaces. Recently researchers have...

  18. Architecture of petawatt-class z-pinch accelerators

    Directory of Open Access Journals (Sweden)

    W. A. Stygar

    2007-03-01

    Full Text Available We have developed an accelerator architecture that can serve as the basis of the design of petawatt-class z-pinch drivers. The architecture has been applied to the design of two z-pinch accelerators, each of which can be contained within a 104-m-diameter cylindrical tank. One accelerator is driven by slow (∼1   μs Marx generators, which are a mature technology but which necessitate significant pulse compression to achieve the short pulses (≪1   μs required to drive z pinches. The other is powered by linear transformer drivers (LTDs, which are less mature but produce much shorter pulses than conventional Marxes. Consequently, an LTD-driven accelerator promises to be (at a given pinch current and implosion time more efficient and reliable. The Marx-driven accelerator produces a peak electrical power of 500 TW and includes the following components: (i 300 Marx generators that comprise a total of 1.8×10^{4} capacitors, store 98 MJ, and erect to 5 MV; (ii 600 water-dielectric triplate intermediate-store transmission lines, which also serve as pulse-forming lines; (iii 600 5-MV laser-triggered gas switches; (iv three monolithic radial-transmission-line impedance transformers, with triplate geometries and exponential impedance profiles; (v a 6-level 5.5-m-diameter 15-MV vacuum insulator stack; (vi six magnetically insulated vacuum transmission lines (MITLs; and (vii a triple-post-hole vacuum convolute that adds the output currents of the six MITLs, and delivers the combined current to a z-pinch load. The accelerator delivers an effective peak current of 52 MA to a 10-mm-length z pinch that implodes in 95 ns, and 57 MA to a pinch that implodes in 120 ns. The LTD-driven accelerator includes monolithic radial transformers and a MITL system similar to those described above, but does not include intermediate-store transmission lines, multimegavolt gas switches, or a laser trigger system. Instead, this accelerator is driven by 210

  19. Architecture of petawatt-class z-pinch accelerators

    International Nuclear Information System (INIS)

    Stygar, William A.; Mazarakis, Michael Gerrassimos; Cuneo, Michael Edward; Leeper, Ramon Joe; Ives, H.C.; Headley, D.I.; Wagoner, Tim C.; Porter, John Larry Jr.

    2006-01-01

    We have developed an accelerator architecture that can serve as the basis of the design of petawatt-class z-pinch drivers. The architecture has been applied to the design of two z-pinch accelerators, each of which can be contained within a 104-m-diameter cylindrical tank. One accelerator is driven by slow (∼1 (micro)s) Marx generators, which are a mature technology but which necessitate significant pulse compression to achieve the short pulses ( 4 capacitors, store 98 MJ, and erect to 5 MV; (ii) 600 water-dielectric triplate intermediate-store transmission lines, which also serve as pulse-forming lines; (iii) 600 5-MV laser-triggered gas switches; (iv) three monolithic radial-transmission-line impedance transformers, with triplate geometries and exponential impedance profiles; (v) a 6-level 5.5-m-diameter 15-MV vacuum insulator stack; (vi) six magnetically insulated vacuum transmission lines (MITLs); and (vii) a triple-post-hole vacuum convolute that adds the output currents of the six MITLs, and delivers the combined current to a z-pinch load. The accelerator delivers an effective peak current of 52 MA to a 10-mm-length z pinch that implodes in 95 ns, and 57 MA to a pinch that implodes in 120 ns. The LTD-driven accelerator includes monolithic radial transformers and a MITL system similar to those described above, but does not include intermediate-store transmission lines, multimegavolt gas switches, or a laser trigger system. Instead, this accelerator is driven by 210 LTD modules that include a total of 1 x 10 6 capacitors and 5 x 10 5 200-kV electrically triggered gas switches. The LTD accelerator stores 182 MJ and produces a peak electrical power of 1000 TW. The accelerator delivers an effective peak current of 68 MA to a pinch that implodes in 95 ns, and 75 MA to a pinch that implodes in 120 ns. Conceptually straightforward upgrades to these designs would deliver even higher pinch currents and faster implosions

  20. Communication and Memory Architecture Design of Application-Specific High-End Multiprocessors

    Directory of Open Access Journals (Sweden)

    Yahya Jan

    2012-01-01

    Full Text Available This paper is devoted to the design of communication and memory architectures of massively parallel hardware multiprocessors necessary for the implementation of highly demanding applications. We demonstrated that for the massively parallel hardware multiprocessors the traditionally used flat communication architectures and multi-port memories do not scale well, and the memory and communication network influence on both the throughput and circuit area dominates the processors influence. To resolve the problems and ensure scalability, we proposed to design highly optimized application-specific hierarchical and/or partitioned communication and memory architectures through exploring and exploiting the regularity and hierarchy of the actual data flows of a given application. Furthermore, we proposed some data distribution and related data mapping schemes in the shared (global partitioned memories with the aim to eliminate the memory access conflicts, as well as, to ensure that our communication design strategies will be applicable. We incorporated these architecture synthesis strategies into our quality-driven model-based multi-processor design method and related automated architecture exploration framework. Using this framework, we performed a large series of experiments that demonstrate many various important features of the synthesized memory and communication architectures. They also demonstrate that our method and related framework are able to efficiently synthesize well scalable memory and communication architectures even for the high-end multiprocessors. The gains as high as 12-times in performance and 25-times in area can be obtained when using the hierarchical communication networks instead of the flat networks. However, for the high parallelism levels only the partitioned approach ensures the scalability in performance.

  1. Modeling Architectural Patterns Using Architectural Primitives

    NARCIS (Netherlands)

    Zdun, Uwe; Avgeriou, Paris

    2005-01-01

    Architectural patterns are a key point in architectural documentation. Regrettably, there is poor support for modeling architectural patterns, because the pattern elements are not directly matched by elements in modeling languages, and, at the same time, patterns support an inherent variability that

  2. Stimulus-driven capture and contingent capture

    NARCIS (Netherlands)

    Theeuwes, J.; Olivers, C.N.L.; Belopolsky, A.V.

    2010-01-01

    Whether or not certain physical events can capture attention has been one of the most debated issues in the study of attention. This discussion is concerned with how goal-directed and stimulus-driven processes interact in perception and cognition. On one extreme of the spectrum is the idea that

  3. Manned/Unmanned Common Architecture Program (MCAP) net centric flight tests

    Science.gov (United States)

    Johnson, Dale

    2009-04-01

    Properly architected avionics systems can reduce the costs of periodic functional improvements, maintenance, and obsolescence. With this in mind, the U.S. Army Aviation Applied Technology Directorate (AATD) initiated the Manned/Unmanned Common Architecture Program (MCAP) in 2003 to develop an affordable, high-performance embedded mission processing architecture for potential application to multiple aviation platforms. MCAP analyzed Army helicopter and unmanned air vehicle (UAV) missions, identified supporting subsystems, surveyed advanced hardware and software technologies, and defined computational infrastructure technical requirements. The project selected a set of modular open systems standards and market-driven commercial-off-theshelf (COTS) electronics and software, and, developed experimental mission processors, network architectures, and software infrastructures supporting the integration of new capabilities, interoperability, and life cycle cost reductions. MCAP integrated the new mission processing architecture into an AH-64D Apache Longbow and participated in Future Combat Systems (FCS) network-centric operations field experiments in 2006 and 2007 at White Sands Missile Range (WSMR), New Mexico and at the Nevada Test and Training Range (NTTR) in 2008. The MCAP Apache also participated in PM C4ISR On-the-Move (OTM) Capstone Experiments 2007 (E07) and 2008 (E08) at Ft. Dix, NJ and conducted Mesa, Arizona local area flight tests in December 2005, February 2006, and June 2008.

  4. A scenario-driven approach for value, risk and cost analysis in system architecting for innovation

    NARCIS (Netherlands)

    Ionita, M.T.; America, P.H.M.; Hammer, D.K.; Obbink, J.H.; Trienekens, J.J.M.; Magee, J.; Szyperski, C.; Bosch, J.

    2004-01-01

    We present a quantitative method for scenario-driven value, risk, and cost analysis when proposing new system architectures for innovation projects. The method helps to articulate the relative benefits and/or disadvantages of the proposed set of scenarios in the early architecting phases of a new

  5. Event-Driven Messaging for Offline Data Quality Monitoring at ATLAS

    CERN Document Server

    Onyisi, Peter; The ATLAS collaboration

    2015-01-01

    During LHC Run 1, the information flow through the offline data quality monitoring in ATLAS relied heavily on chains of processes polling each other's outputs for handshaking purposes. This resulted in a fragile architecture with many possible points of failure and an inability to monitor the overall state of the distributed system. We report on the status of a project undertaken during the LHC shutdown to replace the ad hoc synchronization methods with a uniform message queue system. This enables the use of standard protocols to connect processes on multiple hosts; reliable transmission of messages between possibly unreliable programs; easy monitoring of the information flow; and the removal of inefficient polling-based communication.

  6. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    Science.gov (United States)

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  7. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    Directory of Open Access Journals (Sweden)

    Jose-Luis Poza-Lujan

    2015-02-01

    Full Text Available This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS parameters and the optimization of control using Quality of Control (QoC parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS communication standard as proposed by the Object Management Group (OMG. As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  8. Assessing the Effectiveness of the Early Aberration Reporting System (EARS) for Early Event Detection of the H1N1 (Swine Flu) Virus

    Science.gov (United States)

    2010-09-01

    From Fricker & Hanni , 2010) ..............8 Figure 6. MCHD implementation of the EARS biosurveillance system (After Fricker & Hanni , 2010...11 Figure 8. Confirmed and probable H1N1 case rate by age group from April 15–July 24, 2009 (From Hanni ...United States by age group from April 15 to July 24, 2009 (From Hanni , 2009).......................13 Figure 10. CDC Testing Recommendations for

  9. Event driven adaptation, land use and human coping strategies

    DEFF Research Database (Denmark)

    Reenberg, Anette; Birch-Thomsen, Torben; Fog, Bjarne

    perceive cause-effect relationships between societal and environmental events and their individual and collective management of resources. The coupled human-environment timelines are used to discuss ways in which the local communities' adaptive resource management strategies have been employed in the face......The paper focuses on assessing the wider perspectives of adaptive resource management strategies in former subsistence agriculture societies in the SW Pacific. Firstly, we will briefly introduce the theoretical context related to the livelihood framework, adaptation to socio-environmental change...... and the concept of coupled human-environmental timelines. Secondly, with point of departure in a baseline characterization of Bellona Island derived from a comprehensive survey in the late 1960s and resent fieldwork in late 2006, we present the case of Bellona Island. Key issues addressed concern climatic events...

  10. Nanosatellite Architectures for Improved Study of the Hydrologic Cycle

    Science.gov (United States)

    Blackwell, W. J.; Osaretin, I.; Cahoy, K.

    2012-12-01

    spacecraft spinning mechanism provides a 60 RPM cross-track scan as the satellite orbits the earth. Spatial, spectral, and radiometric performance is comparable to present state-of-the-art systems with costs exceeding $100M. The propulsion systems would be used to achieve formation flight (the satellites would be separated by approximately 500 ± 5 km) and to facilitate de-orbit. The cross-linked communication would provide: 1) reduced communications latency to ground, a key performance attribute that is currently lacking in present systems leading to suboptimal utilization of observations of dynamic meteorological events such as tropical cyclones and hurricanes, and 2) data-driven sensing whereby the lead sensor observes dynamic meteorological phenomena and sends a message to the following sensor to temporarily enable a very high resolution sensing mode (a higher sample rate, for example) to better capture the interesting event and preserve spacecraft resources for when they are most needed. The DOME constellation would allow global, high-resolution, persistent observations of the Earth's surface and atmosphere for studies of the hydrologic cycle and climate feedback processes.

  11. Architecture and Knowledge-Driven Self-Adaptive Security in Smart Space

    Directory of Open Access Journals (Sweden)

    Antti Evesti

    2013-03-01

    Full Text Available Dynamic and heterogeneous smart spaces cause challenges for security because it is impossible to anticipate all the possible changes at design-time. Self-adaptive security is an applicable solution for this challenge. This paper presents an architectural approach for security adaptation in smart spaces. The approach combines an adaptation loop, Information Security Measuring Ontology (ISMO and a smart space security-control model. The adaptation loop includes phases to monitor, analyze, plan and execute changes in the smart space. The ISMO offers input knowledge for the adaptation loop and the security-control model enforces dynamic access control policies. The approach is novel because it defines the whole adaptation loop and knowledge required in each phase of the adaptation. The contributions are validated as a part of the smart space pilot implementation. The approach offers reusable and extensible means to achieve adaptive security in smart spaces and up-to-date access control for devices that appear in the space. Hence, the approach supports the work of smart space application developers.

  12. Data driven processor 'Vertex Trigger' for B experiments

    International Nuclear Information System (INIS)

    Hartouni, E.P.

    1993-01-01

    Data Driven Processors (DDP's) are specialized computation engines configured to solve specific numerical problems, such as vertex reconstruction. The architecture of the DDP which is the subject of this talk was designed and implemented by W. Sippach and B.C. Knapp at Nevis Lab. in the early 1980's. This particular implementation allows multiple parallel streams of data to provide input to a heterogenous collection of simple operators whose interconnection form an algorithm. The local data flow control allows this device to execute algorithms extremely quickly provided that care is taken in the layout of the algorithm. I/O rates of several hundred megabytes/second are routinely achieved thus making DDP's attractive candidates for complex online calculations. The original question was open-quote can a DDP reconstruct tracks in a Silicon Vertex Detector, find events with a separated vertex and do it fast enough to be used as an online trigger?close-quote Restating this inquiry as three questions and describing the answers to the questions will be the subject of this talk. The three specific questions are: (1) Can an algorithm be found which reconstructs tracks in a planar geometry and no magnetic field; (2) Can separated vertices be recognized in some way; (3) Can the algorithm be implemented in the Nevis-UMass and DDP and execute in 10-20 μs?

  13. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    Science.gov (United States)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information

  14. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    Science.gov (United States)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  15. Discrete Event Simulation of Distributed Team Communication

    Science.gov (United States)

    2012-03-22

    performs, and auditory information that is provided through multiple audio devices with speech response. This paper extends previous discrete event workload...2008, pg. 1) notes that “Architecture modeling furnishes abstrac- tions for use in managing complexities, allowing engineers to visualise the proposed

  16. System architecture for high speed reconstruction in time-of-flight positron tomography

    International Nuclear Information System (INIS)

    Campagnolo, R.E.; Bouvier, A.; Chabanas, L.; Robert, C.

    1985-06-01

    A new generation of Time Of Flight (TOF) positron tomograph with high resolution and high count rate capabilities is under development in our group. After a short recall of the data acquisition process and image reconstruction in a TOF PET camera, we present the data acquisition system which achieves a data transfer rate of 0.8 mega events per second or more if necessary in list mode. We describe the reconstruction process based on a five stages pipe line architecture using home made processors. The expected performance with this architecture is a time reconstruction of six seconds per image (256x256 pixels) of one million events. This time could be reduce to 4 seconds. We conclude with the future developments of the system

  17. Views of CMS Event Data Objects, Files, Collections, Virtual Data Products

    CERN Document Server

    Holtman, Koen

    2001-01-01

    The CMS data grid system will store many types of data maintained by the CMS collaboration. An important type of data is the event data, which is defined in this note as all data that directly represents simulated, raw, or reconstructed CMS physics events. Many views on this data will exist simultaneously. To a CMS physics code implementer this data will appear as C++ objects, to a tape robot operator the data will appear as files. This note identifies different views that can exist, describes each of them, and interrelates them by placing them into a vertical stack. This particular stack integrates several existing architectural structures, and is therefore a plausible basis for further prototyping and architectural work. This document is intended as a contribution to, and as common (terminological) reference material for, the CMS architectural efforts and for the Grid projects PPDG, GriPhyN, and the EU DataGrid.

  18. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.

    Science.gov (United States)

    Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-12-01

    Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.

  19. Modeling, realization and evaluation of a parallel architecture for the data acquisition in multidetectors

    International Nuclear Information System (INIS)

    Guirande, Ph.; Aleonard, M-M.; Dien, Q-T.; Pedroza, J-L.

    1997-01-01

    The efficiency increasing in four π (EUROGAM, EUROBALL, DIAMANT) is achieved by an increase in the granularity, hence in the event counting rate in the acquisition system. Consequently, an evolution of the architecture of readout systems, coding and software is necessary. To achieve the required evaluation we have implemented a parallel architecture to check the quality of the events. The first application of this architecture was to make available an improved data acquisition system for the DIAMANT multidetector. The data acquisition system of DIAMANT is based on an ensemble of VME cards which must manage: the event readout, their salvation on magnetic support and histogram construction. The ensemble consists of processors distributed in a net, a workstation to control the experiment and a display system for spectra and arrays. In such architecture the task of VME bus becomes quickly a limitation for performances not only for the data transfer but also for coordination of different processors. The parallel architecture used makes the VME bus operation easy. It is based on three DSP C40 (Digital Signal Processor) implanted in a commercial (LSI) VME. It is provided with an external bus used to read the raw data from an interface card (ROCVI) between the 32 bit ECL bus reading the real time VME-based encoders. The performed tests have evidenced jamming after data exchanges between the processors using two communication lines. The analysis of this problem has indicated the necessity of dynamical changes of tasks to avoid this blocking. Intrinsic evaluation (i.e. without transfer on the VME bus) has been carried out for two parallel topologies (processor farm and tree). The simulation software permitted the generation of event packets. The obtained rates are sensibly equivalent (6 Mo/s) independent of topology. The farm topology has been chosen because it is simple to implant. The charge evaluation has reduced the rate in 'simplex' communication mode to 5.3 Mo/s and

  20. LHCb Event display

    CERN Document Server

    Trisovic, Ana

    2014-01-01

    The LHCb Event Display was made for educational purposes at the European Organization for Nuclear Research, CERN in Geneva, Switzerland. The project was implemented as a stand-alone application using C++ and ROOT, a framework developed by CERN for data analysis. This paper outlines the development and architecture of the application in detail, as well as the motivation for the development and the goals of the exercise. The application focuses on the visualization of events recorded by the LHCb detector, where an event represents a set of charged particle tracks in one proton-proton collision. Every particle track is coloured by its type and can be selected to see its essential information such as mass and momentum. The application allows students to save this information and calculate the invariant mass for any pair of particles. Furthermore, the students can use additional calculating tools in the application and build up a histogram of these invariant masses. The goal for the students is to find a $D^0$ par...

  1. Architectural prototyping

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Hansen, Klaus Marius

    2004-01-01

    A major part of software architecture design is learning how specific architectural designs balance the concerns of stakeholders. We explore the notion of "architectural prototypes", correspondingly architectural prototyping, as a means of using executable prototypes to investigate stakeholders...

  2. Smart House Interconnected System Architecture

    Directory of Open Access Journals (Sweden)

    ALBU Răzvan-Daniel

    2017-05-01

    Full Text Available In this research work we will present the architecture of an intelligent house system capable to detect accidents cause by floods, gas, and to protect against unauthorized access or burglary. Our system is not just an alarm, it continuously monitors the house and reports over internet its state. Most of the current smart house systems available on the market alarms the user via email or SMS when an unwanted event happens. Thus, the user assumes that the house is not affected if an alarm message is not received. This is not always true, since the monitoring system components can also damage, or the entire system can become unable to send an alarm message even if it detects an unwanted event. This article presents also details about both hardware and software implementation.

  3. Implementing An Image Understanding System Architecture Using Pipe

    Science.gov (United States)

    Luck, Randall L.

    1988-03-01

    This paper will describe PIPE and how it can be used to implement an image understanding system. Image understanding is the process of developing a description of an image in order to make decisions about its contents. The tasks of image understanding are generally split into low level vision and high level vision. Low level vision is performed by PIPE -a high performance parallel processor with an architecture specifically designed for processing video images at up to 60 fields per second. High level vision is performed by one of several types of serial or parallel computers - depending on the application. An additional processor called ISMAP performs the conversion from iconic image space to symbolic feature space. ISMAP plugs into one of PIPE's slots and is memory mapped into the high level processor. Thus it forms the high speed link between the low and high level vision processors. The mechanisms for bottom-up, data driven processing and top-down, model driven processing are discussed.

  4. Architectural communication: Intra and extra activity of architecture

    Directory of Open Access Journals (Sweden)

    Stamatović-Vučković Slavica

    2013-01-01

    Full Text Available Apart from a brief overview of architectural communication viewed from the standpoint of theory of information and semiotics, this paper contains two forms of dualistically viewed architectural communication. The duality denotation/connotation (”primary” and ”secondary” architectural communication is one of semiotic postulates taken from Umberto Eco who viewed architectural communication as a semiotic phenomenon. In addition, architectural communication can be viewed as an intra and an extra activity of architecture where the overall activity of the edifice performed through its spatial manifestation may be understood as an act of communication. In that respect, the activity may be perceived as the ”behavior of architecture”, which corresponds to Lefebvre’s production of space.

  5. The iCub Software Architecture: evolution and lessons learned

    Directory of Open Access Journals (Sweden)

    Lorenzo eNatale

    2016-04-01

    Full Text Available The complexity of humanoid robots is increasing with the availability of new sensors, embedded CPUs and actuators. This wealth of technologies allows researchers to investigate new problems like whole-body force control, multi-modal human-robot interaction and sensory fusion. Under the hood of these robots, the software architecture has an important role: it allows researchers to get access to the robot functionalities focusing primarily on their research problems, it supports code reuse to minimize development and debugging, especially when new hardware becomes available. But more importantly it allows increasing the complexity of the experiments that can be implemented before system integration becomes unmanageable and debugging draws more resources than research itself.In this paper we illustrate the software architecture of the iCub humanoid robot and the software engineering best practices that have emerged driven by the needs of our research community. We describe the latest developments at the level of the middleware supporting interface definition and automatic code generation, logging, ROS compatibility and channel prioritization. We show the robot abstraction layer and how it has been modified to better address the requirements of the users and to support new hardware as it became available. We also describe the testing framework we have recently adopted for developing code using a test driven methodology. We conclude the paper discussing the lessons we have learned during the past eleven years of software development on the iCub humanoid robot.

  6. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... proportions, to organize the process on site choosing either one room wall components or several rooms wall components – either horizontally or vertically. Combined with the seamless joint the playing with these possibilities the new industrialized architecture can deliver variations in choice of solutions...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  7. Architectural freedom and industrialized architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to explain that architecture can be thought as a complex and diverse design through customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performing expression in direct relation to the given context. Through the last couple of years we have...... expression in the specific housing area. It is the aim of this article to expand the different design strategies which architects can use – to give the individual project attitudes and designs with architectural quality. Through the customized component production it is possible to choose different...... for retrofit design. If we add the question of the installations e.g. ventilation to this systematic thinking of building technique we get a diverse and functional architecture, thereby creating a new and clearer story telling about new and smart system based thinking behind architectural expression....

  8. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    Architectural freedom and industrialized architecture. Inge Vestergaard, Associate Professor, Cand. Arch. Aarhus School of Architecture, Denmark Noerreport 20, 8000 Aarhus C Telephone +45 89 36 0000 E-mai l inge.vestergaard@aarch.dk Based on the repetitive architecture from the "building boom" 1960...... customization, telling exactly the revitalized storey about the change to a contemporary sustainable and better performed expression in direct relation to the given context. Through the last couple of years we have in Denmark been focusing a more sustainable and low energy building technique, which also include...... to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...

  9. CHANGING PARADIGMS IN SPACE THEORIES: Recapturing 20th Century Architectural History

    Directory of Open Access Journals (Sweden)

    Gül Kaçmaz Erk

    2013-03-01

    Full Text Available The concept of space entered architectural history as late as 1893. Studies in art opened up the discussion, and it has been studied in various ways in architecture ever since. This article aims to instigate an additional reading to architectural history, one that is not supported by “isms” but based on space theories in the 20th century. Objectives of the article are to bring the concept of space and its changing paradigms to the attention of architectural researchers, to introduce a conceptual framework to classify and clarify theories of space, and to enrich the discussions on the 20th century architecture through theories that are beyond styles. The introduction of space in architecture will revolve around subject-object relationships, three-dimensionality and senses. Modern space will be discussed through concepts such as empathy, perception, abstraction, and geometry. A scientific approach will follow to study the concept of place through environment, event, behavior, and design methods. Finally, the reearch will look at contemporary approaches related to digitally  supported space via concepts like reality-virtuality, mediated experience, and relationship with machines.

  10. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  11. La Candelaria Neighborthood City of Guatemala, Forgotten Architectural Heritage

    Directory of Open Access Journals (Sweden)

    Leonel Alberto de la Roca Coronado

    2016-07-01

    Full Text Available The new Guatemala de la Asunción, has been impacted by climate change, due to its geographical location, the tectonic plates and volcanic features of the soil, which causes that the country is always threatened by tragic events that occur suddenly and on a recurring basis, by natural events (volcanic eruptions, earthquakes, hurricanes, storms, floods, landslides. Because the age of the District of La Candelaria, (since it is the second set in the Valle de la Ermita, after the transfer of the city in January 1776, likewise it was one of the areas damaged by the earthquake of February 4, 1976, has as a consequence that the architectural heritage of the District of La Candelaria is constantly at risk. In the 21st century, the problems of nationwide architectural heritage have additional components that make it more vulnerable to ruin, (social, economic and political deterioration, insecurity, which added to the poor state of physical buildings, referred to the lack of maintenance, little financial support and interest of the authorities to apply the laws for the protection of immovable cultural heritage assets. Within the Barrio of La Candelaria, there are homes and architectural remains, which could improve its current state. Guatemala needs to join the State and private institutions to ensure prevention and safeguarding of the heritage. 

  12. Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools

    Science.gov (United States)

    Januszkiewicz, Krystyna; Banachowicz, Marta

    2017-10-01

    The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.

  13. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data

    Directory of Open Access Journals (Sweden)

    Evangelos Stromatias

    2017-06-01

    Full Text Available This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77% and Poker-DVS (100% real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  14. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.

    Science.gov (United States)

    Stromatias, Evangelos; Soto, Miguel; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2017-01-01

    This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN) System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS) chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77%) and Poker-DVS (100%) real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  15. A new Bayesian Event Tree tool to track and quantify volcanic unrest and its application to Kawah Ijen volcano

    Science.gov (United States)

    Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Caudron, Corentin; Marzocchi, Warner; Suparjan

    2016-07-01

    Although most of volcanic hazard studies focus on magmatic eruptions, volcanic hazardous events can also occur when no migration of magma can be recognized. Examples are tectonic and hydrothermal unrest that may lead to phreatic eruptions. Recent events (e.g., Ontake eruption on September 2014) have demonstrated that phreatic eruptions are still hard to forecast, despite being potentially very hazardous. For these reasons, it is of paramount importance to identify indicators that define the condition of nonmagmatic unrest, in particular for hydrothermal systems. Often, this type of unrest is driven by movement of fluids, requiring alternative monitoring setups, beyond the classical seismic-geodetic-geochemical architectures. Here we present a new version of the probabilistic BET (Bayesian Event Tree) model, specifically developed to include the forecasting of nonmagmatic unrest and related hazards. The structure of the new event tree differs from the previous schemes by adding a specific branch to detail nonmagmatic unrest outcomes. A further goal of this work consists in providing a user-friendly, open-access, and straightforward tool to handle the probabilistic forecast and visualize the results as possible support during a volcanic crisis. The new event tree and tool are here applied to Kawah Ijen stratovolcano, Indonesia, as exemplificative application. In particular, the tool is set on the basis of monitoring data for the learning period 2000-2010, and is then blindly applied to the test period 2010-2012, during which significant unrest phases occurred.

  16. A Case Study of Horizontal Reuse in a Project-Driven Organisation

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Røn, Henrik

    2000-01-01

    This experience paper presents observations, lessons learned, and recommendations based on a case study of reuse. The case study is concerned with the development, maturation, and reuse of a business domain independent software component (horizontal reuse) in a project-driven organisation that has...... knowledge is transferred within an organisation; (c) design patterns can be as risky as they can be beneficial; and (d) there is more to architectural mismatch than “merely ” packaging mismatch....

  17. Alexandre Chemetoff: Visits; town and territory - architecture in dialogue

    DEFF Research Database (Denmark)

    Braae, Ellen Marie

    2010-01-01

    The launch of Visits could hardly have been better timed. The financial crisis and the slowing down of a long developer-driven building boom create the best imaginable backdrop for a grounded and reflexive practice regarding the transformation of the urban landscape as exemplified by Chemetoff. H...... and his office represent a long-awaited alternative to smashy and spectacular masterplan- and object-focused architecture. This review is also a warning to the (one hopes) many readers of the book: you’ll find neither easy answers nor flashy renderings of fixed goals....

  18. Dynamically adaptive data-driven simulation of extreme hydrological flows

    KAUST Repository

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2017-01-01

    evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses

  19. Enterprise architecture evaluation using architecture framework and UML stereotypes

    Directory of Open Access Journals (Sweden)

    Narges Shahi

    2014-08-01

    Full Text Available There is an increasing need for enterprise architecture in numerous organizations with complicated systems with various processes. Support for information technology, organizational units whose elements maintain complex relationships increases. Enterprise architecture is so effective that its non-use in organizations is regarded as their institutional inability in efficient information technology management. The enterprise architecture process generally consists of three phases including strategic programing of information technology, enterprise architecture programing and enterprise architecture implementation. Each phase must be implemented sequentially and one single flaw in each phase may result in a flaw in the whole architecture and, consequently, in extra costs and time. If a model is mapped for the issue and then it is evaluated before enterprise architecture implementation in the second phase, the possible flaws in implementation process are prevented. In this study, the processes of enterprise architecture are illustrated through UML diagrams, and the architecture is evaluated in programming phase through transforming the UML diagrams to Petri nets. The results indicate that the high costs of the implementation phase will be reduced.

  20. Advanced and secure architectural EHR approaches.

    Science.gov (United States)

    Blobel, Bernd

    2006-01-01

    Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context

  1. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  2. Lightweight enterprise architectures

    CERN Document Server

    Theuerkorn, Fenix

    2004-01-01

    STATE OF ARCHITECTUREArchitectural ChaosRelation of Technology and Architecture The Many Faces of Architecture The Scope of Enterprise Architecture The Need for Enterprise ArchitectureThe History of Architecture The Current Environment Standardization Barriers The Need for Lightweight Architecture in the EnterpriseThe Cost of TechnologyThe Benefits of Enterprise Architecture The Domains of Architecture The Gap between Business and ITWhere Does LEA Fit? LEA's FrameworkFrameworks, Methodologies, and Approaches The Framework of LEATypes of Methodologies Types of ApproachesActual System Environmen

  3. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  4. A flexible and testable software architecture: applying presenter first to a device server for the DOOCS accelerator control system of the European XFEL

    International Nuclear Information System (INIS)

    Beckmann, A.; Karabekyan, S.; Pflüger, J.

    2012-01-01

    Presenter First (PF) uses a variant of Model View Presenter design pattern to add implementation flexibility and to improve testability of complex event-driven applications. It has been introduced in the context of GUI applications, but can easily be adapted to server applications. This paper describes how Presenter First methodology is used to develop a device server for the Programmable Logic Controls (PLC) of the European XFEL undulator systems, which are Windows PCs running PLC software from Beckhoff. The server implements a ZeroMQ message interface to the PLC allowing the DOOCS accelerator control system of the European XFEL to exchange data with the PLC by sending messages over the network. Our challenge is to develop a well-tested device server with a flexible architecture that allows integrating the server into other accelerator control systems like EPICS. (author)

  5. Data-driven approach for creating synthetic electronic medical records

    Directory of Open Access Journals (Sweden)

    Moniz Linda

    2010-10-01

    Full Text Available Abstract Background New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. Methods This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia and for background records. The method developed has three major steps: 1 synthetic patient identity and basic information generation; 2 identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3 adaptation of these care patterns to the synthetic patient population. Results We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. Conclusions A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders. The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious

  6. Data-driven approach for creating synthetic electronic medical records.

    Science.gov (United States)

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4

  7. TEACHING DESIGN AT THE LIMITS OF ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Nikos A. Salingaros

    2010-07-01

    Full Text Available Pre-industrial architects inherently knew the effectual dimension of design through its materiality, detail, and form. Until now, the intellectual dichotomy of human thinking held that mind and body were separate entities, drawing a distinction between reasoned thought and feeling. The early Greek philosophers distinguished between these two realms. Theories on beauty, the human aesthetic impulse, and design were divided along the objective and subjective lines for centuries. In more current architectural terms, the objective dimension of industry gave structure and perceived virtue to the modernist paradigm, while at the same time clearing the way (tabula rasa for the rampant subjectivity we now see in the idiosyncratic expressions of so many contemporary architects. By revealing the relationship between our physical and mental processes, neuroscience re-situates the debate on physical reality well outside the intellectual enterprise of aesthetically driven design. Clear measures can now be evidenced, documented, and applied to establish a new, more effective, and humanly engaging way to build. This new architecture draws upon those mechanisms of neuro-connectivity that help us to feel safe and secure.   From this knowledge we have developed a new model for building/rebuilding the world, called Intelligence-Based Design. Intelligence-Based Design is the purposeful manipulation of the built environment to engage humans in an essential manner through complex organized information. Intelligence-Based Theory evidences the direct neurological evaluations of surface, structure, pattern, texture, and form, etc., and maintains that our sense of well being is established through positive neuro-engagement with the physical world at the deepest level common to all people, i.e. “Innate Intelligence.” This paper describes a senior architectural design studio taught using the precepts of Intelligence-Based Design. We describe our methodology, and the

  8. Indigenous architecture as a context-oriented architecture, a look at ...

    African Journals Online (AJOL)

    What has become problematic as the achievement of international style and globalization of architecture during the time has been the purely technological look at architecture, and the architecture without belonging to a place. In recent decades, the topic of sustainable architecture and reconsidering indigenous architecture ...

  9. Forecasting Turbine Icing Events

    DEFF Research Database (Denmark)

    Davis, Neil; Hahmann, Andrea N.; Clausen, Niels-Erik

    2012-01-01

    In this study, we present a method for forecasting icing events. The method is validated at two European wind farms in with known icing events. The icing model used was developed using current ice accretion methods, and newly developed ablation algorithms. The model is driven by inputs from the WRF...... mesoscale model, allowing for both climatological estimates of icing and short term icing forecasts. The current model was able to detect periods of icing reasonably well at the warmer site. However at the cold climate site, the model was not able to remove ice quickly enough leading to large ice...

  10. Intercorporate Security Event Correlation

    Directory of Open Access Journals (Sweden)

    D. O. Kovalev

    2010-03-01

    Full Text Available Security controls are prone to false positives and false negatives which can lead to unwanted reputation losses for the bank. The reputational database within the security operations center (SOC and intercorporate correlation of security events are offered as a solution to increase attack detection fidelity. The theses introduce the definition and structure of the reputation, architectures of reputational exchange and the place of intercorporate correlation in overall SOC correlation analysis.

  11. The flaws of fragmented financial standard setting: why substantive economic debates matter for the architecture of global governance

    NARCIS (Netherlands)

    Mügge, D.; Perry, J.

    2014-01-01

    In the half decade following the 2007 financial crisis, the reform of global financial governance was driven by two separate policy debates; one on the substantive content of regulations, the other on the organizational architecture of their governance. The separation of the two debates among

  12. Relaxation near Supermassive Black Holes Driven by Nuclear Spiral Arms: Anisotropic Hypervelocity Stars, S-stars, and Tidal Disruption Events

    Energy Technology Data Exchange (ETDEWEB)

    Hamers, Adrian S. [Institute for Advanced Study, School of Natural Sciences, Einstein Drive, Princeton, NJ 08540 (United States); Perets, Hagai B., E-mail: hamers@ias.edu [Technion—Israel Institute of Technology, Haifa 32000 (Israel)

    2017-09-10

    Nuclear spiral arms are small-scale transient spiral structures found in the centers of galaxies. Similarly to their galactic-scale counterparts, nuclear spiral arms can perturb the orbits of stars. In the case of the Galactic center (GC), these perturbations can affect the orbits of stars and binaries in a region extending to several hundred parsecs around the supermassive black hole (SMBH), causing diffusion in orbital energy and angular momentum. This diffusion process can drive stars and binaries to close approaches with the SMBH, disrupting single stars in tidal disruption events (TDEs), or disrupting binaries, leaving a star tightly bound to the SMBH and an unbound star escaping the galaxy, i.e., a hypervelocity star (HVS). Here, we consider diffusion by nuclear spiral arms in galactic nuclei, specifically the Milky Way GC. We determine nuclear-spiral-arm-driven diffusion rates using test-particle integrations and compute disruption rates. Our TDE rates are up to 20% higher compared to relaxation by single stars. For binaries, the enhancement is up to a factor of ∼100, and our rates are comparable to the observed numbers of HVSs and S-stars. Our scenario is complementary to relaxation driven by massive perturbers. In addition, our rates depend on the inclination of the binary with respect to the Galactic plane. Therefore, our scenario provides a novel potential source for the observed anisotropic distribution of HVSs. Nuclear spiral arms may also be important for accelerating the coalescence of binary SMBHs and for supplying nuclear star clusters with stars and gas.

  13. Performance of the CMS Event Builder

    CERN Document Server

    Andre, Jean-Marc Olivier; Branson, James; Brummer, Philipp Maximilian; Chaze, Olivier; Cittolin, Sergio; Contescu, Cristian; Craigs, Benjamin Gordon; Darlea, Georgiana Lavinia; Deldicque, Christian; Demiragli, Zeynep; Dobson, Marc; Doualot, Nicolas; Erhan, Samim; Fulcher, Jonathan Richard; Gigi, Dominique; Gladki, Maciej Szymon; Glege, Frank; Gomez Ceballos, Guillelmo; Hegeman, Jeroen Guido; Holzner, Andre Georg; Janulis, Mindaugas; Jimenez Estupinan, Raul; Masetti, Lorenzo; Meijers, Franciscus; Meschi, Emilio; Mommsen, Remigius; Morovic, Srecko; O'Dell, Vivian; Orsini, Luciano; Paus, Christoph Maria Ernst; Petrova, Petia; Pieri, Marco; Racz, Attila; Reis, Thomas; Sakulin, Hannes; Schwick, Christoph; Simelevicius, Dainius; Zejdl, Petr

    2017-01-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz. It transports event data at an aggregate throughput of ~100 GB/s to the high-level trigger (HLT) farm. The CMS DAQ system has been completely rebuilt during the first long shutdown of the LHC in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gb/s Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gb/s Infiniband FDR CLOS network has been chosen for the event builder. We report on the performance of the event builder system and the steps taken to exploit the full potential of the network technologies.

  14. Architecture in the Islamic Civilization: Muslim Building or Islamic Architecture

    OpenAIRE

    Yassin, Ayat Ali; Utaberta, Dr. Nangkula

    2012-01-01

    The main problem of the theory in the arena of islamic architecture is affected by some of its Westernthoughts, and stereotyping the islamic architecture according to Western thoughts; this leads to the breakdownof the foundations in the islamic architecture. It is a myth that islamic architecture is subjected to theinfluence from foreign architectures. This paper will highlight the dialectical concept of islamic architecture ormuslim buildings and the areas of recognition in islamic architec...

  15. Sleep architecture in children with arousal disorders

    Directory of Open Access Journals (Sweden)

    S. Hernández-Torres

    2017-07-01

    Conclusions: It has been reported that AD first manifests during the pre-school years and that the frequency of events gradually decreases and abate completely during adolescence, which is why AD is believed to be the manifestation of an immature central nervous system (CNS. It may be that the sleep architecture characteristics shown by patients in the ADG would correspond to CNS immaturity in healthy but younger children.

  16. Network based on statistical multiplexing for event selection and event builder systems in high energy physics experiments

    International Nuclear Information System (INIS)

    Calvet, D.

    2000-03-01

    Systems for on-line event selection in future high energy physics experiments will use advanced distributed computing techniques and will need high speed networks. After a brief description of projects at the Large Hadron Collider, the architectures initially proposed for the Trigger and Data AcQuisition (TD/DAQ) systems of ATLAS and CMS experiments are presented and analyzed. A new architecture for the ATLAS T/DAQ is introduced. Candidate network technologies for this system are described. This thesis focuses on ATM. A variety of network structures and topologies suited to partial and full event building are investigated. The need for efficient networking is shown. Optimization techniques for high speed messaging and their implementation on ATM components are described. Small scale demonstrator systems consisting of up to 48 computers (∼1:20 of the final level 2 trigger) connected via ATM are described. Performance results are presented. Extrapolation of measurements and evaluation of needs lead to a proposal of implementation for the main network of the ATLAS T/DAQ system. (author)

  17. CisLunar Habitat Internal Architecture Design Criteria

    Science.gov (United States)

    Jones, R.; Kennedy, K.; Howard, R.; Whitmore, M.; Martin, C.; Garate, J.

    2017-01-01

    Lunar Habitat Internal Architecture Study is to become a forcing function to establish a common understanding of CisLunar Phase-1 Habitation Internal Architecture design criteria, processes, and tools. The scope of the CisLunar Habitat Internal Architecture study is to design, develop, demonstrate, and evaluate a Phase-1 CisLunar Habitat common module internal architecture based on design criteria agreed to by NASA, the International Partners, and Commercial Exploration teams. This task is to define the CisLunar Phase-1 Internal Architecture Government Reference Design, assist NASA in becoming a "smart buyer" for Phase-1 Habitat Concepts, and ultimately to derive standards and requirements from the Internal Architecture Design Process. The first step was to define a Habitat Internal Architecture Design Criteria and create a structured philosophy to be used by design teams as a filter by which critical aspects of consideration would be identified for the purpose of organizing and utilizing interior spaces. With design criteria in place, the team will develop a series of iterative internal architecture concept designs which will be assessed by means of an evaluation criteria and process. These assessments will successively drive and refine the design, leading to the combination and down-selection of design concepts. A single refined reference design configuration will be developed into in a medium-to-high fidelity mockup. A multi-day human-in-the-loop mission test will fully evaluate the reference design and validate its configuration. Lessons learned from the design and evaluation will enable the team to identify appropriate standards for Phase-1 CisLunar Habitat Internal Architecture and will enable NASA to develop derived requirements in support of maturing CisLunar Habitation capabilities. This paper will describe the criteria definition process, workshop event, and resulting CisLunar Phase-1 Habitat Internal Architecture Design Criteria.

  18. Phonon Spectrum Engineering in Rolled-up Micro- and Nano-Architectures

    Directory of Open Access Journals (Sweden)

    Vladimir M. Fomin

    2015-10-01

    Full Text Available We report on a possibility of efficient engineering of the acoustic phonon energy spectrum in multishell tubular structures produced by a novel high-tech method of self-organization of micro- and nano-architectures. The strain-driven roll-up procedure paved the way for novel classes of metamaterials such as single semiconductor radial micro- and nano-crystals and multi-layer spiral micro- and nano-superlattices. The acoustic phonon dispersion is determined by solving the equations of elastodynamics for InAs and GaAs material systems. It is shown that the number of shells is an important control parameter of the phonon dispersion together with the structure dimensions and acoustic impedance mismatch between the superlattice layers. The obtained results suggest that rolled up nano-architectures are promising for thermoelectric applications owing to a possibility of significant reduction of the thermal conductivity without degradation of the electronic transport.

  19. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  20. Dynamically adaptive data-driven simulation of extreme hydrological flows

    Science.gov (United States)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  1. Dynamically adaptive data-driven simulation of extreme hydrological flows

    KAUST Repository

    Kumar Jain, Pushkar

    2017-12-27

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  2. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  3. Bernard Tschumi Draws Architecture!

    Directory of Open Access Journals (Sweden)

    Gevork Hartoonian

    2014-08-01

    Full Text Available Bernard Tschumi’s delineation prepared for the Museu de Arte Contemporânea provides the starting point for this essay, which discusses the historicity of drawing and highlights the horizontality and the verticality that structure architecture’s contrast with the pictorial realm. Juxtaposing a freehand sketch with the digital image of the same project, Tschumi moves to address the paradox concerning the position of the body and drawing. This drawing also speaks for the reversal in the position of the body brought about by digital reproductivity.The reversal alludes to Tschumi’s theorization of architecture in terms of space and event. These, I will argue, are anticipated in The Manhattan Transcripts (1981 where a set of freehand drawings is used to evoke a filmic mood wherein the image is projected parallel to the spectator’s seated position. The essay goes further, suggesting that the theatricality permeating the present architecture is part of the shift from horizontality to the painterly, and yet the phenomenon is not merely a technical issue. Rather, it alludes to architecture’s dialogical rapport with painting at work since the Renaissance.

  4. Open architecture design and approach for the Integrated Sensor Architecture (ISA)

    Science.gov (United States)

    Moulton, Christine L.; Krzywicki, Alan T.; Hepp, Jared J.; Harrell, John; Kogut, Michael

    2015-05-01

    Integrated Sensor Architecture (ISA) is designed in response to stovepiped integration approaches. The design, based on the principles of Service Oriented Architectures (SOA) and Open Architectures, addresses the problem of integration, and is not designed for specific sensors or systems. The use of SOA and Open Architecture approaches has led to a flexible, extensible architecture. Using these approaches, and supported with common data formats, open protocol specifications, and Department of Defense Architecture Framework (DoDAF) system architecture documents, an integration-focused architecture has been developed. ISA can help move the Department of Defense (DoD) from costly stovepipe solutions to a more cost-effective plug-and-play design to support interoperability.

  5. A Parallel Saturation Algorithm on Shared Memory Architectures

    Science.gov (United States)

    Ezekiel, Jonathan; Siminiceanu

    2007-01-01

    Symbolic state-space generators are notoriously hard to parallelize. However, the Saturation algorithm implemented in the SMART verification tool differs from other sequential symbolic state-space generators in that it exploits the locality of ring events in asynchronous system models. This paper explores whether event locality can be utilized to efficiently parallelize Saturation on shared-memory architectures. Conceptually, we propose to parallelize the ring of events within a decision diagram node, which is technically realized via a thread pool. We discuss the challenges involved in our parallel design and conduct experimental studies on its prototypical implementation. On a dual-processor dual core PC, our studies show speed-ups for several example models, e.g., of up to 50% for a Kanban model, when compared to running our algorithm only on a single core.

  6. Architecture Level Safety Analyses for Safety-Critical Systems

    Directory of Open Access Journals (Sweden)

    K. S. Kushal

    2017-01-01

    Full Text Available The dependency of complex embedded Safety-Critical Systems across Avionics and Aerospace domains on their underlying software and hardware components has gradually increased with progression in time. Such application domain systems are developed based on a complex integrated architecture, which is modular in nature. Engineering practices assured with system safety standards to manage the failure, faulty, and unsafe operational conditions are very much necessary. System safety analyses involve the analysis of complex software architecture of the system, a major aspect in leading to fatal consequences in the behaviour of Safety-Critical Systems, and provide high reliability and dependability factors during their development. In this paper, we propose an architecture fault modeling and the safety analyses approach that will aid in identifying and eliminating the design flaws. The formal foundations of SAE Architecture Analysis & Design Language (AADL augmented with the Error Model Annex (EMV are discussed. The fault propagation, failure behaviour, and the composite behaviour of the design flaws/failures are considered for architecture safety analysis. The illustration of the proposed approach is validated by implementing the Speed Control Unit of Power-Boat Autopilot (PBA system. The Error Model Annex (EMV is guided with the pattern of consideration and inclusion of probable failure scenarios and propagation of fault conditions in the Speed Control Unit of Power-Boat Autopilot (PBA. This helps in validating the system architecture with the detection of the error event in the model and its impact in the operational environment. This also provides an insight of the certification impact that these exceptional conditions pose at various criticality levels and design assurance levels and its implications in verifying and validating the designs.

  7. Location aware event driven multipath routing in Wireless Sensor Networks: Agent based approach

    Directory of Open Access Journals (Sweden)

    A.V. Sutagundar

    2013-03-01

    Full Text Available Wireless Sensor Networks (WSNs demand reliable and energy efficient paths for critical information delivery to sink node from an event occurrence node. Multipath routing facilitates reliable data delivery in case of critical information. This paper proposes an event triggered multipath routing in WSNs by employing a set of static and mobile agents. Every sensor node is assumed to know the location information of the sink node and itself. The proposed scheme works as follows: (1 Event node computes the arbitrary midpoint between an event node and the sink node by using location information. (2 Event node establishes a shortest path from itself to the sink node through the reference axis by using a mobile agent with the help of location information; the mobile agent collects the connectivity information and other parameters of all the nodes on the way and provides the information to the sink node. (3 Event node finds the arbitrary location of the special (middle intermediate nodes (above/below reference axis by using the midpoint location information given in step 1. (4 Mobile agent clones from the event node and the clones carry the event type and discover the path passing through special intermediate nodes; the path above/below reference axis looks like an arc. While migrating from one sensor node to another along the traversed path, each mobile agent gathers the node information (such as node id, location information, residual energy, available bandwidth, and neighbors connectivity and delivers to the sink node. (5 The sink node constructs a partial topology, connecting event and sink node by using the connectivity information delivered by the mobile agents. Using the partial topology information, sink node finds the multipath and path weight factor by using link efficiency, energy ratio, and hop distance. (6 The sink node selects the number of paths among the available paths based upon the criticalness of an event, and (7 if the event is non

  8. Analysis of Architecture Pattern Usage in Legacy System Architecture Documentation

    NARCIS (Netherlands)

    Harrison, Neil B.; Avgeriou, Paris

    2008-01-01

    Architecture patterns are an important tool in architectural design. However, while many architecture patterns have been identified, there is little in-depth understanding of their actual use in software architectures. For instance, there is no overview of how many patterns are used per system or

  9. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  10. On the Architectural Engineering Competences in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    2007-01-01

    In 1997 a new education in Architecture & Design at Department of Architecture and Design, Aalborg University was started with 50 students. During the recent years this number has increased to approximately 100 new students each year, i.e. approximately 500 students are following the 3 years...... bachelor (BSc) and the 2 years master (MSc) programme. The first 5 semesters are common for all students followed by 5 semesters with specialization into Architectural Design, Urban Design, Industrial Design or Digital Design. The present paper gives a short summary of the architectural engineering...

  11. Radiation dose of aircrews during a solar proton event without ground-level enhancement

    Directory of Open Access Journals (Sweden)

    R. Kataoka

    2015-01-01

    Full Text Available A significant enhancement of radiation doses is expected for aircrews during ground-level enhancement (GLE events, while the possible radiation hazard remains an open question during non-GLE solar energetic particle (SEP events. Using a new air-shower simulation driven by the proton flux data obtained from GOES satellites, we show the possibility of significant enhancement of the effective dose rate of up to 4.5 μSv h−1 at a conventional flight altitude of 12 km during the largest SEP event that did not cause a GLE. As a result, a new GOES-driven model is proposed to give an estimate of the contribution from the isotropic component of the radiation dose in the stratosphere during non-GLE SEP events.

  12. The NA60 experiment readout architecture

    CERN Document Server

    Floris, M; Usai, G L; David, A; Rosinsky, P; Ohnishi, H

    2004-01-01

    The NA60 experiment was designed to identify signatures of a new state of matter, the Quark Gluon Plasma, in heavy-ion collisions at the CERN Super Proton Synchroton. The apparatus is composed of four main detectors: a muon spectrometer (MS), a zero degree calorimeter (ZDC), a silicon vertex telescope (VT), and a silicon microstrip beam tracker (BT). The readout of the whole experiment is based on a PCI architecture. The basic unit is a general purpose PCI card, interfaced to the different subdetectors via custom mezzanine cards. This allowed us to successfully implement several completely different readout protocols (from the VME like protocol of the MS to the custom protocol of the pixel telescope). The system was fully tested with proton and ion beams, and several million events were collected in 2002 and 2003. This paper presents the readout architecture of NA60, with particular emphasis on the PCI layer common to all the subdetectors. (16 refs).

  13. Opto-VLSI-based reconfigurable free-space optical interconnects architecture

    DEFF Research Database (Denmark)

    Aljada, Muhsen; Alameh, Kamal; Chung, Il-Sug

    2007-01-01

    is the Opto-VLSI processor which can be driven by digital phase steering and multicasting holograms that reconfigure the optical interconnects between the input and output ports. The optical interconnects architecture is experimentally demonstrated at 2.5 Gbps using high-speed 1×3 VCSEL array and 1......×3 photoreceiver array in conjunction with two 1×4096 pixel Opto-VLSI processors. The minimisation of the crosstalk between the output ports is achieved by appropriately aligning the VCSEL and PD elements with respect to the Opto-VLSI processors and driving the latter with optimal steering phase holograms....

  14. Forecast of icing events at a wind farm in Sweden

    DEFF Research Database (Denmark)

    Davis, Neil; Hahmann, Andrea N.; Clausen, Niels-Erik

    2014-01-01

    This paper introduces a method for identifying icing events using a physical icing model, driven by atmospheric data from the Weather Research and Forecasting (WRF) model, and applies it to a wind park in Sweden. Observed wind park icing events were identified by deviation from an idealized power...

  15. Connecting Architecture and Implementation

    Science.gov (United States)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  16. A Satellite Data-Driven, Client-Server Decision Support Application for Agricultural Water Resources Management

    Science.gov (United States)

    Johnson, Lee F.; Maneta, Marco P.; Kimball, John S.

    2016-01-01

    Water cycle extremes such as droughts and floods present a challenge for water managers and for policy makers responsible for the administration of water supplies in agricultural regions. In addition to the inherent uncertainties associated with forecasting extreme weather events, water planners need to anticipate water demands and water user behavior in a typical circumstances. This requires the use decision support systems capable of simulating agricultural water demand with the latest available data. Unfortunately, managers from local and regional agencies often use different datasets of variable quality, which complicates coordinated action. In previous work we have demonstrated novel methodologies to use satellite-based observational technologies, in conjunction with hydro-economic models and state of the art data assimilation methods, to enable robust regional assessment and prediction of drought impacts on agricultural production, water resources, and land allocation. These methods create an opportunity for new, cost-effective analysis tools to support policy and decision-making over large spatial extents. The methods can be driven with information from existing satellite-derived operational products, such as the Satellite Irrigation Management Support system (SIMS) operational over California, the Cropland Data Layer (CDL), and using a modified light-use efficiency algorithm to retrieve crop yield from the synergistic use of MODIS and Landsat imagery. Here we present an integration of this modeling framework in a client-server architecture based on the Hydra platform. Assimilation and processing of resource intensive remote sensing data, as well as hydrologic and other ancillary information occur on the server side. This information is processed and summarized as attributes in water demand nodes that are part of a vector description of the water distribution network. With this architecture, our decision support system becomes a light weight 'app' that

  17. ATLAS EventIndex General Dataflow and Monitoring Infrastructure

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2016-01-01

    The ATLAS EventIndex has been running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure at CERN. A subset of this information is copied to an Oracle relational database for fast datasets discovery, event-picking, crosschecks with other ATLAS systems and checks for event duplication. The system design and its optimization is serving event picking from requests of a few events up to scales of tens of thousand of events, and in addition, data consistency checks are performed for large production campaigns. Detecting duplicate events with a scope of physics collections has recently arisen as an important use case. This paper describes the general architecture of the project and the data flow and operation issues, which are addressed by recent developments to improve the throughput of the overall system. In this direction, the data collection system is reducing the usage of the messaging infrastructure to overcome t...

  18. ATLAS EventIndex general dataflow and monitoring infrastructure

    CERN Document Server

    AUTHOR|(SzGeCERN)638886; The ATLAS collaboration; Barberis, Dario; Favareto, Andrea; Garcia Montoro, Carlos; Gonzalez de la Hoz, Santiago; Hrivnac, Julius; Prokoshin, Fedor; Salt, Jose; Sanchez, Javier; Toebbicke, Rainer; Yuan, Ruijun

    2017-01-01

    The ATLAS EventIndex has been running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure at CERN. A subset of this information is copied to an Oracle relational database for fast dataset discovery, event-picking, crosschecks with other ATLAS systems and checks for event duplication. The system design and its optimization is serving event picking from requests of a few events up to scales of tens of thousand of events, and in addition, data consistency checks are performed for large production campaigns. Detecting duplicate events with a scope of physics collections has recently arisen as an important use case. This paper describes the general architecture of the project and the data flow and operation issues, which are addressed by recent developments to improve the throughput of the overall system. In this direction, the data collection system is reducing the usage of the messaging infrastructure to overcome th...

  19. Multiple-Event, Single-Photon Counting Imaging Sensor

    Science.gov (United States)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  20. Implementation and integration in the L3 experimentation of a level-2 trigger with event building, based on C104 data driven cross-bar switches and on T9000 transputers

    International Nuclear Information System (INIS)

    Masserot, A.

    1995-01-01

    This thesis describes the new level-2 trigger system. It has been developed to fit the L3 requirements induced by the LEP phase 2 conditions. At each beam crossing, the system memorizes the trigger data, builds-up the events selected by the level-1 hard-wired processors and finally rejects on-line the background identified by algorithms coded in Fortran. Based on T9000 Transputers and on C104 data driven cross-bar switches, the system uses prototypes designed by INMOS/SGS THOMSON for parallel processing applications. Emphasis is set on a new event building technic, on its integration in L3 and on performance. (author). 38 refs., 68 figs., 36 tabs

  1. How organisation of architecture documentation affects architectural knowledge retrieval

    NARCIS (Netherlands)

    de Graaf, K.A.; Liang, P.; Tang, A.; Vliet, J.C.

    A common approach to software architecture documentation in industry projects is the use of file-based documents. This approach offers a single-dimensional arrangement of the architectural knowledge. Knowledge retrieval from file-based architecture documentation is efficient if the organisation of

  2. ASPIE: A Framework for Active Sensing and Processing of Complex Events in the Internet of Manufacturing Things

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Rapid perception and processing of critical monitoring events are essential to ensure healthy operation of Internet of Manufacturing Things (IoMT-based manufacturing processes. In this paper, we proposed a framework (active sensing and processing architecture (ASPIE for active sensing and processing of critical events in IoMT-based manufacturing based on the characteristics of IoMT architecture as well as its perception model. A relation model of complex events in manufacturing processes, together with related operators and unified XML-based semantic definitions, are developed to effectively process the complex event big data. A template based processing method for complex events is further introduced to conduct complex event matching using the Apriori frequent item mining algorithm. To evaluate the proposed models and methods, we developed a software platform based on ASPIE for a local chili sauce manufacturing company, which demonstrated the feasibility and effectiveness of the proposed methods for active perception and processing of complex events in IoMT-based manufacturing.

  3. Simulating flaring events in complex active regions driven by observed magnetograms

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has

  4. Evolution and Adaptation in Pseudomonas aeruginosa Biofilms Driven by Mismatch Repair System-Deficient Mutators

    DEFF Research Database (Denmark)

    Luján, Adela M.; Maciá, María D.; Yang, Liang

    2011-01-01

    , which are rarely eradicated despite intensive antibiotic therapy. Current knowledge indicates that three major adaptive strategies, biofilm development, phenotypic diversification, and mutator phenotypes [driven by a defective mismatch repair system (MRS)], play important roles in P. aeruginosa chronic...... infections, but the relationship between these strategies is still poorly understood. We have used the flow-cell biofilm model system to investigate the impact of the mutS associated mutator phenotype on development, dynamics, diversification and adaptation of P. aeruginosa biofilms. Through competition...... diversification, evidenced by biofilm architecture features and by a wider range and proportion of morphotypic colony variants, respectively. Additionally, morphotypic variants generated in mutator biofilms showed increased competitiveness, providing further evidence for mutator-driven adaptive evolution...

  5. Leaf-architectured 3D Hierarchical Artificial Photosynthetic System of Perovskite Titanates Towards CO2 Photoreduction Into Hydrocarbon Fuels

    Science.gov (United States)

    Zhou, Han; Guo, Jianjun; Li, Peng; Fan, Tongxiang; Zhang, Di; Ye, Jinhua

    2013-01-01

    The development of an “artificial photosynthetic system” (APS) having both the analogous important structural elements and reaction features of photosynthesis to achieve solar-driven water splitting and CO2 reduction is highly challenging. Here, we demonstrate a design strategy for a promising 3D APS architecture as an efficient mass flow/light harvesting network relying on the morphological replacement of a concept prototype-leaf's 3D architecture into perovskite titanates for CO2 photoreduction into hydrocarbon fuels (CO and CH4). The process uses artificial sunlight as the energy source, water as an electron donor and CO2 as the carbon source, mimicking what real leaves do. To our knowledge this is the first example utilizing biological systems as “architecture-directing agents” for APS towards CO2 photoreduction, which hints at a more general principle for APS architectures with a great variety of optimized biological geometries. This research would have great significance for the potential realization of global carbon neutral cycle. PMID:23588925

  6. Thermomechanical architecture of the VIS focal plane for Euclid

    International Nuclear Information System (INIS)

    Martignac, Jerome; Carty, Michael; Tourette, Thierry; Bachet, Damien; Berthe, Michel; Augueres, Jean-Louis; Amiaux, Jerome; Fontignie, Jean; Horeau, Benoit; Renaud, Diana

    2014-01-01

    One of the main challenges for current and near future space experiments is the increase of focal plane complexity in terms of amount of pixels. In the frame work of the ESA Euclid mission to be launched in 2020, the Euclid Consortium is developing an extremely large and stable focal plane for the VIS instrument. CEA has developed the thermomechanical architecture of that Focal Plane taking into account all the very stringent performance and mission related requirements. The VIS Focal Plane Assembly integrates 36 CCDs (operated at 150 K) connected to their front end electronics (operated at 280 K) as to obtain one of the largest focal plane (0.6 billion pixels) ever built for space application after the GAIA one. The CCDs are CCD273 type specially designed and provided by the e2v company under ESA contract, front end electronics is studied and provided by MSSL. In this paper we first recall the specific requirements that have driven the overall architecture of the VIS-FPA and especially the solutions proposed to cope with the scientific needs of an extremely stable focal plane, both mechanically and thermally. The mechanical structure based on SiC material used for the cold sub assembly supporting the CCDs is detailed. We describe also the modular architecture concept that we have selected taking into account AIT-AIV and programmatic constraints. (authors)

  7. Multiple ways to the prior occurrence of an event: an electrophysiological dissociation of experimental and conceptually driven familiarity in recognition memory.

    Science.gov (United States)

    Wiegand, Iris; Bader, Regine; Mecklinger, Axel

    2010-11-11

    Recent research has shown that familiarity contributes to associative memory when the to-be-associated stimuli are unitized during encoding. However, the specific processes underlying familiarity-based recognition of unitized representations are still indefinite. In this study, we present electrophysiologically dissociable early old/new effects, presumably related to two different kinds of familiarity inherent in associative recognition tasks. In a study-test associative recognition memory paradigm, we employed encoding conditions that established unitized representations of two pre-experimentally unrelated words, e.g. vegetable-bible. We compared event-related potentials (ERP) during the retrieval of these unitized word pairs using different retrieval cues. Word pairs presented in the same order as during unitization at encoding elicited a parietally distributed early old/new effect which we interpret as reflecting conceptually driven familiarity for newly formed concepts. Conversely, word pairs presented in reversed order only elicited a topographically dissociable early effect, i.e. the mid-frontal old/new effect, the putative correlate of experimental familiarity. The late parietal old/new effect, the putative ERP correlate of recollection, was obtained irrespective of word order, though it was larger for words presented in same order. These results indicate that familiarity may not be a unitary process and that different task demands can promote the assessment of conceptually driven familiarity for novel unitized concepts or experimentally-induced increments of experimental familiarity, respectively. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Mechanics of Interrill Erosion with Wind-Driven Rain (WDR)

    Science.gov (United States)

    This article provides an evaluation analysis for the performance of the interrill component of the Water Erosion Prediction Project (WEPP) model for Wind-Driven Rain (WDR) events. The interrill delivery rates (Di) were collected in the wind tunnel rainfall simulator facility of the International Cen...

  9. Capital Architecture: Situating symbolism parallel to architectural methods and technology

    Science.gov (United States)

    Daoud, Bassam

    Capital Architecture is a symbol of a nation's global presence and the cultural and social focal point of its inhabitants. Since the advent of High-Modernism in Western cities, and subsequently decolonised capitals, civic architecture no longer seems to be strictly grounded in the philosophy that national buildings shape the legacy of government and the way a nation is regarded through its built environment. Amidst an exceedingly globalized architectural practice and with the growing concern of key heritage foundations over the shortcomings of international modernism in representing its immediate socio-cultural context, the contextualization of public architecture within its sociological, cultural and economic framework in capital cities became the key denominator of this thesis. Civic architecture in capital cities is essential to confront the challenges of symbolizing a nation and demonstrating the legitimacy of the government'. In today's dominantly secular Western societies, governmental architecture, especially where the seat of political power lies, is the ultimate form of architectural expression in conveying a sense of identity and underlining a nation's status. Departing with these convictions, this thesis investigates the embodied symbolic power, the representative capacity, and the inherent permanence in contemporary architecture, and in its modes of production. Through a vast study on Modern architectural ideals and heritage -- in parallel to methodologies -- the thesis stimulates the future of large scale governmental building practices and aims to identify and index the key constituents that may respond to the lack representation in civic architecture in capital cities.

  10. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  11. Big Data Analytics Embedded Smart City Architecture for Performance Enhancement through Real-Time Data Processing and Decision-Making

    Directory of Open Access Journals (Sweden)

    Bhagya Nathali Silva

    2017-01-01

    Full Text Available The concept of the smart city is widely favored, as it enhances the quality of life of urban citizens, involving multiple disciplines, that is, smart community, smart transportation, smart healthcare, smart parking, and many more. Continuous growth of the complex urban networks is significantly challenged by real-time data processing and intelligent decision-making capabilities. Therefore, in this paper, we propose a smart city framework based on Big Data analytics. The proposed framework operates on three levels: (1 data generation and acquisition level collecting heterogeneous data related to city operations, (2 data management and processing level filtering, analyzing, and storing data to make decisions and events autonomously, and (3 application level initiating execution of the events corresponding to the received decisions. In order to validate the proposed architecture, we analyze a few major types of dataset based on the proposed three-level architecture. Further, we tested authentic datasets on Hadoop ecosystem to determine the threshold and the analysis shows that the proposed architecture offers useful insights into the community development authorities to improve the existing smart city architecture.

  12. Architectural design decisions

    NARCIS (Netherlands)

    Jansen, Antonius Gradus Johannes

    2008-01-01

    A software architecture can be considered as the collection of key decisions concerning the design of the software of a system. Knowledge about this design, i.e. architectural knowledge, is key for understanding a software architecture and thus the software itself. Architectural knowledge is mostly

  13. An asynchronous data-driven readout prototype for CEPC vertex detector

    Science.gov (United States)

    Yang, Ping; Sun, Xiangming; Huang, Guangming; Xiao, Le; Gao, Chaosong; Huang, Xing; Zhou, Wei; Ren, Weiping; Li, Yashu; Liu, Jianchao; You, Bihui; Zhang, Li

    2017-12-01

    The Circular Electron Positron Collider (CEPC) is proposed as a Higgs boson and/or Z boson factory for high-precision measurements on the Higgs boson. The precision of secondary vertex impact parameter plays an important role in such measurements which typically rely on flavor-tagging. Thus silicon CMOS Pixel Sensors (CPS) are the most promising technology candidate for a CEPC vertex detector, which can most likely feature a high position resolution, a low power consumption and a fast readout simultaneously. For the R&D of the CEPC vertex detector, we have developed a prototype MIC4 in the Towerjazz 180 nm CMOS Image Sensor (CIS) process. We have proposed and implemented a new architecture of asynchronous zero-suppression data-driven readout inside the matrix combined with a binary front-end inside the pixel. The matrix contains 128 rows and 64 columns with a small pixel pitch of 25 μm. The readout architecture has implemented the traditional OR-gate chain inside a super pixel combined with a priority arbiter tree between the super pixels, only reading out relevant pixels. The MIC4 architecture will be introduced in more detail in this paper. It will be taped out in May and will be characterized when the chip comes back.

  14. Architecture Governance: The Importance of Architecture Governance for Achieving Operationally Responsive Ground Systems

    Science.gov (United States)

    Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik

    2011-01-01

    Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level

  15. Progress on the design of a data push architecture for an array of optimized time tagging pixels

    International Nuclear Information System (INIS)

    Shapiro, S.; Cords, D.; Mani, S.; Holbrook, B.; Atlas, E.

    1993-06-01

    A pixel array has been proposed which features a completely data driven architecture. A pixel cell has been designed that has been optimized for this readout. It retains the features of preceding designs which allow low noise operation, time stamping, analog signal processing, XY address recording, ghost elimination and sparse data transmission. The pixel design eliminates a number of problems inherent in previous designs, by the use of sampled data techniques, destructive readout, and current mode output drivers. This architecture and pixel design is directed at applications such as a forward spectrometer at the SSC, an e + e - B factory at SLAC, and fixed target experiments at FNAL

  16. Minimalism in architecture: Architecture as a language of its identity

    Directory of Open Access Journals (Sweden)

    Vasilski Dragana

    2012-01-01

    Full Text Available Every architectural work is created on the principle that includes the meaning, and then this work is read like an artifact of the particular meaning. Resources by which the meaning is built primarily, susceptible to transformation, as well as routing of understanding (decoding messages carried by a work of architecture, are subject of semiotics and communication theories, which have played significant role for the architecture and the architect. Minimalism in architecture, as a paradigm of the XXI century architecture, means searching for essence located in the irreducible minimum. Inspired use of architectural units (archetypical elements, trough the fatasm of simplicity, assumes the primary responsibility for providing the object identity, because it participates in language formation and therefore in its reading. Volume is form by clean language that builds the expression of the fluid areas liberated of recharge needs. Reduced architectural language is appropriating to the age marked by electronic communications.

  17. Resource checking and event handling within the W7-X segment control framework

    International Nuclear Information System (INIS)

    Laqua, Heike; Bluhm, Torsten; Heimann, Peter; Hennig, Christine; Kroiss, Hugo; Krom, Jon G.; Kühner, Georg; Lewerentz, Marc; Maier, Josef; Schacht, Jörg; Spring, Anett; Werner, Andreas; Zilker, Manfred

    2012-01-01

    Highlights: ► Support for steady state fusion experiments. ► Off-normal event handling. ► Plasma event driven control. - Abstract: ITER, Wendelstein 7-X, LHD, and TORE SUPRA are experimental facilities designed to lead the way to steady state fusion devices. These experiments require strategies to sustain a discharge in case of unforeseen events, e.g. heat overloads of plasma facing components or the failure of a plasma heating source. A recovery strategy is needed to get the discharge back for physics exploitation. For this purpose the W7-X segment control framework provides means for automated event detection along with options to formulate and initiate a recovery strategy. Besides handling of failures and degradation there are events that represent a desired plasma physical effect. An example for this kind of event is a transition to from Low to High-Confinement mode. These events indicate that a certain plasma state is reached and scientific examination can be altered thus enabling event-driven multiple experiments per discharge. Examples of both kinds of events will be presented and compared to other approaches in the community.

  18. Architectural Narratives

    DEFF Research Database (Denmark)

    Kiib, Hans

    2010-01-01

    a functional framework for these concepts, but tries increasingly to endow the main idea of the cultural project with a spatially aesthetic expression - a shift towards “experience architecture.” A great number of these projects typically recycle and reinterpret narratives related to historical buildings......In this essay, I focus on the combination of programs and the architecture of cultural projects that have emerged within the last few years. These projects are characterized as “hybrid cultural projects,” because they intend to combine experience with entertainment, play, and learning. This essay...... and architectural heritage; another group tries to embed new performative technologies in expressive architectural representation. Finally, this essay provides a theoretical framework for the analysis of the political rationales of these projects and for the architectural representation bridges the gap between...

  19. Grid production with the ATLAS Event Service

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration

    2018-01-01

    ATLAS has developed and previously presented a new computing architecture, the Event Service, that allows real time delivery of fine grained workloads which process dispatched events (or event ranges) and immediately streams outputs. The principal aim was to profit from opportunistic resources such as commercial cloud, supercomputing, and volunteer computing, and otherwise unused cycles on clusters and grids. During the development and deployment phase, its utility also on the grid and conventional clusters for the exploitation of otherwise unused cycles became apparent. Here we describe our experience commissioning the Event Service on the grid in the ATLAS production system. We study the performance compared with standard simulation production. We describe the integration with the ATLAS data management system to ensure scalability and compatibility with object stores. Finally, we outline the remaining steps towards a fully commissioned system.

  20. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    Science.gov (United States)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer

  1. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    Science.gov (United States)

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  2. Modeling the energy performance of event-driven wireless sensor network by using static sink and mobile sink.

    Science.gov (United States)

    Chen, Jiehui; Salim, Mariam B; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations.

  3. Modeling the Energy Performance of Event-Driven Wireless Sensor Network by Using Static Sink and Mobile Sink

    Science.gov (United States)

    Chen, Jiehui; Salim, Mariam B.; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations. PMID:22163503

  4. Building the Rainbow Nation. A critical analysis of the role of architecture in materializing a post-apartheid South African identity

    Directory of Open Access Journals (Sweden)

    Kim Raedt

    2012-02-01

    Full Text Available Soon after apartheid was abolished in 1994, the quest for a new, ‘authentic’ South African identity resulted in the emergence of the "Rainbow Nation" idea, picturing an equal, multicultural and reconciled society. As architecture is considered a crucial element in the promotion of this Rainbow identity, the country witnessed a remarkable "building boom" with its apogee roughly between 1998 and 2010. Huge investments have been made in state-driven projects which place the apartheid memory at the center of the architectural debate – mostly museums and memorials. However, the focus of this paper shall lie on another, less highlighted tendency in current architectural practice. This paper demonstrates that, through the construction of urban community services, South African architects attempt to materialize the Rainbow Nation in a way that might be closer to the everyday reality of society. Key words: architecture, post apartheid, Cape Town, South Africa, identity

  5. A roadmap for caGrid, an enterprise Grid architecture for biomedical research.

    Science.gov (United States)

    Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Chue Hong, Neil

    2008-01-01

    caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities.

  6. Architecture & Environment

    Science.gov (United States)

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  7. Shock Geometry and Spectral Breaks in Large SEP Events

    Science.gov (United States)

    Li, G.; Zank, G. P.; Verkhoglyadova, Olga; Mewaldt, R. A.; Cohen, C. M. S.; Mason, G. M.; Desai, M. I.

    2009-09-01

    Solar energetic particle (SEP) events are traditionally classified as "impulsive" or "gradual." It is now widely accepted that in gradual SEP events, particles are accelerated at coronal mass ejection-driven (CME-driven) shocks. In many of these large SEP events, particle spectra exhibit double power law or exponential rollover features, with the break energy or rollover energy ordered as (Q/A)α, with Q being the ion charge in e and A the ion mass in units of proton mass mp . This Q/A dependence of the spectral breaks provides an opportunity to study the underlying acceleration mechanism. In this paper, we examine how the Q/A dependence may depend on shock geometry. Using the nonlinear guiding center theory, we show that α ~ 1/5 for a quasi-perpendicular shock. Such a weak Q/A dependence is in contrast to the quasi-parallel shock case where α can reach 2. This difference in α reflects the difference of the underlying parallel and perpendicular diffusion coefficients κ|| and κbottom. We also examine the Q/A dependence of the break energy for the most general oblique shock case. Our analysis offers a possible way to remotely examine the geometry of a CME-driven shock when it is close to the Sun, where the acceleration of particle to high energies occurs.

  8. Fragments of Architecture

    DEFF Research Database (Denmark)

    Bang, Jacob Sebastian

    2016-01-01

    Topic 3: “Case studies dealing with the artistic and architectural work of architects worldwide, and the ties between specific artistic and architectural projects, methodologies and products”......Topic 3: “Case studies dealing with the artistic and architectural work of architects worldwide, and the ties between specific artistic and architectural projects, methodologies and products”...

  9. Simulating Hydrologic Flow and Reactive Transport with PFLOTRAN and PETSc on Emerging Fine-Grained Parallel Computer Architectures

    Science.gov (United States)

    Mills, R. T.; Rupp, K.; Smith, B. F.; Brown, J.; Knepley, M.; Zhang, H.; Adams, M.; Hammond, G. E.

    2017-12-01

    As the high-performance computing community pushes towards the exascale horizon, power and heat considerations have driven the increasing importance and prevalence of fine-grained parallelism in new computer architectures. High-performance computing centers have become increasingly reliant on GPGPU accelerators and "manycore" processors such as the Intel Xeon Phi line, and 512-bit SIMD registers have even been introduced in the latest generation of Intel's mainstream Xeon server processors. The high degree of fine-grained parallelism and more complicated memory hierarchy considerations of such "manycore" processors present several challenges to existing scientific software. Here, we consider how the massively parallel, open-source hydrologic flow and reactive transport code PFLOTRAN - and the underlying Portable, Extensible Toolkit for Scientific Computation (PETSc) library on which it is built - can best take advantage of such architectures. We will discuss some key features of these novel architectures and our code optimizations and algorithmic developments targeted at them, and present experiences drawn from working with a wide range of PFLOTRAN benchmark problems on these architectures.

  10. Modular Integrated Monitoring System (MIMS). Architecture and implementation

    International Nuclear Information System (INIS)

    Funkhouser, D.R.; Davidson, G.W.; Deland, S.M.

    1999-01-01

    The MIMS is being developed as a cost-effective means of performing safeguards in unattended remote monitoring applications. Based on industry standards and an open systems approach, the MIMS architecture supports both data acquisition and data review subsystems. Data includes images as well as discrete and analog sensor outputs. The MIMS uses an Echelon LonWorks network as a standard means and method of data acquisition from the sensor. A common data base not only stores sensor and image data but also provides a structure by which dynamic changes to the sensor system can be reflected in the data acquisition and data review subsystems without affecting the execution software. The architecture includes standards for wide area communications between data acquisition systems and data review systems. Data authentication is provided as an integral part of the design. The MIMS also provides a generic set of tools for analyzing both system behavior and observed events. The MIMS software implements this architecture by combining the use of commercial applications with a set of custom 16 and 32 bit Microsoft Windows applications which are run under Windows NT and Windows 95 operating systems. (author)

  11. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise...... architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  12. Enterprise architecture management

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Gøtze, John; Møller, Charles

    2017-01-01

    architecture management based on three classes of enterprise architecture scope. Organizations may adopt enterprise architecture management to help form, plan, and implement IT strategies; help plan and implement business strategies; or to further complement the business strategy-formation process......Despite the growing interest in enterprise architecture management, researchers and practitioners lack a shared understanding of its applications in organizations. Building on findings from a literature review and eight case studies, we develop a taxonomy that categorizes applications of enterprise....... The findings challenge the traditional IT-centric view of enterprise architecture management application and suggest enterprise architecture management as an approach that could support the consistent design and evolution of an organization as a whole....

  13. Hierarchical nitrogen doped bismuth niobate architectures: Controllable synthesis and excellent photocatalytic activity

    International Nuclear Information System (INIS)

    Hou, Jungang; Cao, Rui; Wang, Zheng; Jiao, Shuqiang; Zhu, Hongmin

    2012-01-01

    Graphical abstract: Efficient visible-light-driven photocatalysts of peony-like nitrogen doped Bi 3 NbO 7 hierarchical architectures and silver-layered Bi 3 NbO 7−x Nx heterostructures were successfully synthesized in this discovery. Highlights: ► N-Bi 3 NbO 7 architectures were synthesized via two-step hydrothermal process. ► Electronic structure calculations indicated that N replaced O in samples. ► Growth mechanism is proposed for transformation of nanoparticles to microflowers. ► Excellent activities of N-Bi 3 NbO 7 architectures were obtained for degradation. ► Enhanced photocatalytic performance was observed for Ag/N-Bi 3 NbO 7 architectures. - Abstract: Nitrogen doped bismuth niobate (N-Bi 3 NbO 7 ) hierarchical architectures were synthesized via a facile two-step hydrothermal process. XRD patterns revealed that the defect fluorite-type crystal structure of Bi 3 NbO 7 remained intact upon nitrogen doping. Electron microscopy showed the N-Bi 3 NbO 7 architecture has a unique peony-like spherical superstructure composed of numerous nanosheets. UV–vis spectra indicated that nitrogen doping in the compound results in a red-shift of the absorption edge from 450 nm to 470 nm. XPS indicated that [Bi/Nb]-N bonds were formed by inducing nitrogen to replace a small amount of oxygen in Bi 3 NbO 7−x N x , which is explained by electronic structure calculations including energy band and density of states. Based on observations of architectures formation, a possible growth mechanism was proposed to explain the transformation of polyhedral-like nanoparticles to peony-like microflowers via an Ostwald riping mechanism followed by self-assembly. The N-Bi 3 NbO 7 architectures due to the large specific surface area and nitrogen doping exhibited higher photocatalytic activities in the decomposition of organic pollutant under visible-light irradiation than Bi 3 NbO 7 nanoparticles. Furthermore, an enhanced photocatalytic performance was also observed for Ag

  14. Performance evaluation of OpenFOAM on many-core architectures

    International Nuclear Information System (INIS)

    Brzobohatý, Tomáš; Říha, Lubomír; Karásek, Tomáš; Kozubek, Tomáš

    2015-01-01

    In this article application of Open Source Field Operation and Manipulation (OpenFOAM) C++ libraries for solving engineering problems on many-core architectures is presented. Objective of this article is to present scalability of OpenFOAM on parallel platforms solving real engineering problems of fluid dynamics. Scalability test of OpenFOAM is performed using various hardware and different implementation of standard PCG and PBiCG Krylov iterative methods. Speed up of various implementations of linear solvers using GPU and MIC accelerators are presented in this paper. Numerical experiments of 3D lid-driven cavity flow for several cases with various number of cells are presented

  15. Performance evaluation of OpenFOAM on many-core architectures

    Energy Technology Data Exchange (ETDEWEB)

    Brzobohatý, Tomáš; Říha, Lubomír; Karásek, Tomáš, E-mail: tomas.karasek@vsb.cz; Kozubek, Tomáš [IT4Innovations National Supercomputing Center, VŠB-Technical University of Ostrava (Czech Republic)

    2015-03-10

    In this article application of Open Source Field Operation and Manipulation (OpenFOAM) C++ libraries for solving engineering problems on many-core architectures is presented. Objective of this article is to present scalability of OpenFOAM on parallel platforms solving real engineering problems of fluid dynamics. Scalability test of OpenFOAM is performed using various hardware and different implementation of standard PCG and PBiCG Krylov iterative methods. Speed up of various implementations of linear solvers using GPU and MIC accelerators are presented in this paper. Numerical experiments of 3D lid-driven cavity flow for several cases with various number of cells are presented.

  16. A task-based support architecture for developing point-of-care clinical decision support systems for the emergency department.

    Science.gov (United States)

    Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B

    2013-01-01

    The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.

  17. Next generation PET data acquisition architectures

    Science.gov (United States)

    Jones, W. F.; Reed, J. H.; Everman, J. L.; Young, J. W.; Seese, R. D.

    1997-06-01

    New architectures for higher performance data acquisition in PET are proposed. Improvements are demanded primarily by three areas of advancing PET state of the art. First, larger detector arrays such as the Hammersmith ECAT/sup (R/) EXACT HR/sup ++/ exceed the addressing capacity of 32 bit coincidence event words. Second, better scintillators (LSO) make depth-of interaction (DOI) and time-of-flight (TOF) operation more practical. Third, fully optimized single photon attenuation correction requires higher rates of data collection. New technologies which enable the proposed third generation Real Time Sorter (RTS III) include: (1) 80 Mbyte/sec Fibre Channel RAID disk systems, (2) PowerPC on both VMEbus and PCI Local bus, and (3) quadruple interleaved DRAM controller designs. Data acquisition flexibility is enhanced through a wider 64 bit coincidence event word. PET methodology support includes DOI (6 bits), TOF (6 bits), multiple energy windows (6 bits), 512/spl times/512 sinogram indexes (18 bits), and 256 crystal rings (16 bits). Throughput of 10 M events/sec is expected for list-mode data collection as well as both on-line and replay histogramming. Fully efficient list-mode storage for each PET application is provided by real-time bit packing of only the active event word bits. Real-time circuits provide DOI rebinning.

  18. Next generation PET data acquisition architectures

    International Nuclear Information System (INIS)

    Jones, W.F.; Reed, J.H.; Everman, J.L.

    1996-01-01

    New architectures for higher performance data acquisition in PET are proposed. Improvements are demanded primarily by three areas of advancing PET state of the art. First, larger detector arrays such as the Hammersmith ECAT reg-sign EXACT HR ++ exceed the addressing capacity of 32 bit coincidence event words. Second, better scintillators (LSO) make depth-of-interaction (DOI) and time-of-flight (TOF) operation more practical. Third, fully optimized single photon attenuation correction requires higher rates of data collection. New technologies which enable the proposed third generation Real Time Sorter (RTS III) include: (1) 80 M byte/sec Fibre Channel RAID disk systems, (2) PowerPC on both VMEbus and PCI Local bus, and (3) quadruple interleaved DRAM controller designs. Data acquisition flexibility is enhanced through a wider 64 bit coincidence event word. PET methodology support includes DOI (6 bits), TOF (6 bits), multiple energy windows (6 bits), 512 x 512 sinogram indexes (18 bits), and 256 crystal rings (16 bits). Throughput of 10 M events/sec is expected for list-mode data collection as well as both on-line and replay histogramming. Fully efficient list-mode storage for each PET application is provided by real-time bit packing of only the active event word bits. Real-time circuits provide DOI rebinning

  19. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......, assisting with bridging the different engineering disciplines. Timing play an important role in agricultural robotic applications, synchronisation of robot movement and implement actions is important in order to achieve precision spraying, me- chanical weeding, individual feeding, etc. Discovering...

  20. Alterations of Brain Functional Architecture Associated with Psychopathic Traits in Male Adolescents with Conduct Disorder

    OpenAIRE

    Pu, Weidan; Luo, Qiang; Jiang, Yali; Gao, Yidian; Ming, Qingsen; Yao, Shuqiao

    2017-01-01

    Psychopathic traits of conduct disorder (CD) have a core callous-unemotional (CU) component and an impulsive-antisocial component. Previous task-driven fMRI studies have suggested that psychopathic traits are associated with dysfunction of several brain areas involved in different cognitive functions (e.g., empathy, reward, and response inhibition etc.), but the relationship between psychopathic traits and intrinsic brain functional architecture has not yet been explored in CD. Using a holist...

  1. Modeling Architectural Patterns’ Behavior Using Architectural Primitives

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Architectural patterns have an impact on both the structure and the behavior of a system at the architecture design level. However, it is challenging to model patterns’ behavior in a systematic way because modeling languages do not provide the appropriate abstractions and because each pattern

  2. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    Science.gov (United States)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  3. Space and Architecture's Current Line of Research? A Lunar Architecture Workshop With An Architectural Agenda.

    Science.gov (United States)

    Solomon, D.; van Dijk, A.

    The "2002 ESA Lunar Architecture Workshop" (June 3-16) ESTEC, Noordwijk, NL and V2_Lab, Rotterdam, NL) is the first-of-its-kind workshop for exploring the design of extra-terrestrial (infra) structures for human exploration of the Moon and Earth-like planets introducing 'architecture's current line of research', and adopting an architec- tural criteria. The workshop intends to inspire, engage and challenge 30-40 European masters students from the fields of aerospace engineering, civil engineering, archi- tecture, and art to design, validate and build models of (infra) structures for Lunar exploration. The workshop also aims to open up new physical and conceptual terrain for an architectural agenda within the field of space exploration. A sound introduc- tion to the issues, conditions, resources, technologies, and architectural strategies will initiate the workshop participants into the context of lunar architecture scenarios. In my paper and presentation about the development of the ideology behind this work- shop, I will comment on the following questions: * Can the contemporary architectural agenda offer solutions that affect the scope of space exploration? It certainly has had an impression on urbanization and colonization of previously sparsely populated parts of Earth. * Does the current line of research in architecture offer any useful strategies for com- bining scientific interests, commercial opportunity, and public space? What can be learned from 'state of the art' architecture that blends commercial and public pro- grammes within one location? * Should commercial 'colonisation' projects in space be required to provide public space in a location where all humans present are likely to be there in a commercial context? Is the wave in Koolhaas' new Prada flagship store just a gesture to public space, or does this new concept in architecture and shopping evolve the public space? * What can we learn about designing (infra-) structures on the Moon or any other

  4. ATLAS EventIndex General Dataflow and Monitoring Infrastructure

    CERN Document Server

    Fernandez Casani, Alvaro; The ATLAS collaboration

    2016-01-01

    The ATLAS EventIndex has been running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure at CERN. A subset of this information is copied to an Oracle relational database for fast access. The system design and its optimization is serving event picking from requests of a few events up to scales of tens of thousand of events, and in addition, data consistency checks are performed for large production campaigns. Detecting duplicate events with a scope of physics collections has recently arisen as an important use case. This paper describes the general architecture of the project and the data flow and operation issues, which are addressed by recent developments to improve the throughput of the overall system. In this direction, the data collection system is reducing the usage of the messaging infrastructure to overcome the performance shortcomings detected during production peaks; an object storage approach is instea...

  5. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  6. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  7. MEDITERRANEAN VISUAL MESSAGES: THE CONUNDRUM OF IDENTITY, ISMS, AND MEANING IN CONTEMPORARY EGYPTIAN ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    Ashraf M. Salama

    2007-03-01

    Full Text Available Egypt like many of the Mediterranean countries is an amalgam of influences. Its rich history and unique geographical position afforded many opportunities for the emergence of architectural trends and movements. This article presents a new positional interpretation of contemporary Egyptian architecture. It is culled from a spectrum of issues I have presented in several events and published in local and international conferences and trade magazines. However, it calls for a fresh look at the issue of meaning in architecture by critically analyzing the current status of architecture in Egypt through a reading of trends that emerged over the last decade. The article discusses the concepts of Mediterraneanism and Middle Easternism in association with the situation of architecture and urbanism in Egypt. A number of ISMS including postmodernism, historical revivalism, critical regionalism and confusing symbolism are identified and reviewed and representative examples are critically analyzed. The article concludes by outlining an approach for a deeper insight toward the understanding of meaning in Egyptian architecture.

  8. LHCb Kalman Filter cross architecture studies

    Science.gov (United States)

    Hugo, Daniel; Pérez, Cámpora

    2017-10-01

    The 2020 upgrade of the LHCb detector will vastly increase the rate of collisions the Online system needs to process in software, in order to filter events in real time. 30 million collisions per second will pass through a selection chain, where each step is executed conditional to its prior acceptance. The Kalman Filter is a fit applied to all reconstructed tracks which, due to its time characteristics and early execution in the selection chain, consumes 40% of the whole reconstruction time in the current trigger software. This makes the Kalman Filter a time-critical component as the LHCb trigger evolves into a full software trigger in the Upgrade. I present a new Kalman Filter algorithm for LHCb that can efficiently make use of any kind of SIMD processor, and its design is explained in depth. Performance benchmarks are compared between a variety of hardware architectures, including x86_64 and Power8, and the Intel Xeon Phi accelerator, and the suitability of said architectures to efficiently perform the LHCb Reconstruction process is determined.

  9. Data driven CAN node reliability assessment for manufacturing system

    Science.gov (United States)

    Zhang, Leiming; Yuan, Yong; Lei, Yong

    2017-01-01

    The reliability of the Controller Area Network(CAN) is critical to the performance and safety of the system. However, direct bus-off time assessment tools are lacking in practice due to inaccessibility of the node information and the complexity of the node interactions upon errors. In order to measure the mean time to bus-off(MTTB) of all the nodes, a novel data driven node bus-off time assessment method for CAN network is proposed by directly using network error information. First, the corresponding network error event sequence for each node is constructed using multiple-layer network error information. Then, the generalized zero inflated Poisson process(GZIP) model is established for each node based on the error event sequence. Finally, the stochastic model is constructed to predict the MTTB of the node. The accelerated case studies with different error injection rates are conducted on a laboratory network to demonstrate the proposed method, where the network errors are generated by a computer controlled error injection system. Experiment results show that the MTTB of nodes predicted by the proposed method agree well with observations in the case studies. The proposed data driven node time to bus-off assessment method for CAN networks can successfully predict the MTTB of nodes by directly using network error event data.

  10. Domain-specific languages for enterprise systems

    DEFF Research Database (Denmark)

    Andersen, Jesper; Bahr, Patrick; Henglein, Fritz

    2014-01-01

    The process-oriented event-driven transaction systems (POETS) architecture introduced by Henglein et al. is a novel software architecture for enterprise resource planning (ERP) systems. POETS employs a pragmatic separation between (i) transactional data, that is, what has happened; (ii) reports...... auditability; and support for referable data that may evolve over time, also while retaining full auditability as well as referential integrity. Besides the revised architecture, we present the DSLs used to specify data definitions, reports, and contracts respectively. Finally, we illustrate a use case...

  11. Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE)

    Science.gov (United States)

    2005-04-01

    PA 15213-3890 Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE) Felix Bachmann and Mark Klein Software...DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Methodical Design of Software Architecture Using an Architecture Design Assistant...important for architecture design – quality requirements and constraints are most important Here’s some evidence: If the only concern is

  12. MEDITERRANEAN VISUAL MESSAGES: THE CONUNDRUM OF IDENTITY, ISMS, AND MEANING IN CONTEMPORARY EGYPTIAN ARCHITECTURE

    OpenAIRE

    Ashraf M. Salama

    2007-01-01

    Egypt like many of the Mediterranean countries is an amalgam of influences. Its rich history and unique geographical position afforded many opportunities for the emergence of architectural trends and movements. This article presents a new positional interpretation of contemporary Egyptian architecture. It is culled from a spectrum of issues I have presented in several events and published in local and international conferences and trade magazines. However, it calls for a fresh look at the iss...

  13. Soil hydraulic material properties and layered architecture from time-lapse GPR

    Science.gov (United States)

    Jaumann, Stefan; Roth, Kurt

    2018-04-01

    Quantitative knowledge of the subsurface material distribution and its effective soil hydraulic material properties is essential to predict soil water movement. Ground-penetrating radar (GPR) is a noninvasive and nondestructive geophysical measurement method that is suitable to monitor hydraulic processes. Previous studies showed that the GPR signal from a fluctuating groundwater table is sensitive to the soil water characteristic and the hydraulic conductivity function. In this work, we show that the GPR signal originating from both the subsurface architecture and the fluctuating groundwater table is suitable to estimate the position of layers within the subsurface architecture together with the associated effective soil hydraulic material properties with inversion methods. To that end, we parameterize the subsurface architecture, solve the Richards equation, convert the resulting water content to relative permittivity with the complex refractive index model (CRIM), and solve Maxwell's equations numerically. In order to analyze the GPR signal, we implemented a new heuristic algorithm that detects relevant signals in the radargram (events) and extracts the corresponding signal travel time and amplitude. This algorithm is applied to simulated as well as measured radargrams and the detected events are associated automatically. Using events instead of the full wave regularizes the inversion focussing on the relevant measurement signal. For optimization, we use a global-local approach with preconditioning. Starting from an ensemble of initial parameter sets drawn with a Latin hypercube algorithm, we sequentially couple a simulated annealing algorithm with a Levenberg-Marquardt algorithm. The method is applied to synthetic as well as measured data from the ASSESS test site. We show that the method yields reasonable estimates for the position of the layers as well as for the soil hydraulic material properties by comparing the results to references derived from ground

  14. Implementing the competences-based students-centered learning approach in Architectural Design Education. The case of the T MEDA Pilot Architectural Program at the Hashemite University (Jordan

    Directory of Open Access Journals (Sweden)

    Ahmad A. S. Al Husban

    2016-11-01

    Full Text Available Higher educational systems become increasingly oriented towards the competences-based student-centered learning and outcome approach. Worldwide, these systems are focusing on the students as a whole: focusing on their dimensional, intellectual, professional, psychological, moral, and spiritual. This research was conducted in an attempt to answer the main research question: how can the architectural design courses be designed based on the required competences and how can the teaching, learning activities and assessment methods be structured and aligned in order to allow students to achieve and reach the intended learning outcomes? This research used a case study driven best practice research method to answer the research questions based on the T MEDA pilot architectural program that was implemented at the Hashemite University, Jordan. This research found that it is important for architectural education to adapt the students-centered learning method. Such approach increases the effectiveness of teaching and learning methods, enhances the design studio environment, and focuses on students’ engagement to develop their design process and product. Moreover, this research found that using different assessment methods in architectural design courses help students to develop their learning outcomes; and inform teachers about the effectiveness of their teaching process. Furthermore, the involvement of students in assessment produces effective learning and enhances their design motivation. However, applying competences-based students-centered learning and outcome approach needs more time and staff to apply. Another problem is that some instructors resist changing to the new methods or approaches because they prefer to use their old and traditional systems. The application for this method at the first time needs intensive recourses, more time, and good cooperation between different instructors and course coordinator. However, within the time this method

  15. Visualization of decision processes using a cognitive architecture

    Science.gov (United States)

    Livingston, Mark A.; Murugesan, Arthi; Brock, Derek; Frost, Wende K.; Perzanowski, Dennis

    2013-01-01

    Cognitive architectures are computational theories of reasoning the human mind engages in as it processes facts and experiences. A cognitive architecture uses declarative and procedural knowledge to represent mental constructs that are involved in decision making. Employing a model of behavioral and perceptual constraints derived from a set of one or more scenarios, the architecture reasons about the most likely consequence(s) of a sequence of events. Reasoning of any complexity and depth involving computational processes, however, is often opaque and challenging to comprehend. Arguably, for decision makers who may need to evaluate or question the results of autonomous reasoning, it would be useful to be able to inspect the steps involved in an interactive, graphical format. When a chain of evidence and constraint-based decision points can be visualized, it becomes easier to explore both how and why a scenario of interest will likely unfold in a particular way. In initial work on a scheme for visualizing cognitively-based decision processes, we focus on generating graphical representations of models run in the Polyscheme cognitive architecture. Our visualization algorithm operates on a modified version of Polyscheme's output, which is accomplished by augmenting models with a simple set of tags. We provide example visualizations and discuss properties of our technique that pose challenges for our representation goals. We conclude with a summary of feedback solicited from domain experts and practitioners in the field of cognitive modeling.

  16. How Talisman Energy implemented oilfield fleet safety using event driven AVL technologies from TELUS

    Energy Technology Data Exchange (ETDEWEB)

    Munroe, D. [Telus Energy Sector Organization, Calgary, AB (Canada)

    2006-07-01

    This conference presentation provided information on how Talisman Energy implemented oilfield fleet safety using event driven automated vehicle location (AVL) technologies from TELUS. Background information on Telus in the energy sector, Telus geomatics, Telus mobile resource management (MRM) application modules, as well as Talisman Energy was first provided. Talisman looked to Telus for AVL technologies because it had a need to identify where employees were working, when they arrived and how long they were there. Prior to meeting with TELUS, Talisman Energy had implemented a system consisting of a trunk radio network to transmit data to the control room on the employee's location, however, it was unable to use the radio network to dispatch orders and communicate on an as-needed basis. TELUS had implemented a customized solution that tracks vehicles and provides an easy method of communication for employees to track how long they are working at a site. The service provides the control centre with the tools to monitor the location of the vehicles on a map, and communicate to the employee via a horn when their time has expired at the site. If the employee does not respond, the control room can call the nearest personnel to check on the worker's status. The integrated TELUS solutions uses Global Positioning System (GPS), Geographic Information Systems (GIS) and Wireless communication over the Internet. The presentation provided numerous Talisman specific maps. It concluded with several issues to consider such as cross platform cellular communications supported on a single hardware platform; additional support for Satellite communications utilizing the same GIS platform; the need for extensive support for peripheral devices; and, the need for extensive roaming capabilities. tabs., figs.

  17. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    Science.gov (United States)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  18. AN ARCHITECTURE FOR AUTOMATED FIRE DETECTION EARLY WARNING SYSTEM BASED ON GEOPROCESSING SERVICE COMPOSITION

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS has guided geospatial data processing in a Service Oriented Architecture (SOA platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  19. Power-Law Statistics of Driven Reconnection in the Magnetically Closed Corona

    Science.gov (United States)

    Klimchuk, J. A.; DeVore, C. R.; Knizhnik, K. J.; Uritskiy, V. M.

    2018-01-01

    Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.

  20. Power-law Statistics of Driven Reconnection in the Magnetically Closed Corona

    Science.gov (United States)

    Knizhnik, K. J.; Uritsky, V. M.; Klimchuk, J. A.; DeVore, C. R.

    2018-01-01

    Numerous observations have revealed that power-law distributions are ubiquitous in energetic solar processes. Hard X-rays, soft X-rays, extreme ultraviolet radiation, and radio waves all display power-law frequency distributions. Since magnetic reconnection is the driving mechanism for many energetic solar phenomena, it is likely that reconnection events themselves display such power-law distributions. In this work, we perform numerical simulations of the solar corona driven by simple convective motions at the photospheric level. Using temperature changes, current distributions, and Poynting fluxes as proxies for heating, we demonstrate that energetic events occurring in our simulation display power-law frequency distributions, with slopes in good agreement with observations. We suggest that the braiding-associated reconnection in the corona can be understood in terms of a self-organized criticality model driven by convective rotational motions similar to those observed at the photosphere.

  1. Long-lasting solar energetic electron injection during the 26 Dec 2013 widespread SEP event

    Science.gov (United States)

    Dresing, N.; Klassen, A.; Temmer, M.; Gomez-Herrero, R.; Heber, B.; Veronig, A.

    2017-12-01

    The solar energetic particle (SEP) event on 26 Dec 2013 was detected all around the Sun by the two STEREO spacecraft and close-to-Earth observers. While the two STEREOs were separated by 59 degrees and situated at the front side of the associated large coronal event, it was a backside-event for Earth. Nevertheless, significant and long-lasting solar energetic electron anisotropies together with long rise times were observed at all three viewpoints, pointing to an extended electron injection. Although the CME-driven shock appears to account for the SEP event at a first glance a more detailed view reveals a more complex scenario: A CME-CME interaction takes place during the very early phase of the SEP event. Furthermore, four hours after the onset of the event, a second component is measured at all three viewpoints on top of the first SEP increase, mainly consisting of high energy particles. We find that the CME-driven shock alone can hardly account for the observed SEP event in total but a trapping scenario together with ongoing particle acceleration is more likely.

  2. Life events and borderline personality features: the influence of gene–environment interaction and gene–environment correlation

    NARCIS (Netherlands)

    Distel, M.A.; Middeldorp, C.M.; Trull, T.J.; Derom, C.A.; Willemsen, G.; Boomsma, D.I.

    2011-01-01

    Background Traumatic life events are generally more common in patients with borderline personality disorder (BPD) than in non-patients or patients with other personality disorders. This study investigates whether exposure to life events moderates the genetic architecture of BPD features. As the

  3. Architectural freedom and industrialised architecture

    DEFF Research Database (Denmark)

    Vestergaard, Inge

    2012-01-01

    to the building physic problems a new industrialized period has started based on light weight elements basically made of wooden structures, faced with different suitable materials meant for individual expression for the specific housing area. It is the purpose of this article to widen up the different design...... to this systematic thinking of the building technique we get a diverse and functional architecture. Creating a new and clearer story telling about new and smart system based thinking behind the architectural expression....

  4. Design management in the architectural engineering and construction sector : proceedings of the joint CIB W096 Architectural Management and CIB TG49. Architectural Engineering Conference held in conjunction with the 8th Brazilian Workshop on Building Design Management, University of Sao Paulo, 4-8 December 2008

    NARCIS (Netherlands)

    Melhado, S.; Prins, M.; Emmitt, S.; Bouchlaghem, D.; Otter, den A.F.H.J.

    2008-01-01

    Following the Denmark meeting, held in Lyngby 2005, the CIB W096 commission on Architectural Management merged its own meetings with two large events, the Adaptables Conference in Eindhoven 2006, and the CIB world Conference in Cape Town in 2007. Papers were invited under the theme Design Management

  5. The LCLS Timing Event System

    Energy Technology Data Exchange (ETDEWEB)

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  6. Preemptive Architecture: Explosive Art and Future Architectures in Cursed Urban Zones

    Directory of Open Access Journals (Sweden)

    Stahl Stenslie

    2017-04-01

    Full Text Available This article describes the art and architectural research project Preemptive Architecture that uses artistic strategies and approaches to create bomb-ready architectural structures that act as instruments for the undoing of violence in war. Increasing environmental usability through destruction represents an inverse strategy that reverses common thinking patterns about warfare, art and architecture. Building structures predestined for a construc­tive destruction becomes a creative act. One of the main motivations behind this paper is to challenge and expand the material thinking as well as the socio-political conditions related to artistic, architectural and design based practices.   Article received: December 12, 2016; Article accepted: January 10, 2017; Published online: April 20, 2017 Original scholarly paper How to cite this article: Stenslie, Stahl, and Magne Wiggen. "Preemptive Architecture: Explosive Art and Future Architectures in Cursed Urban Zones." AM Journal of Art and Media Studies 12 (2017: 29-39. doi: 10.25038/am.v0i12.165

  7. An event driven hybrid identity management approach to privacy enhanced e-health.

    Science.gov (United States)

    Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio

    2012-01-01

    Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent--considered as a privacy rule in sensitive scenarios--has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.

  8. A model-driven approach for representing clinical archetypes for Semantic Web environments.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto

    2009-02-01

    The life-long clinical information of any person supported by electronic means configures his Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. There are currently different standards for representing and exchanging EHR information among different systems. In advanced EHR approaches, clinical information is represented by means of archetypes. Most of these approaches use the Archetype Definition Language (ADL) to specify archetypes. However, ADL has some drawbacks when attempting to perform semantic activities in Semantic Web environments. In this work, Semantic Web technologies are used to specify clinical archetypes for advanced EHR architectures. The advantages of using the Ontology Web Language (OWL) instead of ADL are described and discussed in this work. Moreover, a solution combining Semantic Web and Model-driven Engineering technologies is proposed to transform ADL into OWL for the CEN EN13606 EHR architecture.

  9. A generative tool for building health applications driven by ISO 13606 archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  10. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    Science.gov (United States)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  11. Enterprise architecture patterns practical solutions for recurring IT-architecture problems

    CERN Document Server

    Perroud, Thierry

    2013-01-01

    Every enterprise architect faces similar problems when designing and governing the enterprise architecture of a medium to large enterprise. Design patterns are a well-established concept in software engineering, used to define universally applicable solution schemes. By applying this approach to enterprise architectures, recurring problems in the design and implementation of enterprise architectures can be solved over all layers, from the business layer to the application and data layer down to the technology layer.Inversini and Perroud describe patterns at the level of enterprise architecture

  12. MUF architecture /art London

    DEFF Research Database (Denmark)

    Svenningsen Kajita, Heidi

    2009-01-01

    Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art......Om MUF architecture samt interview med Liza Fior og Katherine Clarke, partnere i muf architecture/art...

  13. Projections of extreme water level events for atolls in the western Tropical Pacific

    Science.gov (United States)

    Merrifield, M. A.; Becker, J. M.; Ford, M.; Yao, Y.

    2014-12-01

    Conditions that lead to extreme water levels and coastal flooding are examined for atolls in the Republic of the Marshall Islands based on a recent field study of wave transformations over fringing reefs, tide gauge observations, and wave model hindcasts. Wave-driven water level extremes pose the largest threat to atoll shorelines, with coastal levels scaling as approximately one-third of the incident breaking wave height. The wave-driven coastal water level is partitioned into a mean setup, low frequency oscillations associated with cross-reef quasi-standing modes, and wind waves that reach the shore after undergoing high dissipation due to breaking and bottom friction. All three components depend on the water level over the reef; however, the sum of the components is independent of water level due to cancelling effects. Wave hindcasts suggest that wave-driven water level extremes capable of coastal flooding are infrequent events that require a peak wave event to coincide with mid- to high-tide conditions. Interannual and decadal variations in sea level do not change the frequency of these events appreciably. Future sea-level rise scenarios significantly increase the flooding threat associated with wave events, with a nearly exponential increase in flooding days per year as sea level exceeds 0.3 to 1.0 m above current levels.

  14. Architectural design and reliability analysis of a fail-operational brake-by-wire system from ISO 26262 perspectives

    International Nuclear Information System (INIS)

    Sinha, Purnendu

    2011-01-01

    Next generation drive-by-wire automotive systems enabling autonomous driving will build on the fail-operational capabilities of electronics, control and software (ECS) architectural solutions. Developing such architectural designs that would meet dependability requirements and satisfy other system constraints is a challenging task and will possibly lead to a paradigm shift in automotive ECS architecture design and development activities. This aspect is becoming quite relevant while designing battery-driven electric vehicles with integrated in-wheel drive-train and chassis subsystems. In such highly integrated dependable systems, many of the primary features and functions are attributed to the highest safety critical ratings. Brake-by-wire is one such system that interfaces with active safety features built into an automobile, and which in turn is expected to provide fail-operational capabilities. In this paper, building up on the basic concepts of fail-silent and fail-operational systems design we propose a system-architecture for a brake-by-wire system with fail-operational capabilities. The design choices are supported with proper rationale and design trade-offs. Safety and reliability analysis of the proposed system architecture is performed as per the ISO 26262 standard for functional safety of electrical/electronic systems in road vehicles.

  15. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    Science.gov (United States)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  16. The CMS event builder demonstrator and results with Myrinet

    CERN Document Server

    Antchev, G; Cittolin, Sergio; Erhan, S; Faure, B; Gigi, D; Gutleber, J; Jacobs, C; Meijers, F; Meschi, E; Ninane, A; Orsini, L; Pollet, Lucien; Rácz, A; Samyn, D; Schleifer, W; Sinanis, N; Sphicas, Paris

    2001-01-01

    The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network. Several switch technologies are currently being evaluated in order to compare different architectures for the event builder. One candidate is Myrinet. This paper describes the demonstrator which has been setup to study a small-scale (16*16) event builder based on PCs running Linux connected to Myrinet and Ethernet switches. A detailed study of the Myrinet switch performance has been performed for various traffic conditions, including the behaviour of composite switches. Results from event building studies are presented, including measurements on throughput, overhead and scaling. Traffic shaping techniques have been implemented and the effect on the event building performance has been investigated. The paper reports on performances and maximum event rate obtainable using custom software, not described, for the Myrinet control program and the low-level communica...

  17. Real-Time MENTAT programming language and architecture

    Science.gov (United States)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  18. VERNACULAR ARCHITECTURE: AN INTRODUCTORY COURSE TO LEARN ARCHITECTURE IN INDIA

    Directory of Open Access Journals (Sweden)

    Miki Desai

    2010-07-01

    Full Text Available “The object in view of both my predecessors in office and by myself has been rather to bring out the reasoning powers of individual students, so that they may understand the inner meaning of the old forms and their original function and may develop and modernize and gradually produce an architecture, Indian in character, but at the same time as suited to present day India as the old styles were to their own times and environment.” Claude Batley-1940; Lang, Desai, Desai, 1997 (p.143. The article introduces teaching philosophy, content and method of Basic Design I and II for first year students of architecture at the Faculty of Architecture, Centre for Environmental Planning and Technology (CEPT University, Ahmedabad, India. It is framed within the Indian perspective of architectural education from the British colonial times. Commencing with important academic literature and biases of the initial colonial period, it quickly traces architectural education in CEPT, the sixteenth school of post-independent India, set up in 1962, discussing the foundation year teaching imparted. The school was Modernist and avant-garde. The author introduced these two courses against the back drop of the Universalist Modernist credo of architecture and education. In the courses, the primary philosophy behind learning design emerges from heuristic method. The aim of the first course is seen as infusing interest in visual world, development of manual skills and dexterity through the dictum of ‘Look-feel-reason out-evaluate’ and ‘observe-record-interpret-synthesize transform express’. Due to the lack of architectural orientation in Indian schooling; the second course assumes vernacular architecture as a reasonable tool for a novice to understand the triangular relationship of society, architecture and physical context and its impact on design. The students are analytically exposed to the regional variety of architectures logically stemming from the geo

  19. Architecture Descriptions. A Contribution to Modeling of Production System Architecture

    DEFF Research Database (Denmark)

    Jepsen, Allan Dam; Hvam, Lars

    a proper understanding of the architecture phenomenon and the ability to describe it in a manner that allow the architecture to be communicated to and handled by stakeholders throughout the company. Despite the existence of several design philosophies in production system design such as Lean, that focus...... a diverse set of stakeholder domains and tools in the production system life cycle. To support such activities, a contribution is made to the identification and referencing of production system elements within architecture descriptions as part of the reference architecture framework. The contribution...

  20. Backpropagation architecture optimization and an application in nuclear power plant diagnostics

    International Nuclear Information System (INIS)

    Basu, A.; Bartlett, E.B.

    1993-01-01

    This paper presents a Dynamic Node Architecture (DNA) scheme to optimize the architecture of backpropagation Artificial Neural Networks (ANNs). This network scheme is used to develop an ANN based diagnostic adviser capable of identifying the operating status of a nuclear power plant. Specifically, a root network is trained to diagnose if the plant is in a normal operating condition or not. In the event of an abnormal condition, another classifier network is trained to recognize the particular transient taking place. These networks are trained using plant instrumentation data gathered during simulations of the various transients and normal operating conditions at, the Iowa Electric Light and Power Company's Duane Arnold Energy Center (DAEC) operator training simulator