WorldWideScience

Sample records for building scalable models

  1. The Concept of Business Model Scalability

    DEFF Research Database (Denmark)

    Lund, Morten; Nielsen, Christian

    2018-01-01

    -term pro table business. However, the main message of this article is that while providing a good value proposition may help the rm ‘get by’, the really successful businesses of today are those able to reach the sweet-spot of business model scalability. Design/Methodology/Approach: The article is based...... on a ve-year longitudinal action research project of over 90 companies that participated in the International Center for Innovation project aimed at building 10 global network-based business models. Findings: This article introduces and discusses the term scalability from a company-level perspective......Purpose: The purpose of the article is to de ne what scalable business models are. Central to the contemporary understanding of business models is the value proposition towards the customer and the hypotheses generated about delivering value to the customer which become a good foundation for a long...

  2. Scalability of Sustainable Business Models in Hybrid Organizations

    Directory of Open Access Journals (Sweden)

    Adam Jabłoński

    2016-02-01

    Full Text Available The dynamics of change in modern business create new mechanisms for company management to determine their pursuit and the achievement of their high performance. This performance maintained over a long period of time becomes a source of ensuring business continuity by companies. An ontological being enabling the adoption of such assumptions is such a business model that has the ability to generate results in every possible market situation and, moreover, it has the features of permanent adaptability. A feature that describes the adaptability of the business model is its scalability. Being a factor ensuring more work and more efficient work with an increasing number of components, scalability can be applied to the concept of business models as the company’s ability to maintain similar or higher performance through it. Ensuring the company’s performance in the long term helps to build the so-called sustainable business model that often balances the objectives of stakeholders and shareholders, and that is created by the implemented principles of value-based management and corporate social responsibility. This perception of business paves the way for building hybrid organizations that integrate business activities with pro-social ones. The combination of an approach typical of hybrid organizations in designing and implementing sustainable business models pursuant to the scalability criterion seems interesting from the cognitive point of view. Today, hybrid organizations are great spaces for building effective and efficient mechanisms for dialogue between business and society. This requires the appropriate business model. The purpose of the paper is to present the conceptualization and operationalization of scalability of sustainable business models that determine the performance of a hybrid organization in the network environment. The paper presents the original concept of applying scalability in sustainable business models with detailed

  3. Building scalable apps with Redis and Node.js

    CERN Document Server

    Johanan, Joshua

    2014-01-01

    If the phrase scalability sounds alien to you, then this is an ideal book for you. You will not need much Node.js experience as each framework is demonstrated in a way that requires no previous knowledge of the framework. You will be building scalable Node.js applications in no time! Knowledge of JavaScript is required.

  4. An extended systematic mapping study about the scalability of i* Models

    Directory of Open Access Journals (Sweden)

    Paulo Lima

    2016-12-01

    Full Text Available i* models have been used for requirements specification in many domains, such as healthcare, telecommunication, and air traffic control. Managing the scalability and the complexity of such models is an important challenge in Requirements Engineering (RE. Scalability is also one of the most intractable issues in the design of visual notations in general: a well-known problem with visual representations is that they do not scale well. This issue has led us to investigate scalability in i* models and its variants by means of a systematic mapping study. This paper is an extended version of a previous paper on the scalability of i* including papers indicated by specialists. Moreover, we also discuss the challenges and open issues regarding scalability of i* models and its variants. A total of 126 papers were analyzed in order to understand: how the RE community perceives scalability; and which proposals have considered this topic. We found that scalability issues are indeed perceived as relevant and that further work is still required, even though many potential solutions have already been proposed. This study can be a starting point for researchers aiming to further advance the treatment of scalability in i* models.

  5. From Digital Disruption to Business Model Scalability

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten; Thomsen, Peter Poulsen

    2017-01-01

    This article discusses the terms disruption, digital disruption, business models and business model scalability. It illustrates how managers should be using these terms for the benefit of their business by developing business models capable of achieving exponentially increasing returns to scale...... will seldom lead to business model scalability capable of competing with digital disruption(s)....... as a response to digital disruption. A series of case studies illustrate that besides frequent existing messages in the business literature relating to the importance of creating agile businesses, both in growing and declining economies, as well as hard to copy value propositions or value propositions that take...

  6. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    Science.gov (United States)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using

  7. VPLS: an effective technology for building scalable transparent LAN services

    Science.gov (United States)

    Dong, Ximing; Yu, Shaohua

    2005-02-01

    Virtual Private LAN Service (VPLS) is generating considerable interest with enterprises and service providers as it offers multipoint transparent LAN service (TLS) over MPLS networks. This paper describes an effective technology - VPLS, which links virtual switch instances (VSIs) through MPLS to form an emulated Ethernet switch and build Scalable Transparent Lan Services. It first focuses on the architecture of VPLS with Ethernet bridging technique at the edge and MPLS at the core, then it tries to elucidate the data forwarding mechanism within VPLS domain, including learning and aging MAC addresses on a per LSP basis, flooding of unknown frames and replication for unknown, multicast, and broadcast frames. The loop-avoidance mechanism, known as split horizon forwarding, is also analyzed. Another important aspect of VPLS service is its basic operation, including autodiscovery and signaling, is discussed. From the perspective of efficiency and scalability the paper compares two important signaling mechanism, BGP and LDP, which are used to set up a PW between the PEs and bind the PWs to a particular VSI. With the extension of VPLS and the increase of full mesh of PWs between PE devices (n*(n-1)/2 PWs in all, a n2 complete problem), VPLS instance could have a large number of remote PE associations, resulting in an inefficient use of network bandwidth and system resources as the ingress PE has to replicate each frame and append MPLS labels for remote PE. So the latter part of this paper focuses on the scalability issue: the Hierarchical VPLS. Within the architecture of HVPLS, this paper addresses two ways to cope with a possibly large number of MAC addresses, which make VPLS operate more efficiently.

  8. Building Scalable Knowledge Graphs for Earth Science

    Science.gov (United States)

    Ramachandran, R.; Maskey, M.; Gatlin, P. N.; Zhang, J.; Duan, X.; Bugbee, K.; Christopher, S. A.; Miller, J. J.

    2017-12-01

    Estimates indicate that the world's information will grow by 800% in the next five years. In any given field, a single researcher or a team of researchers cannot keep up with this rate of knowledge expansion without the help of cognitive systems. Cognitive computing, defined as the use of information technology to augment human cognition, can help tackle large systemic problems. Knowledge graphs, one of the foundational components of cognitive systems, link key entities in a specific domain with other entities via relationships. Researchers could mine these graphs to make probabilistic recommendations and to infer new knowledge. At this point, however, there is a dearth of tools to generate scalable Knowledge graphs using existing corpus of scientific literature for Earth science research. Our project is currently developing an end-to-end automated methodology for incrementally constructing Knowledge graphs for Earth Science. Semantic Entity Recognition (SER) is one of the key steps in this methodology. SER for Earth Science uses external resources (including metadata catalogs and controlled vocabulary) as references to guide entity extraction and recognition (i.e., labeling) from unstructured text, in order to build a large training set to seed the subsequent auto-learning component in our algorithm. Results from several SER experiments will be presented as well as lessons learned.

  9. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  10. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  11. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  12. Genetic algorithms and genetic programming for multiscale modeling: Applications in materials science and chemistry and advances in scalability

    Science.gov (United States)

    Sastry, Kumara Narasimha

    2007-03-01

    building blocks in organic chemistry---indicate that MOGAs produce High-quality semiempirical methods that (1) are stable to small perturbations, (2) yield accurate configuration energies on untested and critical excited states, and (3) yield ab initio quality excited-state dynamics. The proposed method enables simulations of more complex systems to realistic, multi-picosecond timescales, well beyond previous attempts or expectation of human experts, and 2--3 orders-of-magnitude reduction in computational cost. While the two applications use simple evolutionary operators, in order to tackle more complex systems, their scalability and limitations have to be investigated. The second part of the thesis addresses some of the challenges involved with a successful design of genetic algorithms and genetic programming for multiscale modeling. The first issue addressed is the scalability of genetic programming, where facetwise models are built to assess the population size required by GP to ensure adequate supply of raw building blocks and also to ensure accurate decision-making between competing building blocks. This study also presents a design of competent genetic programming, where traditional fixed recombination operators are replaced by building and sampling probabilistic models of promising candidate programs. The proposed scalable GP, called extended compact GP (eCGP), combines the ideas from extended compact genetic algorithm (eCGA) and probabilistic incremental program evolution (PIPE) and adaptively identifies, propagates and exchanges important subsolutions of a search problem. Results show that eCGP scales cubically with problem size on both GP-easy and GP-hard problems. Finally, facetwise models are developed to explore limitations of scalability of MOGAs, where the scalability of multiobjective algorithms in reliably maintaining Pareto-optimal solutions is addressed. The results show that even when the building blocks are accurately identified, massive multimodality

  13. Historical Building Monitoring Using an Energy-Efficient Scalable Wireless Sensor Network Architecture

    Science.gov (United States)

    Capella, Juan V.; Perles, Angel; Bonastre, Alberto; Serrano, Juan J.

    2011-01-01

    We present a set of novel low power wireless sensor nodes designed for monitoring wooden masterpieces and historical buildings, in order to perform an early detection of pests. Although our previous star-based system configuration has been in operation for more than 13 years, it does not scale well for sensorization of large buildings or when deploying hundreds of nodes. In this paper we demonstrate the feasibility of a cluster-based dynamic-tree hierarchical Wireless Sensor Network (WSN) architecture where realistic assumptions of radio frequency data transmission are applied to cluster construction, and a mix of heterogeneous nodes are used to minimize economic cost of the whole system and maximize power saving of the leaf nodes. Simulation results show that the specialization of a fraction of the nodes by providing better antennas and some energy harvesting techniques can dramatically extend the life of the entire WSN and reduce the cost of the whole system. A demonstration of the proposed architecture with a new routing protocol and applied to termite pest detection has been implemented on a set of new nodes and should last for about 10 years, but it provides better scalability, reliability and deployment properties. PMID:22346630

  14. Historical building monitoring using an energy-efficient scalable wireless sensor network architecture.

    Science.gov (United States)

    Capella, Juan V; Perles, Angel; Bonastre, Alberto; Serrano, Juan J

    2011-01-01

    We present a set of novel low power wireless sensor nodes designed for monitoring wooden masterpieces and historical buildings, in order to perform an early detection of pests. Although our previous star-based system configuration has been in operation for more than 13 years, it does not scale well for sensorization of large buildings or when deploying hundreds of nodes. In this paper we demonstrate the feasibility of a cluster-based dynamic-tree hierarchical Wireless Sensor Network (WSN) architecture where realistic assumptions of radio frequency data transmission are applied to cluster construction, and a mix of heterogeneous nodes are used to minimize economic cost of the whole system and maximize power saving of the leaf nodes. Simulation results show that the specialization of a fraction of the nodes by providing better antennas and some energy harvesting techniques can dramatically extend the life of the entire WSN and reduce the cost of the whole system. A demonstration of the proposed architecture with a new routing protocol and applied to termite pest detection has been implemented on a set of new nodes and should last for about 10 years, but it provides better scalability, reliability and deployment properties.

  15. A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Gent, Peter; Lamarque, Jean-Francois; Conley, Andrew; Vertenstein, Mariana; Craig, Anthony

    2013-02-13

    The objective of this award was to build a scalable and extensible Earth System Model that can be used to study climate change science. That objective has been achieved with the public release of the Community Earth System Model, version 1 (CESM1). In particular, the development of the CESM1 atmospheric chemistry component was substantially funded by this award, as was the development of the significantly improved coupler component. The CESM1 allows new climate change science in areas such as future air quality in very large cities, the effects of recovery of the southern hemisphere ozone hole, and effects of runoff from ice melt in the Greenland and Antarctic ice sheets. Results from a whole series of future climate projections using the CESM1 are also freely available via the web from the CMIP5 archive at the Lawrence Livermore National Laboratory. Many research papers using these results have now been published, and will form part of the 5th Assessment Report of the United Nations Intergovernmental Panel on Climate Change, which is to be published late in 2013.

  16. Scalable Frequent Subgraph Mining

    KAUST Repository

    Abdelhamid, Ehab

    2017-06-19

    A graph is a data structure that contains a set of nodes and a set of edges connecting these nodes. Nodes represent objects while edges model relationships among these objects. Graphs are used in various domains due to their ability to model complex relations among several objects. Given an input graph, the Frequent Subgraph Mining (FSM) task finds all subgraphs with frequencies exceeding a given threshold. FSM is crucial for graph analysis, and it is an essential building block in a variety of applications, such as graph clustering and indexing. FSM is computationally expensive, and its existing solutions are extremely slow. Consequently, these solutions are incapable of mining modern large graphs. This slowness is caused by the underlying approaches of these solutions which require finding and storing an excessive amount of subgraph matches. This dissertation proposes a scalable solution for FSM that avoids the limitations of previous work. This solution is composed of four components. The first component is a single-threaded technique which, for each candidate subgraph, needs to find only a minimal number of matches. The second component is a scalable parallel FSM technique that utilizes a novel two-phase approach. The first phase quickly builds an approximate search space, which is then used by the second phase to optimize and balance the workload of the FSM task. The third component focuses on accelerating frequency evaluation, which is a critical step in FSM. To do so, a machine learning model is employed to predict the type of each graph node, and accordingly, an optimized method is selected to evaluate that node. The fourth component focuses on mining dynamic graphs, such as social networks. To this end, an incremental index is maintained during the dynamic updates. Only this index is processed and updated for the majority of graph updates. Consequently, search space is significantly pruned and efficiency is improved. The empirical evaluation shows that the

  17. Semantic Models for Scalable Search in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Dennis Pfisterer

    2013-03-01

    Full Text Available The Internet of Things is anticipated to connect billions of embedded devices equipped with sensors to perceive their surroundings. Thereby, the state of the real world will be available online and in real-time and can be combined with other data and services in the Internet to realize novel applications such as Smart Cities, Smart Grids, or Smart Healthcare. This requires an open representation of sensor data and scalable search over data from diverse sources including sensors. In this paper we show how the Semantic Web technologies RDF (an open semantic data format and SPARQL (a query language for RDF-encoded data can be used to address those challenges. In particular, we describe how prediction models can be employed for scalable sensor search, how these prediction models can be encoded as RDF, and how the models can be queried by means of SPARQL.

  18. The Concept of Business Model Scalability

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten

    2015-01-01

    The power of business models lies in their ability to visualize and clarify how firms’ may configure their value creation processes. Among the key aspects of business model thinking are a focus on what the customer values, how this value is best delivered to the customer and how strategic partners...... are leveraged in this value creation, delivery and realization exercise. Central to the mainstream understanding of business models is the value proposition towards the customer and the hypothesis generated is that if the firm delivers to the customer what he/she requires, then there is a good foundation...... for a long-term profitable business. However, the message conveyed in this article is that while providing a good value proposition may help the firm ‘get by’, the really successful businesses of today are those able to reach the sweet-spot of business model scalability. This article introduces and discusses...

  19. Think 500, not 50! A scalable approach to student success in STEM.

    Science.gov (United States)

    LaCourse, William R; Sutphin, Kathy Lee; Ott, Laura E; Maton, Kenneth I; McDermott, Patrice; Bieberich, Charles; Farabaugh, Philip; Rous, Philip

    2017-01-01

    UMBC, a diverse public research university, "builds" upon its reputation in producing highly capable undergraduate scholars to create a comprehensive new model, STEM BUILD at UMBC. This program is designed to help more students develop the skills, experience and motivation to excel in science, technology, engineering, and mathematics (STEM). This article provides an in-depth description of STEM BUILD at UMBC and provides the context of this initiative within UMBC's vision and mission. The STEM BUILD model targets promising STEM students who enter as freshmen or transfer students and do not qualify for significant university or other scholarship support. Of primary importance to this initiative are capacity, scalability, and institutional sustainability, as we distill the advantages and opportunities of UMBC's successful scholars programs and expand their application to more students. The general approach is to infuse the mentoring and training process into the fabric of the undergraduate experience while fostering community, scientific identity, and resilience. At the heart of STEM BUILD at UMBC is the development of BUILD Group Research (BGR), a sequence of experiences designed to overcome the challenges that undergraduates without programmatic support often encounter (e.g., limited internship opportunities, mentorships, and research positions for which top STEM students are favored). BUILD Training Program (BTP) Trainees serve as pioneers in this initiative, which is potentially a national model for universities as they address the call to retain and graduate more students in STEM disciplines - especially those from underrepresented groups. As such, BTP is a research study using random assignment trial methodology that focuses on the scalability and eventual incorporation of successful measures into the traditional format of the academy. Critical measures to transform institutional culture include establishing an extensive STEM Living and Learning Community to

  20. A Scalable Heuristic for Viral Marketing Under the Tipping Model

    Science.gov (United States)

    2013-09-01

    Flixster is a social media website that allows users to share reviews and other information about cinema . [35] It was extracted in Dec. 2010. – FourSquare...work of Reichman were developed independently . We also note that Reichman performs no experimental evaluation of the algorithm. A Scalable Heuristic...other dif- fusion models, such as the independent cascade model [21] and evolutionary graph theory [25] as well as probabilistic variants of the

  1. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin

    2017-12-08

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference algorithms for such a model are increasingly limited due to their high time complexity and poor scalability. In this paper, we propose a multi-stage maximum likelihood approach to recover the latent parameters of the stochastic block model, in time linear with respect to the number of edges. We also propose a parallel algorithm based on message passing. Our algorithm can overlap communication and computation, providing speedup without compromising accuracy as the number of processors grows. For example, to process a real-world graph with about 1.3 million nodes and 10 million edges, our algorithm requires about 6 seconds on 64 cores of a contemporary commodity Linux cluster. Experiments demonstrate that the algorithm can produce high quality results on both benchmark and real-world graphs. An example of finding more meaningful communities is illustrated consequently in comparison with a popular modularity maximization algorithm.

  2. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad; Elsawy, Hesham; Bader, Ahmed; Alouini, Mohamed-Slim

    2017-01-01

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  3. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  4. Scalable Deployment of Advanced Building Energy Management Systems

    Science.gov (United States)

    2013-05-01

    build their own visualization screens containing charts and 3D graphics.  Lack of functionality for generating comprehensive reports that can be sent...through the windows and subsequently absorbed by interior walls, floors and furniture , air leakage through doors, sensible air from HVAC, and sensible...Unit Min Max Temperature of Air Entering Condenser ºC 14 35 Temperature of Chilled Water Leaving Chiller ºC 5 12 Part Load Ratio -- 0.1 1.2 Model

  5. Scalable coherent interface

    International Nuclear Information System (INIS)

    Alnaes, K.; Kristiansen, E.H.; Gustavson, D.B.; James, D.V.

    1990-01-01

    The Scalable Coherent Interface (IEEE P1596) is establishing an interface standard for very high performance multiprocessors, supporting a cache-coherent-memory model scalable to systems with up to 64K nodes. This Scalable Coherent Interface (SCI) will supply a peak bandwidth per node of 1 GigaByte/second. The SCI standard should facilitate assembly of processor, memory, I/O and bus bridge cards from multiple vendors into massively parallel systems with throughput far above what is possible today. The SCI standard encompasses two levels of interface, a physical level and a logical level. The physical level specifies electrical, mechanical and thermal characteristics of connectors and cards that meet the standard. The logical level describes the address space, data transfer protocols, cache coherence mechanisms, synchronization primitives and error recovery. In this paper we address logical level issues such as packet formats, packet transmission, transaction handshake, flow control, and cache coherence. 11 refs., 10 figs

  6. A Scalable Version of the Navy Operational Global Atmospheric Prediction System Spectral Forecast Model

    Directory of Open Access Journals (Sweden)

    Thomas E. Rosmond

    2000-01-01

    Full Text Available The Navy Operational Global Atmospheric Prediction System (NOGAPS includes a state-of-the-art spectral forecast model similar to models run at several major operational numerical weather prediction (NWP centers around the world. The model, developed by the Naval Research Laboratory (NRL in Monterey, California, has run operational at the Fleet Numerical Meteorological and Oceanographic Center (FNMOC since 1982, and most recently is being run on a Cray C90 in a multi-tasked configuration. Typically the multi-tasked code runs on 10 to 15 processors with overall parallel efficiency of about 90%. resolution is T159L30, but other operational and research applications run at significantly lower resolutions. A scalable NOGAPS forecast model has been developed by NRL in anticipation of a FNMOC C90 replacement in about 2001, as well as for current NOGAPS research requirements to run on DOD High-Performance Computing (HPC scalable systems. The model is designed to run with message passing (MPI. Model design criteria include bit reproducibility for different processor numbers and reasonably efficient performance on fully shared memory, distributed memory, and distributed shared memory systems for a wide range of model resolutions. Results for a wide range of processor numbers, model resolutions, and different vendor architectures are presented. Single node performance has been disappointing on RISC based systems, at least compared to vector processor performance. This is a common complaint, and will require careful re-examination of traditional numerical weather prediction (NWP model software design and data organization to fully exploit future scalable architectures.

  7. Object-oriented integrated approach for the design of scalable ECG systems.

    Science.gov (United States)

    Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija

    2009-01-01

    The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.

  8. A scalable infrastructure model for carbon capture and storage: SimCCS

    International Nuclear Information System (INIS)

    Middleton, Richard S.; Bielicki, Jeffrey M.

    2009-01-01

    In the carbon capture and storage (CCS) process, CO 2 sources and geologic reservoirs may be widely spatially dispersed and need to be connected through a dedicated CO 2 pipeline network. We introduce a scalable infrastructure model for CCS (simCCS) that generates a fully integrated, cost-minimizing CCS system. SimCCS determines where and how much CO 2 to capture and store, and where to build and connect pipelines of different sizes, in order to minimize the combined annualized costs of sequestering a given amount of CO 2 . SimCCS is able to aggregate CO 2 flows between sources and reservoirs into trunk pipelines that take advantage of economies of scale. Pipeline construction costs take into account factors including topography and social impacts. SimCCS can be used to calculate the scale of CCS deployment (local, regional, national). SimCCS' deployment of a realistic, capacitated pipeline network is a major advancement for planning CCS infrastructure. We demonstrate simCCS using a set of 37 CO 2 sources and 14 reservoirs for California. The results highlight the importance of systematic planning for CCS infrastructure by examining the sensitivity of CCS infrastructure, as optimized by simCCS, to varying CO 2 targets. We finish by identifying critical future research areas for CCS infrastructure

  9. LoRa Scalability: A Simulation Model Based on Interference Measurements.

    Science.gov (United States)

    Haxhibeqiri, Jetmir; Van den Abeele, Floris; Moerman, Ingrid; Hoebeke, Jeroen

    2017-05-23

    LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT) applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  10. Building a scalable event-level metadata service for ATLAS

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Goosens, L; Viegas, F T A; McGlone, H

    2008-01-01

    The ATLAS TAG Database is a multi-terabyte event-level metadata selection system, intended to allow discovery, selection of and navigation to events of interest to an analysis. The TAG Database encompasses file- and relational-database-resident event-level metadata, distributed across all ATLAS Tiers. An oracle hosted global TAG relational database, containing all ATLAS events, implemented in Oracle, will exist at Tier O. Implementing a system that is both performant and manageable at this scale is a challenge. A 1 TB relational TAG Database has been deployed at Tier 0 using simulated tag data. The database contains one billion events, each described by two hundred event metadata attributes, and is currently undergoing extensive testing in terms of queries, population and manageability. These 1 TB tests aim to demonstrate and optimise the performance and scalability of an Oracle TAG Database on a global scale. Partitioning and indexing strategies are crucial to well-performing queries and manageability of the database and have implications for database population and distribution, so these are investigated. Physics query patterns are anticipated, but a crucial feature of the system must be to support a broad range of queries across all attributes. Concurrently, event tags from ATLAS Computing System Commissioning distributed simulations are accumulated in an Oracle-hosted database at CERN, providing an event-level selection service valuable for user experience and gathering information about physics query patterns. In this paper we describe the status of the Global TAG relational database scalability work and highlight areas of future direction

  11. Intelligent Controls for Net-Zero Energy Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Haorong; Cho, Yong; Peng, Dongming

    2011-10-30

    The goal of this project is to develop and demonstrate enabling technologies that can empower homeowners to convert their homes into net-zero energy buildings in a cost-effective manner. The project objectives and expected outcomes are as follows: • To develop rapid and scalable building information collection and modeling technologies that can obtain and process “as-built” building information in an automated or semiautomated manner. • To identify low-cost measurements and develop low-cost virtual sensors that can monitor building operations in a plug-n-play and low-cost manner. • To integrate and demonstrate low-cost building information modeling (BIM) technologies. • To develop decision support tools which can empower building owners to perform energy auditing and retrofit analysis. • To develop and demonstrate low-cost automated diagnostics and optimal control technologies which can improve building energy efficiency in a continual manner.

  12. SAME4HPC: A Promising Approach in Building a Scalable and Mobile Environment for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Karthik, Rajasekar [ORNL

    2014-01-01

    In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack with Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.

  13. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    2010-01-01

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence......For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...

  14. Scalable, full-colour and controllable chromotropic plasmonic printing

    Science.gov (United States)

    Xue, Jiancai; Zhou, Zhang-Kai; Wei, Zhiqiang; Su, Rongbin; Lai, Juan; Li, Juntao; Li, Chao; Zhang, Tengwei; Wang, Xue-Hua

    2015-01-01

    Plasmonic colour printing has drawn wide attention as a promising candidate for the next-generation colour-printing technology. However, an efficient approach to realize full colour and scalable fabrication is still lacking, which prevents plasmonic colour printing from practical applications. Here we present a scalable and full-colour plasmonic printing approach by combining conjugate twin-phase modulation with a plasmonic broadband absorber. More importantly, our approach also demonstrates controllable chromotropic capability, that is, the ability of reversible colour transformations. This chromotropic capability affords enormous potentials in building functionalized prints for anticounterfeiting, special label, and high-density data encryption storage. With such excellent performances in functional colour applications, this colour-printing approach could pave the way for plasmonic colour printing in real-world commercial utilization. PMID:26567803

  15. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  16. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  17. LoRa Scalability: A Simulation Model Based on Interference Measurements

    Directory of Open Access Journals (Sweden)

    Jetmir Haxhibeqiri

    2017-05-01

    Full Text Available LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  18. Scalable devices

    KAUST Repository

    Krüger, Jens J.

    2014-01-01

    In computer science in general and in particular the field of high performance computing and supercomputing the term scalable plays an important role. It indicates that a piece of hardware, a concept, an algorithm, or an entire system scales with the size of the problem, i.e., it can not only be used in a very specific setting but it\\'s applicable for a wide range of problems. From small scenarios to possibly very large settings. In this spirit, there exist a number of fixed areas of research on scalability. There are works on scalable algorithms, scalable architectures but what are scalable devices? In the context of this chapter, we are interested in a whole range of display devices, ranging from small scale hardware such as tablet computers, pads, smart-phones etc. up to large tiled display walls. What interests us mostly is not so much the hardware setup but mostly the visualization algorithms behind these display systems that scale from your average smart phone up to the largest gigapixel display walls.

  19. Constraint Solver Techniques for Implementing Precise and Scalable Static Program Analysis

    DEFF Research Database (Denmark)

    Zhang, Ye

    solver using unification we could make a program analysis easier to design and implement, much more scalable, and still as precise as expected. We present an inclusion constraint language with the explicit equality constructs for specifying program analysis problems, and a parameterized framework...... developers to build reliable software systems more quickly and with fewer bugs or security defects. While designing and implementing a program analysis remains a hard work, making it both scalable and precise is even more challenging. In this dissertation, we show that with a general inclusion constraint...... data flow analyses for C language, we demonstrate a large amount of equivalences could be detected by off-line analyses, and they could then be used by a constraint solver to significantly improve the scalability of an analysis without sacrificing any precision....

  20. Numeric Analysis for Relationship-Aware Scalable Streaming Scheme

    Directory of Open Access Journals (Sweden)

    Heung Ki Lee

    2014-01-01

    Full Text Available Frequent packet loss of media data is a critical problem that degrades the quality of streaming services over mobile networks. Packet loss invalidates frames containing lost packets and other related frames at the same time. Indirect loss caused by losing packets decreases the quality of streaming. A scalable streaming service can decrease the amount of dropped multimedia resulting from a single packet loss. Content providers typically divide one large media stream into several layers through a scalable streaming service and then provide each scalable layer to the user depending on the mobile network. Also, a scalable streaming service makes it possible to decode partial multimedia data depending on the relationship between frames and layers. Therefore, a scalable streaming service provides a way to decrease the wasted multimedia data when one packet is lost. However, the hierarchical structure between frames and layers of scalable streams determines the service quality of the scalable streaming service. Even if whole packets of layers are transmitted successfully, they cannot be decoded as a result of the absence of reference frames and layers. Therefore, the complicated relationship between frames and layers in a scalable stream increases the volume of abandoned layers. For providing a high-quality scalable streaming service, we choose a proper relationship between scalable layers as well as the amount of transmitted multimedia data depending on the network situation. We prove that a simple scalable scheme outperforms a complicated scheme in an error-prone network. We suggest an adaptive set-top box (AdaptiveSTB to lower the dependency between scalable layers in a scalable stream. Also, we provide a numerical model to obtain the indirect loss of multimedia data and apply it to various multimedia streams. Our AdaptiveSTB enhances the quality of a scalable streaming service by removing indirect loss.

  1. A framework for scalable parameter estimation of gene circuit models using structural information

    KAUST Repository

    Kuwahara, Hiroyuki

    2013-06-21

    Motivation: Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Results: Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. The Author 2013.

  2. A framework for scalable parameter estimation of gene circuit models using structural information

    KAUST Repository

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-01-01

    Motivation: Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Results: Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. The Author 2013.

  3. Building-related health impacts in European and Chinese cities: a scalable assessment method.

    Science.gov (United States)

    Tuomisto, Jouni T; Niittynen, Marjo; Pärjälä, Erkki; Asikainen, Arja; Perez, Laura; Trüeb, Stephan; Jantunen, Matti; Künzli, Nino; Sabel, Clive E

    2015-12-14

    Public health is often affected by societal decisions that are not primarily about health. Climate change mitigation requires intensive actions to minimise greenhouse gas emissions in the future. Many of these actions take place in cities due to their traffic, buildings, and energy consumption. Active climate mitigation policies will also, aside of their long term global impacts, have short term local impacts, both positive and negative, on public health. Our main objective was to develop a generic open impact model to estimate health impacts of emissions due to heat and power consumption of buildings. In addition, the model should be usable for policy comparisons by non-health experts on city level with city-specific data, it should give guidance on the particular climate mitigation questions but at the same time increase understanding on the related health impacts and the model should follow the building stock in time, make comparisons between scenarios, propagate uncertainties, and scale to different levels of detail. We tested The functionalities of the model in two case cities, namely Kuopio and Basel. We estimated the health and climate impacts of two actual policies planned or implemented in the cities. The assessed policies were replacement of peat with wood chips in co-generation of district heat and power, and improved energy efficiency of buildings achieved by renovations. Health impacts were not large in the two cities, but also clear differences in implementation and predictability between the two tested policies were seen. Renovation policies can improve the energy efficiency of buildings and reduce greenhouse gas emissions significantly, but this requires systematic policy sustained for decades. In contrast, fuel changes in large district heating facilities may have rapid and large impacts on emissions. However, the life cycle impacts of different fuels is somewhat an open question. In conclusion, we were able to develop a practical model for city

  4. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  5. Content-Aware Scalability-Type Selection for Rate Adaptation of Scalable Video

    Directory of Open Access Journals (Sweden)

    Tekalp A Murat

    2007-01-01

    Full Text Available Scalable video coders provide different scaling options, such as temporal, spatial, and SNR scalabilities, where rate reduction by discarding enhancement layers of different scalability-type results in different kinds and/or levels of visual distortion depend on the content and bitrate. This dependency between scalability type, video content, and bitrate is not well investigated in the literature. To this effect, we first propose an objective function that quantifies flatness, blockiness, blurriness, and temporal jerkiness artifacts caused by rate reduction by spatial size, frame rate, and quantization parameter scaling. Next, the weights of this objective function are determined for different content (shot types and different bitrates using a training procedure with subjective evaluation. Finally, a method is proposed for choosing the best scaling type for each temporal segment that results in minimum visual distortion according to this objective function given the content type of temporal segments. Two subjective tests have been performed to validate the proposed procedure for content-aware selection of the best scalability type on soccer videos. Soccer videos scaled from 600 kbps to 100 kbps by the proposed content-aware selection of scalability type have been found visually superior to those that are scaled using a single scalability option over the whole sequence.

  6. A scalable variational inequality approach for flow through porous media models with pressure-dependent viscosity

    Science.gov (United States)

    Mapakshi, N. K.; Chang, J.; Nakshatrala, K. B.

    2018-04-01

    Mathematical models for flow through porous media typically enjoy the so-called maximum principles, which place bounds on the pressure field. It is highly desirable to preserve these bounds on the pressure field in predictive numerical simulations, that is, one needs to satisfy discrete maximum principles (DMP). Unfortunately, many of the existing formulations for flow through porous media models do not satisfy DMP. This paper presents a robust, scalable numerical formulation based on variational inequalities (VI), to model non-linear flows through heterogeneous, anisotropic porous media without violating DMP. VI is an optimization technique that places bounds on the numerical solutions of partial differential equations. To crystallize the ideas, a modification to Darcy equations by taking into account pressure-dependent viscosity will be discretized using the lowest-order Raviart-Thomas (RT0) and Variational Multi-scale (VMS) finite element formulations. It will be shown that these formulations violate DMP, and, in fact, these violations increase with an increase in anisotropy. It will be shown that the proposed VI-based formulation provides a viable route to enforce DMP. Moreover, it will be shown that the proposed formulation is scalable, and can work with any numerical discretization and weak form. A series of numerical benchmark problems are solved to demonstrate the effects of heterogeneity, anisotropy and non-linearity on DMP violations under the two chosen formulations (RT0 and VMS), and that of non-linearity on solver convergence for the proposed VI-based formulation. Parallel scalability on modern computational platforms will be illustrated through strong-scaling studies, which will prove the efficiency of the proposed formulation in a parallel setting. Algorithmic scalability as the problem size is scaled up will be demonstrated through novel static-scaling studies. The performed static-scaling studies can serve as a guide for users to be able to select

  7. Big data integration: scalability and sustainability

    KAUST Repository

    Zhang, Zhang

    2016-01-26

    Integration of various types of omics data is critically indispensable for addressing most important and complex biological questions. In the era of big data, however, data integration becomes increasingly tedious, time-consuming and expensive, posing a significant obstacle to fully exploit the wealth of big biological data. Here we propose a scalable and sustainable architecture that integrates big omics data through community-contributed modules. Community modules are contributed and maintained by different committed groups and each module corresponds to a specific data type, deals with data collection, processing and visualization, and delivers data on-demand via web services. Based on this community-based architecture, we build Information Commons for Rice (IC4R; http://ic4r.org), a rice knowledgebase that integrates a variety of rice omics data from multiple community modules, including genome-wide expression profiles derived entirely from RNA-Seq data, resequencing-based genomic variations obtained from re-sequencing data of thousands of rice varieties, plant homologous genes covering multiple diverse plant species, post-translational modifications, rice-related literatures, and community annotations. Taken together, such architecture achieves integration of different types of data from multiple community-contributed modules and accordingly features scalable, sustainable and collaborative integration of big data as well as low costs for database update and maintenance, thus helpful for building IC4R into a comprehensive knowledgebase covering all aspects of rice data and beneficial for both basic and translational researches.

  8. A framework for scalable parameter estimation of gene circuit models using structural information.

    Science.gov (United States)

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  9. Performance and scalability of finite-difference and finite-element wave-propagation modeling on Intel's Xeon Phi

    NARCIS (Netherlands)

    Zhebel, E.; Minisini, S.; Kononov, A.; Mulder, W.A.

    2013-01-01

    With the rapid developments in parallel compute architectures, algorithms for seismic modeling and imaging need to be reconsidered in terms of parallelization. The aim of this paper is to compare scalability of seismic modeling algorithms: finite differences, continuous mass-lumped finite elements

  10. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics for Scientific Data and Analysis

    Data.gov (United States)

    National Aeronautics and Space Administration — We will construct SciSpark, a scalable system for interactive model evaluation and for the rapid development of climate metrics and analyses. SciSpark directly...

  11. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria

    2016-01-01

    Scalability is a design principle often valued for the engineering of complex systems. Scalability is the ability of a system to change the current value of one of its specification parameters. Although targeted frameworks are available for the evaluation of scalability for specific digital systems...... re-engineering of 5 independent system modules, from the replacement of a wireless Bluetooth interface, to the revision of the ADC sample-and-hold operation could help increase system bandwidth....

  12. Scalable Video Coding with Interlayer Signal Decorrelation Techniques

    Directory of Open Access Journals (Sweden)

    Yang Wenxian

    2007-01-01

    Full Text Available Scalability is one of the essential requirements in the compression of visual data for present-day multimedia communications and storage. The basic building block for providing the spatial scalability in the scalable video coding (SVC standard is the well-known Laplacian pyramid (LP. An LP achieves the multiscale representation of the video as a base-layer signal at lower resolution together with several enhancement-layer signals at successive higher resolutions. In this paper, we propose to improve the coding performance of the enhancement layers through efficient interlayer decorrelation techniques. We first show that, with nonbiorthogonal upsampling and downsampling filters, the base layer and the enhancement layers are correlated. We investigate two structures to reduce this correlation. The first structure updates the base-layer signal by subtracting from it the low-frequency component of the enhancement layer signal. The second structure modifies the prediction in order that the low-frequency component in the new enhancement layer is diminished. The second structure is integrated in the JSVM 4.0 codec with suitable modifications in the prediction modes. Experimental results with some standard test sequences demonstrate coding gains up to 1 dB for I pictures and up to 0.7 dB for both I and P pictures.

  13. A~Scalable~Data~Taking~System at~a~Test~Beam~for~LHC

    CERN Multimedia

    2002-01-01

    % RD-13 A Scalable Data Taking System at a Test Beam for LHC \\\\ \\\\We have installed a test beam read-out facility for the simultaneous test of LHC detectors, trigger and read-out electronics, together with the development of the supporting architecture in a multiprocessor environment. The aim of the project is to build a system which incorporates all the functionality of a complete read-out chain. Emphasis is put on a highly modular design, such that new hardware and software developments can be conveniently introduced. Exploiting this modularity, the set-up will evolve driven by progress in technologies and new software developments. \\\\ \\\\One of the main thrusts of the project is modelling and integration of different read-out architectures to provide a valuable training ground for new techniques. To address these aspects in a realistic manner, we collaborate with detector R\\&D projects in order to test higher level trigger systems, event building and high rate data transfers, once the techniques involve...

  14. Design issues for numerical libraries on scalable multicore architectures

    International Nuclear Information System (INIS)

    Heroux, M A

    2008-01-01

    Future generations of scalable computers will rely on multicore nodes for a significant portion of overall system performance. At present, most applications and libraries cannot exploit multiple cores beyond running addition MPI processes per node. In this paper we discuss important multicore architecture issues, programming models, algorithms requirements and software design related to effective use of scalable multicore computers. In particular, we focus on important issues for library research and development, making recommendations for how to effectively develop libraries for future scalable computer systems

  15. Enhancing Scalability of Sparse Direct Methods

    International Nuclear Information System (INIS)

    Li, Xiaoye S.; Demmel, James; Grigori, Laura; Gu, Ming; Xia, Jianlin; Jardin, Steve; Sovinec, Carl; Lee, Lie-Quan

    2007-01-01

    TOPS is providing high-performance, scalable sparse direct solvers, which have had significant impacts on the SciDAC applications, including fusion simulation (CEMM), accelerator modeling (COMPASS), as well as many other mission-critical applications in DOE and elsewhere. Our recent developments have been focusing on new techniques to overcome scalability bottleneck of direct methods, in both time and memory. These include parallelizing symbolic analysis phase and developing linear-complexity sparse factorization methods. The new techniques will make sparse direct methods more widely usable in large 3D simulations on highly-parallel petascale computers

  16. Solar energy in buildings solved by building information modeling

    Science.gov (United States)

    Chudikova, B.; Faltejsek, M.

    2018-03-01

    Building lead us to use renewable energy sources for all types of buildings. The use of solar energy is the alternatives that can be applied in a good ratio of space, price, and resultant benefits. Building Information Modelling is a modern and effective way of dealing with buildings with regard to all aspects of the life cycle. The basis is careful planning and simulation in the pre-investment phase, where it is possible to determine the effective result and influence the lifetime of the building and the cost of its operation. By simulating, analysing and insert a building model into its future environment where climate conditions and surrounding buildings play a role, it is possible to predict the usability of the solar energy and establish an ideal model. Solar systems also very affect the internal layout of buildings. Pre-investment phase analysis, with a view to future aspects, will ensure that the resulting building will be both low-energy and environmentally friendly.

  17. Use of modeling to assess the scalability of Ethernet networks for the ATLAS second level trigger

    CERN Document Server

    Korcyl, K; Dobinson, Robert W; Saka, F

    1999-01-01

    The second level trigger of LHC's ATLAS experiment has to perform real-time analyses on detector data at 10 GBytes/s. A switching network is required to connect more than thousand read-out buffers to about thousand processors that execute the trigger algorithm. We are investigating the use of Ethernet technology to build this large switching network. Ethernet is attractive because of the huge installed base, competitive prices, and recent introduction of the high-performance Gigabit version. Due to the network's size it has to be constructed as a layered structure of smaller units. To assess the scalability of such a structure we evaluated a single switch unit. (0 refs).

  18. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  19. Computational scalability of large size image dissemination

    Science.gov (United States)

    Kooper, Rob; Bajcsy, Peter

    2011-01-01

    We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.

  20. Scalable and balanced dynamic hybrid data assimilation

    Science.gov (United States)

    Kauranne, Tuomo; Amour, Idrissa; Gunia, Martin; Kallio, Kari; Lepistö, Ahti; Koponen, Sampsa

    2017-04-01

    Scalability of complex weather forecasting suites is dependent on the technical tools available for implementing highly parallel computational kernels, but to an equally large extent also on the dependence patterns between various components of the suite, such as observation processing, data assimilation and the forecast model. Scalability is a particular challenge for 4D variational assimilation methods that necessarily couple the forecast model into the assimilation process and subject this combination to an inherently serial quasi-Newton minimization process. Ensemble based assimilation methods are naturally more parallel, but large models force ensemble sizes to be small and that results in poor assimilation accuracy, somewhat akin to shooting with a shotgun in a million-dimensional space. The Variational Ensemble Kalman Filter (VEnKF) is an ensemble method that can attain the accuracy of 4D variational data assimilation with a small ensemble size. It achieves this by processing a Gaussian approximation of the current error covariance distribution, instead of a set of ensemble members, analogously to the Extended Kalman Filter EKF. Ensemble members are re-sampled every time a new set of observations is processed from a new approximation of that Gaussian distribution which makes VEnKF a dynamic assimilation method. After this a smoothing step is applied that turns VEnKF into a dynamic Variational Ensemble Kalman Smoother VEnKS. In this smoothing step, the same process is iterated with frequent re-sampling of the ensemble but now using past iterations as surrogate observations until the end result is a smooth and balanced model trajectory. In principle, VEnKF could suffer from similar scalability issues as 4D-Var. However, this can be avoided by isolating the forecast model completely from the minimization process by implementing the latter as a wrapper code whose only link to the model is calling for many parallel and totally independent model runs, all of them

  1. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  2. Scalable Simulation of Electromagnetic Hybrid Codes

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.; Fujimoto, Richard; Karimabadi, Dr. Homa

    2006-01-01

    New discrete-event formulations of physics simulation models are emerging that can outperform models based on traditional time-stepped techniques. Detailed simulation of the Earth's magnetosphere, for example, requires execution of sub-models that are at widely differing timescales. In contrast to time-stepped simulation which requires tightly coupled updates to entire system state at regular time intervals, the new discrete event simulation (DES) approaches help evolve the states of sub-models on relatively independent timescales. However, parallel execution of DES-based models raises challenges with respect to their scalability and performance. One of the key challenges is to improve the computation granularity to offset synchronization and communication overheads within and across processors. Our previous work was limited in scalability and runtime performance due to the parallelization challenges. Here we report on optimizations we performed on DES-based plasma simulation models to improve parallel performance. The net result is the capability to simulate hybrid particle-in-cell (PIC) models with over 2 billion ion particles using 512 processors on supercomputing platforms

  3. Towards a Scalable, Biomimetic, Antibacterial Coating

    Science.gov (United States)

    Dickson, Mary Nora

    Corneal afflictions are the second leading cause of blindness worldwide. When a corneal transplant is unavailable or contraindicated, an artificial cornea device is the only chance to save sight. Bacterial or fungal biofilm build up on artificial cornea devices can lead to serious complications including the need for systemic antibiotic treatment and even explantation. As a result, much emphasis has been placed on anti-adhesion chemical coatings and antibiotic leeching coatings. These methods are not long-lasting, and microorganisms can eventually circumvent these measures. Thus, I have developed a surface topographical antimicrobial coating. Various surface structures including rough surfaces, superhydrophobic surfaces, and the natural surfaces of insects' wings and sharks' skin are promising anti-biofilm candidates, however none meet the criteria necessary for implementation on the surface of an artificial cornea device. In this thesis I: 1) developed scalable fabrication protocols for a library of biomimetic nanostructure polymer surfaces 2) assessed the potential these for poly(methyl methacrylate) nanopillars to kill or prevent formation of biofilm by E. coli bacteria and species of Pseudomonas and Staphylococcus bacteria and improved upon a proposed mechanism for the rupture of Gram-negative bacterial cell walls 3) developed a scalable, commercially viable method for producing antibacterial nanopillars on a curved, PMMA artificial cornea device and 4) developed scalable fabrication protocols for implantation of antibacterial nanopatterned surfaces on the surfaces of thermoplastic polyurethane materials, commonly used in catheter tubings. This project constitutes a first step towards fabrication of the first entirely PMMA artificial cornea device. The major finding of this work is that by precisely controlling the topography of a polymer surface at the nano-scale, we can kill adherent bacteria and prevent biofilm formation of certain pathogenic bacteria

  4. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolla, Hemanth [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Borghesi, Giulio [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-05-01

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- merical tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.

  5. Scalable Database Design of End-Game Model with Decoupled Countermeasure and Threat Information

    Science.gov (United States)

    2017-11-01

    the Army Modular Active Protection System (MAPS) program to provide end-to-end APS modeling and simulation capabilities. The SSES simulation features...research project of scalable database design was initiated in support of SSES modularization efforts with respect to 4 major software components...Iron Curtain KE kinetic energy MAPS Modular Active Protective System OLE DB object linking and embedding database RDB relational database RPG

  6. Demand Response Resource Quantification with Detailed Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Elaine; Horsey, Henry; Merket, Noel; Stoll, Brady; Nag, Ambarish

    2017-04-03

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  7. Highly scalable Ab initio genomic motif identification

    KAUST Repository

    Marchand, Benoit; Bajic, Vladimir B.; Kaushik, Dinesh

    2011-01-01

    We present results of scaling an ab initio motif family identification system, Dragon Motif Finder (DMF), to 65,536 processor cores of IBM Blue Gene/P. DMF seeks groups of mutually similar polynucleotide patterns within a set of genomic sequences and builds various motif families from them. Such information is of relevance to many problems in life sciences. Prior attempts to scale such ab initio motif-finding algorithms achieved limited success. We solve the scalability issues using a combination of mixed-mode MPI-OpenMP parallel programming, master-slave work assignment, multi-level workload distribution, multi-level MPI collectives, and serial optimizations. While the scalability of our algorithm was excellent (94% parallel efficiency on 65,536 cores relative to 256 cores on a modest-size problem), the final speedup with respect to the original serial code exceeded 250,000 when serial optimizations are included. This enabled us to carry out many large-scale ab initio motiffinding simulations in a few hours while the original serial code would have needed decades of execution time. Copyright 2011 ACM.

  8. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  9. Application of response surface methodology to maximize the productivity of scalable automated human embryonic stem cell manufacture.

    Science.gov (United States)

    Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J

    2013-01-01

    Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.

  10. Economical and scalable synthesis of 6-amino-2-cyanobenzothiazole

    Directory of Open Access Journals (Sweden)

    Jacob R. Hauser

    2016-09-01

    Full Text Available 2-Cyanobenzothiazoles (CBTs are useful building blocks for: 1 luciferin derivatives for bioluminescent imaging; and 2 handles for bioorthogonal ligations. A particularly versatile CBT is 6-amino-2-cyanobenzothiazole (ACBT, which has an amine handle for straight-forward derivatisation. Here we present an economical and scalable synthesis of ACBT based on a cyanation catalysed by 1,4-diazabicyclo[2.2.2]octane (DABCO, and discuss its advantages for scale-up over previously reported routes.

  11. Irregular Shaped Building Design Optimization with Building Information Modelling

    Directory of Open Access Journals (Sweden)

    Lee Xia Sheng

    2016-01-01

    Full Text Available This research is to recognise the function of Building Information Modelling (BIM in design optimization for irregular shaped buildings. The study focuses on a conceptual irregular shaped “twisted” building design similar to some existing sculpture-like architectures. Form and function are the two most important aspects of new buildings, which are becoming more sophisticated as parts of equally sophisticated “systems” that we are living in. Nowadays, it is common to have irregular shaped or sculpture-like buildings which are very different when compared to regular buildings. Construction industry stakeholders are facing stiff challenges in many aspects such as buildability, cost effectiveness, delivery time and facility management when dealing with irregular shaped building projects. Building Information Modelling (BIM is being utilized to enable architects, engineers and constructors to gain improved visualization for irregular shaped buildings; this has a purpose of identifying critical issues before initiating physical construction work. In this study, three variations of design options differing in rotating angle: 30 degrees, 60 degrees and 90 degrees are created to conduct quantifiable comparisons. Discussions are focused on three major aspects including structural planning, usable building space, and structural constructability. This research concludes that Building Information Modelling is instrumental in facilitating design optimization for irregular shaped building. In the process of comparing different design variations, instead of just giving “yes or no” type of response, stakeholders can now easily visualize, evaluate and decide to achieve the right balance based on their own criteria. Therefore, construction project stakeholders are empowered with superior evaluation and decision making capability.

  12. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    Thomas André

    2007-03-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  13. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    André Thomas

    2007-01-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  14. Investigation of the blockchain systems’ scalability features using the agent based modelling

    OpenAIRE

    Šulnius, Aleksas

    2017-01-01

    Investigation of the BlockChain Systems’ Scalability Features using the Agent Based Modelling. BlockChain currently is in the spotlight of all the FinTech industry. This technology is being called revolutionary, ground breaking, disruptive and even the WEB 3.0. On the other hand it is widely agreed that the BlockChain is in its early stages of development. In its current state BlockChain is in similar position that the Internet was in the early nineties. In order for this technology to gain m...

  15. Scalable Nanomanufacturing—A Review

    Directory of Open Access Journals (Sweden)

    Khershed Cooper

    2017-01-01

    Full Text Available This article describes the field of scalable nanomanufacturing, its importance and need, its research activities and achievements. The National Science Foundation is taking a leading role in fostering basic research in scalable nanomanufacturing (SNM. From this effort several novel nanomanufacturing approaches have been proposed, studied and demonstrated, including scalable nanopatterning. This paper will discuss SNM research areas in materials, processes and applications, scale-up methods with project examples, and manufacturing challenges that need to be addressed to move nanotechnology discoveries closer to the marketplace.

  16. Accounting Fundamentals and the Variation of Stock Price: Factoring in the Investment Scalability

    Directory of Open Access Journals (Sweden)

    Sumiyana Sumiyana

    2010-05-01

    Full Text Available This study develops a new return model with respect to accounting fundamentals. The new return model is based on Chen and Zhang (2007. This study takes into account theinvestment scalability information. Specifically, this study splitsthe scale of firm’s operations into short-run and long-runinvestment scalabilities. We document that five accounting fun-damentals explain the variation of annual stock return. Thefactors, comprised book value, earnings yield, short-run andlong-run investment scalabilities, and growth opportunities, co associate positively with stock price. The remaining factor,which is the pure interest rate, is negatively related to annualstock return. This study finds that inducing short-run and long-run investment scalabilities into the model could improve the degree of association. In other words, they have value rel-evance. Finally, this study suggests that basic trading strategieswill improve if investors revert to the accounting fundamentals. Keywords: accounting fundamentals; book value; earnings yield; growth opportuni­ties; short­run and long­run investment scalabilities; trading strategy;value relevance

  17. SUSY GUT Model Building

    International Nuclear Information System (INIS)

    Raby, Stuart

    2008-01-01

    In this talk I discuss the evolution of SUSY GUT model building as I see it. Starting with 4 dimensional model building, I then consider orbifold GUTs in 5 dimensions and finally orbifold GUTs embedded into the E 8 xE 8 heterotic string.

  18. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  19. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  20. More scalability, less pain: A simple programming model and its implementation for extreme computing

    International Nuclear Information System (INIS)

    Lusk, E.L.; Pieper, S.C.; Butler, R.M.

    2010-01-01

    This is the story of a simple programming model, its implementation for extreme computing, and a breakthrough in nuclear physics. A critical issue for the future of high-performance computing is the programming model to use on next-generation architectures. Described here is a promising approach: program very large machines by combining a simplified programming model with a scalable library implementation. The presentation takes the form of a case study in nuclear physics. The chosen application addresses fundamental issues in the origins of our Universe, while the library developed to enable this application on the largest computers may have applications beyond this one.

  1. Scalability of Semi-Implicit Time Integrators for Nonhydrostatic Galerkin-based Atmospheric Models on Large Scale Cluster

    Science.gov (United States)

    2011-01-01

    present performance statistics to explain the scalability behavior. Keywords-atmospheric models, time intergrators , MPI, scal- ability, performance; I...across inter-element bound- aries. Basis functions are constructed as tensor products of Lagrange polynomials ψi (x) = hα(ξ) ⊗ hβ(η) ⊗ hγ(ζ)., where hα

  2. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    International Nuclear Information System (INIS)

    Desai, Ajit; Pettit, Chris; Poirel, Dominique; Sarkar, Abhijit

    2017-01-01

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolution in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.

  3. Declarative and Scalable Selection for Map Visualizations

    DEFF Research Database (Denmark)

    Kefaloukos, Pimin Konstantin Balic

    and is itself a source and cause of prolific data creation. This calls for scalable map processing techniques that can handle the data volume and which play well with the predominant data models on the Web. (4) Maps are now consumed around the clock by a global audience. While historical maps were singleuser......-defined constraints as well as custom objectives. The purpose of the language is to derive a target multi-scale database from a source database according to holistic specifications. (b) The Glossy SQL compiler allows Glossy SQL to be scalably executed in a spatial analytics system, such as a spatial relational......, there are indications that the method is scalable for databases that contain millions of records, especially if the target language of the compiler is substituted by a cluster-ready variant of SQL. While several realistic use cases for maps have been implemented in CVL, additional non-geographic data visualization uses...

  4. Scalable Brain Network Construction on White Matter Fibers.

    Science.gov (United States)

    Chung, Moo K; Adluru, Nagesh; Dalton, Kim M; Alexander, Andrew L; Davidson, Richard J

    2011-02-12

    DTI offers a unique opportunity to characterize the structural connectivity of the human brain non-invasively by tracing white matter fiber tracts. Whole brain tractography studies routinely generate up to half million tracts per brain, which serves as edges in an extremely large 3D graph with up to half million edges. Currently there is no agreed-upon method for constructing the brain structural network graphs out of large number of white matter tracts. In this paper, we present a scalable iterative framework called the ε-neighbor method for building a network graph and apply it to testing abnormal connectivity in autism.

  5. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  6. On the scalability of uncoordinated multiple access for the Internet of Things

    KAUST Repository

    Chisci, Giovanni

    2017-11-16

    The Internet of things (IoT) will entail massive number of wireless connections with sporadic traffic patterns. To support the IoT traffic, several technologies are evolving to support low power wide area (LPWA) wireless communications. However, LPWA networks rely on variations of uncoordinated spectrum access, either for data transmissions or scheduling requests, thus imposing a scalability problem to the IoT. This paper presents a novel spatiotemporal model to study the scalability of the ALOHA medium access. In particular, the developed mathematical model relies on stochastic geometry and queueing theory to account for spatial and temporal attributes of the IoT. To this end, the scalability of the ALOHA is characterized by the percentile of IoT devices that can be served while keeping their queues stable. The results highlight the scalability problem of ALOHA and quantify the extend to which ALOHA can support in terms of number of devices, traffic requirement, and transmission rate.

  7. Building online genomics applications using BioPyramid.

    Science.gov (United States)

    Stephenson, Liam; Wakeham, Yoshua; Seidenman, Nick; Choi, Jarny

    2018-03-29

    BioPyramid is a python package, which serves as a scaffold for building an online application for the exploration of gene expression data. It is designed for bioinformaticians wishing to quickly share transformed data and interactive analyses with collaborators. Current R-based tools similarly address the need to quickly share "omics"-data in an exploratory format, but these are generally small-scale, single-dataset solutions. Biopyramid is written in python pyramid framework and scalable to address longer-term or more complex projects. It contains a number of components designed to reduce the time and effort in building such an application from scratch, including gene annotation, dataset models and visualisation tools. Freely available at http://github.com/jarny/biopyramid. Implemented in python and javascript. jarnyc@unimelb.edu.au.

  8. Optimizing Energy Consumption in Building Designs Using Building Information Model (BIM

    Directory of Open Access Journals (Sweden)

    Egwunatum Samuel

    2016-09-01

    Full Text Available Given the ability of a Building Information Model (BIM to serve as a multi-disciplinary data repository, this paper seeks to explore and exploit the sustainability value of Building Information Modelling/models in delivering buildings that require less energy for their operation, emit less CO2 and at the same time provide a comfortable living environment for their occupants. This objective was achieved by a critical and extensive review of the literature covering: (1 building energy consumption, (2 building energy performance and analysis, and (3 building information modeling and energy assessment. The literature cited in this paper showed that linking an energy analysis tool with a BIM model helped project design teams to predict and create optimized energy consumption. To validate this finding, an in-depth analysis was carried out on a completed BIM integrated construction project using the Arboleda Project in the Dominican Republic. The findings showed that the BIM-based energy analysis helped the design team achieve the world’s first 103% positive energy building. From the research findings, the paper concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results as well as deliver energy-efficient buildings. The study further recommends that the adoption of a level 2 BIM and the integration of BIM in energy optimization analyse should be made compulsory for all projects irrespective of the method of procurement (government-funded or otherwise or its size.

  9. Model building

    International Nuclear Information System (INIS)

    Frampton, Paul H.

    1998-01-01

    In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA

  10. Toward a scalable flexible-order model for 3D nonlinear water waves

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Ducrozet, Guillaume; Bingham, Harry B.

    For marine and coastal applications, current work are directed toward the development of a scalable numerical 3D model for fully nonlinear potential water waves over arbitrary depths. The model is high-order accurate, robust and efficient for large-scale problems, and support will be included...... for flexibility in the description of structures by the use of curvilinear boundary-fitted meshes. The mathematical equations for potential waves in the physical domain is transformed through $\\sigma$-mapping(s) to a time-invariant boundary-fitted domain which then becomes a basis for an efficient solution...... strategy on a time-invariant mesh. The 3D numerical model is based on a finite difference method as in the original works \\cite{LiFleming1997,BinghamZhang2007}. Full details and other aspects of an improved 3D solution can be found in \\cite{EBL08}. The new and improved approach for three...

  11. Scalable photoreactor for hydrogen production

    KAUST Repository

    Takanabe, Kazuhiro; Shinagawa, Tatsuya

    2017-01-01

    Provided herein are scalable photoreactors that can include a membrane-free water- splitting electrolyzer and systems that can include a plurality of membrane-free water- splitting electrolyzers. Also provided herein are methods of using the scalable photoreactors provided herein.

  12. Scalable photoreactor for hydrogen production

    KAUST Repository

    Takanabe, Kazuhiro

    2017-04-06

    Provided herein are scalable photoreactors that can include a membrane-free water- splitting electrolyzer and systems that can include a plurality of membrane-free water- splitting electrolyzers. Also provided herein are methods of using the scalable photoreactors provided herein.

  13. Resource-aware complexity scalability for mobile MPEG encoding

    NARCIS (Netherlands)

    Mietens, S.O.; With, de P.H.N.; Hentschel, C.; Panchanatan, S.; Vasudev, B.

    2004-01-01

    Complexity scalability attempts to scale the required resources of an algorithm with the chose quality settings, in order to broaden the application range. In this paper, we present complexity-scalable MPEG encoding of which the core processing modules are modified for scalability. Scalability is

  14. Developing a scalable modeling architecture for studying survivability technologies

    Science.gov (United States)

    Mohammad, Syed; Bounker, Paul; Mason, James; Brister, Jason; Shady, Dan; Tucker, David

    2006-05-01

    To facilitate interoperability of models in a scalable environment, and provide a relevant virtual environment in which Survivability technologies can be evaluated, the US Army Research Development and Engineering Command (RDECOM) Modeling Architecture for Technology Research and Experimentation (MATREX) Science and Technology Objective (STO) program has initiated the Survivability Thread which will seek to address some of the many technical and programmatic challenges associated with the effort. In coordination with different Thread customers, such as the Survivability branches of various Army labs, a collaborative group has been formed to define the requirements for the simulation environment that would in turn provide them a value-added tool for assessing models and gauge system-level performance relevant to Future Combat Systems (FCS) and the Survivability requirements of other burgeoning programs. An initial set of customer requirements has been generated in coordination with the RDECOM Survivability IPT lead, through the Survivability Technology Area at RDECOM Tank-automotive Research Development and Engineering Center (TARDEC, Warren, MI). The results of this project are aimed at a culminating experiment and demonstration scheduled for September, 2006, which will include a multitude of components from within RDECOM and provide the framework for future experiments to support Survivability research. This paper details the components with which the MATREX Survivability Thread was created and executed, and provides insight into the capabilities currently demanded by the Survivability faculty within RDECOM.

  15. Model building

    International Nuclear Information System (INIS)

    Frampton, P.H.

    1998-01-01

    In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA. copyright 1998 American Institute of Physics

  16. A Scalable Cloud Library Empowering Big Data Management, Diagnosis, and Visualization of Cloud-Resolving Models

    Science.gov (United States)

    Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.

    2015-12-01

    A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a

  17. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    Science.gov (United States)

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  18. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  19. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  20. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  1. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin

    2016-07-29

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available data such as aerial images and street-view images. First, we propose a parametric model to describe the exterior of a building (with a varying number of parameters) and propose a set of attributes as a building representation with fixed dimensionality. Second, we propose a hierarchical graphical model with hidden variables to encode the relationships between building attributes and learn both the structure and parameters of the model from the database. Third, we propose optimization algorithms to generate three-dimensional models based on building attributes sampled from the graphical model. Finally, we demonstrate our framework by synthesizing new building models and completing partially observed building models from photographs.

  2. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The highly automated PHENIX AutoBuild wizard is described. The procedure can be applied equally well to phases derived from isomorphous/anomalous and molecular-replacement methods. The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution

  3. Scalable Open Source Smart Grid Simulator (SGSim)

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Jacobsen, Rune Hylsberg; Stefanni, Francesco

    2017-01-01

    . This paper presents an open source smart grid simulator (SGSim). The simulator is based on open source SystemC Network Simulation Library (SCNSL) and aims to model scalable smart grid applications. SGSim has been tested under different smart grid scenarios that contain hundreds of thousands of households...

  4. Iterative model-building, structure refinement, and density modification with the PHENIX AutoBuild Wizard

    Energy Technology Data Exchange (ETDEWEB)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2007-04-29

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} to 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.

  5. Petascale Hierarchical Modeling VIA Parallel Execution

    Energy Technology Data Exchange (ETDEWEB)

    Gelman, Andrew [Principal Investigator

    2014-04-14

    The research allows more effective model building. By allowing researchers to fit complex models to large datasets in a scalable manner, our algorithms and software enable more effective scientific research. In the new area of “big data,” it is often necessary to fit “big models” to adjust for systematic differences between sample and population. For this task, scalable and efficient model-fitting tools are needed, and these have been achieved with our new Hamiltonian Monte Carlo algorithm, the no-U-turn sampler, and our new C++ program, Stan. In layman’s terms, our research enables researchers to create improved mathematical modes for large and complex systems.

  6. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    Directory of Open Access Journals (Sweden)

    Sven Van Poucke

    Full Text Available With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension. Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM, the ETL process (Extract, Transform, Load was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  7. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    Science.gov (United States)

    Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  8. Accounting Fundamentals and the Variation of Stock Price: Factoring in the Investment Scalability

    OpenAIRE

    Sumiyana, Sumiyana; Baridwan, Zaki; Sugiri, Slamet; Hartono, Jogiyanto

    2010-01-01

    This study develops a new return model with respect to accounting fundamentals. The new return model is based on Chen and Zhang (2007). This study takes into account theinvestment scalability information. Specifically, this study splitsthe scale of firm’s operations into short-run and long-runinvestment scalabilities. We document that five accounting fun-damentals explain the variation of annual stock return. Thefactors, comprised book value, earnings yield, short-run andlong-run investment s...

  9. Virtual building environments (VBE) - Applying information modeling to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  10. Adaptive format conversion for scalable video coding

    Science.gov (United States)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  11. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    2010-11-01

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  12. Scalable power selection method for wireless mesh networks

    CSIR Research Space (South Africa)

    Olwal, TO

    2009-01-01

    Full Text Available This paper addresses the problem of a scalable dynamic power control (SDPC) for wireless mesh networks (WMNs) based on IEEE 802.11 standards. An SDPC model that accounts for architectural complexities witnessed in multiple radios and hops...

  13. A conclusive scalable model for the complete actuation response for IPMC transducers

    International Nuclear Information System (INIS)

    McDaid, A J; Aw, K C; Haemmerle, E; Xie, S Q

    2010-01-01

    This paper proposes a conclusive scalable model for the complete actuation response for ionic polymer metal composites (IPMC). This single model is proven to be able to accurately predict the free displacement/velocity and force actuation at varying displacements, with up to 3 V inputs. An accurate dynamic relationship between the force and displacement has been established which can be used to predict the complete actuation response of the IPMC transducer. The model is accurate at large displacements and can also predict the response when interacting with external mechanical systems and loads. This model equips engineers with a useful design tool which enables simple mechanical design, simulation and optimization when integrating IPMC actuators into an application. The response of the IPMC is modelled in three stages: (i) a nonlinear equivalent electrical circuit to predict the current drawn, (ii) an electromechanical coupling term and (iii) a segmented mechanical beam model which includes an electrically induced torque for the polymer. Model parameters are obtained using the dynamic time response and results are presented demonstrating the correspondence between the model and experimental results over a large operating range. This newly developed model is a large step forward, aiding in the progression of IPMCs towards wide acceptance as replacements to traditional actuators

  14. Scalable Density-Based Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2011-01-01

    For knowledge discovery in high dimensional databases, subspace clustering detects clusters in arbitrary subspace projections. Scalability is a crucial issue, as the number of possible projections is exponential in the number of dimensions. We propose a scalable density-based subspace clustering...... method that steers mining to few selected subspace clusters. Our novel steering technique reduces subspace processing by identifying and clustering promising subspaces and their combinations directly. Thereby, it narrows down the search space while maintaining accuracy. Thorough experiments on real...... and synthetic databases show that steering is efficient and scalable, with high quality results. For future work, our steering paradigm for density-based subspace clustering opens research potential for speeding up other subspace clustering approaches as well....

  15. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  16. Scalable devices

    KAUST Repository

    Krü ger, Jens J.; Hadwiger, Markus

    2014-01-01

    In computer science in general and in particular the field of high performance computing and supercomputing the term scalable plays an important role. It indicates that a piece of hardware, a concept, an algorithm, or an entire system scales

  17. Evaluation and simulation of event building techniques for a detector at the LHC

    CERN Document Server

    Spiwoks, R

    1995-01-01

    The main objectives of future experiments at the Large Hadron Collider are the search for the Higgs boson (or bosons), the verification of the Standard Model and the search beyond the Standard Model in a new energy range up to a few TeV. These experiments will have to cope with unprecedented high data rates and will need event building systems which can offer a bandwidth of 1 to 100GB/s and which can assemble events from 100 to 1000 readout memories at rates of 1 to 100kHz. This work investigates the feasibility of parallel event building sys- tems using commercially available high speed interconnects and switches. Studies are performed by building a small-scale prototype and by modelling this proto- type and realistic architectures with discrete-event simulations. The prototype is based on the HiPPI standard and uses commercially available VME-HiPPI interfaces and a HiPPI switch together with modular and scalable software. The setup operates successfully as a parallel event building system of limited size in...

  18. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  19. Flexible building stock modelling with array-programming

    DEFF Research Database (Denmark)

    Brøgger, Morten; Wittchen, Kim Bjarne

    2017-01-01

    Many building stock models employ archetype-buildings in order to capture the essential characteristics of a diverse building stock. However, these models often require multiple archetypes, which make them inflexible. This paper proposes an array-programming based model, which calculates the heat...... tend to overestimate potential energy-savings, if we do not consider these discrepancies. The proposed model makes it possible to compute and visualize potential energy-savings in a flexible and transparent way....

  20. Building America Case Study: Accelerating the Delivery of Home-Performance Upgrades Using a Synergistic Business Model, Minneapolis, Minnesota

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    Achieving Building America energy savings goals (40 percent by 2030) will require many existing homes to install energy upgrades. Engaging large numbers of homeowners in building science-guided upgrades during a single remodeling event has been difficult for a number of reasons. Performance upgrades in existing homes tend to occur over multiple years and usually result from component failures (furnace failure) and weather damage (ice dams, roofing, siding). This research attempted to: A) Understand the homeowner's motivations regarding investing in building science based performance upgrades. B) Determining a rapidly scalable approach to engage large numbers of homeowners directly through existing customer networks. C) Access a business model that will manage all aspects of the contractor-homeowner-performance professional interface to ensure good upgrade decisions over time. The solution results from a synergistic approach utilizing networks of suppliers merging with networks of homeowner customers. Companies in the $400 to $800 billion home services industry have proven direct marketing and sales proficiencies that have led to the development of vast customer networks. Companies such as pest control, lawn care, and security have nurtured these networks by successfully addressing the ongoing needs of homes. This long-term access to customers and trust established with consistent delivery has also provided opportunities for home service providers to grow by successfully introducing new products and services like attic insulation and air sealing. The most important component for success is a business model that will facilitate and manage the process. The team analyzes a group that developed a working model.

  1. Cooperative Scalable Moving Continuous Query Processing

    DEFF Research Database (Denmark)

    Li, Xiaohui; Karras, Panagiotis; Jensen, Christian S.

    2012-01-01

    of the global view and handle the majority of the workload. Meanwhile, moving clients, having basic memory and computation resources, handle small portions of the workload. This model is further enhanced by dynamic region allocation and grid size adjustment mechanisms that reduce the communication...... and computation cost for both servers and clients. An experimental study demonstrates that our approaches offer better scalability than competitors...

  2. A Numerical Study of Scalable Cardiac Electro-Mechanical Solvers on HPC Architectures

    Directory of Open Access Journals (Sweden)

    Piero Colli Franzone

    2018-04-01

    Full Text Available We introduce and study some scalable domain decomposition preconditioners for cardiac electro-mechanical 3D simulations on parallel HPC (High Performance Computing architectures. The electro-mechanical model of the cardiac tissue is composed of four coupled sub-models: (1 the static finite elasticity equations for the transversely isotropic deformation of the cardiac tissue; (2 the active tension model describing the dynamics of the intracellular calcium, cross-bridge binding and myofilament tension; (3 the anisotropic Bidomain model describing the evolution of the intra- and extra-cellular potentials in the deforming cardiac tissue; and (4 the ionic membrane model describing the dynamics of ionic currents, gating variables, ionic concentrations and stretch-activated channels. This strongly coupled electro-mechanical model is discretized in time with a splitting semi-implicit technique and in space with isoparametric finite elements. The resulting scalable parallel solver is based on Multilevel Additive Schwarz preconditioners for the solution of the Bidomain system and on BDDC preconditioned Newton-Krylov solvers for the non-linear finite elasticity system. The results of several 3D parallel simulations show the scalability of both linear and non-linear solvers and their application to the study of both physiological excitation-contraction cardiac dynamics and re-entrant waves in the presence of different mechano-electrical feedbacks.

  3. New Region-Scalable Discriminant and Fitting Energy Functional for Driving Geometric Active Contours in Medical Image Segmentation

    Directory of Open Access Journals (Sweden)

    Xuchu Wang

    2014-01-01

    that uses region-scalable discriminant and fitting energy functional for handling the intensity inhomogeneity and weak boundary problems in medical image segmentation. The region-scalable discriminant and fitting energy functional is defined to capture the image intensity characteristics in local and global regions for driving the evolution of active contour. The discriminant term in the model aims at separating background and foreground in scalable regions while the fitting term tends to fit the intensity in these regions. This model is then transformed into a variational level set formulation with a level set regularization term for accurate computation. The new model utilizes intensity information in the local and global regions as much as possible; so it not only handles better intensity inhomogeneity, but also allows more robustness to noise and more flexible initialization in comparison to the original global region and regional-scalable based models. Experimental results for synthetic and real medical image segmentation show the advantages of the proposed method in terms of accuracy and robustness.

  4. Progress Report 2008: A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Drake, John B [ORNL; Worley, Patrick H [ORNL; Hoffman, Forrest M [ORNL; Jones, Phil [Los Alamos National Laboratory (LANL)

    2009-01-01

    This project employs multi-disciplinary teams to accelerate development of the Community Climate System Model (CCSM), based at the National Center for Atmospheric Research (NCAR). A consortium of eight Department of Energy (DOE) National Laboratories collaborate with NCAR and the NASA Global Modeling and Assimilation Office (GMAO). The laboratories are Argonne (ANL), Brookhaven (BNL) Los Alamos (LANL), Lawrence Berkeley (LBNL), Lawrence Livermore (LLNL), Oak Ridge (ORNL), Pacific Northwest (PNNL) and Sandia (SNL). The work plan focuses on scalablity for petascale computation and extensibility to a more comprehensive earth system model. Our stated goal is to support the DOE mission in climate change research by helping ... To determine the range of possible climate changes over the 21st century and beyond through simulations using a more accurate climate system model that includes the full range of human and natural climate feedbacks with increased realism and spatial resolution.

  5. U.S. Department of Energy Commercial Reference Building Models of the National Building Stock

    Energy Technology Data Exchange (ETDEWEB)

    Deru, M.; Field, K.; Studer, D.; Benne, K.; Griffith, B.; Torcellini, P.; Liu, B.; Halverson, M.; Winiarski, D.; Rosenberg, M.; Yazdanian, M.; Huang, J.; Crawley, D.

    2011-02-01

    The U.S. Department of Energy (DOE) Building Technologies Program has set the aggressive goal of producing marketable net-zero energy buildings by 2025. This goal will require collaboration between the DOE laboratories and the building industry. We developed standard or reference energy models for the most common commercial buildings to serve as starting points for energy efficiency research. These models represent fairly realistic buildings and typical construction practices. Fifteen commercial building types and one multifamily residential building were determined by consensus between DOE, the National Renewable Energy Laboratory, Pacific Northwest National Laboratory, and Lawrence Berkeley National Laboratory, and represent approximately two-thirds of the commercial building stock.

  6. Scalable algorithms for contact problems

    CERN Document Server

    Dostál, Zdeněk; Sadowská, Marie; Vondrák, Vít

    2016-01-01

    This book presents a comprehensive and self-contained treatment of the authors’ newly developed scalable algorithms for the solutions of multibody contact problems of linear elasticity. The brand new feature of these algorithms is theoretically supported numerical scalability and parallel scalability demonstrated on problems discretized by billions of degrees of freedom. The theory supports solving multibody frictionless contact problems, contact problems with possibly orthotropic Tresca’s friction, and transient contact problems. It covers BEM discretization, jumping coefficients, floating bodies, mortar non-penetration conditions, etc. The exposition is divided into four parts, the first of which reviews appropriate facets of linear algebra, optimization, and analysis. The most important algorithms and optimality results are presented in the third part of the volume. The presentation is complete, including continuous formulation, discretization, decomposition, optimality results, and numerical experimen...

  7. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  8. Efficient Delivery of Scalable Video Using a Streaming Class Model

    Directory of Open Access Journals (Sweden)

    Jason J. Quinlan

    2018-03-01

    Full Text Available When we couple the rise in video streaming with the growing number of portable devices (smart phones, tablets, laptops, we see an ever-increasing demand for high-definition video online while on the move. Wireless networks are inherently characterised by restricted shared bandwidth and relatively high error loss rates, thus presenting a challenge for the efficient delivery of high quality video. Additionally, mobile devices can support/demand a range of video resolutions and qualities. This demand for mobile streaming highlights the need for adaptive video streaming schemes that can adjust to available bandwidth and heterogeneity, and can provide a graceful changes in video quality, all while respecting viewing satisfaction. In this context, the use of well-known scalable/layered media streaming techniques, commonly known as scalable video coding (SVC, is an attractive solution. SVC encodes a number of video quality levels within a single media stream. This has been shown to be an especially effective and efficient solution, but it fares badly in the presence of datagram losses. While multiple description coding (MDC can reduce the effects of packet loss on scalable video delivery, the increased delivery cost is counterproductive for constrained networks. This situation is accentuated in cases where only the lower quality level is required. In this paper, we assess these issues and propose a new approach called Streaming Classes (SC through which we can define a key set of quality levels, each of which can be delivered in a self-contained manner. This facilitates efficient delivery, yielding reduced transmission byte-cost for devices requiring lower quality, relative to MDC and Adaptive Layer Distribution (ALD (42% and 76% respective reduction for layer 2, while also maintaining high levels of consistent quality. We also illustrate how selective packetisation technique can further reduce the effects of packet loss on viewable quality by

  9. A Massively Scalable Architecture for Instant Messaging & Presence

    NARCIS (Netherlands)

    Schippers, Jorrit; Remke, Anne Katharina Ingrid; Punt, Henk; Wegdam, M.; Haverkort, Boudewijn R.H.M.; Thomas, N.; Bradley, J.; Knottenbelt, W.; Dingle, N.; Harder, U.

    2010-01-01

    This paper analyzes the scalability of Instant Messaging & Presence (IM&P) architectures. We take a queueing-based modelling and analysis approach to ��?nd the bottlenecks of the current IM&P architecture at the Dutch social network Hyves, as well as of alternative architectures. We use the

  10. Comparison of Building Energy Modeling Programs: Building Loads

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Dandan [Tsinghua Univ., Beijing (China); Hong, Tianzhen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yan, Da [Tsinghua Univ., Beijing (China); Wang, Chuang [Tsinghua Univ., Beijing (China)

    2012-06-01

    This technical report presented the methodologies, processes, and results of comparing three Building Energy Modeling Programs (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In the fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to

  11. MicROS-drt: supporting real-time and scalable data distribution in distributed robotic systems.

    Science.gov (United States)

    Ding, Bo; Wang, Huaimin; Fan, Zedong; Zhang, Pengfei; Liu, Hui

    A primary requirement in distributed robotic software systems is the dissemination of data to all interested collaborative entities in a timely and scalable manner. However, providing such a service in a highly dynamic and resource-limited robotic environment is a challenging task, and existing robot software infrastructure has limitations in this aspect. This paper presents a novel robot software infrastructure, micROS-drt, which supports real-time and scalable data distribution. The solution is based on a loosely coupled data publish-subscribe model with the ability to support various time-related constraints. And to realize this model, a mature data distribution standard, the data distribution service for real-time systems (DDS), is adopted as the foundation of the transport layer of this software infrastructure. By elaborately adapting and encapsulating the capability of the underlying DDS middleware, micROS-drt can meet the requirement of real-time and scalable data distribution in distributed robotic systems. Evaluation results in terms of scalability, latency jitter and transport priority as well as the experiment on real robots validate the effectiveness of this work.

  12. A Scalable Method for Regioselective 3-Acylation of 2-Substituted Indoles under Basic Conditions

    DEFF Research Database (Denmark)

    Johansson, Karl Henrik; Urruticoechea, Andoni; Larsen, Inna

    2015-01-01

    Privileged structures such as 2-arylindoles are recurrent molecular scaffolds in bioactive molecules. We here present an operationally simple, high yielding and scalable method for regioselective 3-acylation of 2-substituted indoles under basic conditions using functionalized acid chlorides. The ....... The method shows good tolerance to both electron-withdrawing and donating substituents on the indole scaffold and gives ready access to a variety of functionalized 3-acylindole building blocks suited for further derivatization....

  13. Integration of design applications with building models

    DEFF Research Database (Denmark)

    Eastman, C. M.; Jeng, T. S.; Chowdbury, R.

    1997-01-01

    This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

  14. Armagh Observatory - Historic Building Information Modelling for Virtual Learning in Building Conservation

    Science.gov (United States)

    Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.

    2017-08-01

    In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  15. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  16. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  17. Component-Based Modelling for Scalable Smart City Systems Interoperability: A Case Study on Integrating Energy Demand Response Systems.

    Science.gov (United States)

    Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan

    2016-10-28

    Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.

  18. Artificial intelligence support for scientific model-building

    Science.gov (United States)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  19. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  20. New Complexity Scalable MPEG Encoding Techniques for Mobile Applications

    Directory of Open Access Journals (Sweden)

    Stephan Mietens

    2004-03-01

    Full Text Available Complexity scalability offers the advantage of one-time design of video applications for a large product family, including mobile devices, without the need of redesigning the applications on the algorithmic level to meet the requirements of the different products. In this paper, we present complexity scalable MPEG encoding having core modules with modifications for scalability. The interdependencies of the scalable modules and the system performance are evaluated. Experimental results show scalability giving a smooth change in complexity and corresponding video quality. Scalability is basically achieved by varying the number of computed DCT coefficients and the number of evaluated motion vectors but other modules are designed such they scale with the previous parameters. In the experiments using the “Stefan” sequence, the elapsed execution time of the scalable encoder, reflecting the computational complexity, can be gradually reduced to roughly 50% of its original execution time. The video quality scales between 20 dB and 48 dB PSNR with unity quantizer setting, and between 21.5 dB and 38.5 dB PSNR for different sequences targeting 1500 kbps. The implemented encoder and the scalability techniques can be successfully applied in mobile systems based on MPEG video compression.

  1. Buildings Lean Maintenance Implementation Model

    Science.gov (United States)

    Abreu, Antonio; Calado, João; Requeijo, José

    2016-11-01

    Nowadays, companies in global markets have to achieve high levels of performance and competitiveness to stay "alive".Within this assumption, the building maintenance cannot be done in a casual and improvised way due to the costs related. Starting with some discussion about lean management and building maintenance, this paper introduces a model to support the Lean Building Maintenance (LBM) approach. Finally based on a real case study from a Portuguese company, the benefits, challenges and difficulties are presented and discussed.

  2. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  3. Guidelines for Using Building Information Modeling for Energy Analysis of Buildings

    Directory of Open Access Journals (Sweden)

    Thomas Reeves

    2015-12-01

    Full Text Available Building energy modeling (BEM, a subset of building information modeling (BIM, integrates energy analysis into the design, construction, and operation and maintenance of buildings. As there are various existing BEM tools available, there is a need to evaluate the utility of these tools in various phases of the building lifecycle. The goal of this research was to develop guidelines for evaluation and selection of BEM tools to be used in particular building lifecycle phases. The objectives of this research were to: (1 Evaluate existing BEM tools; (2 Illustrate the application of the three BEM tools; (3 Re-evaluate the three BEM tools; and (4 Develop guidelines for evaluation, selection and application of BEM tools in the design, construction and operation/maintenance phases of buildings. Twelve BEM tools were initially evaluated using four criteria: interoperability, usability, available inputs, and available outputs. Each of the top three BEM tools selected based on this initial evaluation was used in a case study to simulate and evaluate energy usage, daylighting performance, and natural ventilation for two academic buildings (LEED-certified and non-LEED-certified. The results of the case study were used to re-evaluate the three BEM tools using the initial criteria with addition of the two new criteria (speed and accuracy, and to develop guidelines for evaluating and selecting BEM tools to analyze building energy performance. The major contribution of this research is the development of these guidelines that can help potential BEM users to identify the most appropriate BEM tool for application in particular building lifecycle phases.

  4. NYU3T: teaching, technology, teamwork: a model for interprofessional education scalability and sustainability.

    Science.gov (United States)

    Djukic, Maja; Fulmer, Terry; Adams, Jennifer G; Lee, Sabrina; Triola, Marc M

    2012-09-01

    Interprofessional education is a critical precursor to effective teamwork and the collaboration of health care professionals in clinical settings. Numerous barriers have been identified that preclude scalable and sustainable interprofessional education (IPE) efforts. This article describes NYU3T: Teaching, Technology, Teamwork, a model that uses novel technologies such as Web-based learning, virtual patients, and high-fidelity simulation to overcome some of the common barriers and drive implementation of evidence-based teamwork curricula. It outlines the program's curricular components, implementation strategy, evaluation methods, and lessons learned from the first year of delivery and describes implications for future large-scale IPE initiatives. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Statistical models describing the energy signature of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Thavlov, Anders

    2010-01-01

    Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i...... or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as UA......-values, time constants of the building, and other parameters related to the heat dynamics. A method for selecting the most appropriate model for a given building is outlined and finally a perspective of the applications is given. Aknowledgements to the Danish Energy Saving Trust and the Interreg IV ``Vind i...

  6. Algorithmic psychometrics and the scalable subject.

    Science.gov (United States)

    Stark, Luke

    2018-04-01

    Recent public controversies, ranging from the 2014 Facebook 'emotional contagion' study to psychographic data profiling by Cambridge Analytica in the 2016 American presidential election, Brexit referendum and elsewhere, signal watershed moments in which the intersecting trajectories of psychology and computer science have become matters of public concern. The entangled history of these two fields grounds the application of applied psychological techniques to digital technologies, and an investment in applying calculability to human subjectivity. Today, a quantifiable psychological subject position has been translated, via 'big data' sets and algorithmic analysis, into a model subject amenable to classification through digital media platforms. I term this position the 'scalable subject', arguing it has been shaped and made legible by algorithmic psychometrics - a broad set of affordances in digital platforms shaped by psychology and the behavioral sciences. In describing the contours of this 'scalable subject', this paper highlights the urgent need for renewed attention from STS scholars on the psy sciences, and on a computational politics attentive to psychology, emotional expression, and sociality via digital media.

  7. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  8. Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.

    Science.gov (United States)

    Maani, Ehsan; Katsaggelos, Aggelos K

    2009-09-01

    The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.

  9. Neuromorphic adaptive plastic scalable electronics: analog learning systems.

    Science.gov (United States)

    Srinivasa, Narayan; Cruz-Albrecht, Jose

    2012-01-01

    Decades of research to build programmable intelligent machines have demonstrated limited utility in complex, real-world environments. Comparing their performance with biological systems, these machines are less efficient by a factor of 1 million1 billion in complex, real-world environments. The Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program is a multifaceted Defense Advanced Research Projects Agency (DARPA) project that seeks to break the programmable machine paradigm and define a new path for creating useful, intelligent machines. Since real-world systems exhibit infinite combinatorial complexity, electronic neuromorphic machine technology would be preferable in a host of applications, but useful and practical implementations still do not exist. HRL Laboratories LLC has embarked on addressing these challenges, and, in this article, we provide an overview of our project and progress made thus far.

  10. Scalability Dilemma and Statistic Multiplexed Computing — A Theory and Experiment

    Directory of Open Access Journals (Sweden)

    Justin Yuan Shi

    2017-08-01

    Full Text Available The For the last three decades, end-to-end computing paradigms, such as MPI (Message Passing Interface, RPC (Remote Procedure Call and RMI (Remote Method Invocation, have been the de facto paradigms for distributed and parallel programming. Despite of the successes, applications built using these paradigms suffer due to the proportionality factor of crash in the application with its size. Checkpoint/restore and backup/recovery are the only means to save otherwise lost critical information. The scalability dilemma is such a practical challenge that the probability of the data losses increases as the application scales in size. The theoretical significance of this practical challenge is that it undermines the fundamental structure of the scientific discovery process and mission critical services in production today. In 1997, the direct use of end-to-end reference model in distributed programming was recognized as a fallacy. The scalability dilemma was predicted. However, this voice was overrun by the passage of time. Today, the rapidly growing digitized data demands solving the increasingly critical scalability challenges. Computing architecture scalability, although loosely defined, is now the front and center of large-scale computing efforts. Constrained only by the economic law of diminishing returns, this paper proposes a narrow definition of a Scalable Computing Service (SCS. Three scalability tests are also proposed in order to distinguish service architecture flaws from poor application programming. Scalable data intensive service requires additional treatments. Thus, the data storage is assumed reliable in this paper. A single-sided Statistic Multiplexed Computing (SMC paradigm is proposed. A UVR (Unidirectional Virtual Ring SMC architecture is examined under SCS tests. SMC was designed to circumvent the well-known impossibility of end-to-end paradigms. It relies on the proven statistic multiplexing principle to deliver reliable service

  11. Iterative-build OMIT maps: map improvement by iterative model building and refinement without model bias

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Adams, Paul D.; Read, Randy J.; Zwart, Peter H.; Hung, Li-Wei

    2008-01-01

    An OMIT procedure is presented that has the benefits of iterative model building density modification and refinement yet is essentially unbiased by the atomic model that is built. A procedure for carrying out iterative model building, density modification and refinement is presented in which the density in an OMIT region is essentially unbiased by an atomic model. Density from a set of overlapping OMIT regions can be combined to create a composite ‘iterative-build’ OMIT map that is everywhere unbiased by an atomic model but also everywhere benefiting from the model-based information present elsewhere in the unit cell. The procedure may have applications in the validation of specific features in atomic models as well as in overall model validation. The procedure is demonstrated with a molecular-replacement structure and with an experimentally phased structure and a variation on the method is demonstrated by removing model bias from a structure from the Protein Data Bank

  12. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  13. Multidisciplinary Energy Assessment of Tertiary Buildings: Automated Geomatic Inspection, Building Information Modeling Reconstruction and Building Performance Simulation

    Directory of Open Access Journals (Sweden)

    Faustino Patiño-Cambeiro

    2017-07-01

    Full Text Available There is an urgent need for energy efficiency in buildings within the European framework, considering its environmental implications, and Europe’s energy dependence. Furthermore, the need for enhancing and increasing productivity in the building industry turns new technologies and building energy performance simulation environments into extremely interesting solutions towards rigorous analysis and decision making in renovation within acceptable risk levels. The present work describes a multidisciplinary approach for the estimation of the energy performance of an educational building. The research involved data acquisition with advanced geomatic tools, the development of an optimized building information model, and energy assessment in Building Performance Simulation (BPS software. Interoperability issues were observed in the different steps of the process. The inspection and diagnostic phases were conducted in a timely, accurate manner thanks to automated data acquisition and subsequent analysis using Building Information Modeling based tools (BIM-based tools. Energy simulation was performed using Design Builder, and the results obtained were compared with those yielded by the official software tool established by Spanish regulations for energy certification. The discrepancies between the results of both programs have proven that the official software program is conservative in this sense. This may cause the depreciation of the assessed buildings.

  14. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  15. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  16. ARMAGH OBSERVATORY – HISTORIC BUILDING INFORMATION MODELLING FOR VIRTUAL LEARNING IN BUILDING CONSERVATION

    Directory of Open Access Journals (Sweden)

    M. Murphy

    2017-08-01

    Full Text Available In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  17. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  18. Scalability and efficiency of genetic algorithms for geometrical applications

    NARCIS (Netherlands)

    Dijk, van S.F.; Thierens, D.; Berg, de M.; Schoenauer, M.

    2000-01-01

    We study the scalability and efficiency of a GA that we developed earlier to solve the practical cartographic problem of labeling a map with point features. We argue that the special characteristics of our GA make that it fits in well with theoretical models predicting the optimal population size

  19. Building RESTful web services with Go learn how to build powerful RESTful APIs with Golang that scale gracefully

    CERN Document Server

    Yellavula, Naren

    2017-01-01

    REST is an architectural style that tackles the challenges of building scalable web services and in today's connected world, APIs have taken a central role on the web. APIs provide the fabric through which systems interact, and REST has become synonymous with APIs. The depth, breadth, and ease of use of Go, makes it a breeze for developers to ...

  20. A review of building information modelling

    Science.gov (United States)

    Wang, Wen; Han, Rui

    2018-05-01

    Building Information Modelling (BIM) is widely seen as a catalyst for innovation and productivity. It is becoming standard for new construction and is the most significant technology changing how we design, build, use and manage the building. It is a dominant technological trend in the software industry and although the theoretical groundwork was laid in the previous century, it is a popular topic in academic research. BIM is discussed in this study, which results can provide better and more comprehensive choices for building owners, designers, and developers in future.

  1. Scalability of the muscular action in a parametric 3D model of the index finger.

    Science.gov (United States)

    Sancho-Bru, Joaquín L; Vergara, Margarita; Rodríguez-Cervantes, Pablo-Jesús; Giurintano, David J; Pérez-González, Antonio

    2008-01-01

    A method for scaling the muscle action is proposed and used to achieve a 3D inverse dynamic model of the human finger with all its components scalable. This method is based on scaling the physiological cross-sectional area (PCSA) in a Hill muscle model. Different anthropometric parameters and maximal grip force data have been measured and their correlations have been analyzed and used for scaling the PCSA of each muscle. A linear relationship between the normalized PCSA and the product of the length and breadth of the hand has been finally used for scaling, with a slope of 0.01315 cm(-2), with the length and breadth of the hand expressed in centimeters. The parametric muscle model has been included in a parametric finger model previously developed by the authors, and it has been validated reproducing the results of an experiment in which subjects from different population groups exerted maximal voluntary forces with their index finger in a controlled posture.

  2. COMPLEMENTARITY OF HISTORIC BUILDING INFORMATION MODELLING AND GEOGRAPHIC INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    X. Yang

    2016-06-01

    Full Text Available In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM and Geographical Information Systems (GIS to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D, time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  3. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  4. Modelling the heat dynamics of buildings using stochastic

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae; Madsen, Henrik

    2000-01-01

    This paper describes the continuous time modelling of the heat dynamics of a building. The considered building is a residential like test house divided into two test rooms with a water based central heating. Each test room is divided into thermal zones in order to describe both short and long term...... variations. Besides modelling the heat transfer between thermal zones, attention is put on modelling the heat input from radiators and solar radiation. The applied modelling procedure is based on collected building performance data and statistical methods. The statistical methods are used in parameter...

  5. Detailed Modeling and Evaluation of a Scalable Multilevel Checkpointing System

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronevetsky, Greg [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); de Supinski, Bronis R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-01

    High-performance computing (HPC) systems are growing more powerful by utilizing more components. As the system mean time before failure correspondingly drops, applications must checkpoint frequently to make progress. But, at scale, the cost of checkpointing becomes prohibitive. A solution to this problem is multilevel checkpointing, which employs multiple types of checkpoints in a single run. Moreover, lightweight checkpoints can handle the most common failure modes, while more expensive checkpoints can handle severe failures. We designed a multilevel checkpointing library, the Scalable Checkpoint/Restart (SCR) library, that writes lightweight checkpoints to node-local storage in addition to the parallel file system. We present probabilistic Markov models of SCR's performance. We show that on future large-scale systems, SCR can lead to a gain in machine efficiency of up to 35 percent, and reduce the load on the parallel file system by a factor of two. In addition, we predict that checkpoint scavenging, or only writing checkpoints to the parallel file system on application termination, can reduce the load on the parallel file system by 20 × on today's systems and still maintain high application efficiency.

  6. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  7. Model building and new particles

    International Nuclear Information System (INIS)

    Frampton, P.H.

    1992-01-01

    After an outline of the Standard Model, indications of new physics beyond it are discussed. The nature of model building is illustrated by three examples which predict, respectively, new particles called the axigluon, sarks and the aspon. (author). 11 refs

  8. Scalable approximate policies for Markov decision process models of hospital elective admissions.

    Science.gov (United States)

    Zhu, George; Lizotte, Dan; Hoey, Jesse

    2014-05-01

    To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Analysis of a Residential Building Energy Consumption Demand Model

    Directory of Open Access Journals (Sweden)

    Meng Liu

    2011-03-01

    Full Text Available In order to estimate the energy consumption demand of residential buildings, this paper first discusses the status and shortcomings of current domestic energy consumption models. Then it proposes and develops a residential building energy consumption demand model based on a back propagation (BP neural network model. After that, taking residential buildings in Chongqing (P.R. China as an example, 16 energy consumption indicators are introduced as characteristics of the residential buildings in Chongqing. The index system of the BP neutral network prediction model is established and the multi-factorial BP neural network prediction model of Chongqing residential building energy consumption is developed using the Cshap language, based on the SQL server 2005 platform. The results obtained by applying the model in Chongqing are in good agreement with actual ones. In addition, the model provides corresponding approximate data by taking into account the potential energy structure adjustments and relevant energy policy regulations.

  10. Investigating the Role of Biogeochemical Processes in the Northern High Latitudes on Global Climate Feedbacks Using an Efficient Scalable Earth System Model

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Atul K. [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2016-09-14

    The overall objectives of this DOE funded project is to combine scientific and computational challenges in climate modeling by expanding our understanding of the biogeophysical-biogeochemical processes and their interactions in the northern high latitudes (NHLs) using an earth system modeling (ESM) approach, and by adopting an adaptive parallel runtime system in an ESM to achieve efficient and scalable climate simulations through improved load balancing algorithms.

  11. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  12. PKI Scalability Issues

    OpenAIRE

    Slagell, Adam J; Bonilla, Rafael

    2004-01-01

    This report surveys different PKI technologies such as PKIX and SPKI and the issues of PKI that affect scalability. Much focus is spent on certificate revocation methodologies and status verification systems such as CRLs, Delta-CRLs, CRS, Certificate Revocation Trees, Windowed Certificate Revocation, OCSP, SCVP and DVCS.

  13. Silicon nanophotonics for scalable quantum coherent feedback networks

    International Nuclear Information System (INIS)

    Sarovar, Mohan; Brif, Constantin; Soh, Daniel B.S.; Cox, Jonathan; DeRose, Christopher T.; Camacho, Ryan; Davids, Paul

    2016-01-01

    The emergence of coherent quantum feedback control (CQFC) as a new paradigm for precise manipulation of dynamics of complex quantum systems has led to the development of efficient theoretical modeling and simulation tools and opened avenues for new practical implementations. This work explores the applicability of the integrated silicon photonics platform for implementing scalable CQFC networks. If proven successful, on-chip implementations of these networks would provide scalable and efficient nanophotonic components for autonomous quantum information processing devices and ultra-low-power optical processing systems at telecommunications wavelengths. We analyze the strengths of the silicon photonics platform for CQFC applications and identify the key challenges to both the theoretical formalism and experimental implementations. In particular, we determine specific extensions to the theoretical CQFC framework (which was originally developed with bulk-optics implementations in mind), required to make it fully applicable to modeling of linear and nonlinear integrated optics networks. We also report the results of a preliminary experiment that studied the performance of an in situ controllable silicon nanophotonic network of two coupled cavities and analyze the properties of this device using the CQFC formalism. (orig.)

  14. Silicon nanophotonics for scalable quantum coherent feedback networks

    Energy Technology Data Exchange (ETDEWEB)

    Sarovar, Mohan; Brif, Constantin [Sandia National Laboratories, Livermore, CA (United States); Soh, Daniel B.S. [Sandia National Laboratories, Livermore, CA (United States); Stanford University, Edward L. Ginzton Laboratory, Stanford, CA (United States); Cox, Jonathan; DeRose, Christopher T.; Camacho, Ryan; Davids, Paul [Sandia National Laboratories, Albuquerque, NM (United States)

    2016-12-15

    The emergence of coherent quantum feedback control (CQFC) as a new paradigm for precise manipulation of dynamics of complex quantum systems has led to the development of efficient theoretical modeling and simulation tools and opened avenues for new practical implementations. This work explores the applicability of the integrated silicon photonics platform for implementing scalable CQFC networks. If proven successful, on-chip implementations of these networks would provide scalable and efficient nanophotonic components for autonomous quantum information processing devices and ultra-low-power optical processing systems at telecommunications wavelengths. We analyze the strengths of the silicon photonics platform for CQFC applications and identify the key challenges to both the theoretical formalism and experimental implementations. In particular, we determine specific extensions to the theoretical CQFC framework (which was originally developed with bulk-optics implementations in mind), required to make it fully applicable to modeling of linear and nonlinear integrated optics networks. We also report the results of a preliminary experiment that studied the performance of an in situ controllable silicon nanophotonic network of two coupled cavities and analyze the properties of this device using the CQFC formalism. (orig.)

  15. On Scalability and Replicability of Smart Grid Projects—A Case Study

    Directory of Open Access Journals (Sweden)

    Lukas Sigrist

    2016-03-01

    Full Text Available This paper studies the scalability and replicability of smart grid projects. Currently, most smart grid projects are still in the R&D or demonstration phases. The full roll-out of the tested solutions requires a suitable degree of scalability and replicability to prevent project demonstrators from remaining local experimental exercises. Scalability and replicability are the preliminary requisites to perform scaling-up and replication successfully; therefore, scalability and replicability allow for or at least reduce barriers for the growth and reuse of the results of project demonstrators. The paper proposes factors that influence and condition a project’s scalability and replicability. These factors involve technical, economic, regulatory and stakeholder acceptance related aspects, and they describe requirements for scalability and replicability. In order to assess and evaluate the identified scalability and replicability factors, data has been collected from European and national smart grid projects by means of a survey, reflecting the projects’ view and results. The evaluation of the factors allows quantifying the status quo of on-going projects with respect to the scalability and replicability, i.e., they provide a feedback on to what extent projects take into account these factors and on whether the projects’ results and solutions are actually scalable and replicable.

  16. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    Science.gov (United States)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and

  17. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  18. Heterotic model building: 16 special manifolds

    International Nuclear Information System (INIS)

    He, Yang-Hui; Lee, Seung-Joo; Lukas, Andre; Sun, Chuang

    2014-01-01

    We study heterotic model building on 16 specific Calabi-Yau manifolds constructed as hypersurfaces in toric four-folds. These 16 manifolds are the only ones among the more than half a billion manifolds in the Kreuzer-Skarke list with a non-trivial first fundamental group. We classify the line bundle models on these manifolds, both for SU(5) and SO(10) GUTs, which lead to consistent supersymmetric string vacua and have three chiral families. A total of about 29000 models is found, most of them corresponding to SO(10) GUTs. These models constitute a starting point for detailed heterotic model building on Calabi-Yau manifolds in the Kreuzer-Skarke list. The data for these models can be downloaded http://www-thphys.physics.ox.ac.uk/projects/CalabiYau/toricdata/index.html.

  19. A scalable implementation of RI-SCF on parallel computers

    International Nuclear Information System (INIS)

    Fruechtl, H.A.; Kendall, R.A.; Harrison, R.J.

    1996-01-01

    In order to avoid the integral bottleneck of conventional SCF calculations, the Resolution of the Identity (RI) method is used to obtain an approximate solution to the Hartree-Fock equations. In this approximation only three-center integrals are needed to build the Fock matrix. It has been implemented as part of the NWChem package of portable and scalable ab initio programs for parallel computers. Utilizing the V-approximation, both the Coulomb and exchange contribution to the Fock matrix can be calculated from a transformed set of three-center integrals which have to be precalculated and stored. A distributed in-core method as well as a disk based implementation have been programmed. Details of the implementation as well as the parallel programming tools used are described. We also give results and timings from benchmark calculations

  20. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  1. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  2. WIFIRE: A Scalable Data-Driven Monitoring, Dynamic Prediction and Resilience Cyberinfrastructure for Wildfires

    Science.gov (United States)

    Altintas, I.; Block, J.; Braun, H.; de Callafon, R. A.; Gollner, M. J.; Smarr, L.; Trouve, A.

    2013-12-01

    Recent studies confirm that climate change will cause wildfires to increase in frequency and severity in the coming decades especially for California and in much of the North American West. The most critical sustainability issue in the midst of these ever-changing dynamics is how to achieve a new social-ecological equilibrium of this fire ecology. Wildfire wind speeds and directions change in an instant, and first responders can only be effective when they take action as quickly as the conditions change. To deliver information needed for sustainable policy and management in this dynamically changing fire regime, we must capture these details to understand the environmental processes. We are building an end-to-end cyberinfrastructure (CI), called WIFIRE, for real-time and data-driven simulation, prediction and visualization of wildfire behavior. The WIFIRE integrated CI system supports social-ecological resilience to the changing fire ecology regime in the face of urban dynamics and climate change. Networked observations, e.g., heterogeneous satellite data and real-time remote sensor data is integrated with computational techniques in signal processing, visualization, modeling and data assimilation to provide a scalable, technological, and educational solution to monitor weather patterns to predict a wildfire's Rate of Spread. Our collaborative WIFIRE team of scientists, engineers, technologists, government policy managers, private industry, and firefighters architects implement CI pathways that enable joint innovation for wildfire management. Scientific workflows are used as an integrative distributed programming model and simplify the implementation of engineering modules for data-driven simulation, prediction and visualization while allowing integration with large-scale computing facilities. WIFIRE will be scalable to users with different skill-levels via specialized web interfaces and user-specified alerts for environmental events broadcasted to receivers before

  3. Climate change and high-resolution whole-building numerical modelling

    NARCIS (Netherlands)

    Blocken, B.J.E.; Briggen, P.M.; Schellen, H.L.; Hensen, J.L.M.

    2010-01-01

    This paper briefly discusses the need of high-resolution whole-building numerical modelling in the context of climate change. High-resolution whole-building numerical modelling can be used for detailed analysis of the potential consequences of climate change on buildings and to evaluate remedial

  4. Working group report: Flavor physics and model building

    Indian Academy of Sciences (India)

    cO Indian Academy of Sciences. Vol. ... This is the report of flavor physics and model building working group at ... those in model building have been primarily devoted to neutrino physics. ..... [12] Andrei Gritsan, ICHEP 2004, Beijing, China.

  5. Application of 6D Building Information Model (6D BIM) for Business-storage Building in Slovenia

    Science.gov (United States)

    Pučko, Zoran; Vincek, Dražen; Štrukelj, Andrej; Šuman, Nataša

    2017-10-01

    The aim of this paper is to present an application of 6D building information modelling (6D BIM) on a real business-storage building in Slovenia. First, features of building maintenance in general are described according to the current Slovenian legislation, and also a general principle of BIM is given. After that, step-by-step activities for modelling 6D BIM are exposed, namely from Element list for maintenance, determination of their lifetime and service measures, cost analysing and time analysing to 6D BIM modelling. The presented 6D BIM model is designed in a unique way in which cost analysis is performed as 5D BIM model with linked data to use BIM Construction Project Management Software (Vico Office), integrated with 3D BIM model, whereas time analysis as 4D BIM model is carried out as non-linked data with the help of Excel (without connection to 3D BIM model). The paper is intended to serve as a guide to the building owners to prepare 6D BIM and to provide an insight into the relevant dynamic information about intervals and costs for execution of maintenance works in the whole building lifecycle.

  6. Scalable Transactions for Web Applications in the Cloud

    NARCIS (Netherlands)

    Zhou, W.; Pierre, G.E.O.; Chi, C.-H.

    2009-01-01

    Cloud Computing platforms provide scalability and high availability properties for web applications but they sacrifice data consistency at the same time. However, many applications cannot afford any data inconsistency. We present a scalable transaction manager for NoSQL cloud database services to

  7. NPTool: Towards Scalability and Reliability of Business Process Management

    Science.gov (United States)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  8. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    OpenAIRE

    Lee, Jin Kook; Kim, Mi Jeong

    2014-01-01

    This paper describes how a building information modelling (BIM)-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs), which follow an object-oriented data modelli...

  9. Whole-Building Hygrothermal Modeling in IEA Annex 41

    DEFF Research Database (Denmark)

    Rode, Carsten; Woloszyn, Monika

    2007-01-01

    . The IEA Annex 41 project runs from 2004–2007, coming to conclusion just before the Thermal Performance of the Exterior Envelopes of Whole Buildings X conference. The Annex 41 project and its Subtask 1 do not aim to produce one state-of-the-art hygrothermal simulation model for whole buildings, but rather...... the modeling, free scientific contributions have been invited from specific fields that need the most attention in order to better accomplish the integral building simulations. This paper will give an overview of the advances in whole-building hygrothermal simulation that have been accomplished and presented...

  10. Requirements for Scalable Access Control and Security Management Architectures

    National Research Council Canada - National Science Library

    Keromytis, Angelos D; Smith, Jonathan M

    2005-01-01

    Maximizing local autonomy has led to a scalable Internet. Scalability and the capacity for distributed control have unfortunately not extended well to resource access control policies and mechanisms...

  11. Experimental and analytical studies of a deeply embedded reactor building model considering soil-building interaction. Pt. 1

    International Nuclear Information System (INIS)

    Tanaka, H.; Ohta, T.; Uchiyama, S.

    1979-01-01

    The purpose of this paper is to describe the dynamic characteristics of a deeply embedded reactor building model derived from experimental and analytical studies which considers soil-building interaction behaviour. The model building is made of reinforced concrete. It has two stories above ground level and a basement, resting on sandy gravel layer at a depth of 3 meters. The backfill around the building was made to ground level. The model building is simplified and reduced to about one-fifteenth (1/15) of the prototype. It has bearing wall system for the basement and the first story, and frame system for the second. (orig.)

  12. Scalable cloud without dedicated storage

    Science.gov (United States)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  13. Building energy modeling for green architecture and intelligent dashboard applications

    Science.gov (United States)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the

  14. A Learning Framework for Control-Oriented Modeling of Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.; Vishnu, Abhinav; Vrabie, Draguna L.

    2018-01-18

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and big data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.

  15. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin; Wonka, Peter

    2016-01-01

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available

  16. Modeling urban building energy use: A review of modeling approaches and procedures

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wenliang; Zhou, Yuyu; Cetin, Kristen; Eom, Jiyong; Wang, Yu; Chen, Gang; Zhang, Xuesong

    2017-12-01

    With rapid urbanization and economic development, the world has been experiencing an unprecedented increase in energy consumption and greenhouse gas (GHG) emissions. While reducing energy consumption and GHG emissions is a common interest shared by major developed and developing countries, actions to enable these global reductions are generally implemented at the city scale. This is because baseline information from individual cities plays an important role in identifying economical options for improving building energy efficiency and reducing GHG emissions. Numerous approaches have been proposed for modeling urban building energy use in the past decades. This paper aims to provide an up-to-date review of the broad categories of energy models for urban buildings and describes the basic workflow of physics-based, bottom-up models and their applications in simulating urban-scale building energy use. Because there are significant differences across models with varied potential for application, strengths and weaknesses of the reviewed models are also presented. This is followed by a discussion of challenging issues associated with model preparation and calibration.

  17. Salvus: A scalable software suite for full-waveform modelling & inversion

    Science.gov (United States)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; Fichtner, A.

    2017-12-01

    Full-waveform inversion (FWI), whether at the lab, exploration, or planetary scale, requires the cooperation of five principal components. (1) The geometry of the domain needs to be properly discretized and an initial guess of the model parameters must be projected onto it; (2) Large volumes of recorded waveform data must be collected, organized, and processed; (3) Synthetic waveform data must be efficiently and accurately computed through complex domains; (4) Suitable misfit functions and optimization techniques must be used to relate discrepancies in data space to perturbations in the model; and (5) Some form of workflow management must be employed to schedule and run (1) - (4) in the correct order. Each one of these components can represent a formidable technical challenge which redirects energy from the true task at hand: using FWI to extract new information about some underlying continuum.In this presentation we give an overview of the current status of the Salvus software suite, which was introduced to address the challenges listed above. Specifically, we touch on (1) salvus_mesher, which eases the discretization of complex Earth models into hexahedral meshes; (2) salvus_seismo, which integrates with LASIF and ObsPy to streamline the processing and preparation of seismic data; (3) salvus_wave, a high-performance and scalable spectral-element solver capable of simulating waveforms through general unstructured 2- and 3-D domains, and (4) salvus_opt, an optimization toolbox specifically designed for full-waveform inverse problems. Tying everything together, we also discuss (5) salvus_flow: a workflow package designed to orchestrate and manage the rest of the suite. It is our hope that these developments represent a step towards the automation of large-scale seismic waveform inversion, while also lowering the barrier of entry for new applications. We include several examples of Salvus' use in (extra-) planetary seismology, non-destructive testing, and medical

  18. Conscientiousness at the workplace: Applying mixture IRT to investigate scalability and predictive validity

    NARCIS (Netherlands)

    Egberink, I.J.L.; Meijer, R.R.; Veldkamp, Bernard P.

    2010-01-01

    Mixture item response theory (IRT) models have been used to assess multidimensionality of the construct being measured and to detect different response styles for different groups. In this study a mixture version of the graded response model was applied to investigate scalability and predictive

  19. Conscientiousness in the workplace : Applying mixture IRT to investigate scalability and predictive validity

    NARCIS (Netherlands)

    Egberink, I.J.L.; Meijer, R.R.; Veldkamp, B.P.

    Mixture item response theory (IRT) models have been used to assess multidimensionality of the construct being measured and to detect different response styles for different groups. In this study a mixture version of the graded response model was applied to investigate scalability and predictive

  20. Economic aspects and models for building codes

    DEFF Research Database (Denmark)

    Bonke, Jens; Pedersen, Dan Ove; Johnsen, Kjeld

    It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study.......It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study....

  1. Dynamic analysis of reactor containment building using axisymmetric finite element model

    International Nuclear Information System (INIS)

    Thakkar, S.K.; Dubey, R.N.

    1989-01-01

    The structural safety of nuclear reactor building during earthquake is of great importance in view of possibility of radiation hazards. The rational evaluation of forces and displacements in various portions of structure and foundation during strong ground motion is most important for safe performance and economic design of the reactor building. The accuracy of results of dynamic analysis is naturally dependent on the type of mathematical model employed. Three types of mathematical models are employed for dynamic analysis of reactor building beam model axisymmetric finite element model and three dimensional model. In this paper emphasis is laid on axisymmetric model. This model of containment building is considered a reinfinement over conventional beam model of the structure. The nuclear reactor building on a rocky foundation is considered herein. The foundation-structure interaction is relatively less in this condition. The objective of the paper is to highlight the significance of modelling of non-axisymmetric portion of building, such as reactor internals by equivalent axisymmetric body, on the structural response of the building

  2. THE EFFECT OF BUILDING FAÇADE MODEL ON LIGHT DISTRIBUTION (CASE STUDY: MENARA PHINISI BUILDING OF UNM

    Directory of Open Access Journals (Sweden)

    Nurul Jamala

    2017-12-01

    Full Text Available Global warming issues influence the temperature of the earth surface. It is an impact on energy consumption, especially in buildings. Utilization of daylight is one of the factors that need to be considered, in order to minimize energy consumption as a source of artificial lighting. This study analyzed the distribution of light on the Menara Phinisi building of Makassar State University. Quantitative research method that is to describe the data of simulation in Autodesk Ecotect program. The research objective was to determine the effect of the building facade model on the value of illumination inside the building. Results of the study concluded that the decrease percentage of the distribution of light on the building facade using and not using the facade is 3,16% or 236 lux. Distribution of light in horizontal and diagonal facade models differ in the amount of 2,5%. Design analysis of the building serves as a guide for analyzing the influence of the building facade model so that it can create energy efficient buildings.

  3. GPU-based Scalable Volumetric Reconstruction for Multi-view Stereo

    Energy Technology Data Exchange (ETDEWEB)

    Kim, H; Duchaineau, M; Max, N

    2011-09-21

    We present a new scalable volumetric reconstruction algorithm for multi-view stereo using a graphics processing unit (GPU). It is an effectively parallelized GPU algorithm that simultaneously uses a large number of GPU threads, each of which performs voxel carving, in order to integrate depth maps with images from multiple views. Each depth map, triangulated from pair-wise semi-dense correspondences, represents a view-dependent surface of the scene. This algorithm also provides scalability for large-scale scene reconstruction in a high resolution voxel grid by utilizing streaming and parallel computation. The output is a photo-realistic 3D scene model in a volumetric or point-based representation. We demonstrate the effectiveness and the speed of our algorithm with a synthetic scene and real urban/outdoor scenes. Our method can also be integrated with existing multi-view stereo algorithms such as PMVS2 to fill holes or gaps in textureless regions.

  4. SWAP-Assembler: scalable and efficient genome assembly towards thousands of cores.

    Science.gov (United States)

    Meng, Jintao; Wang, Bingqiang; Wei, Yanjie; Feng, Shengzhong; Balaji, Pavan

    2014-01-01

    There is a widening gap between the throughput of massive parallel sequencing machines and the ability to analyze these sequencing data. Traditional assembly methods requiring long execution time and large amount of memory on a single workstation limit their use on these massive data. This paper presents a highly scalable assembler named as SWAP-Assembler for processing massive sequencing data using thousands of cores, where SWAP is an acronym for Small World Asynchronous Parallel model. In the paper, a mathematical description of multi-step bi-directed graph (MSG) is provided to resolve the computational interdependence on merging edges, and a highly scalable computational framework for SWAP is developed to automatically preform the parallel computation of all operations. Graph cleaning and contig extension are also included for generating contigs with high quality. Experimental results show that SWAP-Assembler scales up to 2048 cores on Yanhuang dataset using only 26 minutes, which is better than several other parallel assemblers, such as ABySS, Ray, and PASHA. Results also show that SWAP-Assembler can generate high quality contigs with good N50 size and low error rate, especially it generated the longest N50 contig sizes for Fish and Yanhuang datasets. In this paper, we presented a highly scalable and efficient genome assembly software, SWAP-Assembler. Compared with several other assemblers, it showed very good performance in terms of scalability and contig quality. This software is available at: https://sourceforge.net/projects/swapassembler.

  5. Model for Refurbishment of Heritage Buildings

    DEFF Research Database (Denmark)

    Rasmussen, Torben Valdbjørn

    2014-01-01

    the Heritage Agency, the Danish Working Environment Authority and the owner as a team cooperated in identifying feasible refurbishments. In this case, the focus centered on restoring and identifying potential energy savings and deciding on energy upgrading measures for the listed complex. The refurbished...... with the requirements for the use of the building. The model focuses on the cooperation and dialogue between authorities and owners, who refurbish heritage buildings. The developed model was used for the refurbishment of the listed complex, Fæstningens Materialgård. Fæstningens Materialgård is a case study where...

  6. BIM-Enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  7. Modular Universal Scalable Ion-trap Quantum Computer

    Science.gov (United States)

    2016-06-02

    SECURITY CLASSIFICATION OF: The main goal of the original MUSIQC proposal was to construct and demonstrate a modular and universally- expandable ion...Distribution Unlimited UU UU UU UU 02-06-2016 1-Aug-2010 31-Jan-2016 Final Report: Modular Universal Scalable Ion-trap Quantum Computer The views...P.O. Box 12211 Research Triangle Park, NC 27709-2211 Ion trap quantum computation, scalable modular architectures REPORT DOCUMENTATION PAGE 11

  8. Scalable and Media Aware Adaptive Video Streaming over Wireless Networks

    Directory of Open Access Journals (Sweden)

    Béatrice Pesquet-Popescu

    2008-07-01

    Full Text Available This paper proposes an advanced video streaming system based on scalable video coding in order to optimize resource utilization in wireless networks with retransmission mechanisms at radio protocol level. The key component of this system is a packet scheduling algorithm which operates on the different substreams of a main scalable video stream and which is implemented in a so-called media aware network element. The concerned type of transport channel is a dedicated channel subject to parameters (bitrate, loss rate variations on the long run. Moreover, we propose a combined scalability approach in which common temporal and SNR scalability features can be used jointly with a partitioning of the image into regions of interest. Simulation results show that our approach provides substantial quality gain compared to classical packet transmission methods and they demonstrate how ROI coding combined with SNR scalability allows to improve again the visual quality.

  9. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  10. Myria: Scalable Analytics as a Service

    Science.gov (United States)

    Howe, B.; Halperin, D.; Whitaker, A.

    2014-12-01

    At the UW eScience Institute, we're working to empower non-experts, especially in the sciences, to write and use data-parallel algorithms. To this end, we are building Myria, a web-based platform for scalable analytics and data-parallel programming. Myria's internal model of computation is the relational algebra extended with iteration, such that every program is inherently data-parallel, just as every query in a database is inherently data-parallel. But unlike databases, iteration is a first class concept, allowing us to express machine learning tasks, graph traversal tasks, and more. Programs can be expressed in a number of languages and can be executed on a number of execution environments, but we emphasize a particular language called MyriaL that supports both imperative and declarative styles and a particular execution engine called MyriaX that uses an in-memory column-oriented representation and asynchronous iteration. We deliver Myria over the web as a service, providing an editor, performance analysis tools, and catalog browsing features in a single environment. We find that this web-based "delivery vector" is critical in reaching non-experts: they are insulated from irrelevant effort technical work associated with installation, configuration, and resource management. The MyriaX backend, one of several execution runtimes we support, is a main-memory, column-oriented, RDBMS-on-the-worker system that supports cyclic data flows as a first-class citizen and has been shown to outperform competitive systems on 100-machine cluster sizes. I will describe the Myria system, give a demo, and present some new results in large-scale oceanographic microbiology.

  11. A Heat Dynamic Model for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2015-01-01

    This article presents a heat dynamic model for prediction of the indoor temperature in an office building. The model has been used in several flexible load applications, where the indoor temperature is allowed to vary around a given reference to provide power system services by shifting the heating...... of the building in time. This way the thermal mass of the building can be used to absorb energy from renewable energy source when available and postpone heating in periods with lack of renewable energy generation. The model is used in a model predictive controller to ensure the residential comfort over a given...

  12. Scalability Modeling for Optimal Provisioning of Data Centers in Telenor: A better balance between under- and over-provisioning

    OpenAIRE

    Rygg, Knut Helge

    2012-01-01

    The scalability of an information system describes the relationship between system ca-pacity and system size. This report studies the scalability of Microsoft Lync Server 2010 in order to provide guidelines for provisioning hardware resources. Optimal pro-visioning is required to reduce both deployment and operational costs, while keeping an acceptable service quality.All Lync servers in the test setup are virtualizedusingVMware ESXi 5.0 and the system runs on a Cisco Unified Computing System...

  13. Decentralized control of a scalable photovoltaic (PV)-battery hybrid power system

    International Nuclear Information System (INIS)

    Kim, Myungchin; Bae, Sungwoo

    2017-01-01

    Highlights: • This paper introduces the design and control of a PV-battery hybrid power system. • Reliable and scalable operation of hybrid power systems is achieved. • System and power control are performed without a centralized controller. • Reliability and scalability characteristics are studied in a quantitative manner. • The system control performance is verified using realistic solar irradiation data. - Abstract: This paper presents the design and control of a sustainable standalone photovoltaic (PV)-battery hybrid power system (HPS). The research aims to develop an approach that contributes to increased level of reliability and scalability for an HPS. To achieve such objectives, a PV-battery HPS with a passively connected battery was studied. A quantitative hardware reliability analysis was performed to assess the effect of energy storage configuration to the overall system reliability. Instead of requiring the feedback control information of load power through a centralized supervisory controller, the power flow in the proposed HPS is managed by a decentralized control approach that takes advantage of the system architecture. Reliable system operation of an HPS is achieved through the proposed control approach by not requiring a separate supervisory controller. Furthermore, performance degradation of energy storage can be prevented by selecting the controller gains such that the charge rate does not exceed operational requirements. The performance of the proposed system architecture with the control strategy was verified by simulation results using realistic irradiance data and a battery model in which its temperature effect was considered. With an objective to support scalable operation, details on how the proposed design could be applied were also studied so that the HPS could satisfy potential system growth requirements. Such scalability was verified by simulating various cases that involve connection and disconnection of sources and loads. The

  14. Scalable Fabrication of Integrated Nanophotonic Circuits on Arrays of Thin Single Crystal Diamond Membrane Windows.

    Science.gov (United States)

    Piracha, Afaq H; Rath, Patrik; Ganesan, Kumaravelu; Kühn, Stefan; Pernice, Wolfram H P; Prawer, Steven

    2016-05-11

    Diamond has emerged as a promising platform for nanophotonic, optical, and quantum technologies. High-quality, single crystalline substrates of acceptable size are a prerequisite to meet the demanding requirements on low-level impurities and low absorption loss when targeting large photonic circuits. Here, we describe a scalable fabrication method for single crystal diamond membrane windows that achieves three major goals with one fabrication method: providing high quality diamond, as confirmed by Raman spectroscopy; achieving homogeneously thin membranes, enabled by ion implantation; and providing compatibility with established planar fabrication via lithography and vertical etching. On such suspended diamond membranes we demonstrate a suite of photonic components as building blocks for nanophotonic circuits. Monolithic grating couplers are used to efficiently couple light between photonic circuits and optical fibers. In waveguide coupled optical ring resonators, we find loaded quality factors up to 66 000 at a wavelength of 1560 nm, corresponding to propagation loss below 7.2 dB/cm. Our approach holds promise for the scalable implementation of future diamond quantum photonic technologies and all-diamond photonic metrology tools.

  15. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    Science.gov (United States)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  16. Automatic Generation of 3D Building Models with Multiple Roofs

    Institute of Scientific and Technical Information of China (English)

    Kenichi Sugihara; Yoshitugu Hayashi

    2008-01-01

    Based on building footprints (building polygons) on digital maps, we are proposing the GIS and CG integrated system that automatically generates 3D building models with multiple roofs. Most building polygons' edges meet at right angles (orthogonal polygon). The integrated system partitions orthogonal building polygons into a set of rectangles and places rectangular roofs and box-shaped building bodies on these rectangles. In order to partition an orthogonal polygon, we proposed a useful polygon expression in deciding from which vertex a dividing line is drawn. In this paper, we propose a new scheme for partitioning building polygons and show the process of creating 3D roof models.

  17. Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

    Science.gov (United States)

    Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth

    2017-12-01

    The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.

  18. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Don W.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminary theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.

  19. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Science.gov (United States)

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  20. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Directory of Open Access Journals (Sweden)

    Tewari Susanta

    2012-10-01

    Full Text Available Abstract Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach.

  1. Demand Response Technology Readiness Levels for Energy Management in Blocks of Buildings

    Directory of Open Access Journals (Sweden)

    Tracey Crosbie

    2018-01-01

    Full Text Available Fossil fuels deliver most of the flexibility in contemporary electricity systems. The pressing need to reduce CO2 emissions requires new methods to provide this flexibility. Demand response (DR offers consumers a significant role in the delivery of flexibility by reducing or shifting their electricity usage during periods of stress or constraint. Blocks of buildings offer more flexibility in the timing and use of energy than single buildings, however, and a lack of relevant scalable ICT tools hampers DR in blocks of buildings. To ameliorate this problem, a current innovation project called “Demand Response in Blocks of Buildings” (DR-BoB: www.dr-bob.eu has integrated existing technologies into a scalable cloud-based solution for DR in blocks of buildings. The degree to which the DR-BoB energy management solution can increase the ability of any given site to participate in DR is dependent upon its current energy systems, i.e., the energy metering, the telemetry and control technologies in building management systems, and the existence/capacity of local power generation and storage plants. To encourage the owners and managers of blocks of buildings to participate in DR, a method of assessing and validating the technological readiness to participate in DR energy management solutions at any given site is required. This paper describes the DR-BoB energy management solution and outlines what we have called the demand response technology readiness levels (DRTRLs for the implementation of such a solution in blocks of buildings.

  2. Scalable Nonlinear Solvers for Fully Implicit Coupled Nuclear Fuel Modeling. Final Report

    International Nuclear Information System (INIS)

    Cai, Xiao-Chuan; Yang, Chao; Pernice, Michael

    2014-01-01

    The focus of the project is on the development and customization of some highly scalable domain decomposition based preconditioning techniques for the numerical solution of nonlinear, coupled systems of partial differential equations (PDEs) arising from nuclear fuel simulations. These high-order PDEs represent multiple interacting physical fields (for example, heat conduction, oxygen transport, solid deformation), each is modeled by a certain type of Cahn-Hilliard and/or Allen-Cahn equations. Most existing approaches involve a careful splitting of the fields and the use of field-by-field iterations to obtain a solution of the coupled problem. Such approaches have many advantages such as ease of implementation since only single field solvers are needed, but also exhibit disadvantages. For example, certain nonlinear interactions between the fields may not be fully captured, and for unsteady problems, stable time integration schemes are difficult to design. In addition, when implemented on large scale parallel computers, the sequential nature of the field-by-field iterations substantially reduces the parallel efficiency. To overcome the disadvantages, fully coupled approaches have been investigated in order to obtain full physics simulations.

  3. A Unified Building Model for 3D Urban GIS

    Directory of Open Access Journals (Sweden)

    Ihab Hijazi

    2012-07-01

    Full Text Available Several tasks in urban and architectural design are today undertaken in a geospatial context. Building Information Models (BIM and geospatial technologies offer 3D data models that provide information about buildings and the surrounding environment. The Industry Foundation Classes (IFC and CityGML are today the two most prominent semantic models for representation of BIM and geospatial models respectively. CityGML has emerged as a standard for modeling city models while IFC has been developed as a reference model for building objects and sites. Current CAD and geospatial software provide tools that allow the conversion of information from one format to the other. These tools are however fairly limited in their capabilities, often resulting in data and information losses in the transformations. This paper describes a new approach for data integration based on a unified building model (UBM which encapsulates both the CityGML and IFC models, thus avoiding translations between the models and loss of information. To build the UBM, all classes and related concepts were initially collected from both models, overlapping concepts were merged, new objects were created to ensure the capturing of both indoor and outdoor objects, and finally, spatial relationships between the objects were redefined. Unified Modeling Language (UML notations were used for representing its objects and relationships between them. There are two use-case scenarios, both set in a hospital: “evacuation” and “allocating spaces for patient wards” were developed to validate and test the proposed UBM data model. Based on these two scenarios, four validation queries were defined in order to validate the appropriateness of the proposed unified building model. It has been validated, through the case scenarios and four queries, that the UBM being developed is able to integrate CityGML data as well as IFC data in an apparently seamless way. Constraints and enrichment functions are

  4. Scuba: scalable kernel-based gene prioritization.

    Science.gov (United States)

    Zampieri, Guido; Tran, Dinh Van; Donini, Michele; Navarin, Nicolò; Aiolli, Fabio; Sperduti, Alessandro; Valle, Giorgio

    2018-01-25

    The uncovering of genes linked to human diseases is a pressing challenge in molecular biology and precision medicine. This task is often hindered by the large number of candidate genes and by the heterogeneity of the available information. Computational methods for the prioritization of candidate genes can help to cope with these problems. In particular, kernel-based methods are a powerful resource for the integration of heterogeneous biological knowledge, however, their practical implementation is often precluded by their limited scalability. We propose Scuba, a scalable kernel-based method for gene prioritization. It implements a novel multiple kernel learning approach, based on a semi-supervised perspective and on the optimization of the margin distribution. Scuba is optimized to cope with strongly unbalanced settings where known disease genes are few and large scale predictions are required. Importantly, it is able to efficiently deal both with a large amount of candidate genes and with an arbitrary number of data sources. As a direct consequence of scalability, Scuba integrates also a new efficient strategy to select optimal kernel parameters for each data source. We performed cross-validation experiments and simulated a realistic usage setting, showing that Scuba outperforms a wide range of state-of-the-art methods. Scuba achieves state-of-the-art performance and has enhanced scalability compared to existing kernel-based approaches for genomic data. This method can be useful to prioritize candidate genes, particularly when their number is large or when input data is highly heterogeneous. The code is freely available at https://github.com/gzampieri/Scuba .

  5. Scalable shared-memory multiprocessing

    CERN Document Server

    Lenoski, Daniel E

    1995-01-01

    Dr. Lenoski and Dr. Weber have experience with leading-edge research and practical issues involved in implementing large-scale parallel systems. They were key contributors to the architecture and design of the DASH multiprocessor. Currently, they are involved with commercializing scalable shared-memory technology.

  6. Alternatives to quintessence model building

    International Nuclear Information System (INIS)

    Avelino, P.P.; Beca, L.M.G.; Pinto, P.; Carvalho, J.P.M. de; Martins, C.J.A.P.

    2003-01-01

    We discuss the issue of toy model building for the dark energy component of the universe. Specifically, we consider two generic toy models recently proposed as alternatives to quintessence models, respectively known as Cardassian expansion and the Chaplygin gas. We show that the former is entirely equivalent to a class of quintessence models. We determine the observational constraints on the latter, coming from recent supernovae results and from the shape of the matter power spectrum. As expected, these restrict the model to a behavior that closely matches that of a standard cosmological constant Λ

  7. Impacts of building information modeling on facility maintenance management

    Energy Technology Data Exchange (ETDEWEB)

    Ahamed, Shafee; Neelamkavil, Joseph; Canas, Roberto [Centre for Computer-assisted Construction Technologies, National Research Council of Canada, London, Ontario (Canada)

    2010-07-01

    Building information modeling (BIM) is a digital representation of the physical and functional properties of a building; it has been used by construction professionals for a long time and stakeholders are now using it in different aspects of the building lifecycle. This paper intends to present how BIM impacts the construction industry and how it can be used for facility maintenance management. The maintenance and operations of buildings are in most cases still managed through the use of drawings and spreadsheets although life cycle costs of a building are significantly higher than initial investment costs; thus, the use of BIM could help in achieving a higher efficiency and so important benefits. This study is part of an ongoing research project, the nD modeling project, which aims at predicting building energy consumption with better accuracy.

  8. Updating of a dynamic finite element model from the Hualien scale model reactor building

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Lebailly, P.

    1996-08-01

    The forces occurring at the soil-structure interface of a building have generally a large influence on the way the building reacts to an earthquake. One can be tempted to characterise these forces more accurately bu updating a model from the structure. However, this procedure requires an updating method suitable for dissipative models, since significant damping can be observed at the soil-structure interface of buildings. Such a method is presented here. It is based on the minimization of a mechanical energy built from the difference between Eigen data calculated bu the model and Eigen data issued from experimental tests on the real structure. An experimental validation of this method is then proposed on a model from the HUALIEN scale-model reactor building. This scale-model, built on the HUALIEN site of TAIWAN, is devoted to the study of soil-structure interaction. The updating concerned the soil impedances, modelled by a layer of springs and viscous dampers attached to the building foundation. A good agreement was found between the Eigen modes and dynamic responses calculated bu the updated model and the corresponding experimental data. (authors). 12 refs., 3 figs., 4 tabs

  9. Integration of an intelligent systems behavior simulator and a scalable soldier-machine interface

    Science.gov (United States)

    Johnson, Tony; Manteuffel, Chris; Brewster, Benjamin; Tierney, Terry

    2007-04-01

    As the Army's Future Combat Systems (FCS) introduce emerging technologies and new force structures to the battlefield, soldiers will increasingly face new challenges in workload management. The next generation warfighter will be responsible for effectively managing robotic assets in addition to performing other missions. Studies of future battlefield operational scenarios involving the use of automation, including the specification of existing and proposed technologies, will provide significant insight into potential problem areas regarding soldier workload. The US Army Tank Automotive Research, Development, and Engineering Center (TARDEC) is currently executing an Army technology objective program to analyze and evaluate the effect of automated technologies and their associated control devices with respect to soldier workload. The Human-Robotic Interface (HRI) Intelligent Systems Behavior Simulator (ISBS) is a human performance measurement simulation system that allows modelers to develop constructive simulations of military scenarios with various deployments of interface technologies in order to evaluate operator effectiveness. One such interface is TARDEC's Scalable Soldier-Machine Interface (SMI). The scalable SMI provides a configurable machine interface application that is capable of adapting to several hardware platforms by recognizing the physical space limitations of the display device. This paper describes the integration of the ISBS and Scalable SMI applications, which will ultimately benefit both systems. The ISBS will be able to use the Scalable SMI to visualize the behaviors of virtual soldiers performing HRI tasks, such as route planning, and the scalable SMI will benefit from stimuli provided by the ISBS simulation environment. The paper describes the background of each system and details of the system integration approach.

  10. Working towards a scalable model of problem-based learning instruction in undergraduate engineering education

    Science.gov (United States)

    Mantri, Archana

    2014-05-01

    The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and Communication Engineering, namely Analog Electronics, Digital Electronics and Pulse, Digital & Switching Circuits is presented here. It measures the effects of pedagogy, gender and cognitive styles on the knowledge, skill and attitude of the students. The study was conducted two times with content designed around same set of OEPs but with two different trained facilitators for all the three courses. The repeatability of results for effects of the independent parameters on dependent parameters is studied and inferences are drawn.

  11. Physical and JIT Model Based Hybrid Modeling Approach for Building Thermal Load Prediction

    Science.gov (United States)

    Iino, Yutaka; Murai, Masahiko; Murayama, Dai; Motoyama, Ichiro

    Energy conservation in building fields is one of the key issues in environmental point of view as well as that of industrial, transportation and residential fields. The half of the total energy consumption in a building is occupied by HVAC (Heating, Ventilating and Air Conditioning) systems. In order to realize energy conservation of HVAC system, a thermal load prediction model for building is required. This paper propose a hybrid modeling approach with physical and Just-in-Time (JIT) model for building thermal load prediction. The proposed method has features and benefits such as, (1) it is applicable to the case in which past operation data for load prediction model learning is poor, (2) it has a self checking function, which always supervises if the data driven load prediction and the physical based one are consistent or not, so it can find if something is wrong in load prediction procedure, (3) it has ability to adjust load prediction in real-time against sudden change of model parameters and environmental conditions. The proposed method is evaluated with real operation data of an existing building, and the improvement of load prediction performance is illustrated.

  12. Modeling of HVAC operational faults in building performance simulation

    International Nuclear Information System (INIS)

    Zhang, Rongpeng; Hong, Tianzhen

    2017-01-01

    Highlights: •Discuss significance of capturing operational faults in existing buildings. •Develop a novel feature in EnergyPlus to model operational faults of HVAC systems. •Compare three approaches to faults modeling using EnergyPlus. •A case study demonstrates the use of the fault-modeling feature. •Future developments of new faults are discussed. -- Abstract: Operational faults are common in the heating, ventilating, and air conditioning (HVAC) systems of existing buildings, leading to a decrease in energy efficiency and occupant comfort. Various fault detection and diagnostic methods have been developed to identify and analyze HVAC operational faults at the component or subsystem level. However, current methods lack a holistic approach to predicting the overall impacts of faults at the building level—an approach that adequately addresses the coupling between various operational components, the synchronized effect between simultaneous faults, and the dynamic nature of fault severity. This study introduces the novel development of a fault-modeling feature in EnergyPlus which fills in the knowledge gap left by previous studies. This paper presents the design and implementation of the new feature in EnergyPlus and discusses in detail the fault-modeling challenges faced. The new fault-modeling feature enables EnergyPlus to quantify the impacts of faults on building energy use and occupant comfort, thus supporting the decision making of timely fault corrections. Including actual building operational faults in energy models also improves the accuracy of the baseline model, which is critical in the measurement and verification of retrofit or commissioning projects. As an example, EnergyPlus version 8.6 was used to investigate the impacts of a number of typical operational faults in an office building across several U.S. climate zones. The results demonstrate that the faults have significant impacts on building energy performance as well as on occupant

  13. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Mustafaraj, Giorgio; Marini, Dashamir; Costa, Andrea; Keane, Marcus

    2014-01-01

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE) hourly from −5.6% to 7.5% and CV(RMSE) hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  14. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  15. A fuzzy-based model to implement the global safety buildings index assessment for agri-food buildings

    Directory of Open Access Journals (Sweden)

    Francesco Barreca

    2014-06-01

    Full Text Available The latest EU policies focus on the issue of food safety with a view to ensuring adequate and standard quality levels for the food produced and/or consumed within the EC. To that purpose, the environment where agricultural products are manufactured and processed plays a crucial role in achieving food hygiene. As a consequence, it is of the outmost importance to adopt proper building solutions which meet health and hygiene requirements as well as to use suitable tools to measure the levels achieved. Similarly, it is necessary to verify and evaluate the level of workers’ safety and welfare in their working environment. Workers’ safety has not only an ethical and social value but also an economic implication, since possible accidents or environmental stressors are the major causes of the lower efficiency and productivity of workers. Therefore, it is fundamental to design suitable models of analysis that allow assessing buildings as a whole, taking into account both health and hygiene safety as well as workers’ safety and welfare. Hence, this paper proposes an assessment model that, based on an established study protocol and on the application of a fuzzy logic procedure, allows assessing the global safety level of an agri-food building by means of a global safety buildings index. The model here presented is original since it uses fuzzy logic to evaluate the performances of both the technical and environmental systems of an agri-food building in terms of health and hygiene safety of the manufacturing process as well as of workers’ health and safety. The result of the assessment is expressed through a triangular fuzzy membership function which allows carrying out comparative analyses of different buildings. A specific procedure was developed to apply the model to a case study which tested its operational simplicity and the validity of its results. The proposed model allows obtaining a synthetic and global value of the building performance of

  16. A scalable healthcare information system based on a service-oriented architecture.

    Science.gov (United States)

    Yang, Tzu-Hsiang; Sun, Yeali S; Lai, Feipei

    2011-06-01

    Many existing healthcare information systems are composed of a number of heterogeneous systems and face the important issue of system scalability. This paper first describes the comprehensive healthcare information systems used in National Taiwan University Hospital (NTUH) and then presents a service-oriented architecture (SOA)-based healthcare information system (HIS) based on the service standard HL7. The proposed architecture focuses on system scalability, in terms of both hardware and software. Moreover, we describe how scalability is implemented in rightsizing, service groups, databases, and hardware scalability. Although SOA-based systems sometimes display poor performance, through a performance evaluation of our HIS based on SOA, the average response time for outpatient, inpatient, and emergency HL7Central systems are 0.035, 0.04, and 0.036 s, respectively. The outpatient, inpatient, and emergency WebUI average response times are 0.79, 1.25, and 0.82 s. The scalability of the rightsizing project and our evaluation results show that the SOA HIS we propose provides evidence that SOA can provide system scalability and sustainability in a highly demanding healthcare information system.

  17. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  18. Exploitation of Semantic Building Model in Indoor Navigation Systems

    Science.gov (United States)

    Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min

    2009-04-01

    There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication

  19. Activity-Tracking Service for Building Operating Systems

    DEFF Research Database (Denmark)

    Hviid, Jakob; Kjærgaard, Mikkel Baun

    2018-01-01

    of Things sensors and devices promise to deliver rich data about human activities and control of loads. However, existing proposals for building operating systems that should combine such data and control opportunities does not provide concepts and support for activity data. In this paper we propose...... an activity-tracking service for building operating systems. The service is designed to consider the security, privacy, integration, extendability and scalability challenges in the building setting. We provide initial findings for testing the system in a proof of concept evaluation using a set of common......Several high consuming electricity loads in retail stores are currently highly intertwined in human activities. Without knowledge of such activities it is difficult to improve the energy efficiency of the loads operation for sustainability and cost reasons. The increasing availability of Internet...

  20. IMPROVING TRADITIONAL BUILDING REPAIR CONSTRUCTION QUALITY USING HISTORIC BUILDING INFORMATION MODELING CONCEPT

    Directory of Open Access Journals (Sweden)

    T. C. Wu

    2013-07-01

    Full Text Available In addition to the repair construction project following the repair principles contemplated by heritage experts, the construction process should be recorded and measured at any time for monitoring to ensure the quality of repair. The conventional construction record methods mostly depend on the localized shooting of 2D digital images coupled with text and table for illustration to achieve the purpose of monitoring. Such methods cannot fully and comprehensively record the 3D spatial relationships in the real world. Therefore, the construction records of traditional buildings are very important but cannot function due to technical limitations. This study applied the 3D laser scanning technology to establish a 3D point cloud model for the repair construction of historical buildings. It also broke down the detailed components of the 3D point cloud model by using the concept of the historic building information modeling, and established the 3D models of various components and their attribute data in the 3DGIS platform database. In the construction process, according to the time of completion of each stage as developed on the construction project, this study conducted the 3D laser scanning and database establishment for each stage, also applied 3DGIS spatial information and attribute information comparison and analysis to propose the analysis of differences in completion of various stages for improving the traditional building repair construction quality. This method helps to improve the quality of repair construction work of tangible cultural assets of the world. The established 3DGIS platform can be used as a power tool for subsequent management and maintenance.

  1. Modelling energy demand in the Norwegian building stock

    Energy Technology Data Exchange (ETDEWEB)

    Sartori, Igor

    2008-07-15

    Energy demand in the building stock in Norway represents about 40% of the final energy consumption, of which 22% goes to the residential sector and 18% to the service sector. In Norway there is a strong dependency on electricity for heating purposes, with electricity covering about 80% of the energy demand in buildings. The building sector can play an important role in the achievement of a more sustainable energy system. The work performed in the articles presented in this thesis investigates various aspects related to the energy demand in the building sector, both in singular cases and in the stock as a whole. The work performed in the first part of this thesis on development and survey of case studies provided background knowledge that was then used in the second part, on modelling the entire stock. In the first part, a literature survey of case studies showed that, in a life cycle perspective, the energy used in the operating phase of buildings is the single most important factor. Design of low-energy buildings is then beneficial and should be pursued, even though it implies a somewhat higher embodied energy. A case study was performed on a school building. First, a methodology using a Monte Carlo method in the calibration process was explored. Then, the calibrated model of the school was used to investigate measures for the achievement of high energy efficiency standard through renovation work. In the second part, a model was developed to study the energy demand in a scenario analysis. The results showed the robustness of policies that included conservation measures against the conflicting effects of the other policies. Adopting conservation measures on a large scale showed the potential to reduce both electricity and total energy demand from present day levels while the building stock keeps growing. The results also highlighted the inertia to change of the building stock, due to low activity levels compared to the stock size. It also became clear that a deeper

  2. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  3. The Creation of Space Vector Models of Buildings From RPAS Photogrammetry Data

    Directory of Open Access Journals (Sweden)

    Trhan Ondrej

    2017-06-01

    Full Text Available The results of Remote Piloted Aircraft System (RPAS photogrammetry are digital surface models and orthophotos. The main problem of the digital surface models obtained is that buildings are not perpendicular and the shape of roofs is deformed. The task of this paper is to obtain a more accurate digital surface model using building reconstructions. The paper discusses the problem of obtaining and approximating building footprints, reconstructing the final spatial vector digital building model, and modifying the buildings on the digital surface model.

  4. A simple, scalable and low-cost method to generate thermal diagnostics of a domestic building

    International Nuclear Information System (INIS)

    Papafragkou, Anastasios; Ghosh, Siddhartha; James, Patrick A.B.; Rogers, Alex; Bahaj, AbuBakr S.

    2014-01-01

    Highlights: • Our diagnostic method uses a single field measurement from a temperature logger. • Building technical performance and occupant behaviour are addressed simultaneously. • Our algorithm learns a thermal model of a home and diagnoses the heating system. • We propose a novel clustering approach to decouple user behaviour from technical performance. • Our diagnostic confidence is enhanced using a large scale deployment. - Abstract: Traditional approaches to understand the problem of the energy performance in the domestic sector include on-site surveys by energy assessors and the installation of complex home energy monitoring systems. The time and money that needs to be invested by the occupants and the form of feedback generated by these approaches often makes them unattractive to householders. This paper demonstrates a simple, low cost method that generates thermal diagnostics for dwellings, measuring only one field dataset; internal temperature over a period of 1 week. A thermal model, which is essentially a learning algorithm, generates a set of thermal diagnostics about the primary heating system, the occupants’ preferences and the impact of certain interventions, such as lowering the thermostat set-point. A simple clustering approach is also proposed to categorise homes according to their building fabric thermal performance and occupants’ energy efficiency with respect to ventilation. The advantage of this clustering approach is that the occupants receive tailored advice on certain actions that if taken will improve the overall thermal performance of a dwelling. Due to the method’s low cost and simplicity it could facilitate government initiatives, such as the ‘Green Deal’ in the UK

  5. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  6. Features of Functioning the Integrated Building Thermal Model

    Directory of Open Access Journals (Sweden)

    Morozov Maxim N.

    2017-01-01

    Full Text Available A model of the building heating system, consisting of energy source, a distributed automatic control system, elements of individual heating unit and heating system is designed. Application Simulink of mathematical package Matlab is selected as a platform for the model. There are the specialized application Simscape libraries in aggregate with a wide range of Matlab mathematical tools allow to apply the “acausal” modeling concept. Implementation the “physical” representation of the object model gave improving the accuracy of the models. Principle of operation and features of the functioning of the thermal model is described. The investigations of building cooling dynamics were carried out.

  7. Investigation Into Informational Compatibility Of Building Information Modelling And Building Performance Analysis Software Solutions

    OpenAIRE

    Hyun, S.; Marjanovic-Halburd, L.; Raslan, R.

    2015-01-01

    There are significant opportunities for Building Information Modelling (BIM) to address issues related to sustainable and energy efficient building design. While the potential benefits associated with the integration of BIM and BPA (Building Performance Analysis) have been recognised, its specifications and formats remain in their early infancy and often fail to live up to the promise of seamless interoperability at various stages of design process. This paper conducts a case study to investi...

  8. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  9. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  10. Progress in D-brane model building

    International Nuclear Information System (INIS)

    Marchesano, F.

    2007-01-01

    The state of the art in D-brane model building is briefly reviewed, focusing on recent achievements in the construction of D=4 N=1 type II string vacua with semi-realistic gauge sectors. Such progress relies on a better understanding of the spectrum of BPS D-branes, the effective field theory obtained from them and the explicit construction of vacua. We first consider D-branes in standard Calabi-Yau compactifications, and then the more involved case of compactifications with fluxes. We discuss how the non-trivial interplay between D-branes and fluxes modifies the previous model-building rules, as well as provides new possibilities to connect string theory to particle physics. (Abstract Copyright [2007], Wiley Periodicals, Inc.)

  11. Scalable robotic biofabrication of tissue spheroids

    International Nuclear Information System (INIS)

    Mehesz, A Nagy; Hajdu, Z; Visconti, R P; Markwald, R R; Mironov, V; Brown, J; Beaver, W; Da Silva, J V L

    2011-01-01

    Development of methods for scalable biofabrication of uniformly sized tissue spheroids is essential for tissue spheroid-based bioprinting of large size tissue and organ constructs. The most recent scalable technique for tissue spheroid fabrication employs a micromolded recessed template prepared in a non-adhesive hydrogel, wherein the cells loaded into the template self-assemble into tissue spheroids due to gravitational force. In this study, we present an improved version of this technique. A new mold was designed to enable generation of 61 microrecessions in each well of a 96-well plate. The microrecessions were seeded with cells using an EpMotion 5070 automated pipetting machine. After 48 h of incubation, tissue spheroids formed at the bottom of each microrecession. To assess the quality of constructs generated using this technology, 600 tissue spheroids made by this method were compared with 600 spheroids generated by the conventional hanging drop method. These analyses showed that tissue spheroids fabricated by the micromolded method are more uniform in diameter. Thus, use of micromolded recessions in a non-adhesive hydrogel, combined with automated cell seeding, is a reliable method for scalable robotic fabrication of uniform-sized tissue spheroids.

  12. Scalable robotic biofabrication of tissue spheroids

    Energy Technology Data Exchange (ETDEWEB)

    Mehesz, A Nagy; Hajdu, Z; Visconti, R P; Markwald, R R; Mironov, V [Advanced Tissue Biofabrication Center, Department of Regenerative Medicine and Cell Biology, Medical University of South Carolina, Charleston, SC (United States); Brown, J [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Beaver, W [York Technical College, Rock Hill, SC (United States); Da Silva, J V L, E-mail: mironovv@musc.edu [Renato Archer Information Technology Center-CTI, Campinas (Brazil)

    2011-06-15

    Development of methods for scalable biofabrication of uniformly sized tissue spheroids is essential for tissue spheroid-based bioprinting of large size tissue and organ constructs. The most recent scalable technique for tissue spheroid fabrication employs a micromolded recessed template prepared in a non-adhesive hydrogel, wherein the cells loaded into the template self-assemble into tissue spheroids due to gravitational force. In this study, we present an improved version of this technique. A new mold was designed to enable generation of 61 microrecessions in each well of a 96-well plate. The microrecessions were seeded with cells using an EpMotion 5070 automated pipetting machine. After 48 h of incubation, tissue spheroids formed at the bottom of each microrecession. To assess the quality of constructs generated using this technology, 600 tissue spheroids made by this method were compared with 600 spheroids generated by the conventional hanging drop method. These analyses showed that tissue spheroids fabricated by the micromolded method are more uniform in diameter. Thus, use of micromolded recessions in a non-adhesive hydrogel, combined with automated cell seeding, is a reliable method for scalable robotic fabrication of uniform-sized tissue spheroids.

  13. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  14. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  15. Architectures and Applications for Scalable Quantum Information Systems

    Science.gov (United States)

    2007-01-01

    Gershenfeld and I. Chuang. Quantum computing with molecules. Scientific American, June 1998. [16] A. Globus, D. Bailey, J. Han, R. Jaffe, C. Levit , R...AFRL-IF-RS-TR-2007-12 Final Technical Report January 2007 ARCHITECTURES AND APPLICATIONS FOR SCALABLE QUANTUM INFORMATION SYSTEMS...NUMBER 5b. GRANT NUMBER FA8750-01-2-0521 4. TITLE AND SUBTITLE ARCHITECTURES AND APPLICATIONS FOR SCALABLE QUANTUM INFORMATION SYSTEMS 5c

  16. Extending JPEG-LS for low-complexity scalable video coding

    DEFF Research Database (Denmark)

    Ukhanova, Anna; Sergeev, Anton; Forchhammer, Søren

    2011-01-01

    JPEG-LS, the well-known international standard for lossless and near-lossless image compression, was originally designed for non-scalable applications. In this paper we propose a scalable modification of JPEG-LS and compare it with the leading image and video coding standards JPEG2000 and H.264/SVC...

  17. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  18. Near-Source Modeling Updates: Building Downwash & Near-Road

    Science.gov (United States)

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  19. Sustainability and scalability of a volunteer-based primary care intervention (Health TAPESTRY): a mixed-methods analysis.

    Science.gov (United States)

    Kastner, Monika; Sayal, Radha; Oliver, Doug; Straus, Sharon E; Dolovich, Lisa

    2017-08-01

    Health TAPESTRY team members (53% response rate) completed the NHS sustainability survey. The overall mean sustainability score was 64.6 (range 22.8-96.8). Important opportunities for improving sustainability were better staff involvement and training, clinical leadership engagement, and infrastructure for sustainability. Interviews with 25 participants (response rate 60%) showed that factors influencing the sustainability and scalability of Health TAPESTRY emerged across two dimensions: I) Health TAPESTRY operations (development and implementation activities undertaken by the central team); and II) the Health TAPESTRY intervention (factors specific to the intervention and its elements). Resource capacity appears to be an important factor to consider for Health TAPESTRY operations as it was identified across both sustainability and scalability factors; and perceived lack of interprofessional team and volunteer resource capacity and the need for stakeholder buy-in are important considerations for the Health TAPESTRY intervention. We used these findings to create actionable recommendations to initiate dialogue among Health TAPESTRY team members to improve the intervention. Our study identified sustainability and scalability determinants of the Health TAPESTRY intervention that can be used to optimize its potential for impact. Next steps will involve using findings to inform a guide to facilitate sustainability and scalability of Health TAPESTRY in other jurisdictions considering its adoption. Our findings build on the limited current knowledge of sustainability, and advances KT science related to the sustainability and scalability of KT interventions.

  20. An Extensible Sensing and Control Platform for Building Energy Management

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, Anthony [Carnegie Mellon Univ., Pittsburgh, PA (United States); Berges, Mario [Carnegie Mellon Univ., Pittsburgh, PA (United States); Martin, Christopher [Robert Bosch LLC, Anderson, SC (United States)

    2016-04-03

    The goal of this project is to develop Mortar.io, an open-source BAS platform designed to simplify data collection, archiving, event scheduling and coordination of cross-system interactions. Mortar.io is optimized for (1) robustness to network outages, (2) ease of installation using plug-and-play and (3) scalable support for small to large buildings and campuses.

  1. Traffic and Quality Characterization of the H.264/AVC Scalable Video Coding Extension

    Directory of Open Access Journals (Sweden)

    Geert Van der Auwera

    2008-01-01

    Full Text Available The recent scalable video coding (SVC extension to the H.264/AVC video coding standard has unprecedented compression efficiency while supporting a wide range of scalability modes, including temporal, spatial, and quality (SNR scalability, as well as combined spatiotemporal SNR scalability. The traffic characteristics, especially the bit rate variabilities, of the individual layer streams critically affect their network transport. We study the SVC traffic statistics, including the bit rate distortion and bit rate variability distortion, with long CIF resolution video sequences and compare them with the corresponding MPEG-4 Part 2 traffic statistics. We consider (i temporal scalability with three temporal layers, (ii spatial scalability with a QCIF base layer and a CIF enhancement layer, as well as (iii quality scalability modes FGS and MGS. We find that the significant improvement in RD efficiency of SVC is accompanied by substantially higher traffic variabilities as compared to the equivalent MPEG-4 Part 2 streams. We find that separately analyzing the traffic of temporal-scalability only encodings gives reasonable estimates of the traffic statistics of the temporal layers embedded in combined spatiotemporal encodings and in the base layer of combined FGS-temporal encodings. Overall, we find that SVC achieves significantly higher compression ratios than MPEG-4 Part 2, but produces unprecedented levels of traffic variability, thus presenting new challenges for the network transport of scalable video.

  2. Building Information Modelling in Denmark and Iceland

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Jóhannesson, Elvar Ingi

    2013-01-01

    with BIM is studied. Based on findings from both parts, ideas and recommendations are put forward for the Icelandic building industry about feasible ways of implementing BIM. Findings – Among the results are that the use of BIM is very limited in the Icelandic companies compared to the other Nordic...... for making standards and guidelines related to BIM. Public building clients are also encouraged to consider initiating projects based on making simple building models of existing buildings in order to introduce the BIM technology to the industry. Icelandic companies are recommended to start implementing BIM...... countries. Research limitations/implications – The research is limited to the Nordic countries in Europe, but many recommendations could be relevant to other countries. Practical implications – It is recommended to the Icelandic building authorities to get into cooperation with their Nordic counterparts...

  3. Enabling Highly-Scalable Remote Memory Access Programming with MPI-3 One Sided

    Directory of Open Access Journals (Sweden)

    Robert Gerstenberger

    2014-01-01

    Full Text Available Modern interconnects offer remote direct memory access (RDMA features. Yet, most applications rely on explicit message passing for communications albeit their unwanted overheads. The MPI-3.0 standard defines a programming interface for exploiting RDMA networks directly, however, it's scalability and practicability has to be demonstrated in practice. In this work, we develop scalable bufferless protocols that implement the MPI-3.0 specification. Our protocols support scaling to millions of cores with negligible memory consumption while providing highest performance and minimal overheads. To arm programmers, we provide a spectrum of performance models for all critical functions and demonstrate the usability of our library and models with several application studies with up to half a million processes. We show that our design is comparable to, or better than UPC and Fortran Coarrays in terms of latency, bandwidth and message rate. We also demonstrate application performance improvements with comparable programming complexity.

  4. Current State of the Art Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  5. Modeling of Dynamic Responses in Building Insulation

    Directory of Open Access Journals (Sweden)

    Anna Antonyová

    2015-10-01

    Full Text Available In this research a measurement systemwas developedfor monitoring humidity and temperature in the cavity between the wall and the insulating material in the building envelope. This new technology does not disturb the insulating material during testing. The measurement system can also be applied to insulation fixed ten or twenty years earlier and sufficiently reveals the quality of the insulation. A mathematical model is proposed to characterize the dynamic responses in the cavity between the wall and the building insulation as influenced by weather conditions.These dynamic responses are manifested as a delay of both humidity and temperature changes in the cavity when compared with the changes in the ambient surrounding of the building. The process is then modeled through numerical methods and statistical analysis of the experimental data obtained using the new system of measurement.

  6. LEGO® bricks as building blocks for centimeter-scale biological environments: the case of plants.

    Science.gov (United States)

    Lind, Kara R; Sizmur, Tom; Benomar, Saida; Miller, Anthony; Cademartiri, Ludovico

    2014-01-01

    LEGO bricks are commercially available interlocking pieces of plastic that are conventionally used as toys. We describe their use to build engineered environments for cm-scale biological systems, in particular plant roots. Specifically, we take advantage of the unique modularity of these building blocks to create inexpensive, transparent, reconfigurable, and highly scalable environments for plant growth in which structural obstacles and chemical gradients can be precisely engineered to mimic soil.

  7. On eliminating synchronous communication in molecular simulations to improve scalability

    Science.gov (United States)

    Straatsma, T. P.; Chavarría-Miranda, Daniel G.

    2013-12-01

    Molecular dynamics simulation, as a complementary tool to experimentation, has become an important methodology for the understanding and design of molecular systems as it provides access to properties that are difficult, impossible or prohibitively expensive to obtain experimentally. Many of the available software packages have been parallelized to take advantage of modern massively concurrent processing resources. The challenge in achieving parallel efficiency is commonly attributed to the fact that molecular dynamics algorithms are communication intensive. This paper illustrates how an appropriately chosen data distribution and asynchronous one-sided communication approach can be used to effectively deal with the data movement within the Global Arrays/ARMCI programming model framework. A new put_notify capability is presented here, allowing the implementation of the molecular dynamics algorithm without any explicit global or local synchronization or global data reduction operations. In addition, this push-data model is shown to very effectively allow hiding data communication behind computation. Rather than data movement or explicit global reductions, the implicit synchronization of the algorithm becomes the primary challenge for scalability. Without any explicit synchronous operations, the scalability of molecular simulations is shown to depend only on the ability to evenly balance computational load.

  8. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  9. Flood vulnerability assessment of residential buildings by explicit damage process modelling

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    2015-01-01

    The present paper introduces a vulnerability modelling approach for residential buildings in flood. The modelling approach explicitly considers relevant damage processes, i.e. water infiltration into the building, mechanical failure of components in the building envelope and damage from water...

  10. Scalable, full-colour and controllable chromotropic plasmonic printing

    OpenAIRE

    Xue, Jiancai; Zhou, Zhang-Kai; Wei, Zhiqiang; Su, Rongbin; Lai, Juan; Li, Juntao; Li, Chao; Zhang, Tengwei; Wang, Xue-Hua

    2015-01-01

    Plasmonic colour printing has drawn wide attention as a promising candidate for the next-generation colour-printing technology. However, an efficient approach to realize full colour and scalable fabrication is still lacking, which prevents plasmonic colour printing from practical applications. Here we present a scalable and full-colour plasmonic printing approach by combining conjugate twin-phase modulation with a plasmonic broadband absorber. More importantly, our approach also demonstrates ...

  11. Temporal scalability comparison of the H.264/SVC and distributed video codec

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Belyaev, Evgeny

    2009-01-01

    The problem of the multimedia scalable video streaming is a current topic of interest. There exist many methods for scalable video coding. This paper is focused on the scalable extension of H.264/AVC (H.264/SVC) and distributed video coding (DVC). The paper presents an efficiency comparison of SV...

  12. Co-Simulation of Detailed Whole Building with the Power System to Study Smart Grid Applications

    Energy Technology Data Exchange (ETDEWEB)

    Makhmalbaf, Atefe; Fuller, Jason C.; Srivastava, Viraj; Ciraci, Selim; Daily, Jeffrey A.

    2014-12-24

    Modernization of the power system in a way that ensures a sustainable energy system is arguably one of the most pressing concerns of our time. Buildings are important components in the power system. First, they are the main consumers of electricity and secondly, they do not have constant energy demand. Conventionally, electricity has been difficult to store and should be consumed as it is generated. Therefore, maintaining the demand and supply is critical in the power system. However, to reduce the complexity of power models, buildings (i.e., end-use loads) are traditionally modeled and represented as aggregated “dumb” nodes in the power system. This means we lack effective detailed whole building energy models that can support requirements and emerging technologies of the smart power grid. To gain greater insight into the relationship between building energy demand and power system performance, it is important to constitute a co-simulation framework to support detailed building energy modeling and simulation within the power system to study capabilities promised by the modern power grid. This paper discusses ongoing work at Pacific Northwest National Laboratory and presents underlying tools and framework needed to enable co-simulation of building, building energy systems and their control in the power system to study applications such as demand response, grid-based HVAC control, and deployment of buildings for ancillary services. The optimal goal is to develop an integrated modeling and simulation platform that is flexible, reusable, and scalable. Results of this work will contribute to future building and power system studies, especially those related to the integrated ‘smart grid’. Results are also expected to advance power resiliency and local (micro) scale grid studies where several building and renewable energy systems transact energy directly. This paper also reviews some applications that can be supported and studied using the framework introduced

  13. Robust Building Energy Load Forecasting Using Physically-Based Kernel Models

    Directory of Open Access Journals (Sweden)

    Anand Krishnan Prakash

    2018-04-01

    Full Text Available Robust and accurate building energy load forecasting is important for helping building managers and utilities to plan, budget, and strategize energy resources in advance. With recent prevalent adoption of smart-meters in buildings, a significant amount of building energy consumption data became available. Many studies have developed physics-based white box models and data-driven black box models to predict building energy consumption; however, they require extensive prior knowledge about building system, need a large set of training data, or lack robustness to different forecasting scenarios. In this paper, we introduce a new building energy forecasting method based on Gaussian Process Regression (GPR that incorporates physical insights about load data characteristics to improve accuracy while reducing training requirements. The GPR is a non-parametric regression method that models the data as a joint Gaussian distribution with mean and covariance functions and forecast using the Bayesian updating. We model the covariance function of the GPR to reflect the data patterns in different forecasting horizon scenarios, as prior knowledge. Our method takes advantage of the modeling flexibility and computational efficiency of the GPR while benefiting from the physical insights to further improve the training efficiency and accuracy. We evaluate our method with three field datasets from two university campuses (Carnegie Mellon University and Stanford University for both short- and long-term load forecasting. The results show that our method performs more accurately, especially when the training dataset is small, compared to other state-of-the-art forecasting models (up to 2.95 times smaller prediction error.

  14. Scalable and near-optimal design space exploration for embedded systems

    CERN Document Server

    Kritikakou, Angeliki; Goutis, Costas

    2014-01-01

    This book describes scalable and near-optimal, processor-level design space exploration (DSE) methodologies.  The authors present design methodologies for data storage and processing in real-time, cost-sensitive data-dominated embedded systems.  Readers will be enabled to reduce time-to-market, while satisfying system requirements for performance, area, and energy consumption, thereby minimizing the overall cost of the final design.   • Describes design space exploration (DSE) methodologies for data storage and processing in embedded systems, which achieve near-optimal solutions with scalable exploration time; • Presents a set of principles and the processes which support the development of the proposed scalable and near-optimal methodologies; • Enables readers to apply scalable and near-optimal methodologies to the intra-signal in-place optimization step for both regular and irregular memory accesses.

  15. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  16. Building Information Model: advantages, tools and adoption efficiency

    Science.gov (United States)

    Abakumov, R. G.; Naumov, A. E.

    2018-03-01

    The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.

  17. TLS for generating multi-LOD of 3D building model

    International Nuclear Information System (INIS)

    Akmalia, R; Setan, H; Majid, Z; Suwardhi, D; Chong, A

    2014-01-01

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown

  18. TLS for generating multi-LOD of 3D building model

    Science.gov (United States)

    Akmalia, R.; Setan, H.; Majid, Z.; Suwardhi, D.; Chong, A.

    2014-02-01

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown.

  19. State reduced order models for the modelling of the thermal behavior of buildings

    Energy Technology Data Exchange (ETDEWEB)

    Menezo, Christophe; Bouia, Hassan; Roux, Jean-Jacques; Depecker, Patrick [Institute National de Sciences Appliquees de Lyon, Villeurbanne Cedex, (France). Centre de Thermique de Lyon (CETHIL). Equipe Thermique du Batiment]. E-mail: menezo@insa-cethil-etb.insa-lyon.fr; bouia@insa-cethil-etb.insa-lyon.fr; roux@insa-cethil-etb.insa-lyon.fr; depecker@insa-cethil-etb.insa-lyon.fr

    2000-07-01

    This work is devoted to the field of building physics and related to the reduction of heat conduction models. The aim is to enlarge the model libraries of heat and mass transfer codes through limiting the considerable dimensions reached by the numerical systems during the modelling process of a multizone building. We show that the balanced realization technique, specifically adapted to the coupling of reduced order models with the other thermal phenomena, turns out to be very efficient. (author)

  20. Activity measurement and effective dose modelling of natural radionuclides in building material

    International Nuclear Information System (INIS)

    Maringer, F.J.; Baumgartner, A.; Rechberger, F.; Seidel, C.; Stietka, M.

    2013-01-01

    In this paper the assessment of natural radionuclides' activity concentration in building materials, calibration requirements and related indoor exposure dose models is presented. Particular attention is turned to specific improvements in low-level gamma-ray spectrometry to determine the activity concentration of necessary natural radionuclides in building materials with adequate measurement uncertainties. Different approaches for the modelling of the effective dose indoor due to external radiation resulted from natural radionuclides in building material and results of actual building material assessments are shown. - Highlights: • Dose models for indoor radiation exposure due to natural radionuclides in building materials. • Strategies and methods in radionuclide metrology, activity measurement and dose modelling. • Selection of appropriate parameters in radiation protection standards for building materials. • Scientific-based limitations of indoor exposure due to natural radionuclides in building materials

  1. BIM, GIS and semantic models of cultural heritage buildings

    Directory of Open Access Journals (Sweden)

    Pavel Tobiáš

    2016-12-01

    Full Text Available Even though there has been a great development of using building information models in the AEC (Architecture/Engineering/Construction sector recently, creation of models of existing buildings is still not very usual. The cultural heritage documentation is still, in most cases, kept in the form of 2D drawings while these drawings mostly contain only geometry without semantics, attributes or definitions of relationships and hierarchies between particular building elements. All these additional information would, however, be very providential for the tasks of cultural heritage preservation, i.e. for the facility management of heritage buildings or for reconstruction planning and it would be suitable to manage all geometric and non-geometric information in a single 3D information model. This paper is based on the existing literature and focuses on the historic building information modelling to provide information about the current state of the art. First, a summary of available software tools is introduced while not only the BIM tools but also the related GIS software is considered. This is followed by a review of existing efforts worldwide and an evaluation of the facts found.

  2. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Staci R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-01

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. These methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.

  3. Study on vertical seismic response model of BWR-type reactor building

    International Nuclear Information System (INIS)

    Konno, T.; Motohashi, S.; Izumi, M.; Iizuka, S.

    1993-01-01

    A study on advanced seismic design for LWR has been carried out by the Nuclear Power Engineering Corporation (NUPEC), under the sponsorship of the Ministry of International Trade and Industry (MITI) of Japan. As a part of the study, it has been investigated to construct an accurate analytical model of reactor buildings for a seismic response analysis, which can reasonably represent dynamic characteristics of the building. In Japan, vibration models of reactor buildings for horizontal ground motion have been studied and examined through many simulation analyses for forced vibration tests and earthquake observations of actual buildings. And now it is possible to establish a reliable horizontal vibration model on the basis of multi-lumped mass and spring model. However, vertical vibration models have not been so much studied as horizontal models, due to less observed data for vertical motions. In this paper, the vertical seismic response models of a BWR-type reactor building including soil-structure interaction effect are numerically studied, by comparing the dynamic characteristics of (1) three dimensional finite element model, (2) multi-stick lumped mass model with a flexible base-mat, (3) multi-stick lumped mass model with a rigid base-mat and (4) single-stick lumped mass model. In particular, the BWR-type reactor building has the long span truss roof which is considered to be one of the critical members to vertical excitation. The modelings of the roof trusses are also studied

  4. Quality Scalability Compression on Single-Loop Solution in HEVC

    Directory of Open Access Journals (Sweden)

    Mengmeng Zhang

    2014-01-01

    Full Text Available This paper proposes a quality scalable extension design for the upcoming high efficiency video coding (HEVC standard. In the proposed design, the single-loop decoder solution is extended into the proposed scalable scenario. A novel interlayer intra/interprediction is added to reduce the amount of bits representation by exploiting the correlation between coding layers. The experimental results indicate that the average Bjøntegaard delta rate decrease of 20.50% can be gained compared with the simulcast encoding. The proposed technique achieved 47.98% Bjøntegaard delta rate reduction compared with the scalable video coding extension of the H.264/AVC. Consequently, significant rate savings confirm that the proposed method achieves better performance.

  5. Transaction-Based Controls for Building-Grid Integration: VOLTTRON™

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A.; Haack, Jereme N.; Hernandez, George; Katipamula, Srinivas; Widergren, Steven E.

    2015-07-01

    The U.S. Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of a “transactional network” concept that supports energy, operational, and financial transactions between building systems (e.g., rooftop units -- RTUs), and the electric power grid using applications, or 'agents', that reside either on the equipment, on local building controllers, or in the Cloud. The transactional network vision is delivered using a real-time, scalable reference platform called VOLTTRON that supports the needs of the changing energy system. VOLTTRON is an agent execution and an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions.

  6. Building Energy Modeling and Control Methods for Optimization and Renewables Integration

    Science.gov (United States)

    Burger, Eric M.

    This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled

  7. Use of MCAM in creating 3D neutronics model for ITER building

    International Nuclear Information System (INIS)

    Zeng Qin; Wang Guozhong; Dang Tongqiang; Long Pengcheng; Loughlin, Michael

    2012-01-01

    Highlights: ► We created a 3D neutronics model of the ITER building. ► The model was produced from the engineering CAD model by MCAM software. ► The neutron flux map in the ITER building was calculated. - Abstract: The three dimensional (3D) neutronics reference model of International Thermonuclear Experimental Reactor (ITER) only defines the tokamak machine and extends to the bio-shield. In order to meet further 3D neutronics analysis needs, it is necessary to create a 3D reference model of the ITER building. Monte Carlo Automatic Modeling Program for Radiation Transport Simulation (MCAM) was developed as a computer aided design (CAD) based bi-directional interface program between general CAD systems and Monte Carlo radiation transport simulation codes. With the help of MCAM version 4.8, the 3D neutronics model of ITER building was created based on the engineering CAD model. The calculation of the neutron flux map in ITER building during operation showed the correctness and usability of the model. This model is the first detailed ITER building 3D neutronics model and it will be made available to all international organization collaborators as a reference model.

  8. A system to build distributed multivariate models and manage disparate data sharing policies: implementation in the scalable national network for effectiveness research.

    Science.gov (United States)

    Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila

    2015-11-01

    Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage

  9. SOL: A Library for Scalable Online Learning Algorithms

    OpenAIRE

    Wu, Yue; Hoi, Steven C. H.; Liu, Chenghao; Lu, Jing; Sahoo, Doyen; Yu, Nenghai

    2016-01-01

    SOL is an open-source library for scalable online learning algorithms, and is particularly suitable for learning with high-dimensional data. The library provides a family of regular and sparse online learning algorithms for large-scale binary and multi-class classification tasks with high efficiency, scalability, portability, and extensibility. SOL was implemented in C++, and provided with a collection of easy-to-use command-line tools, python wrappers and library calls for users and develope...

  10. Building a biodiversity content management system for science, education, and outreach

    Directory of Open Access Journals (Sweden)

    C S Parr

    2006-01-01

    Full Text Available We describe the system architecture and data template design for the Animal Diversity Web (http://www.animaldiversity.org, an online natural history resource serving three audiences: 1 the scientific community, 2 educators and learners, and 3 the general public. Our architecture supports highly scalable, flexible resource building by combining relational and object-oriented databases. Content resources are managed separately from identifiers that relate and display them. Websites targeting different audiences from the same database handle large volumes of traffic. Content contribution and legacy data are robust to changes in data models. XML and OWL versions of our data template set the stage for making ADW data accessible to other systems.

  11. Build IT: Scaling and Sustaining an Afterschool Computer Science Program for Girls

    Science.gov (United States)

    Koch, Melissa; Gorges, Torie; Penuel, William R.

    2012-01-01

    "Co-design"--including youth development staff along with curriculum designers--is the key to developing an effective program that is both scalable and sustainable. This article describes Build IT, a two-year afterschool and summer curriculum designed to help middle school girls develop fluency in information technology (IT), interest in…

  12. Ancestors protocol for scalable key management

    Directory of Open Access Journals (Sweden)

    Dieter Gollmann

    2010-06-01

    Full Text Available Group key management is an important functional building block for secure multicast architecture. Thereby, it has been extensively studied in the literature. The main proposed protocol is Adaptive Clustering for Scalable Group Key Management (ASGK. According to ASGK protocol, the multicast group is divided into clusters, where each cluster consists of areas of members. Each cluster uses its own Traffic Encryption Key (TEK. These clusters are updated periodically depending on the dynamism of the members during the secure session. The modified protocol has been proposed based on ASGK with some modifications to balance the number of affected members and the encryption/decryption overhead with any number of the areas when a member joins or leaves the group. This modified protocol is called Ancestors protocol. According to Ancestors protocol, every area receives the dynamism of the members from its parents. The main objective of the modified protocol is to reduce the number of affected members during the leaving and joining members, then 1 affects n overhead would be reduced. A comparative study has been done between ASGK protocol and the modified protocol. According to the comparative results, it found that the modified protocol is always outperforming the ASGK protocol.

  13. Reduced order modeling and parameter identification of a building energy system model through an optimization routine

    International Nuclear Information System (INIS)

    Harish, V.S.K.V.; Kumar, Arun

    2016-01-01

    Highlights: • A BES model based on 1st principles is developed and solved numerically. • Parameters of lumped capacitance model are fitted using the proposed optimization routine. • Validations are showed for different types of building construction elements. • Step response excitations for outdoor air temperature and relative humidity are analyzed. - Abstract: Different control techniques together with intelligent building technology (Building Automation Systems) are used to improve energy efficiency of buildings. In almost all control projects, it is crucial to have building energy models with high computational efficiency in order to design and tune the controllers and simulate their performance. In this paper, a set of partial differential equations are formulated accounting for energy flow within the building space. These equations are then solved as conventional finite difference equations using Crank–Nicholson scheme. Such a model of a higher order is regarded as a benchmark model. An optimization algorithm has been developed, depicted through a flowchart, which minimizes the sum squared error between the step responses of the numerical and the optimal model. Optimal model of the construction element is nothing but a RC-network model with the values of Rs and Cs estimated using the non-linear time invariant constrained optimization routine. The model is validated with comparing the step responses with other two RC-network models whose parameter values are selected based on a certain criteria. Validations are showed for different types of building construction elements viz., low, medium and heavy thermal capacity elements. Simulation results show that the optimal model closely follow the step responses of the numerical model as compared to the responses of other two models.

  14. Impact of whole-building hygrothermal modelling on the assessment of indoor climate in a library building

    Energy Technology Data Exchange (ETDEWEB)

    Steeman, M.; Janssens, A. [Ghent University, Department of Architecture and Urban Planning, Jozef Plateaustraat 22, B-9000 Gent (Belgium); De Paepe, M. [Ghent University, Department of Flow, Heat and Combustion Mechanics, Sint-Pietersnieuwstraat 41, B-9000 Gent (Belgium)

    2010-07-15

    This paper focuses on the importance of accurately modelling the hygrothermal interaction between the building and its hygroscopic content for the assessment of the indoor climate. Libraries contain a large amount of stored books which require a stable relative humidity to guarantee their preservation. On the other hand, visitors and staff must be comfortable with the indoor climate. The indoor climate of a new library building is evaluated by means of measurements and simulations. Complaints of the staff are confirmed by measured data during the winter and summer of 2007-2008. For the evaluation of the indoor climate, a building simulation model is used in which the porous books are either described by a HAM model or by a simplified isothermal model. Calculations demonstrate that the HAM model predicts a more stable indoor climate regarding both temperature and relative humidity variations in comparison to the estimations by the simplified model. This is attributed to the ability of the HAM model to account for the effect of temperature variations on moisture storage. Moreover, by applying the HAM model, a good agreement with the measured indoor climate is found. As expected, a larger exposed book surface ameliorates the indoor climate because a more stable indoor relative humidity is obtained. Finally, the building simulation model is used to improve the indoor climate with respect to the preservation of valuable books. Results demonstrate that more stringent interventions on the air handling unit are expected when a simplified approach is used to model the hygroscopic books. (author)

  15. A scalable distributed RRT for motion planning

    KAUST Repository

    Jacobs, Sam Ade

    2013-05-01

    Rapidly-exploring Random Tree (RRT), like other sampling-based motion planning methods, has been very successful in solving motion planning problems. Even so, sampling-based planners cannot solve all problems of interest efficiently, so attention is increasingly turning to parallelizing them. However, one challenge in parallelizing RRT is the global computation and communication overhead of nearest neighbor search, a key operation in RRTs. This is a critical issue as it limits the scalability of previous algorithms. We present two parallel algorithms to address this problem. The first algorithm extends existing work by introducing a parameter that adjusts how much local computation is done before a global update. The second algorithm radially subdivides the configuration space into regions, constructs a portion of the tree in each region in parallel, and connects the subtrees,i removing cycles if they exist. By subdividing the space, we increase computation locality enabling a scalable result. We show that our approaches are scalable. We present results demonstrating almost linear scaling to hundreds of processors on a Linux cluster and a Cray XE6 machine. © 2013 IEEE.

  16. A scalable distributed RRT for motion planning

    KAUST Repository

    Jacobs, Sam Ade; Stradford, Nicholas; Rodriguez, Cesar; Thomas, Shawna; Amato, Nancy M.

    2013-01-01

    Rapidly-exploring Random Tree (RRT), like other sampling-based motion planning methods, has been very successful in solving motion planning problems. Even so, sampling-based planners cannot solve all problems of interest efficiently, so attention is increasingly turning to parallelizing them. However, one challenge in parallelizing RRT is the global computation and communication overhead of nearest neighbor search, a key operation in RRTs. This is a critical issue as it limits the scalability of previous algorithms. We present two parallel algorithms to address this problem. The first algorithm extends existing work by introducing a parameter that adjusts how much local computation is done before a global update. The second algorithm radially subdivides the configuration space into regions, constructs a portion of the tree in each region in parallel, and connects the subtrees,i removing cycles if they exist. By subdividing the space, we increase computation locality enabling a scalable result. We show that our approaches are scalable. We present results demonstrating almost linear scaling to hundreds of processors on a Linux cluster and a Cray XE6 machine. © 2013 IEEE.

  17. Aspects of superstring model-building

    International Nuclear Information System (INIS)

    Ellis, J.

    1989-01-01

    Several approaches to model-building with strings are discussed, including Calabi-Yau manifolds and fermionic formulations of strings directly in four dimensions. Ideas about supersymmetry breaking are reviewed. Flipped SU(5)xU(1) is touted as the theory of everything below the Planck scale (perhaps). (author). 64 refs, 7 figs

  18. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schellen, H.L.; Schijndel, van A.W.M.; Blades, N.

    2014-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  19. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schijndel, van A.W.M.; Schellen, H.L.; Blades, N.; Mahdavi, A.; Mertens, B.

    2013-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  20. LEGO® bricks as building blocks for centimeter-scale biological environments: the case of plants.

    Directory of Open Access Journals (Sweden)

    Kara R Lind

    Full Text Available LEGO bricks are commercially available interlocking pieces of plastic that are conventionally used as toys. We describe their use to build engineered environments for cm-scale biological systems, in particular plant roots. Specifically, we take advantage of the unique modularity of these building blocks to create inexpensive, transparent, reconfigurable, and highly scalable environments for plant growth in which structural obstacles and chemical gradients can be precisely engineered to mimic soil.

  1. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  2. Integrating Building Information Modeling and Augmented Reality to Improve Investigation of Historical Buildings

    Directory of Open Access Journals (Sweden)

    Francesco Chionna

    2015-12-01

    Full Text Available This paper describes an experimental system to support investigation of historical buildings using Building Information Modeling (BIM and Augmented Reality (AR. The system requires the use of an off-line software to build the BIM representation and defines a method to integrate diagnostic data into BIM. The system offers access to such information during site investigation using AR glasses supported by marker and marker-less technologies. The main innovation is the possibility to contextualize through AR not only existing BIM properties but also results from non-invasive tools. User evaluations show how the use of the system may enhance the perception of engineers during the investigation process.

  3. CloudTPS: Scalable Transactions for Web Applications in the Cloud

    NARCIS (Netherlands)

    Zhou, W.; Pierre, G.E.O.; Chi, C.-H.

    2010-01-01

    NoSQL Cloud data services provide scalability and high availability properties for web applications but at the same time they sacrifice data consistency. However, many applications cannot afford any data inconsistency. CloudTPS is a scalable transaction manager to allow cloud database services to

  4. Modeling Aggregate Hourly Energy Consumption in a Regional Building Stock

    Directory of Open Access Journals (Sweden)

    Anna Kipping

    2017-12-01

    Full Text Available Sound estimates of future heat and electricity demand with high temporal and spatial resolution are needed for energy system planning, grid design, and evaluating demand-side management options and polices on regional and national levels. In this study, smart meter data on electricity consumption in buildings are combined with cross-sectional building information to model hourly electricity consumption within the household and service sectors on a regional basis in Norway. The same modeling approach is applied to model aggregate hourly district heat consumption in three different consumer groups located in Oslo. A comparison of modeled and metered hourly energy consumption shows that hourly variations and aggregate consumption per county and year are reproduced well by the models. However, for some smaller regions, modeled annual electricity consumption is over- or underestimated by more than 20%. Our results indicate that the presented method is useful for modeling the current and future hourly energy consumption of a regional building stock, but that larger and more detailed training datasets are required to improve the models, and more detailed building stock statistics on regional level are needed to generate useful estimates on aggregate regional energy consumption.

  5. Use of MCAM in creating 3D neutronics model for ITER building

    Energy Technology Data Exchange (ETDEWEB)

    Zeng Qin [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Wang Guozhong, E-mail: mango33@mail.ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Dang Tongqiang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Long Pengcheng [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Loughlin, Michael [ITER Organization, Route de Vinon sur Verdon, 13115 St. Paul-Lz-Durance (France)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer We created a 3D neutronics model of the ITER building. Black-Right-Pointing-Pointer The model was produced from the engineering CAD model by MCAM software. Black-Right-Pointing-Pointer The neutron flux map in the ITER building was calculated. - Abstract: The three dimensional (3D) neutronics reference model of International Thermonuclear Experimental Reactor (ITER) only defines the tokamak machine and extends to the bio-shield. In order to meet further 3D neutronics analysis needs, it is necessary to create a 3D reference model of the ITER building. Monte Carlo Automatic Modeling Program for Radiation Transport Simulation (MCAM) was developed as a computer aided design (CAD) based bi-directional interface program between general CAD systems and Monte Carlo radiation transport simulation codes. With the help of MCAM version 4.8, the 3D neutronics model of ITER building was created based on the engineering CAD model. The calculation of the neutron flux map in ITER building during operation showed the correctness and usability of the model. This model is the first detailed ITER building 3D neutronics model and it will be made available to all international organization collaborators as a reference model.

  6. Models test on dynamic structure-structure interaction of nuclear power plant buildings

    International Nuclear Information System (INIS)

    Kitada, Y.; Hirotani, T.

    1999-01-01

    A reactor building of an NPP (nuclear power plant) is generally constructed closely adjacent to a turbine building and other buildings such as the auxiliary building, and in increasing numbers of NPPs, multiple plants are being planned and constructed closely on a single site. In these situations, adjacent buildings are considered to influence each other through the soil during earthquakes and to exhibit dynamic behaviour different from that of separate buildings, because those buildings in NPP are generally heavy and massive. The dynamic interaction between buildings during earthquake through the soil is termed here as 'dynamic cross interaction (DCI)'. In order to comprehend DCI appropriately, forced vibration tests and earthquake observation are needed using closely constructed building models. Standing on this background, Nuclear Power Engineering Corporation (NUPEC) had planned the project to investigate the DCI effect in 1993 after the preceding SSI (soil-structure interaction) investigation project, 'model tests on embedment effect of reactor building'. The project consists of field and laboratory tests. The field test is being carried out using three different building construction conditions, e.g. a single reactor building to be used for the comparison purposes as for a reference, two same reactor buildings used to evaluate pure DCI effects, and two different buildings, reactor and turbine building models to evaluate DCI effects under the actual plant conditions. Forced vibration tests and earthquake observations are planned in the field test. The laboratory test is planned to evaluate basic characteristics of the DCI effects using simple soil model made of silicon rubber and structure models made of aluminum. In this test, forced vibration tests and shaking table tests are planned. The project was started in April 1994 and will be completed in March 2002. This paper describes an outline and the summary of the current status of this project. (orig.)

  7. Energy modelling and capacity building

    International Nuclear Information System (INIS)

    2005-01-01

    The Planning and Economic Studies Section of the IAEA's Department of Nuclear Energy is focusing on building analytical capacity in MS for energy-environmental-economic assessments and for the elaboration of sustainable energy strategies. It offers a variety of analytical models specifically designed for use in developing countries for (i) evaluating alternative energy strategies; (ii) assessing environmental, economic and financial impacts of energy options; (iii) assessing infrastructure needs; (iv) evaluating regional development possibilities and energy trade; (v) assessing the role of nuclear power in addressing priority issues (climate change, energy security, etc.). These models can be used for analysing energy or electricity systems, and to assess possible implications of different energy, environmental or financial policies that affect the energy sector and energy systems. The models vary in complexity and data requirements, and so can be adapted to the available data, statistics and analytical needs of different countries. These models are constantly updated to reflect changes in the real world and in the concerns that drive energy system choices. They can provide thoughtfully informed choices for policy makers over a broader range of circumstances and interests. For example, they can readily reflect the workings of competitive energy and electricity markets, and cover such topics as external costs. The IAEA further offers training in the use of these models and -just as important- in the interpretation and critical evaluation of results. Training of national teams to develop national competence over the full spectrum of models, is a high priority. The IAEA maintains a broad spectrum of databanks relevant to energy, economic and environmental analysis in MS, and make these data available to analysts in MS for use in their own analytical work. The Reference Technology Data Base (RTDB) and the Reference Data Series (RDS-1) are the major vehicles by which we

  8. DEVELOPING PARAMETRIC BUILDING MODELS – THE GANDIS USE CASE

    Directory of Open Access Journals (Sweden)

    W. Thaller

    2012-09-01

    Full Text Available In the course of a project related to green building design, we have created a group of eight parametric building models that can be manipulated interactively with respect to dimensions, number of floors, and a few other parameters. We report on the commonalities and differences between the models and the abstractions that we were able to identify.

  9. Scalable Packet Classification with Hash Tables

    Science.gov (United States)

    Wang, Pi-Chung

    In the last decade, the technique of packet classification has been widely deployed in various network devices, including routers, firewalls and network intrusion detection systems. In this work, we improve the performance of packet classification by using multiple hash tables. The existing hash-based algorithms have superior scalability with respect to the required space; however, their search performance may not be comparable to other algorithms. To improve the search performance, we propose a tuple reordering algorithm to minimize the number of accessed hash tables with the aid of bitmaps. We also use pre-computation to ensure the accuracy of our search procedure. Performance evaluation based on both real and synthetic filter databases shows that our scheme is effective and scalable and the pre-computation cost is moderate.

  10. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade; Manavi, Kasra; Burgos, Juan; Denny, Jory; Thomas, Shawna; Amato, Nancy M.

    2012-01-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  11. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade

    2012-05-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  12. Efficient Enhancement for Spatial Scalable Video Coding Transmission

    Directory of Open Access Journals (Sweden)

    Mayada Khairy

    2017-01-01

    Full Text Available Scalable Video Coding (SVC is an international standard technique for video compression. It is an extension of H.264 Advanced Video Coding (AVC. In the encoding of video streams by SVC, it is suitable to employ the macroblock (MB mode because it affords superior coding efficiency. However, the exhaustive mode decision technique that is usually used for SVC increases the computational complexity, resulting in a longer encoding time (ET. Many other algorithms were proposed to solve this problem with imperfection of increasing transmission time (TT across the network. To minimize the ET and TT, this paper introduces four efficient algorithms based on spatial scalability. The algorithms utilize the mode-distribution correlation between the base layer (BL and enhancement layers (ELs and interpolation between the EL frames. The proposed algorithms are of two categories. Those of the first category are based on interlayer residual SVC spatial scalability. They employ two methods, namely, interlayer interpolation (ILIP and the interlayer base mode (ILBM method, and enable ET and TT savings of up to 69.3% and 83.6%, respectively. The algorithms of the second category are based on full-search SVC spatial scalability. They utilize two methods, namely, full interpolation (FIP and the full-base mode (FBM method, and enable ET and TT savings of up to 55.3% and 76.6%, respectively.

  13. Building Component Library: An Online Repository to Facilitate Building Energy Model Creation; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, K.; Long, N.; Swindler, A.

    2012-05-01

    This paper describes the Building Component Library (BCL), the U.S. Department of Energy's (DOE) online repository of building components that can be directly used to create energy models. This comprehensive, searchable library consists of components and measures as well as the metadata which describes them. The library is also designed to allow contributors to easily add new components, providing a continuously growing, standardized list of components for users to draw upon.

  14. Four-dimensional strings: Phenomenology and model building

    International Nuclear Information System (INIS)

    Quiros, M.

    1989-01-01

    In these lectures we will review some of the last developments in string theories leading to the construction of realistic four-dimensional string models. Special attention will be paid to world-sheet and space-time supersymmetry, modular invariance and model building for supersymmetric and (tachyon-free) nonsupersymmetric ten and four-dimensional models. (orig.)

  15. Modeling hourly consumption of electricity and district heat in non-residential buildings

    International Nuclear Information System (INIS)

    Kipping, A.; Trømborg, E.

    2017-01-01

    Models for hourly consumption of heat and electricity in different consumer groups on a regional level can yield important data for energy system planning and management. In this study hourly meter data, combined with cross-sectional data derived from the Norwegian energy label database, is used to model hourly consumption of both district heat and electrical energy in office buildings and schools which either use direct electric heating (DEH) or non-electric hydronic heating (OHH). The results of the study show that modeled hourly total energy consumption in buildings with DEH and in buildings with OHH (supplied by district heat) exhibits differences, e.g. due to differences in heat distribution and control systems. In a normal year, in office buildings with OHH the main part of total modeled energy consumption is used for electric appliances, while in schools with OHH the main part is used for heating. In buildings with OHH the share of modeled annual heating energy is higher than in buildings with DEH. Although based on small samples our regression results indicate that the presented method can be used for modeling hourly energy consumption in non-residential buildings, but also that larger samples and additional cross-sectional information could yield improved models and more reliable results. - Highlights: • Schools with district heating (DH) tend to use less night-setback. • DH in office buildings tends to start earlier than direct electric heating (DEH). • In schools with DH the main part of annual energy consumption is used for heating. • In office buildings with DH the main part is used for electric appliances. • Buildings with DH use a larger share of energy for heating than buildings with DEH.

  16. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  17. Collaborative data analytics for smart buildings: opportunities and models

    DEFF Research Database (Denmark)

    Lazarova-Molnar, Sanja; Mohamed, Nader

    2018-01-01

    of collaborative data analytics for smart buildings, its benefits, as well as presently possible models of carrying it out. Furthermore, we present a framework for collaborative fault detection and diagnosis as a case of collaborative data analytics for smart buildings. We also provide a preliminary analysis...... of the energy efficiency benefit of such collaborative framework for smart buildings. The result shows that significant energy savings can be achieved for smart buildings using collaborative data analytics.......Smart buildings equipped with state-of-the-art sensors and meters are becoming more common. Large quantities of data are being collected by these devices. For a single building to benefit from its own collected data, it will need to wait for a long time to collect sufficient data to build accurate...

  18. A MODEL BUILDING CODE ARTICLE ON FALLOUT SHELTERS WITH RECOMMENDATIONS FOR INCLUSION OF REQUIREMENTS FOR FALLOUT SHELTER CONSTRUCTION IN FOUR NATIONAL MODEL BUILDING CODES.

    Science.gov (United States)

    American Inst. of Architects, Washington, DC.

    A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…

  19. Investigation on Reliability and Scalability of an FBG-Based Hierarchical AOFSN

    Directory of Open Access Journals (Sweden)

    Li-Mei Peng

    2010-03-01

    Full Text Available The reliability and scalability of large-scale based optical fiber sensor networks (AOFSN are considered in this paper. The AOFSN network consists of three-level hierarchical sensor network architectures. The first two levels consist of active interrogation and remote nodes (RNs and the third level, called the sensor subnet (SSN, consists of passive Fiber Bragg Gratings (FBGs and a few switches. The switch architectures in the RN and various SSNs to improve the reliability and scalability of AOFSN are studied. Two SSNs with a regular topology are proposed to support simple routing and scalability in AOFSN: square-based sensor cells (SSC and pentagon-based sensor cells (PSC. The reliability and scalability are evaluated in terms of the available sensing coverage in the case of one or multiple link failures.

  20. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Voluntary National Model Building Codes E Exhibit E... National Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2) of...

  1. A model for the sustainable selection of building envelope assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Huedo, Patricia, E-mail: huedo@uji.es [Universitat Jaume I (Spain); Mulet, Elena, E-mail: emulet@uji.es [Universitat Jaume I (Spain); López-Mesa, Belinda, E-mail: belinda@unizar.es [Universidad de Zaragoza (Spain)

    2016-02-15

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate the impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.

  2. A model for the sustainable selection of building envelope assemblies

    International Nuclear Information System (INIS)

    Huedo, Patricia; Mulet, Elena; López-Mesa, Belinda

    2016-01-01

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate the impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.

  3. Automatic generation and simulation of urban building energy models based on city datasets for city-scale building retrofit analysis

    International Nuclear Information System (INIS)

    Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann

    2017-01-01

    Highlights: •Developed methods and used data models to integrate city’s public building records. •Shading from neighborhood buildings strongly influences urban building performance. •A case study demonstrated the workflow, simulation and analysis of building retrofits. •CityBES retrofit analysis feature provides actionable information for decision making. •Discussed significance and challenges of urban building energy modeling. -- Abstract: Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details of using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city’s mild

  4. Seismic simulation analysis of nuclear reactor building by soil-building interaction model

    International Nuclear Information System (INIS)

    Muto, K.; Kobayashi, T.; Motohashi, S.; Kusano, N.; Mizuno, N.; Sugiyama, N.

    1981-01-01

    Seismic simulation analysis were performed for evaluating soil-structure interaction effects by an analytical approach using a 'Lattice Model' developed by the authors. The purpose of this paper is to check the adequacy of this procedure for analyzing soil-structure interaction by means of comparing computed results with recorded ones. The 'Lattice Model' approach employs a lumped mass interactive model, in which not only the structure but also the underlying and/or surrounding soil are modeled as descretized elements. The analytical model used for this study extends about 310 m in the horizontal direction and about 103 m in depth. The reactor building is modeled as three shearing-bending sticks (outer wall, inner wall and shield wall) and the underlying and surrounding soil are divided into four shearing sticks (column directly beneath the reactor building, adjacent, near and distant columns). A corresponding input base motion for the 'Lattice Model' was determined by a deconvolution analysis using a recorded motion at elevation -18.5 m in the free-field. The results of this simulation analysis were shown to be in reasonably good agreement with the recorded ones in the forms of the distribution of ground motions and structural responses, acceleration time histories and related response spectra. These results showed that the 'Lattice Model' approach was an appropriate one to estimate the soil-structure interaction effects. (orig./HP)

  5. Scalable Coverage Maintenance for Dense Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jun Lu

    2007-06-01

    Full Text Available Owing to numerous potential applications, wireless sensor networks have been attracting significant research effort recently. The critical challenge that wireless sensor networks often face is to sustain long-term operation on limited battery energy. Coverage maintenance schemes can effectively prolong network lifetime by selecting and employing a subset of sensors in the network to provide sufficient sensing coverage over a target region. We envision future wireless sensor networks composed of a vast number of miniaturized sensors in exceedingly high density. Therefore, the key issue of coverage maintenance for future sensor networks is the scalability to sensor deployment density. In this paper, we propose a novel coverage maintenance scheme, scalable coverage maintenance (SCOM, which is scalable to sensor deployment density in terms of communication overhead (i.e., number of transmitted and received beacons and computational complexity (i.e., time and space complexity. In addition, SCOM achieves high energy efficiency and load balancing over different sensors. We have validated our claims through both analysis and simulations.

  6. DISP: Optimizations towards Scalable MPI Startup

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Huansong [Florida State University, Tallahassee; Pophale, Swaroop S [ORNL; Gorentla Venkata, Manjunath [ORNL; Yu, Weikuan [Florida State University, Tallahassee

    2016-01-01

    Despite the popularity of MPI for high performance computing, the startup of MPI programs faces a scalability challenge as both the execution time and memory consumption increase drastically at scale. We have examined this problem using the collective modules of Cheetah and Tuned in Open MPI as representative implementations. Previous improvements for collectives have focused on algorithmic advances and hardware off-load. In this paper, we examine the startup cost of the collective module within a communicator and explore various techniques to improve its efficiency and scalability. Accordingly, we have developed a new scalable startup scheme with three internal techniques, namely Delayed Initialization, Module Sharing and Prediction-based Topology Setup (DISP). Our DISP scheme greatly benefits the collective initialization of the Cheetah module. At the same time, it helps boost the performance of non-collective initialization in the Tuned module. We evaluate the performance of our implementation on Titan supercomputer at ORNL with up to 4096 processes. The results show that our delayed initialization can speed up the startup of Tuned and Cheetah by an average of 32.0% and 29.2%, respectively, our module sharing can reduce the memory consumption of Tuned and Cheetah by up to 24.1% and 83.5%, respectively, and our prediction-based topology setup can speed up the startup of Cheetah by up to 80%.

  7. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  8. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  9. Blind Cooperative Routing for Scalable and Energy-Efficient Internet of Things

    KAUST Repository

    Bader, Ahmed; Alouini, Mohamed-Slim

    2016-01-01

    Multihop networking is promoted in this paper for energy-efficient and highly-scalable Internet of Things (IoT). Recognizing concerns related to the scalability of classical multihop routing and medium access techniques, the use of blind cooperation

  10. MODELLING AND SIMULATION MATTERS UPON THE STATIC ANALYSIS OF A BUILDING

    Directory of Open Access Journals (Sweden)

    DUTA Alina

    2017-05-01

    Full Text Available The present paper puts forward a method applied to determine the static analysis and the stress of a two-level building, via an analysis with finite elements for building construction domain. Prior to this, we shall deal with a strategic issue, i.e. the achievement of a model with finite elements to validate the best approximation for the building structure. The method endorsed comes to replace the mathematical model, which is more complicated. However, a central issue that has to be dealt with before determining the displacements and the stress analysis is the achievement of the model with finite elements, as the best approximation of the building structure.

  11. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  12. Open string model building

    International Nuclear Information System (INIS)

    Ishibashi, Nobuyuki; Onogi, Tetsuya

    1989-01-01

    Consistency conditions of open string theories, which can be a powerful tool in open string model building, are proposed. By making use of these conditions and assuming a simple prescription for the Chan-Paton factors, open string theories in several backgrounds are studied. We show that 1. there exist a large number of consistent bosonic open string theories on Z 2 orbifolds, 2. SO(32) type I superstring is the unique consistent model among fermionic string theories on the ten-dimensional flat Minkowski space, and 3. with our prescription for the Chan-Paton factors, there exist no consistent open superstring theories on (six-dimensional Minkowski space-time) x (Z 2 orbifold). (orig.)

  13. Rate control scheme for consistent video quality in scalable video codec.

    Science.gov (United States)

    Seo, Chan-Won; Han, Jong-Ki; Nguyen, Truong Q

    2011-08-01

    Multimedia data delivered to mobile devices over wireless channels or the Internet are complicated by bandwidth fluctuation and the variety of mobile devices. Scalable video coding has been developed as an extension of H.264/AVC to solve this problem. Since scalable video codec provides various scalabilities to adapt the bitstream for the channel conditions and terminal types, scalable codec is one of the useful codecs for wired or wireless multimedia communication systems, such as IPTV and streaming services. In such scalable multimedia communication systems, video quality fluctuation degrades the visual perception significantly. It is important to efficiently use the target bits in order to maintain a consistent video quality or achieve a small distortion variation throughout the whole video sequence. The scheme proposed in this paper provides a useful function to control video quality in applications supporting scalability, whereas conventional schemes have been proposed to control video quality in the H.264 and MPEG-4 systems. The proposed algorithm decides the quantization parameter of the enhancement layer to maintain a consistent video quality throughout the entire sequence. The video quality of the enhancement layer is controlled based on a closed-form formula which utilizes the residual data and quantization error of the base layer. The simulation results show that the proposed algorithm controls the frame quality of the enhancement layer in a simple operation, where the parameter decision algorithm is applied to each frame.

  14. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  15. Scalable Multi-Platform Distribution of Spatial 3d Contents

    Science.gov (United States)

    Klimke, J.; Hagedorn, B.; Döllner, J.

    2013-09-01

    Virtual 3D city models provide powerful user interfaces for communication of 2D and 3D geoinformation. Providing high quality visualization of massive 3D geoinformation in a scalable, fast, and cost efficient manner is still a challenging task. Especially for mobile and web-based system environments, software and hardware configurations of target systems differ significantly. This makes it hard to provide fast, visually appealing renderings of 3D data throughout a variety of platforms and devices. Current mobile or web-based solutions for 3D visualization usually require raw 3D scene data such as triangle meshes together with textures delivered from server to client, what makes them strongly limited in terms of size and complexity of the models they can handle. In this paper, we introduce a new approach for provisioning of massive, virtual 3D city models on different platforms namely web browsers, smartphones or tablets, by means of an interactive map assembled from artificial oblique image tiles. The key concept is to synthesize such images of a virtual 3D city model by a 3D rendering service in a preprocessing step. This service encapsulates model handling and 3D rendering techniques for high quality visualization of massive 3D models. By generating image tiles using this service, the 3D rendering process is shifted from the client side, which provides major advantages: (a) The complexity of the 3D city model data is decoupled from data transfer complexity (b) the implementation of client applications is simplified significantly as 3D rendering is encapsulated on server side (c) 3D city models can be easily deployed for and used by a large number of concurrent users, leading to a high degree of scalability of the overall approach. All core 3D rendering techniques are performed on a dedicated 3D rendering server, and thin-client applications can be compactly implemented for various devices and platforms.

  16. Early experiences building a software quality prediction model

    Science.gov (United States)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  17. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  18. Towards a Very Low Energy Building Stock: Modeling the U.S. Commercial Building Sector to Support Policy and Innovation Planning

    Energy Technology Data Exchange (ETDEWEB)

    Coffey, Brian; Borgeson, Sam; Selkowitz, Stephen; Apte, Josh; Mathew, Paul; Haves, Philip

    2009-07-01

    This paper describes the origin, structure and continuing development of a model of time varying energy consumption in the US commercial building stock. The model is based on a flexible structure that disaggregates the stock into various categories (e.g. by building type, climate, vintage and life-cycle stage) and assigns attributes to each of these (e.g. floor area and energy use intensity by fuel type and end use), based on historical data and user-defined scenarios for future projections. In addition to supporting the interactive exploration of building stock dynamics, the model has been used to study the likely outcomes of specific policy and innovation scenarios targeting very low future energy consumption in the building stock. Model use has highlighted the scale of the challenge of meeting targets stated by various government and professional bodies, and the importance of considering both new construction and existing buildings.

  19. Thermal Models for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2012-01-01

    the comfort of residents, proper prediction models for indoor temperature have to be developed. This paper presents a model for prediction of indoor temperature and power consumption from electrical space heating in an office building, using stochastic differential equations. The heat dynamic model is build......The Danish government has set the ambitious goal that the share of the total Danish electricity consumption, covered by wind energy, should be increased to 50% by year 2020. This asks for radical changes in how we utilize and transmit electricity in the future power grid. To fully utilize the high...... share of renewable power generation, which is in general intermittent and non-controllable, the consumption side has to be much more flexible than today. To achieve such flexibility, methods for moving power consumption in time, within the hourly timescale, have to be developed. One approach currently...

  20. FITTING OF PARAMETRIC BUILDING MODELS TO OBLIQUE AERIAL IMAGES

    Directory of Open Access Journals (Sweden)

    U. S. Panday

    2012-09-01

    Full Text Available In literature and in photogrammetric workstations many approaches and systems to automatically reconstruct buildings from remote sensing data are described and available. Those building models are being used for instance in city modeling or in cadastre context. If a roof overhang is present, the building walls cannot be estimated correctly from nadir-view aerial images or airborne laser scanning (ALS data. This leads to inconsistent building outlines, which has a negative influence on visual impression, but more seriously also represents a wrong legal boundary in the cadaster. Oblique aerial images as opposed to nadir-view images reveal greater detail, enabling to see different views of an object taken from different directions. Building walls are visible from oblique images directly and those images are used for automated roof overhang estimation in this research. A fitting algorithm is employed to find roof parameters of simple buildings. It uses a least squares algorithm to fit projected wire frames to their corresponding edge lines extracted from the images. Self-occlusion is detected based on intersection result of viewing ray and the planes formed by the building whereas occlusion from other objects is detected using an ALS point cloud. Overhang and ground height are obtained by sweeping vertical and horizontal planes respectively. Experimental results are verified with high resolution ortho-images, field survey, and ALS data. Planimetric accuracy of 1cm mean and 5cm standard deviation was obtained, while buildings' orientation were accurate to mean of 0.23° and standard deviation of 0.96° with ortho-image. Overhang parameters were aligned to approximately 10cm with field survey. The ground and roof heights were accurate to mean of – 9cm and 8cm with standard deviations of 16cm and 8cm with ALS respectively. The developed approach reconstructs 3D building models well in cases of sufficient texture. More images should be acquired for

  1. A financing model to solve financial barriers for implementing green building projects.

    Science.gov (United States)

    Lee, Sanghyo; Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained.

  2. Bibliography for the Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  3. Protocol to Manage Heritage-Building Interventions Using Heritage Building Information Modelling (HBIM

    Directory of Open Access Journals (Sweden)

    Isabel Jordan-Palomar

    2018-03-01

    Full Text Available The workflow in historic architecture projects presents problems related to the lack of clarity of processes, dispersion of information and the use of outdated tools. Different heritage organisations have showed interest in innovative methods to resolve those problems and improve cultural tourism for sustainable economic development. Building Information Modelling (BIM has emerged as a suitable computerised system for improving heritage management. Its application to historic buildings is named Historic BIM (HBIM. HBIM literature highlights the need for further research in terms of the overall processes of heritage projects, its practical implementation and a need for better cultural documentation. This work uses Design Science Research to develop a protocol to improve the workflow in heritage interdisciplinary projects. Research techniques used include documentary analysis, semi-structured interviews and focus groups. HBIM is proposed as a virtual model that will hold heritage data and will articulate processes. As a result, a simple and visual HBIM protocol was developed and applied in a real case study. The protocol was named BIMlegacy and it is divided into eight phases: building registration, determine intervention options, develop design for intervention, planning the physical intervention, physical intervention, handover, maintenance and culture dissemination. It contemplates all the stakeholders involved.

  4. Scalable electrophysiology in intact small animals with nanoscale suspended electrode arrays

    Science.gov (United States)

    Gonzales, Daniel L.; Badhiwala, Krishna N.; Vercosa, Daniel G.; Avants, Benjamin W.; Liu, Zheng; Zhong, Weiwei; Robinson, Jacob T.

    2017-07-01

    Electrical measurements from large populations of animals would help reveal fundamental properties of the nervous system and neurological diseases. Small invertebrates are ideal for these large-scale studies; however, patch-clamp electrophysiology in microscopic animals typically requires invasive dissections and is low-throughput. To overcome these limitations, we present nano-SPEARs: suspended electrodes integrated into a scalable microfluidic device. Using this technology, we have made the first extracellular recordings of body-wall muscle electrophysiology inside an intact roundworm, Caenorhabditis elegans. We can also use nano-SPEARs to record from multiple animals in parallel and even from other species, such as Hydra littoralis. Furthermore, we use nano-SPEARs to establish the first electrophysiological phenotypes for C. elegans models for amyotrophic lateral sclerosis and Parkinson's disease, and show a partial rescue of the Parkinson's phenotype through drug treatment. These results demonstrate that nano-SPEARs provide the core technology for microchips that enable scalable, in vivo studies of neurobiology and neurological diseases.

  5. Regulatory odour model development: Survey of modelling tools and datasets with focus on building effects

    DEFF Research Database (Denmark)

    Olesen, H. R.; Løfstrøm, P.; Berkowicz, R.

    dispersion models for estimating local concentration levels in general. However, the report focuses on some particular issues, which are relevant for subsequent work on odour due to animal production. An issue of primary concern is the effect that buildings (stables) have on flow and dispersion. The handling...... of building effects is a complicated problem, and a major part of the report is devoted to the treatment of building effects in dispersion models......A project within the framework of a larger research programme, Action Plan for the Aquatic Environment III (VMP III) aims towards improving an atmospheric dispersion model (OML). The OML model is used for regulatory applications in Denmark, and it is the candidate model to be used also in future...

  6. Scalable and Resilient Middleware to Handle Information Exchange during Environment Crisis

    Science.gov (United States)

    Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.

    2012-04-01

    The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and

  7. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  8. Development of surrogate models using artificial neural network for building shell energy labelling

    International Nuclear Information System (INIS)

    Melo, A.P.; Cóstola, D.; Lamberts, R.; Hensen, J.L.M.

    2014-01-01

    Surrogate models are an important part of building energy labelling programs, but these models still present low accuracy, particularly in cooling-dominated climates. The objective of this study was to evaluate the feasibility of using an artificial neural network (ANN) to improve the accuracy of surrogate models for labelling purposes. An ANN was applied to model the building stock of a city in Brazil, based on the results of extensive simulations using the high-resolution building energy simulation program EnergyPlus. Sensitivity and uncertainty analyses were carried out to evaluate the behaviour of the ANN model, and the variations in the best and worst performance for several typologies were analysed in relation to variations in the input parameters and building characteristics. The results obtained indicate that an ANN can represent the interaction between input and output data for a vast and diverse building stock. Sensitivity analysis showed that no single input parameter can be identified as the main factor responsible for the building energy performance. The uncertainty associated with several parameters plays a major role in assessing building energy performance, together with the facade area and the shell-to-floor ratio. The results of this study may have a profound impact as ANNs could be applied in the future to define regulations in many countries, with positive effects on optimizing the energy consumption. - Highlights: • We model several typologies which have variation in input parameters. • We evaluate the accuracy of surrogate models for labelling purposes. • ANN is applied to model the building stock. • Uncertainty in building plays a major role in the building energy performance. • Results show that ANN could help to develop building energy labelling systems

  9. Structural observability analysis and EKF based parameter estimation of building heating models

    Directory of Open Access Journals (Sweden)

    D.W.U. Perera

    2016-07-01

    Full Text Available Research for enhanced energy-efficient buildings has been given much recognition in the recent years owing to their high energy consumptions. Increasing energy needs can be precisely controlled by practicing advanced controllers for building Heating, Ventilation, and Air-Conditioning (HVAC systems. Advanced controllers require a mathematical building heating model to operate, and these models need to be accurate and computationally efficient. One main concern associated with such models is the accurate estimation of the unknown model parameters. This paper presents the feasibility of implementing a simplified building heating model and the computation of physical parameters using an off-line approach. Structural observability analysis is conducted using graph-theoretic techniques to analyze the observability of the developed system model. Then Extended Kalman Filter (EKF algorithm is utilized for parameter estimates using the real measurements of a single-zone building. The simulation-based results confirm that even with a simple model, the EKF follows the state variables accurately. The predicted parameters vary depending on the inputs and disturbances.

  10. Using Python to Construct a Scalable Parallel Nonlinear Wave Solver

    KAUST Repository

    Mandli, Kyle

    2011-01-01

    Computational scientists seek to provide efficient, easy-to-use tools and frameworks that enable application scientists within a specific discipline to build and/or apply numerical models with up-to-date computing technologies that can be executed on all available computing systems. Although many tools could be useful for groups beyond a specific application, it is often difficult and time consuming to combine existing software, or to adapt it for a more general purpose. Python enables a high-level approach where a general framework can be supplemented with tools written for different fields and in different languages. This is particularly important when a large number of tools are necessary, as is the case for high performance scientific codes. This motivated our development of PetClaw, a scalable distributed-memory solver for time-dependent nonlinear wave propagation, as a case-study for how Python can be used as a highlevel framework leveraging a multitude of codes, efficient both in the reuse of code and programmer productivity. We present scaling results for computations on up to four racks of Shaheen, an IBM BlueGene/P supercomputer at King Abdullah University of Science and Technology. One particularly important issue that PetClaw has faced is the overhead associated with dynamic loading leading to catastrophic scaling. We use the walla library to solve the issue which does so by supplanting high-cost filesystem calls with MPI operations at a low enough level that developers may avoid any changes to their codes.

  11. JEDDAH HISTORICAL BUILDING INFORMATION MODELING "JHBIM" OLD JEDDAH – SAUDI ARABIA

    Directory of Open Access Journals (Sweden)

    A. Baik

    2013-07-01

    Full Text Available The historic city of Jeddah faces serious issues in the conservation, documentation and recording of its valuable building stock. Terrestrial Laser Scanning and Architectural Photogrammetry have already been used in many Heritage sites in the world. The integration of heritage recording and Building Information Modelling (BIM has been introduced as HBIM and is now a method to document and manage these buildings. In the last decade many traditional surveying methods were used to record the buildings in Old Jeddah. However, these methods take a long time, can sometimes provide unreliable information and often lack completeness. This paper will look at another approach for heritage recording by using the Jeddah Historical Building Information Modelling (JHBIM.

  12. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    Yiming Yan

    2017-01-01

    Full Text Available In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM, which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.

  13. Temporal Scalability of Dynamic Volume Data using Mesh Compensated Wavelet Lifting.

    Science.gov (United States)

    Schnurrer, Wolfgang; Pallast, Niklas; Richter, Thomas; Kaup, Andre

    2017-10-12

    Due to their high resolution, dynamic medical 2D+t and 3D+t volumes from computed tomography (CT) and magnetic resonance tomography (MR) reach a size which makes them very unhandy for teleradiologic applications. A lossless scalable representation offers the advantage of a down-scaled version which can be used for orientation or previewing, while the remaining information for reconstructing the full resolution is transmitted on demand. The wavelet transform offers the desired scalability. A very high quality of the lowpass sub-band is crucial in order to use it as a down-scaled representation. We propose an approach based on compensated wavelet lifting for obtaining a scalable representation of dynamic CT and MR volumes with very high quality. The mesh compensation is feasible to model the displacement in dynamic volumes which is mainly given by expansion and contraction of tissue over time. To achieve this, we propose an optimized estimation of the mesh compensation parameters to optimally fit for dynamic volumes. Within the lifting structure, the inversion of the motion compensation is crucial in the update step. We propose to take this inversion directly into account during the estimation step and can improve the quality of the lowpass sub-band by 0.63 dB and 0.43 dB on average for our tested dynamic CT and MR volumes at the cost of an increase of the rate by 2.4% and 1.2% on average.

  14. Modelling of settlement induced building damage

    NARCIS (Netherlands)

    Giardina, G.

    2013-01-01

    This thesis focuses on the modelling of settlement induced damage to masonry buildings. In densely populated areas, the need for new space is nowadays producing a rapid increment of underground excavations. Due to the construction of new metro lines, tunnelling activity in urban areas is growing.

  15. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    Model Predictive Control (MPC) of building systems is a promising approach to optimize building energy performance. In contrast to traditional control strategies which are reactive in nature, MPC optimizes the utilization of resources based on the predicted effects. It has been shown that energy ...

  16. A Financing Model to Solve Financial Barriers for Implementing Green Building Projects

    Science.gov (United States)

    Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained. PMID:24376379

  17. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    International Nuclear Information System (INIS)

    Jang, Minho; Hong, Taehoon; Ji, Changyoon

    2015-01-01

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C 6 H 6 eq.) calculated by the previous model was much lower (1965 kg C 6 H 6 eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings

  18. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach

    International Nuclear Information System (INIS)

    Lü, Xiaoshu; Lu, Tao; Kibert, Charles J.; Viljanen, Martti

    2015-01-01

    Highlights: • This paper presents a new modeling method to forecast energy demands. • The model is based on physical–statistical approach to improving forecast accuracy. • A new method is proposed to address the heterogeneity challenge. • Comparison with measurements shows accurate forecasts of the model. • The first physical–statistical/heterogeneous building energy modeling approach is proposed and validated. - Abstract: Energy consumption forecasting is a critical and necessary input to planning and controlling energy usage in the building sector which accounts for 40% of the world’s energy use and the world’s greatest fraction of greenhouse gas emissions. However, due to the diversity and complexity of buildings as well as the random nature of weather conditions, energy consumption and loads are stochastic and difficult to predict. This paper presents a new methodology for energy demand forecasting that addresses the heterogeneity challenges in energy modeling of buildings. The new method is based on a physical–statistical approach designed to account for building heterogeneity to improve forecast accuracy. The physical model provides a theoretical input to characterize the underlying physical mechanism of energy flows. Then stochastic parameters are introduced into the physical model and the statistical time series model is formulated to reflect model uncertainties and individual heterogeneity in buildings. A new method of model generalization based on a convex hull technique is further derived to parameterize the individual-level model parameters for consistent model coefficients while maintaining satisfactory modeling accuracy for heterogeneous buildings. The proposed method and its validation are presented in detail for four different sports buildings with field measurements. The results show that the proposed methodology and model can provide a considerable improvement in forecasting accuracy

  19. Boxes of Model Building and Visualization.

    Science.gov (United States)

    Turk, Dušan

    2017-01-01

    Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.

  20. Scalable Integrated Region-Based Image Retrieval Using IRM and Statistical Clustering.

    Science.gov (United States)

    Wang, James Z.; Du, Yanping

    Statistical clustering is critical in designing scalable image retrieval systems. This paper presents a scalable algorithm for indexing and retrieving images based on region segmentation. The method uses statistical clustering on region features and IRM (Integrated Region Matching), a measure developed to evaluate overall similarity between images…

  1. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Karl Grau; Rode, Carsten

    2009-01-01

    cavity such as behind the exterior cladding of a building envelope, i.e. a flow which is parallel to the construction plane. (2) Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the constructionplane. The paper presents the models and how they have...

  2. Integration of Models of Building Interiors with Cadastral Data

    OpenAIRE

    Gotlib Dariusz; Karabin Marcin

    2017-01-01

    Demands for applications which use models of building interiors is growing and highly diversified. Those models are applied at the stage of designing and construction of a building, in applications which support real estate management, in navigation and marketing systems and, finally, in crisis management and security systems. They are created on the basis of different data: architectural and construction plans, both, in the analogue form, as well as CAD files, BIM data files, by means of las...

  3. Models for map building and navigation

    International Nuclear Information System (INIS)

    Penna, M.A.; Jian Wu

    1993-01-01

    In this paper the authors present several models for solving map building and navigation problems. These models are motivated by biological processes, and presented in the context of artificial neural networks. Since the nodes, weights, and threshold functions of the models all have physical meanings, they can easily predict network topologies and avoid traditional trial-and-error training. On one hand, this makes their models useful in constructing solutions to engineering problems (problems such as those that occur in robotics, for example). On the other hand, this might also contribute to the ability of their models to explain some biological processes, few of which are completely understood at this time

  4. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches

    Directory of Open Access Journals (Sweden)

    Mojtaba Valinejad Shoubi

    2015-03-01

    Full Text Available A sustainable building is constructed of materials that could decrease environmental impacts, such as energy usage, during the lifecycle of the building. Building Information Modeling (BIM has been identified as an effective tool for building performance analysis virtually in the design stage. The main aims of this study were to assess various combinations of materials using BIM and identify alternative, sustainable solutions to reduce operational energy consumption. The amount of energy consumed by a double story bungalow house in Johor, Malaysia, and assessments of alternative material configurations to determine the best energy performance were evaluated by using Revit Architecture 2012 and Autodesk Ecotect Analysis software to show which of the materials helped in reducing the operational energy use of the building to the greatest extent throughout its annual life cycle. At the end, some alternative, sustainable designs in terms of energy savings have been suggested.

  5. Things That Squeak and Make You Feel Bad: Building Scalable User Experience Programs for Space Assessment

    Directory of Open Access Journals (Sweden)

    Rebecca Kuglitsch

    2018-04-01

    Full Text Available This article suggests a process for creating a user experience (UX assessment of space program that requires limited resources and minimal prior UX experience. By beginning with small scale methods, like comment boxes and easel prompts, librarians can overturn false assumptions about user behaviors, ground deeper investigations such as focus groups, and generate momentum. At the same time, these methods should feed into larger efforts to build trust and interest with peers and administration, laying the groundwork for more in-depth space UX assessment and more significant changes. The process and approach we suggest can be scaled for use in both large and small library systems. Developing a user experience space assessment program can seem overwhelming, especially without a dedicated user experience librarian or department, but does not have to be. In this piece, we explore how to scale and sequence small UX projects, communicate UX practices and results to stakeholders, and build support in order to develop an intentional but still manageable space assessment program. Our approach takes advantage of our institutional context—a large academic library system with several branch locations, allowing us to pilot projects at different scales. We were able to coordinate across a complex multi-site system, as well as in branch libraries with a staffing model analogous to libraries at smaller institutions. This gives us confidence that our methods can be applied at libraries of different sizes. As subject librarians who served as co-coordinators of a UX team on a voluntary basis, we also confronted the question of how we could attend to user needs while staying on top of our regular workload. Haphazard experimentation is unsatisfying and wasteful, particularly when there is limited time, so we sought to develop a process we could implement that applied approachable, purposeful UX space assessments while building trust and buy-in with colleagues

  6. Building 235-F Goldsim Fate And Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, G. A.; Phifer, M. A.

    2012-09-14

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D&D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ?Ci/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ?Ci/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met.

  7. Building 235-F Goldsim Fate And Transport Model

    International Nuclear Information System (INIS)

    Taylor, G. A.; Phifer, M. A.

    2012-01-01

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D and D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ρCi/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ρCi/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met

  8. Scalable fast multipole accelerated vortex methods

    KAUST Repository

    Hu, Qi; Gumerov, Nail A.; Yokota, Rio; Barba, Lorena A.; Duraiswami, Ramani

    2014-01-01

    -node communication and load balance efficiently, with only a small parallel construction overhead. This algorithm can scale to large-sized clusters showing both strong and weak scalability. Careful error and timing trade-off analysis are also performed for the cutoff

  9. Internet of Things building blocks and business models

    CERN Document Server

    Hussain, Fatima

    2017-01-01

    This book describes the building blocks and introductory business models for Internet of Things (IoT). The author provide an overview of the entire IoT architecture and constituent layers, followed by detail description of each block . Various inter-connecting technologies and sensors are discussed in context of IoT networks. In addition to this, concepts of Big Data and Fog Computing are presented and characterized as per data generated by versatile IoT applications . Smart parking system and context aware services are presented as an hybrid model of cloud and Fog Afterwards, various IoT applications and respective business models are discussed. Finally, author summarizes the IoT building blocks and identify research issues in each, and suggest potential research projects worthy of pursuing. .

  10. Heterotic SO(32) model building in four dimensions

    International Nuclear Information System (INIS)

    Choi, K.S.; Groot Nibbelink, S.; Minnesota Univ., Minneapolis, MN; Trapletti, M.

    2004-10-01

    Four dimensional heterotic SO(32) orbifold models are classified systematically with model building applications in mind. We obtain all Z 3 , Z 7 and Z 2N models based on vectorial gauge shifts. The resulting gauge groups are reminiscent of those of type-I model building, as they always take the form SO(2n 0 ) x U(n 1 ) x.. x U(n N-1 ) x SO(2n N ). The complete twisted spectrum is determined simultaneously for all orbifold models in a parametric way depending on n 0 ,.., n N , rather than on a model by model basis. This reveals interesting patterns in the twisted states: They are always built out of vectors and anti-symmetric tensors of the U(n) groups, and either vectors or spinors of the SO(2n) groups. Our results may shed additional light on the S-duality between heterotic and type-I strings in four dimensions. As a spin-off we obtain an SO(10) GUT model with four generations from the Z 4 orbifold. (orig.)

  11. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    Science.gov (United States)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  12. Scalable Resolution Display Walls

    KAUST Repository

    Leigh, Jason; Johnson, Andrew; Renambot, Luc; Peterka, Tom; Jeong, Byungil; Sandin, Daniel J.; Talandis, Jonas; Jagodic, Ratko; Nam, Sungwon; Hur, Hyejung; Sun, Yiwen

    2013-01-01

    This article will describe the progress since 2000 on research and development in 2-D and 3-D scalable resolution display walls that are built from tiling individual lower resolution flat panel displays. The article will describe approaches and trends in display hardware construction, middleware architecture, and user-interaction design. The article will also highlight examples of use cases and the benefits the technology has brought to their respective disciplines. © 1963-2012 IEEE.

  13. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Minho, E-mail: minmin40@hanmail.net [Asset Management Division, Mate Plus Co., Ltd., 9th Fl., Financial News Bldg. 24-5 Yeouido-dong, Yeongdeungpo-gu, Seoul, 150-877 (Korea, Republic of); Hong, Taehoon, E-mail: hong7@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of); Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of)

    2015-01-15

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.

  14. The Guatemala-Penn Partners: An Innovative Inter-Institutional Model for Scientific Capacity-Building, Healthcare Education, and Public Health

    Science.gov (United States)

    Paniagua-Avila, Maria Alejandra; Messenger, Elizabeth; Nelson, Caroline A.; Calgua, Erwin; Barg, Frances K.; Bream, Kent W.; Compher, Charlene; Dean, Anthony J.; Martinez-Siekavizza, Sergio; Puac-Polanco, Victor; Richmond, Therese S.; Roth, Rudolf R.; Branas, Charles C.

    2017-01-01

    Population health outcomes are directly related to robust public health programs, access to basic health services, and a well-trained health-care workforce. Effective health services need to systematically identify solutions, scientifically test these solutions, and share generated knowledge. The World Health Organization (WHO)’s Global Healthcare Workforce Alliance states that the capacity to perform research is an essential factor for well-functioning public health systems. Low- and middle-income countries have greater health-care worker shortages and lower research capacity than higher-income countries. International global health partnerships between higher-income countries and low-middle-income countries aim to directly address such inequalities through capacity building, a process by which human and institutional resources are strengthened and developed, allowing them to perform high-level functions, solve complex problems, and achieve important objectives. The Guatemala–Penn Partners (GPP) is a collaboration among academic centers in Guatemala and the University of Pennsylvania (Penn), in Philadelphia, Pennsylvania that echoes the vision of the WHO’s Global Healthcare Workforce Alliance. This article describes the historical development and present organization of the GPP according to its three guiding principles: university-to-university connections, dual autonomies with locally led capacity building, and mutually beneficial exchanges. It describes the GPP activities within the domains of science, health-care education, and public health, emphasizing implementation factors, such as sustainability and scalability, in relation to the guiding principles. Successes and limitations of this innovative model are also analyzed in the hope that the lessons learned may be applied to similar partnerships across the globe. PMID:28443274

  15. Integration of Models of Building Interiors with Cadastral Data

    Science.gov (United States)

    Gotlib, Dariusz; Karabin, Marcin

    2017-12-01

    Demands for applications which use models of building interiors is growing and highly diversified. Those models are applied at the stage of designing and construction of a building, in applications which support real estate management, in navigation and marketing systems and, finally, in crisis management and security systems. They are created on the basis of different data: architectural and construction plans, both, in the analogue form, as well as CAD files, BIM data files, by means of laser scanning (TLS) and conventional surveys. In this context the issue of searching solutions which would integrate the existing models and lead to elimination of data redundancy is becoming more important. The authors analysed the possible input- of cadastral data (legal extent of premises) at the stage of the creation and updating different models of building's interiors. The paper focuses on one issue - the way of describing the geometry of premises basing on the most popular source data, i.e. architectural and construction plans. However, the described rules may be considered as universal and also may be applied in practice concerned may be used during the process of creation and updating indoor models based on BIM dataset or laser scanning clouds

  16. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data

    Directory of Open Access Journals (Sweden)

    Jaewook Jung

    2017-03-01

    Full Text Available With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL combined with Hypothesize and Test (HAT. The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International

  17. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data.

    Science.gov (United States)

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-03-19

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for

  18. Models for describing the thermal characteristics of building components

    DEFF Research Database (Denmark)

    Jimenez, M.J.; Madsen, Henrik

    2008-01-01

    , for example. For the analysis of these tests, dynamic analysis models and methods are required. However, a wide variety of models and methods exists, and the problem of choosing the most appropriate approach for each particular case is a non-trivial and interdisciplinary task. Knowledge of a large family....... The characteristics of each type of model are highlighted. Some available software tools for each of the methods described will be mentioned. A case study also demonstrating the difference between linear and nonlinear models is considered....... of these approaches may therefore be very useful for selecting a suitable approach for each particular case. This paper presents an overview of models that can be applied for modelling the thermal characteristics of buildings and building components using data from outdoor testing. The choice of approach depends...

  19. BUILDING INFORMATION MODELS FOR MONITORING AND SIMULATION DATA IN HERITAGE BUILDINGS

    Directory of Open Access Journals (Sweden)

    D. P. Pocobelli

    2018-05-01

    Full Text Available This paper analyses the use of BIM in heritage buildings, assessing the state-of-the-art and finding paths for further development. Specifically, this work is part of a broader project, which final aim is to support stakeholders through BIM. Given that humidity is one of the major causes of weathering, being able to detect, depict and forecast it, is a key task. A BIM model of a heritage building – enhanced with the integration of a weathering forecasting model – will be able to give detailed information on possible degradation patterns, and when they will happen. This information can be effectively used to plan both ordinary and extraordinary maintenance. The Jewel Tower in London, our case study, is digitised using combined laser scanning and photogrammetry, and a virtual model is produced. The point cloud derived from combined laser scanning & photogrammetry is traced out in with Autodesk Revit, where the main volumetry (gross walls and floors is created with parametric objects. Surface characterisation of the façade is given through renderings. Specifically, new rendering materials have been created for this purpose, based on rectified photos of the Tower. The model is then integrated with moisture data, organised in spreadsheets and linked to it via parametric objects representing the points where measurements had been previously taken. The spatial distribution of moisture is then depicted using Dynamo. This simple exercise demonstrates the potential Dynamo has for condition reporting, and future work will concentrate on the creation of a complex forecasting model to be linked through it.

  20. Evaluation of 3D printed anatomically scalable transfemoral prosthetic knee.

    Science.gov (United States)

    Ramakrishnan, Tyagi; Schlafly, Millicent; Reed, Kyle B

    2017-07-01

    This case study compares a transfemoral amputee's gait while using the existing Ossur Total Knee 2000 and our novel 3D printed anatomically scalable transfemoral prosthetic knee. The anatomically scalable transfemoral prosthetic knee is 3D printed out of a carbon-fiber and nylon composite that has a gear-mesh coupling with a hard-stop weight-actuated locking mechanism aided by a cross-linked four-bar spring mechanism. This design can be scaled using anatomical dimensions of a human femur and tibia to have a unique fit for each user. The transfemoral amputee who was tested is high functioning and walked on the Computer Assisted Rehabilitation Environment (CAREN) at a self-selected pace. The motion capture and force data that was collected showed that there were distinct differences in the gait dynamics. The data was used to perform the Combined Gait Asymmetry Metric (CGAM), where the scores revealed that the overall asymmetry of the gait on the Ossur Total Knee was more asymmetric than the anatomically scalable transfemoral prosthetic knee. The anatomically scalable transfemoral prosthetic knee had higher peak knee flexion that caused a large step time asymmetry. This made walking on the anatomically scalable transfemoral prosthetic knee more strenuous due to the compensatory movements in adapting to the different dynamics. This can be overcome by tuning the cross-linked spring mechanism to emulate the dynamics of the subject better. The subject stated that the knee would be good for daily use and has the potential to be adapted as a running knee.

  1. Scalable optical switches for computing applications

    NARCIS (Netherlands)

    White, I.H.; Aw, E.T.; Williams, K.A.; Wang, Haibo; Wonfor, A.; Penty, R.V.

    2009-01-01

    A scalable photonic interconnection network architecture is proposed whereby a Clos network is populated with broadcast-and-select stages. This enables the efficient exploitation of an emerging class of photonic integrated switch fabric. A low distortion space switch technology based on recently

  2. On the scalability of LISP and advanced overlaid services

    OpenAIRE

    Coras, Florin

    2015-01-01

    In just four decades the Internet has gone from a lab experiment to a worldwide, business critical infrastructure that caters to the communication needs of almost a half of the Earth's population. With these figures on its side, arguing against the Internet's scalability would seem rather unwise. However, the Internet's organic growth is far from finished and, as billions of new devices are expected to be joined in the not so distant future, scalability, or lack thereof, is commonly believed ...

  3. Construction cost prediction model for conventional and sustainable college buildings in North America

    Directory of Open Access Journals (Sweden)

    Othman Subhi Alshamrani

    2017-03-01

    Full Text Available The literature lacks in initial cost prediction models for college buildings, especially comparing costs of sustainable and conventional buildings. A multi-regression model was developed for conceptual initial cost estimation of conventional and sustainable college buildings in North America. RS Means was used to estimate the national average of construction costs for 2014, which was subsequently utilized to develop the model. The model could predict the initial cost per square feet with two structure types made of steel and concrete. The other predictor variables were building area, number of floors and floor height. The model was developed in three major stages, such as preliminary diagnostics on data quality, model development and validation. The developed model was successfully tested and validated with real-time data.

  4. Scalable Algorithms for Adaptive Statistical Designs

    Directory of Open Access Journals (Sweden)

    Robert Oehmke

    2000-01-01

    Full Text Available We present a scalable, high-performance solution to multidimensional recurrences that arise in adaptive statistical designs. Adaptive designs are an important class of learning algorithms for a stochastic environment, and we focus on the problem of optimally assigning patients to treatments in clinical trials. While adaptive designs have significant ethical and cost advantages, they are rarely utilized because of the complexity of optimizing and analyzing them. Computational challenges include massive memory requirements, few calculations per memory access, and multiply-nested loops with dynamic indices. We analyze the effects of various parallelization options, and while standard approaches do not work well, with effort an efficient, highly scalable program can be developed. This allows us to solve problems thousands of times more complex than those solved previously, which helps make adaptive designs practical. Further, our work applies to many other problems involving neighbor recurrences, such as generalized string matching.

  5. Scalable fabrication of perovskite solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zhen; Klein, Talysa R.; Kim, Dong Hoe; Yang, Mengjin; Berry, Joseph J.; van Hest, Maikel F. A. M.; Zhu, Kai

    2018-03-27

    Perovskite materials use earth-abundant elements, have low formation energies for deposition and are compatible with roll-to-roll and other high-volume manufacturing techniques. These features make perovskite solar cells (PSCs) suitable for terawatt-scale energy production with low production costs and low capital expenditure. Demonstrations of performance comparable to that of other thin-film photovoltaics (PVs) and improvements in laboratory-scale cell stability have recently made scale up of this PV technology an intense area of research focus. Here, we review recent progress and challenges in scaling up PSCs and related efforts to enable the terawatt-scale manufacturing and deployment of this PV technology. We discuss common device and module architectures, scalable deposition methods and progress in the scalable deposition of perovskite and charge-transport layers. We also provide an overview of device and module stability, module-level characterization techniques and techno-economic analyses of perovskite PV modules.

  6. The nightly build and test system for LCG AA and LHCb software

    CERN Document Server

    Kruzelecki, K; Degaudenzi, H

    2010-01-01

    The core software stack both from the LCG Application Area and LHCb consists of more than 25 C++/Fortran/Python projects build for about 20 different configurations on Linux, Windows and MacOSX. To these projects, one can also add about 70 external software packages (Boost, Python, Qt, CLHEP, ...) which have also to be build for the same configurations. It order to reduce the time of the development cycle and increase the quality insurance, a framework has been developed for the daily (nightly actually) build and test of the software. Performing the build and the tests on several configurations and platform allows to increase the efficiency of the unit and integration tests. Main features: - flexible and fine grained setup (full, partial build) through a web interface; - possibility to build several “slots” with different configurations; - precise and highly granular reports on a web server; - support for CMT projects (but not only) with their cross-dependencies; - scalable client-server architecture for ...

  7. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  8. Overview of the Scalable Coherent Interface, IEEE STD 1596 (SCI)

    International Nuclear Information System (INIS)

    Gustavson, D.B.; James, D.V.; Wiggers, H.A.

    1992-10-01

    The Scalable Coherent Interface standard defines a new generation of interconnection that spans the full range from supercomputer memory 'bus' to campus-wide network. SCI provides bus-like services and a shared-memory software model while using an underlying, packet protocol on many independent communication links. Initially these links are 1 GByte/s (wires) and 1 GBit/s (fiber), but the protocol scales well to future faster or lower-cost technologies. The interconnect may use switches, meshes, and rings. The SCI distributed-shared-memory model is simple and versatile, enabling for the first time a smooth integration of highly parallel multiprocessors, workstations, personal computers, I/O, networking and data acquisition

  9. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  10. A procedure for Building Product Models

    DEFF Research Database (Denmark)

    Hvam, Lars

    1999-01-01

    , easily adaptable concepts and methods from data modeling (object oriented analysis) and domain modeling (product modeling). The concepts are general and can be used for modeling all types of specifications in the different phases in the product life cycle. The modeling techniques presented have been......The application of product modeling in manufacturing companies raises the important question of how to model product knowledge in a comprehensible and efficient way. An important challenge is to qualify engineers to model and specify IT-systems (product models) to support their specification...... activities. A basic assumption is that engineers have to take the responsability for building product models to be used in their domain. To do that they must be able to carry out the modeling task on their own without any need for support from computer science experts. This paper presents a set of simple...

  11. Advanced, Integrated Control for Building Operations to Achieve 40% Energy Saving

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Yan; Song, Zhen; Loftness, Vivian; Ji, Kun; Zheng, Sam; Lasternas, Bertrand; Marion, Flore; Yuebin, Yu

    2012-10-15

    We developed and demonstrated a software based integrated advanced building control platform called Smart Energy Box (SEB), which can coordinate building subsystem controls, integrate variety of energy optimization algorithms and provide proactive and collaborative energy management and control for building operations using weather and occupancy information. The integrated control system is a low cost solution and also features: Scalable component based architecture allows to build a solution for different building control system configurations with needed components; Open Architecture with a central data repository for data exchange among runtime components; Extendible to accommodate variety of communication protocols. Optimal building control for central loads, distributed loads and onsite energy resource; uses web server as a loosely coupled way to engage both building operators and building occupants in collaboration for energy conservation. Based on the open platform of SEB, we have investigated and evaluated a variety of operation and energy saving control strategies on Carnegie Mellon University Intelligent Work place which is equipped with alternative cooling/heating/ventilation/lighting methods, including radiant mullions, radiant cooling/heating ceiling panels, cool waves, dedicated ventilation unit, motorized window and blinds, and external louvers. Based on the validation results of these control strategies, they were integrated in SEB in a collaborative and dynamic way. This advanced control system was programmed and computer tested with a model of the Intelligent Workplace's northern section (IWn). The advanced control program was then installed in the IWn control system; the performance was measured and compared with that of the state of the art control system to verify the overall energy savings great than 40%. In addition advanced human machine interfaces (HMI's) were developed to communicate both with building

  12. SEMI-AUTOMATIC BUILDING MODELS AND FAÇADE TEXTURE MAPPING FROM MOBILE PHONE IMAGES

    Directory of Open Access Journals (Sweden)

    J. Jeong

    2016-06-01

    Full Text Available Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  13. FIRST PRISMATIC BUILDING MODEL RECONSTRUCTION FROM TOMOSAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    Y. Sun

    2016-06-01

    Full Text Available This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007 and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.

  14. The ORC method. Effective modelling of thermal performance of multilayer building components

    Energy Technology Data Exchange (ETDEWEB)

    Akander, Jan

    2000-02-01

    The ORC Method (Optimised RC-networks) provides a means of modelling one- or multidimensional heat transfer in building components, in this context within building simulation environments. The methodology is shown, primarily applied to heat transfer in multilayer building components. For multilayer building components, the analytical thermal performance is known, given layer thickness and material properties. The aim of the ORC Method is to optimise the values of the thermal resistances and heat capacities of an RC-model such as to give model performance a good agreement with the analytical performance, for a wide range of frequencies. The optimisation procedure is made in the frequency domain, where the over-all deviation between model and analytical frequency response, in terms of admittance and dynamic transmittance, is minimised. It is shown that ORC's are effective in terms of accuracy and computational time in comparison to finite difference models when used in building simulations, in this case with IDA/ICE. An ORC configuration of five mass nodes has been found to model building components in Nordic countries well, within the application of thermal comfort and energy requirement simulations. Simple RC-networks, such as the surface heat capacity and the simple R-C-configuration are not appropriate for detailed building simulation. However, these can be used as basis for defining the effective heat capacity of a building component. An approximate method is suggested on how to determine the effective heat capacity without the use of complex numbers. This entity can be calculated on basis of layer thickness and material properties with the help of two time constants. The approximate method can give inaccuracies corresponding to 20%. In-situ measurements have been carried out in an experimental building with the purpose of establishing the effective heat capacity of external building components that are subjected to normal thermal conditions. The auxiliary

  15. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  16. Non-commutative standard model: model building

    CERN Document Server

    Chaichian, Masud; Presnajder, P

    2003-01-01

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U sub * (n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,+-1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U sub * (3) x U sub * (2) x U sub * (1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  17. Mental models of a water management system in a green building.

    Science.gov (United States)

    Kalantzis, Anastasia; Thatcher, Andrew; Sheridan, Craig

    2016-11-01

    This intergroup case study compared users' mental models with an expert design model of a water management system in a green building. The system incorporates a constructed wetland component and a rainwater collection pond that together recycle water for re-use in the building and its surroundings. The sample consisted of five building occupants and the cleaner (6 users) and two experts who were involved with the design of the water management system. Users' mental model descriptions and the experts' design model were derived from in-depth interviews combined with self-constructed (and verified) diagrams. Findings from the study suggest that there is considerable variability in the user mental models that could impact the efficient functioning of the water management system. Recommendations for improvements are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. RESRAD-BUILD: A computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material

    International Nuclear Information System (INIS)

    Yu, C.; LePoire, D.J.; Jones, L.G.

    1994-11-01

    The RESRAD-BUILD computer code is a pathway analysis model designed to evaluate the potential radiological dose incurred by an individual who works or lives in a building contaminated with radioactive material. The transport of radioactive material inside the building from one compartment to another is calculated with an indoor air quality model. The air quality model considers the transport of radioactive dust particulates and radon progeny due to air exchange, deposition and resuspension, and radioactive decay and ingrowth. A single run of the RESRAD-BUILD code can model a building with up to: three compartments, 10 distinct source geometries, and 10 receptor locations. A shielding material can be specified between each source-receptor pair for external gamma dose calculations. Six exposure pathways are considered in the RESRAD-BUILD code: (1) external exposure directly from the source; (2) external exposure to materials deposited on the floor; (3) external exposure due to air submersion; (4) inhalation of airborne radioactive particulates; (5) inhalation of aerosol indoor radon progeny; and (6) inadvertent ingestion of radioactive material, either directly from the sources or from materials deposited on the surfaces of the building compartments

  19. Modeling arson - An exercise in qualitative model building

    Science.gov (United States)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  20. Scalable nanofabrication of U-shaped nanowire resonators with tunable optical magnetism.

    Science.gov (United States)

    Zhou, Fan; Wang, Chen; Dong, Biqin; Chen, Xiangfan; Zhang, Zhen; Sun, Cheng

    2016-03-21

    Split ring resonators have been studied extensively in reconstituting the diminishing magnetism at high electromagnetic frequencies in nature. However, breakdown in the linear scaling of artificial magnetism is found to occur at the near-infrared frequency mainly due to the increasing contribution of self-inductance while reducing dimensions of the resonators. Although alternative designs have enabled artificial magnetism at optical frequencies, their sophisticated configurations and fabrication procedures do not lend themselves to easy implementation. Here, we report scalable nanofabrication of U-shaped nanowire resonators (UNWRs) using the high-throughput nanotransfer printing method. By providing ample area for conducting oscillating electric current, UNWRs overcome the saturation of the geometric scaling of the artificial magnetism. We experimentally demonstrated coarse and fine tuning of LC resonances over a wide wavelength range from 748 nm to 1600 nm. The added flexibility in transferring to other substrates makes UNWR a versatile building block for creating functional metamaterials in three dimensions.

  1. Modeling Manpower and Equipment Productivity in Tall Building Construction Projects

    Science.gov (United States)

    Mudumbai Krishnaswamy, Parthasarathy; Rajiah, Murugasan; Vasan, Ramya

    2017-12-01

    Tall building construction projects involve two critical resources of manpower and equipment. Their usage, however, widely varies due to several factors affecting their productivity. Currently, no systematic study for estimating and increasing their productivity is available. What is prevalent is the use of empirical data, experience of similar projects and assumptions. As tall building projects are here to stay and increase, to meet the emerging demands in ever shrinking urban spaces, it is imperative to explore ways and means of scientific productivity models for basic construction activities: concrete, reinforcement, formwork, block work and plastering for the input of specific resources in a mixed environment of manpower and equipment usage. Data pertaining to 72 tall building projects in India were collected and analyzed. Then, suitable productivity estimation models were developed using multiple linear regression analysis and validated using independent field data. It is hoped that the models developed in the study will be useful for quantity surveyors, cost engineers and project managers to estimate productivity of resources in tall building projects.

  2. Reconstructing building mass models from UAV images

    KAUST Repository

    Li, Minglei

    2015-07-26

    We present an automatic reconstruction pipeline for large scale urban scenes from aerial images captured by a camera mounted on an unmanned aerial vehicle. Using state-of-the-art Structure from Motion and Multi-View Stereo algorithms, we first generate a dense point cloud from the aerial images. Based on the statistical analysis of the footprint grid of the buildings, the point cloud is classified into different categories (i.e., buildings, ground, trees, and others). Roof structures are extracted for each individual building using Markov random field optimization. Then, a contour refinement algorithm based on pivot point detection is utilized to refine the contour of patches. Finally, polygonal mesh models are extracted from the refined contours. Experiments on various scenes as well as comparisons with state-of-the-art reconstruction methods demonstrate the effectiveness and robustness of the proposed method.

  3. Links Related to the Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  4. Weather Correlations to Calculate Infiltration Rates for U. S. Commercial Building Energy Models.

    Science.gov (United States)

    Ng, Lisa C; Quiles, Nelson Ojeda; Dols, W Stuart; Emmerich, Steven J

    2018-01-01

    As building envelope performance improves, a greater percentage of building energy loss will occur through envelope leakage. Although the energy impacts of infiltration on building energy use can be significant, current energy simulation software have limited ability to accurately account for envelope infiltration and the impacts of improved airtightness. This paper extends previous work by the National Institute of Standards and Technology that developed a set of EnergyPlus inputs for modeling infiltration in several commercial reference buildings using Chicago weather. The current work includes cities in seven additional climate zones and uses the updated versions of the prototype commercial building types developed by the Pacific Northwest National Laboratory for the U. S. Department of Energy. Comparisons were made between the predicted infiltration rates using three representations of the commercial building types: PNNL EnergyPlus models, CONTAM models, and EnergyPlus models using the infiltration inputs developed in this paper. The newly developed infiltration inputs in EnergyPlus yielded average annual increases of 3 % and 8 % in the HVAC electrical and gas use, respectively, over the original infiltration inputs in the PNNL EnergyPlus models. When analyzing the benefits of building envelope airtightening, greater HVAC energy savings were predicted using the newly developed infiltration inputs in EnergyPlus compared with using the original infiltration inputs. These results indicate that the effects of infiltration on HVAC energy use can be significant and that infiltration can and should be better accounted for in whole-building energy models.

  5. Fast & scalable pattern transfer via block copolymer nanolithography

    DEFF Research Database (Denmark)

    Li, Tao; Wang, Zhongli; Schulte, Lars

    2015-01-01

    A fully scalable and efficient pattern transfer process based on block copolymer (BCP) self-assembling directly on various substrates is demonstrated. PS-rich and PDMS-rich poly(styrene-b-dimethylsiloxane) (PS-b-PDMS) copolymers are used to give monolayer sphere morphology after spin-casting of s......A fully scalable and efficient pattern transfer process based on block copolymer (BCP) self-assembling directly on various substrates is demonstrated. PS-rich and PDMS-rich poly(styrene-b-dimethylsiloxane) (PS-b-PDMS) copolymers are used to give monolayer sphere morphology after spin...... on long range lateral order, including fabrication of substrates for catalysis, solar cells, sensors, ultrafiltration membranes and templating of semiconductors or metals....

  6. A scalable new mechanism to store and serve the ATLAS detector description through a REST web API

    CERN Document Server

    Bianchi, Riccardo-Maria; The ATLAS collaboration

    2017-01-01

    Until now, geometry information for the detector description of HEP experiments was only stored in online relational databases integrated in the experiments’ frameworks or described in files with text-based markup languages. In all cases, to build and store the detector description, a full software stack was needed. In this paper we present a new and scalable mechanism to store the geometry data and to serve the detector description data through a REST web-based API. This new approach decouples the geometry information from the experiment’s framework. Moreover, it provides new functionalities to users, who can now search for specific volumes and get partial detector description, or filter geometry data based on custom criteria. We present two approaches to build a REST API to serve geometry data, based on two different technologies used in other fields and communities: the search engine ElasticSearch and the graph database Neo4j. We describe their characteristics and we compare them using real-world usage...

  7. A scalable new mechanism to store and serve the ATLAS detector description through a REST web API

    CERN Document Server

    Bianchi, Riccardo-Maria; The ATLAS collaboration

    2017-01-01

    Until now, geometry information for the detector description of HEP experiments was only stored in online relational databases integrated into the experiments’ frameworks or described in files with text-based markup languages. In all cases, to build and store the detector description, a full software stack was needed. In this paper, we present a new and scalable mechanism to store the geometry data and to serve the detector description data through a web interface and a REST API. This new approach decouples the geometry information from the experiment’s framework. Moreover, it provides new functionalities to users, who can now search for specific volumes and get partial detector description, or filter geometry data based on custom criteria. We present two approaches to build a REST API to serve geometry data, based on two different technologies used in other fields and communities: the search engine ElasticSearch and the graph database Neo4j. We describe their characteristics and we compare them using rea...

  8. Scalable Motion Estimation Processor Core for Multimedia System-on-Chip Applications

    Science.gov (United States)

    Lai, Yeong-Kang; Hsieh, Tian-En; Chen, Lien-Fei

    2007-04-01

    In this paper, we describe a high-throughput and scalable motion estimation processor architecture for multimedia system-on-chip applications. The number of processing elements (PEs) is scalable according to the variable algorithm parameters and the performance required for different applications. Using the PE rings efficiently and an intelligent memory-interleaving organization, the efficiency of the architecture can be increased. Moreover, using efficient on-chip memories and a data management technique can effectively decrease the power consumption and memory bandwidth. Techniques for reducing the number of interconnections and external memory accesses are also presented. Our results demonstrate that the proposed scalable PE-ringed architecture is a flexible and high-performance processor core in multimedia system-on-chip applications.

  9. A Scalable Framework and Prototype for CAS e-Science

    Directory of Open Access Journals (Sweden)

    Yuanchun Zhou

    2007-07-01

    Full Text Available Based on the Small-World model of CAS e-Science and the power low of Internet, this paper presents a scalable CAS e-Science Grid framework based on virtual region called Virtual Region Grid Framework (VRGF. VRGF takes virtual region and layer as logic manage-unit. In VRGF, the mode of intra-virtual region is pure P2P, and the model of inter-virtual region is centralized. Therefore, VRGF is decentralized framework with some P2P properties. Further more, VRGF is able to achieve satisfactory performance on resource organizing and locating at a small cost, and is well adapted to the complicated and dynamic features of scientific collaborations. We have implemented a demonstration VRGF based Grid prototype—SDG.

  10. PIConGPU - How to build one of the fastest GPU particle-in-cell codes in the world

    Energy Technology Data Exchange (ETDEWEB)

    Burau, Heiko; Debus, Alexander; Helm, Anton; Huebl, Axel; Kluge, Thomas; Widera, Rene; Bussmann, Michael; Schramm, Ulrich; Cowan, Thomas [HZDR, Dresden (Germany); Juckeland, Guido; Nagel, Wolfgang [TU Dresden (Germany); ZIH, Dresden (Germany); Schmitt, Felix [NVIDIA (United States)

    2013-07-01

    We present the algorithmic building blocks of PIConGPU, one of the fastest implementations of the particle-in-cell algortihm on GPU clusters. PIConGPU is a highly-scalable, 3D3V electromagnetic PIC code that is used in laser plasma and astrophysical plasma simulations.

  11. Review of Development Survey of Phase Change Material Models in Building Applications

    Directory of Open Access Journals (Sweden)

    Hussein J. Akeiber

    2014-01-01

    Full Text Available The application of phase change materials (PCMs in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data.

  12. Activity measurement and effective dose modelling of natural radionuclides in building material.

    Science.gov (United States)

    Maringer, F J; Baumgartner, A; Rechberger, F; Seidel, C; Stietka, M

    2013-11-01

    In this paper the assessment of natural radionuclides' activity concentration in building materials, calibration requirements and related indoor exposure dose models is presented. Particular attention is turned to specific improvements in low-level gamma-ray spectrometry to determine the activity concentration of necessary natural radionuclides in building materials with adequate measurement uncertainties. Different approaches for the modelling of the effective dose indoor due to external radiation resulted from natural radionuclides in building material and results of actual building material assessments are shown. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Ten questions concerning future buildings beyond zero energy and carbon neutrality

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Na; Phelan, Patrick E.; Gonzalez, Jorge; Harris, Chioke; Henze, Gregor P.; Hutchinson, Robert; Langevin, Jared; Lazarus, Mary Ann; Nelson, Brent; Pyke, Chris; Roth, Kurt; Rouse, David; Sawyer, Karma; Selkowitz, Stephen

    2017-07-01

    Architects, planners, and building scientists have been at the forefront of envisioning a future built environment for centuries. However, fragmental views that emphasize one facet of the built environment, such as energy, environment, or groundbreaking technologies, often do not achieve expected outcomes. Buildings are responsible for approximately one-third of worldwide carbon emissions and account for over 40% of primary energy consumption in the U.S. In addition to achieving the ambitious goal of reducing building greenhouse gas emissions by 75% by 2050, buildings must improve their functionality and performance to meet current and future human, societal, and environmental needs in a changing world. In this article, we introduce a new framework to guide potential evolution of the building stock in the next century, based on greenhouse gas emissions as the common thread to investigate the potential implications of new design paradigms, innovative operational strategies, and disruptive technologies. This framework emphasizes integration of multidisciplinary knowledge, scalability for mainstream buildings, and proactive approaches considering constraints and unknowns. The framework integrates the interrelated aspects of the built environment through a series of quantitative metrics that aim to improve environmental outcomes while optimizing building performance to achieve healthy, adaptive, and productive buildings.

  14. Fine modeling of energy exchanges between buildings and urban atmosphere

    International Nuclear Information System (INIS)

    Daviau-Pellegrin, Noelie

    2016-01-01

    This thesis work is about the effect of buildings on the urban atmosphere and more precisely the energetic exchanges that take place between these two systems. In order to model more finely the thermal effects of buildings on the atmospheric flows in simulations run under the CFD software Code-Saturne, we proceed to couple this tool with the building model BuildSysPro. This library is run under Dymola and can generate matrices describing the building thermal properties that can be used outside this software. In order to carry out the coupling, we use these matrices in a code that allows the building thermal calculations and the CFD to exchange their results. After a review about the physical phenomena and the existing models, we explain the interactions between the atmosphere and the urban elements, especially buildings. The latter can impact the air flows dynamically, as they act as obstacles, and thermally, through their surface temperatures. At first, we analyse the data obtained from the measurement campaign EM2PAU that we use in order to validate the coupled model. EM2PAU was carried out in Nantes in 2011 and represents a canyon street with two rows of four containers. Its distinctive feature lies in the simultaneous measurements of the air and wall temperatures as well as the wind speeds with anemometers located on a 10 m-high mast for the reference wind and on six locations in the canyon. This aims for studying the thermal influence of buildings on the air flows. Then the numerical simulations of the air flows in EM2PAU is carried out with different methods that allow us to calculate or impose the surface temperature we use for each of the container walls. The first method consists in imposing their temperatures from the measurements. For each wall, we set the temperature to the surface temperature that was measured during the EM2PAU campaign. The second method involves imposing the outdoor air temperature that was measured at a given time to all the

  15. Conceptual Architecture of Building Energy Management Open Source Software (BEMOSS)

    Energy Technology Data Exchange (ETDEWEB)

    Khamphanchai, Warodom; Saha, Avijit; Rathinavel, Kruthika; Kuzlu, Murat; Pipattanasomporn, Manisa; Rahman, Saifur; Akyol, Bora A.; Haack, Jereme N.

    2014-12-01

    The objective of this paper is to present a conceptual architecture of a Building Energy Management Open Source Software (BEMOSS) platform. The proposed BEMOSS platform is expected to improve sensing and control of equipment in small- and medium-sized buildings, reduce energy consumption and help implement demand response (DR). It aims to offer: scalability, robustness, plug and play, open protocol, interoperability, cost-effectiveness, as well as local and remote monitoring. In this paper, four essential layers of BEMOSS software architecture -- namely User Interface, Application and Data Management, Operating System and Framework, and Connectivity layers -- are presented. A laboratory test bed to demonstrate the functionality of BEMOSS located at the Advanced Research Institute of Virginia Tech is also briefly described.

  16. Integrated Urban System and Energy Consumption Model: Residential Buildings

    Directory of Open Access Journals (Sweden)

    Rocco Papa

    2014-05-01

    Full Text Available This paper describes a segment of research conducted within the project PON 04a2_E Smart Energy Master for the energetic government of the territory conducted by the Department of Civil, Architectural and Environment Engineering, University of Naples "Federico II".  In particular, this article is part of the study carried out for the definition of the comprehension/interpretation model that correlates buildings, city’s activities and users’ behaviour in order to promote energy savings. In detail, this segment of the research wants to define the residential variables to be used in the model. For this purpose a knowledge framework at international level has been defined, to estimate the energy requirements of residential buildings and the identification of a set of parameters, whose variation has a significant influence on the energy consumption of residential buildings.

  17. Protein Simulation Data in the Relational Model.

    Science.gov (United States)

    Simms, Andrew M; Daggett, Valerie

    2012-10-01

    High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.

  18. ParaText : scalable solutions for processing and searching very large document collections : final LDRD report.

    Energy Technology Data Exchange (ETDEWEB)

    Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.; Shead, Timothy M.

    2010-09-01

    This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages of information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.

  19. LHCb: The nightly build and test system for LCG AA and LHCb software

    CERN Multimedia

    Kruzelecki, K; Degaudenzi, H

    2009-01-01

    The core software stack both from the LCG Application Area and LHCb consists of more than 25 C++/Fortran/Python projects build for about 20 different configurations on Linux, Windows and MacOSX. To these projects, one can also add about 20 external software packages (Boost, Python, Qt, CLHEP, ...) which have also to be build for the same configurations. It order to reduce the time of the development cycle and increase the quality insurance, a framework has been developed for the daily (nightly actually) build and test of the software. Performing the build and the tests on several configurations and platform allows to increase the efficiency of the unit and integration tests. Main features: - flexible and fine grained setup (full, partial build) through a web interface - possibility to build several "slots" with different configurations - precise and highly granular reports on a web server - support for CMT projects (but not only) with their cross-dependencies. - scalable client -server architecture for the co...

  20. Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.

    Science.gov (United States)

    Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele

    2015-01-01

    Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.

  1. A Study on Development of a Cost Optimal and Energy Saving Building Model: Focused on Industrial Building

    Directory of Open Access Journals (Sweden)

    Hye Yeon Kim

    2016-03-01

    Full Text Available This study suggests an optimization method for the life cycle cost (LCC in an economic feasibility analysis when applying energy saving techniques in the early design stage of a building. Literature and previous studies were reviewed to select appropriate optimization and LCC analysis techniques. The energy simulation (Energy Plus and computational program (MATLAB were linked to provide an automated optimization process. From the results, it is suggested that this process could outline the cost optimization model with which it is possible to minimize the LCC. To aid in understanding the model, a case study on an industrial building was performed to outline the operations of the cost optimization model including energy savings. An energy optimization model was also presented to illustrate the need for the cost optimization model.

  2. Building damage assessment from PolSAR data using texture parameters of statistical model

    Science.gov (United States)

    Li, Linlin; Liu, Xiuguo; Chen, Qihao; Yang, Shuai

    2018-04-01

    Accurate building damage assessment is essential in providing decision support for disaster relief and reconstruction. Polarimetric synthetic aperture radar (PolSAR) has become one of the most effective means of building damage assessment, due to its all-day/all-weather ability and richer backscatter information of targets. However, intact buildings that are not parallel to the SAR flight pass (termed oriented buildings) and collapsed buildings share similar scattering mechanisms, both of which are dominated by volume scattering. This characteristic always leads to misjudgments between assessments of collapsed buildings and oriented buildings from PolSAR data. Because the collapsed buildings and the intact buildings (whether oriented or parallel buildings) have different textures, a novel building damage assessment method is proposed in this study to address this problem by introducing texture parameters of statistical models. First, the logarithms of the estimated texture parameters of different statistical models are taken as a new texture feature to describe the collapse of the buildings. Second, the collapsed buildings and intact buildings are distinguished using an appropriate threshold. Then, the building blocks are classified into three levels based on the building block collapse rate. Moreover, this paper also discusses the capability for performing damage assessment using texture parameters from different statistical models or using different estimators. The RADARSAT-2 and ALOS-1 PolSAR images are used to present and analyze the performance of the proposed method. The results show that using the texture parameters avoids the problem of confusing collapsed and oriented buildings and improves the assessment accuracy. The results assessed by using the K/G0 distribution texture parameters estimated based on the second moment obtain the highest extraction accuracies. For the RADARSAT-2 and ALOS-1 data, the overall accuracy (OA) for these three types of

  3. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  4. Numerical model for stack gas diffusion in terrain with buildings. Variations in air flow and gas concentration with additional building near stack

    International Nuclear Information System (INIS)

    Sada, Koichi; Michioka, Takenobu; Ichikawa, Yoichi; Komiyama, Sumito; Numata, Kunio

    2009-01-01

    A numerical simulation method for predicting atmospheric flow and stack gas diffusion using a calculation domain of several km around a stack under complex terrain conditions containing buildings has been developed. The turbulence closure technique using a modified k-ε-type model without a hydrostatic approximation was used for flow calculation, and some of the calculation grids near the ground were treated as buildings using a terrain-following coordinate system. Stack gas diffusion was predicted using the Lagrangian particle model, that is, the stack gas was represented by trajectories of released particles. The developed numerical model was applied to a virtual terrain and building conditions in this study prior to the applications of a numerical model for real terrain and building conditions. The height of the additional building (H a ), located about 200 m leeward from the stack, was varied (i.e., H a =0, 20, 30 and 50 m), and its effects on airflow and the concentration of stack gas at a released height of 75 m were calculated. Furthermore, effective stack height, which was used in the safety analysis of atmospheric diffusion for nuclear facilities in Japan, was evaluated from the calculated ground-level concentration of stack gas. The cavity region behind the additional building was calculated, and turbulence near the cavity was observed to decrease when the additional building was present. According to these flow variations with the additional building, tracer gas tended to diffuse to the ground surface rapidly with the additional building at the leeward position of the cavity, and the ground-level stack gas concentration along the plume axis also increased with the height of the additional building. However, the variations in effective stack height with the height of the additional building were relatively small and ranged within several m in this study. (author)

  5. Superlinearly scalable noise robustness of redundant coupled dynamical systems.

    Science.gov (United States)

    Kohar, Vivek; Kia, Behnam; Lindner, John F; Ditto, William L

    2016-03-01

    We illustrate through theory and numerical simulations that redundant coupled dynamical systems can be extremely robust against local noise in comparison to uncoupled dynamical systems evolving in the same noisy environment. Previous studies have shown that the noise robustness of redundant coupled dynamical systems is linearly scalable and deviations due to noise can be minimized by increasing the number of coupled units. Here, we demonstrate that the noise robustness can actually be scaled superlinearly if some conditions are met and very high noise robustness can be realized with very few coupled units. We discuss these conditions and show that this superlinear scalability depends on the nonlinearity of the individual dynamical units. The phenomenon is demonstrated in discrete as well as continuous dynamical systems. This superlinear scalability not only provides us an opportunity to exploit the nonlinearity of physical systems without being bogged down by noise but may also help us in understanding the functional role of coupled redundancy found in many biological systems. Moreover, engineers can exploit superlinear noise suppression by starting a coupled system near (not necessarily at) the appropriate initial condition.

  6. From Point Clouds to Building Information Models: 3D Semi-Automatic Reconstruction of Indoors of Existing Buildings

    Directory of Open Access Journals (Sweden)

    Hélène Macher

    2017-10-01

    Full Text Available The creation of as-built Building Information Models requires the acquisition of the as-is state of existing buildings. Laser scanners are widely used to achieve this goal since they permit to collect information about object geometry in form of point clouds and provide a large amount of accurate data in a very fast way and with a high level of details. Unfortunately, the scan-to-BIM (Building Information Model process remains currently largely a manual process which is time consuming and error-prone. In this paper, a semi-automatic approach is presented for the 3D reconstruction of indoors of existing buildings from point clouds. Several segmentations are performed so that point clouds corresponding to grounds, ceilings and walls are extracted. Based on these point clouds, walls and slabs of buildings are reconstructed and described in the IFC format in order to be integrated into BIM software. The assessment of the approach is proposed thanks to two datasets. The evaluation items are the degree of automation, the transferability of the approach and the geometric quality of results of the 3D reconstruction. Additionally, quality indexes are introduced to inspect the results in order to be able to detect potential errors of reconstruction.

  7. A scalable quantum computer with ions in an array of microtraps

    Science.gov (United States)

    Cirac; Zoller

    2000-04-06

    Quantum computers require the storage of quantum information in a set of two-level systems (called qubits), the processing of this information using quantum gates and a means of final readout. So far, only a few systems have been identified as potentially viable quantum computer models--accurate quantum control of the coherent evolution is required in order to realize gate operations, while at the same time decoherence must be avoided. Examples include quantum optical systems (such as those utilizing trapped ions or neutral atoms, cavity quantum electrodynamics and nuclear magnetic resonance) and solid state systems (using nuclear spins, quantum dots and Josephson junctions). The most advanced candidates are the quantum optical and nuclear magnetic resonance systems, and we expect that they will allow quantum computing with about ten qubits within the next few years. This is still far from the numbers required for useful applications: for example, the factorization of a 200-digit number requires about 3,500 qubits, rising to 100,000 if error correction is implemented. Scalability of proposed quantum computer architectures to many qubits is thus of central importance. Here we propose a model for an ion trap quantum computer that combines scalability (a feature usually associated with solid state proposals) with the advantages of quantum optical systems (in particular, quantum control and long decoherence times).

  8. Continuity-Aware Scheduling Algorithm for Scalable Video Streaming

    Directory of Open Access Journals (Sweden)

    Atinat Palawan

    2016-05-01

    Full Text Available The consumer demand for retrieving and delivering visual content through consumer electronic devices has increased rapidly in recent years. The quality of video in packet networks is susceptible to certain traffic characteristics: average bandwidth availability, loss, delay and delay variation (jitter. This paper presents a scheduling algorithm that modifies the stream of scalable video to combat jitter. The algorithm provides unequal look-ahead by safeguarding the base layer (without the need for overhead of the scalable video. The results of the experiments show that our scheduling algorithm reduces the number of frames with a violated deadline and significantly improves the continuity of the video stream without compromising the average Y Peek Signal-to-Noise Ratio (PSNR.

  9. Aggregation Potentials for Buildings - Business Models of Demand Response and Virtual Power Plants

    DEFF Research Database (Denmark)

    Ma, Zheng; Billanes, Joy Dalmacio; Jørgensen, Bo Nørregaard

    2017-01-01

    programs, national regulations and energy market structures strongly influence buildings’ participation in the aggregation market. Under the current Nordic market regulation, business model one is the most feasible one, and business model two faces more challenges due to regulation barriers and limited...... aggregation market with unclear incentives is still a challenge for buildings to participate in the aggregation market. However, few studies have investigated business models for building participation in the aggregation market. Therefore, this paper develops four business models for buildings to participate...

  10. Issues of Application of Machine Learning Models for Virtual and Real-Life Buildings

    Directory of Open Access Journals (Sweden)

    Young Min Kim

    2016-06-01

    Full Text Available The current Building Energy Performance Simulation (BEPS tools are based on first principles. For the correct use of BEPS tools, simulationists should have an in-depth understanding of building physics, numerical methods, control logics of building systems, etc. However, it takes significant time and effort to develop a first principles-based simulation model for existing buildings—mainly due to the laborious process of data gathering, uncertain inputs, model calibration, etc. Rather than resorting to an expert’s effort, a data-driven approach (so-called “inverse” approach has received growing attention for the simulation of existing buildings. This paper reports a cross-comparison of three popular machine learning models (Artificial Neural Network (ANN, Support Vector Machine (SVM, and Gaussian Process (GP for predicting a chiller’s energy consumption in a virtual and a real-life building. The predictions based on the three models are sufficiently accurate compared to the virtual and real measurements. This paper addresses the following issues for the successful development of machine learning models: reproducibility, selection of inputs, training period, outlying data obtained from the building energy management system (BEMS, and validation of the models. From the result of this comparative study, it was found that SVM has a disadvantage in computation time compared to ANN and GP. GP is the most sensitive to a training period among the three models.

  11. Towards Scalable Strain Gauge-Based Joint Torque Sensors

    Science.gov (United States)

    D’Imperio, Mariapaola; Cannella, Ferdinando; Caldwell, Darwin G.; Cuschieri, Alfred

    2017-01-01

    During recent decades, strain gauge-based joint torque sensors have been commonly used to provide high-fidelity torque measurements in robotics. Although measurement of joint torque/force is often required in engineering research and development, the gluing and wiring of strain gauges used as torque sensors pose difficulties during integration within the restricted space available in small joints. The problem is compounded by the need for a scalable geometric design to measure joint torque. In this communication, we describe a novel design of a strain gauge-based mono-axial torque sensor referred to as square-cut torque sensor (SCTS), the significant features of which are high degree of linearity, symmetry, and high scalability in terms of both size and measuring range. Most importantly, SCTS provides easy access for gluing and wiring of the strain gauges on sensor surface despite the limited available space. We demonstrated that the SCTS was better in terms of symmetry (clockwise and counterclockwise rotation) and more linear. These capabilities have been shown through finite element modeling (ANSYS) confirmed by observed data obtained by load testing experiments. The high performance of SCTS was confirmed by studies involving changes in size, material and/or wings width and thickness. Finally, we demonstrated that the SCTS can be successfully implementation inside the hip joints of miniaturized hydraulically actuated quadruped robot-MiniHyQ. This communication is based on work presented at the 18th International Conference on Climbing and Walking Robots (CLAWAR). PMID:28820446

  12. Thermal models of buildings. Determination of temperatures, heating and cooling loads. Theories, models and computer programs

    Energy Technology Data Exchange (ETDEWEB)

    Kaellblad, K

    1998-05-01

    The need to estimate indoor temperatures, heating or cooling load and energy requirements for buildings arises in many stages of a buildings life cycle, e.g. at the early layout stage, during the design of a building and for energy retrofitting planning. Other purposes are to meet the authorities requirements given in building codes. All these situations require good calculation methods. The main purpose of this report is to present the authors work with problems related to thermal models and calculation methods for determination of temperatures and heating or cooling loads in buildings. Thus the major part of the report deals with treatment of solar radiation in glazing systems, shading of solar and sky radiation and the computer program JULOTTA used to simulate the thermal behavior of rooms and buildings. Other parts of thermal models of buildings are more briefly discussed and included in order to give an overview of existing problems and available solutions. A brief presentation of how thermal models can be built up is also given and it is a hope that the report can be useful as an introduction to this part of building physics as well as during development of calculation methods and computer programs. The report may also serve as a help for the users of energy related programs. Independent of which method or program a user choose to work with it is his or her own responsibility to understand the limits of the tool, else wrong conclusions may be drawn from the results 52 refs, 22 figs, 4 tabs

  13. Scalable Partitioning Algorithms for FPGAs With Heterogeneous Resources

    National Research Council Canada - National Science Library

    Selvakkumaran, Navaratnasothie; Ranjan, Abhishek; Raje, Salil; Karypis, George

    2004-01-01

    As FPGA densities increase, partitioning-based FPGA placement approaches are becoming increasingly important as they can be used to provide high-quality and computationally scalable placement solutions...

  14. Management Model for efficient quality control in new buildings

    Directory of Open Access Journals (Sweden)

    C. E. Rodríguez-Jiménez

    2017-09-01

    Full Text Available The management of the quality control of each building process is usually set up in Spain from different levels of demand. This work tries to obtain a model of reference, to compare the quality control of the building process of a specific product (building, and to be able to evaluate its warranty level. In the quest of this purpose, we take credit of specialized sources and 153 real cases of Quality Control were carefully revised using a multi-judgment method. Applying different techniques to get a specific valuation (impartial of the input parameters through Delphi’s method (17 experts query, whose matrix treatment with the Fuzzy-QFD tool condenses numerical references through a weighted distribution of the selected functions and their corresponding conditioning factors. The model thus obtained (M153 is useful in order to have a quality control reference to meet the expectations of the quality.

  15. Scalable Atomistic Simulation Algorithms for Materials Research

    Directory of Open Access Journals (Sweden)

    Aiichiro Nakano

    2002-01-01

    Full Text Available A suite of scalable atomistic simulation programs has been developed for materials research based on space-time multiresolution algorithms. Design and analysis of parallel algorithms are presented for molecular dynamics (MD simulations and quantum-mechanical (QM calculations based on the density functional theory. Performance tests have been carried out on 1,088-processor Cray T3E and 1,280-processor IBM SP3 computers. The linear-scaling algorithms have enabled 6.44-billion-atom MD and 111,000-atom QM calculations on 1,024 SP3 processors with parallel efficiency well over 90%. production-quality programs also feature wavelet-based computational-space decomposition for adaptive load balancing, spacefilling-curve-based adaptive data compression with user-defined error bound for scalable I/O, and octree-based fast visibility culling for immersive and interactive visualization of massive simulation data.

  16. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  17. A Graph-Based Approach for 3D Building Model Reconstruction from Airborne LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    Bin Wu

    2017-01-01

    Full Text Available 3D building model reconstruction is of great importance for environmental and urban applications. Airborne light detection and ranging (LiDAR is a very useful data source for acquiring detailed geometric and topological information of building objects. In this study, we employed a graph-based method based on hierarchical structure analysis of building contours derived from LiDAR data to reconstruct urban building models. The proposed approach first uses a graph theory-based localized contour tree method to represent the topological structure of buildings, then separates the buildings into different parts by analyzing their topological relationships, and finally reconstructs the building model by integrating all the individual models established through the bipartite graph matching process. Our approach provides a more complete topological and geometrical description of building contours than existing approaches. We evaluated the proposed method by applying it to the Lujiazui region in Shanghai, China, a complex and large urban scene with various types of buildings. The results revealed that complex buildings could be reconstructed successfully with a mean modeling error of 0.32 m. Our proposed method offers a promising solution for 3D building model reconstruction from airborne LiDAR point clouds.

  18. Development of Automated Procedures to Generate Reference Building Models for ASHRAE Standard 90.1 and India’s Building Energy Code and Implementation in OpenStudio

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Andrew [National Renewable Energy Lab. (NREL), Golden, CO (United States); Haves, Philip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jegi, Subhash [International Institute of Information Technology, Hyderabad (India); Garg, Vishal [International Institute of Information Technology, Hyderabad (India); Ravache, Baptiste [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-09-14

    This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.

  19. Scalable DeNoise-and-Forward in Bidirectional Relay Networks

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Krigslund, Rasmus; Popovski, Petar

    2010-01-01

    In this paper a scalable relaying scheme is proposed based on an existing concept called DeNoise-and-Forward, DNF. We call it Scalable DNF, S-DNF, and it targets the scenario with multiple communication flows through a single common relay. The idea of the scheme is to combine packets at the relay...... in order to save transmissions. To ensure decodability at the end-nodes, a priori information about the content of the combined packets must be available. This is gathered during the initial transmissions to the relay. The trade-off between decodability and number of necessary transmissions is analysed...

  20. Building models for marketing decisions : Past, present and future

    NARCIS (Netherlands)

    Leeflang, PSH; Wittink, DR

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models

  1. The Dutch sustainable building policy: A model for developing countries?

    Energy Technology Data Exchange (ETDEWEB)

    Melchert, Luciana [Faculty of Architecture and Urbanism, University of Sao Paulo, Rua do Lago, 876, CEP 05508.900, Sao Paulo SP (Brazil)

    2007-02-15

    This article explores the institutionalization of environmental policies in the Dutch building sector and the applicability of the current model to developing countries. First, it analyzes the transition of sustainable building practices in the Netherlands from the 1970s until today, exploring how these were originally embedded in a discourse on 'de-modernization', which attempted to improve the environmental performance of building stocks by means of self-sufficient technologies, whereas nowadays they adopt a framework of 'ecological modernization', with integrative approaches seeking to improve the environmental performance of building stocks through more efficient-rather than self-sufficient-technologies. The study subsequently shows how the current Dutch sustainable building framework has thereby managed to achieve a pragmatic and widely accepted rationale, which can serve to orient the ecological restructuring of building stocks in developing countries. (author)

  2. Modeling Boston: A workflow for the efficient generation and maintenance of urban building energy models from existing geospatial datasets

    International Nuclear Information System (INIS)

    Cerezo Davila, Carlos; Reinhart, Christoph F.; Bemis, Jamie L.

    2016-01-01

    City governments and energy utilities are increasingly focusing on the development of energy efficiency strategies for buildings as a key component in emission reduction plans and energy supply strategies. To support these diverse needs, a new generation of Urban Building Energy Models (UBEM) is currently being developed and validated to estimate citywide hourly energy demands at the building level. However, in order for cities to rely on UBEMs, effective model generation and maintenance workflows are needed based on existing urban data structures. Within this context, the authors collaborated with the Boston Redevelopment Authority to develop a citywide UBEM based on official GIS datasets and a custom building archetype library. Energy models for 83,541 buildings were generated and assigned one of 52 use/age archetypes, within the CAD modelling environment Rhinoceros3D. The buildings were then simulated using the US DOE EnergyPlus simulation program, and results for buildings of the same archetype were crosschecked against data from the US national energy consumption surveys. A district-level intervention combining photovoltaics with demand side management is presented to demonstrate the ability of UBEM to provide actionable information. Lack of widely available archetype templates and metered energy data, were identified as key barriers within existing workflows that may impede cities from effectively applying UBEM to guide energy policy. - Highlights: • Data requirements for Urban Building Energy Models are reviewed. • A workflow for UBEM generation from available GIS datasets is developed. • A citywide demand simulation model for Boston is generated and tested. • Limitations for UBEM in current urban data systems are identified and discussed. • Model application for energy management policy is shown in an urban PV scenario.

  3. A scalable and deformable stylized model of the adult human eye for radiation dose assessment.

    Science.gov (United States)

    El Basha, Daniel; Furuta, Takuya; Iyer, Siva S R; Bolch, Wesley E

    2018-03-23

    With recent changes in the recommended annual limit on eye lens exposures to ionizing radiation, there is considerable interest in predictive computational dosimetry models of the human eye and its various ocular structures including the crystalline lens, ciliary body, cornea, retina, optic nerve, and central retinal artery. Computational eye models to date have been constructed as stylized models, high-resolution voxel models, and polygon mesh models. Their common feature, however, is that they are typically constructed of nominal size and of a roughly spherical shape associated with the emmetropic eye. In this study, we present a geometric eye model that is both scalable (allowing for changes in eye size) and deformable (allowing for changes in eye shape), and that is suitable for use in radiation transport studies of ocular exposures and radiation treatments of eye disease. The model allows continuous and variable changes in eye size (axial lengths from 20 to 26 mm) and eye shape (diopters from -12 to +6). As an explanatory example of its use, five models (emmetropic eyes of small, average, and large size, as well as average size eyes of -12D and +6D) were constructed and subjected to normally incident beams of monoenergetic electrons and photons, with resultant energy-dependent dose coefficients presented for both anterior and posterior eye structures. Electron dose coefficients were found to vary with changes to both eye size and shape for the posterior eye structures, while their values for the eye crystalline lens were found to be sensitive to changes in only eye size. No dependence upon eye size or eye shape was found for photon dose coefficients at energies below 2 MeV. Future applications of the model can include more extensive tabulations of dose coefficients to all ocular structures (not only the lens) as a function of eye size and shape, as well as the assessment of x-ray therapies for ocular disease for patients with non-emmetropic eyes. © 2018

  4. A scalable and deformable stylized model of the adult human eye for radiation dose assessment

    Science.gov (United States)

    El Basha, Daniel; Furuta, Takuya; Iyer, Siva S. R.; Bolch, Wesley E.

    2018-05-01

    With recent changes in the recommended annual limit on eye lens exposures to ionizing radiation, there is considerable interest in predictive computational dosimetry models of the human eye and its various ocular structures including the crystalline lens, ciliary body, cornea, retina, optic nerve, and central retinal artery. Computational eye models to date have been constructed as stylized models, high-resolution voxel models, and polygon mesh models. Their common feature, however, is that they are typically constructed of nominal size and of a roughly spherical shape associated with the emmetropic eye. In this study, we present a geometric eye model that is both scalable (allowing for changes in eye size) and deformable (allowing for changes in eye shape), and that is suitable for use in radiation transport studies of ocular exposures and radiation treatments of eye disease. The model allows continuous and variable changes in eye size (axial lengths from 20 to 26 mm) and eye shape (diopters from  ‑12 to  +6). As an explanatory example of its use, five models (emmetropic eyes of small, average, and large size, as well as average size eyes of  ‑12D and  +6D) were constructed and subjected to normally incident beams of monoenergetic electrons and photons, with resultant energy-dependent dose coefficients presented for both anterior and posterior eye structures. Electron dose coefficients were found to vary with changes to both eye size and shape for the posterior eye structures, while their values for the crystalline lens were found to be sensitive to changes in only eye size. No dependence upon eye size or eye shape was found for photon dose coefficients at energies below 2 MeV. Future applications of the model can include more extensive tabulations of dose coefficients to all ocular structures (not only the lens) as a function of eye size and shape, as well as the assessment of x-ray therapies for ocular disease for patients with non

  5. INTEGRATING SMARTPHONE IMAGES AND AIRBORNE LIDAR DATA FOR COMPLETE URBAN BUILDING MODELLING

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2016-06-01

    Full Text Available A complete building model reconstruction needs data collected from both air and ground. The former often has sparse coverage on building façades, while the latter usually is unable to observe the building rooftops. Attempting to solve the missing data issues in building reconstruction from single data source, we describe an approach for complete building reconstruction that integrates airborne LiDAR data and ground smartphone imagery. First, by taking advantages of GPS and digital compass information embedded in the image metadata of smartphones, we are able to find airborne LiDAR point clouds for the corresponding buildings in the images. In the next step, Structure-from-Motion and dense multi-view stereo algorithms are applied to generate building point cloud from multiple ground images. The third step extracts building outlines respectively from the LiDAR point cloud and the ground image point cloud. An automated correspondence between these two sets of building outlines allows us to achieve a precise registration and combination of the two point clouds, which ultimately results in a complete and full resolution building model. The developed approach overcomes the problem of sparse points on building façades in airborne LiDAR and the deficiency of rooftops in ground images such that the merits of both datasets are utilized.

  6. An Approach for On-Board Software Building Blocks Cooperation and Interfaces Definition

    Science.gov (United States)

    Pascucci, Dario; Campolo, Giovanni; Candia, Sante; Lisio, Giovanni

    2010-08-01

    This paper provides an insight on the Avionic SW architecture developed by Thales Alenia Space Italy (TAS-I) to achieve structuring of the OBSW as a set of self-standing and re-usable building blocks. It is initially described the underlying framework for building blocks cooperation, which is based on ECSSE-70 packets forwarding (for services request to a building block) and standard parameters exchange for data communication. Subsequently it is discussed the high level of flexibility and scalability of the resulting architecture, reporting as example an implementation of the Failure Detection, Isolation and Recovery (FDIR) function which exploits the proposed architecture. The presented approach evolves from avionic SW architecture developed in the scope of the project PRIMA (Mult-Purpose Italian Re-configurable Platform) and has been adopted for the Sentinel-1 Avionic Software (ASW).

  7. Multiscale modelling for better hygrothermal prediction of porous building materials

    Directory of Open Access Journals (Sweden)

    Belarbi Rafik

    2018-01-01

    Full Text Available The aim of this work is to understand the influence of the microstructuralgeometric parameters of porous building materials on the mechanisms of coupled heat, air and moisture transfers, in order to predict behavior of the building to control and improve it in its durability. For this a multi-scale approach is implemented. It consists of mastering the dominant physical phenomena and their interactions on the microscopic scale. Followed by a dual-scale modelling, microscopic-macroscopic, of coupled heat, air and moisture transfers that takes into account the intrinsic properties and microstructural topology of the material using X-ray tomography combined with the correlation of 3D images were undertaken. In fact, the hygromorphicbehavior under hydric solicitations was considered. In this context, a model of coupled heat, air and moisture transfer in porous building materials was developed using the periodic homogenization technique. These informations were subsequently implemented in a dynamic computation simulation that model the hygrothermalbehaviourof material at the scale of the envelopes and indoor air quality of building. Results reveals that is essential to consider the local behaviors of materials, but also to be able to measure and quantify the evolution of its properties on a macroscopic scale from the youngest age of the material. In addition, comparisons between experimental and numerical temperature and relative humidity profilesin multilayers wall and in building envelopes were undertaken. Good agreements were observed.

  8. Modeling of Heat Transfer in Rooms in the Modelica "Buildings" Library

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nouidui, Thierry Stephane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-11-01

    This paper describes the implementation of the room heat transfer model in the free open-source Modelica \\Buildings" library. The model can be used as a single room or to compose a multizone building model. We discuss how the model is decomposed into submodels for the individual heat transfer phenomena. We also discuss the main physical assumptions. The room model can be parameterized to use different modeling assumptions, leading to linear or non-linear differential algebraic systems of equations. We present numerical experiments that show how these assumptions affect computing time and accuracy for selected cases of the ANSI/ASHRAE Standard 140- 2007 envelop validation tests.

  9. Enhancements to AERMOD's building downwash algorithms based on wind-tunnel and Embedded-LES modeling

    Science.gov (United States)

    Monbureau, E. M.; Heist, D. K.; Perry, S. G.; Brouwer, L. H.; Foroutan, H.; Tang, W.

    2018-04-01

    Knowing the fate of effluent from an industrial stack is important for assessing its impact on human health. AERMOD is one of several Gaussian plume models containing algorithms to evaluate the effect of buildings on the movement of the effluent from a stack. The goal of this study is to improve AERMOD's ability to accurately model important and complex building downwash scenarios by incorporating knowledge gained from a recently completed series of wind tunnel studies and complementary large eddy simulations of flow and dispersion around simple structures for a variety of building dimensions, stack locations, stack heights, and wind angles. This study presents three modifications to the building downwash algorithm in AERMOD that improve the physical basis and internal consistency of the model, and one modification to AERMOD's building pre-processor to better represent elongated buildings in oblique winds. These modifications are demonstrated to improve the ability of AERMOD to model observed ground-level concentrations in the vicinity of a building for the variety of conditions examined in the wind tunnel and numerical studies.

  10. Actual building energy use patterns and their implications for predictive modeling

    International Nuclear Information System (INIS)

    Heidarinejad, Mohammad; Cedeño-Laurent, Jose G.; Wentz, Joshua R.; Rekstad, Nicholas M.; Spengler, John D.; Srebric, Jelena

    2017-01-01

    Highlights: • Developed three building categories based on energy use patterns of campus buildings. • Evaluated implication of temporal energy data granularity on predictive modeling. • Demonstrated importance of monitoring daily chilled water consumption. • Identified interval electricity data as an indicator of building operation schedules. • Demonstrated a calibration process for energy modeling of a campus building. - Abstract: The main goal of this study is to understand the patterns in which commercial buildings consume energy, rather than evaluating building energy use based on aggregate utility bills typically linked to building principal tenant activity or occupancy type. The energy consumption patterns define buildings as externally-load, internally-load, or mixed-load dominated buildings. Penn State and Harvard campuses serve as case studies for this particular research project. The buildings in these two campuses use steam, chilled water, and electricity as energy commodities and maintain databases of different resolutions to include minute, hourly, daily, and monthly data instances depending on the commodity and available data acquisition system. The results of this study show monthly steam consumption directly correlates to outdoor environmental conditions for 88% of the studied buildings, while chilled water consumption has negligible correlation to the outdoor environmental conditions. Thus, in terms of monthly chilled water consumption, 86% of buildings are internally-load and mixed-load dominated, respectively. Chilled water consumption is better suited for the daily analyses compared to the monthly and hourly analyses. While the influence of building operation schedules affects the analyses at the hourly level, the monthly chilled water consumptions are not good indicators of the building energy consumption patterns. Electricity consumption at the monthly (or seasonal) level can support the building energy simulation tools for the

  11. The Front-End Concentrator card for the RD51 Scalable Readout System

    International Nuclear Information System (INIS)

    Toledo, J; Esteve, R; Monzó, J M; Tarazona, A; Muller, H; Martoiu, S

    2011-01-01

    Conventional readout systems exist in many variants since the usual approach is to build readout electronics for one given type of detector. The Scalable Readout System (SRS) developed within the RD51 collaboration relaxes this situation considerably by providing a choice of frontends which are connected over a customizable interface to a common SRS DAQ architecture. This allows sharing development and production costs among a large base of users as well as support from a wide base of developers. The Front-end Concentrator card (FEC), a RD51 common project between CERN and the NEXT Collaboration, is a reconfigurable interface between the SRS online system and a wide range of frontends. This is accomplished by using application-specific adapter cards between the FEC and the frontends. The ensemble (FEC and adapter card are edge mounted) forms a 6U × 220 mm Eurocard combo that fits on a 19'' subchassis. Adapter cards exist already for the first applications and more are in development.

  12. Pseudo-Bond Graph model for the analysis of the thermal behavior of buildings

    Directory of Open Access Journals (Sweden)

    Merabtine Abdelatif

    2013-01-01

    Full Text Available In this work, a simplified graphical modeling tool, which in some extent can be considered in halfway between detailed physical and Data driven dynamic models, has been developed. This model is based on Bond Graphs approach. This approach has the potential to display explicitly the nature of power in a building system, such as a phenomenon of storage, processing and dissipating energy such as Heating, Ventilation and Air-Conditioning (HVAC systems. This paper represents the developed models of the two transient heat conduction problems corresponding to the most practical cases in building envelope, such as the heat transfer through vertical walls, roofs and slabs. The validation procedure consists of comparing the results obtained with this model with analytical solution. It has shown very good agreement between measured data and Bond Graphs model simulation. The Bond Graphs technique is then used to model the building dynamic thermal behavior over a single zone building structure and compared with a set of experimental data. An evaluation of indoor temperature was carried out in order to check our Bond Graphs model.

  13. Comparison of sensorless dimming control based on building modeling and solar power generation

    International Nuclear Information System (INIS)

    Lee, Naeun; Kim, Jonghun; Jang, Cheolyong; Sung, Yoondong; Jeong, Hakgeun

    2015-01-01

    Artificial lighting in office buildings accounts for about 30% of the total building energy consumption. Lighting energy is important to reduce building energy consumption since artificial lighting typically has a relatively large energy conversion factor. Therefore, previous studies have proposed a dimming control using daylight. When applied dimming control, method based on building modeling does not need illuminance sensors. Thus, it can be applied to existing buildings that do not have illuminance sensors. However, this method does not accurately reflect real-time weather conditions. On the other hand, solar power generation from a PV (photovoltaic) panel reflects real-time weather conditions. The PV panel as the sensor improves the accuracy of dimming control by reflecting disturbance. Therefore, we compared and analyzed two types of sensorless dimming controls: those based on the building modeling and those that based on solar power generation using PV panels. In terms of energy savings, we found that a dimming control based on building modeling is more effective than that based on solar power generation by about 6%. However, dimming control based on solar power generation minimizes the inconvenience to occupants and can also react to changes in solar radiation entering the building caused by dirty window. - Highlights: • We conducted sensorless dimming control based on solar power generation. • Dimming controls using building modeling and solar power generation were compared. • The real time weather conditions can be considered by using solar power generation. • Dimming control using solar power generation minimizes inconvenience to occupants

  14. A meta model-based methodology for an energy savings uncertainty assessment of building retrofitting

    Directory of Open Access Journals (Sweden)

    Caucheteux Antoine

    2016-01-01

    Full Text Available To reduce greenhouse gas emissions, energy retrofitting of building stock presents significant potential for energy savings. In the design stage, energy savings are usually assessed through Building Energy Simulation (BES. The main difficulty is to first assess the energy efficiency of the existing buildings, in other words, to calibrate the model. As calibration is an under determined problem, there is many solutions for building representation in simulation tools. In this paper, a method is proposed to assess not only energy savings but also their uncertainty. Meta models, using experimental designs, are used to identify many acceptable calibrations: sets of parameters that provide the most accurate representation of the building are retained to calculate energy savings. The method was applied on an existing office building modeled with the TRNsys BES. The meta model, using 13 parameters, is built with no more than 105 simulations. The evaluation of the meta model on thousands of new simulations gives a normalized mean bias error between the meta model and BES of <4%. Energy savings are assessed based on six energy savings concepts, which indicate savings of 2–45% with a standard deviation ranging between 1.3% and 2.5%.

  15. Toward Accessing Spatial Structure from Building Information Models

    Science.gov (United States)

    Schultz, C.; Bhatt, M.

    2011-08-01

    Data about building designs and layouts is becoming increasingly more readily available. In the near future, service personal (such as maintenance staff or emergency rescue workers) arriving at a building site will have immediate real-time access to enormous amounts of data relating to structural properties, utilities, materials, temperature, and so on. The critical problem for users is the taxing and error prone task of interpreting such a large body of facts in order to extract salient information. This is necessary for comprehending a situation and deciding on a plan of action, and is a particularly serious issue in time-critical and safety-critical activities such as firefighting. Current unifying building models such as the Industry Foundation Classes (IFC), while being comprehensive, do not directly provide data structures that focus on spatial reasoning and spatial modalities that are required for high-level analytical tasks. The aim of the research presented in this paper is to provide computational tools for higher level querying and reasoning that shift the cognitive burden of dealing with enormous amounts of data away from the user. The user can then spend more energy and time in planning and decision making in order to accomplish the tasks at hand. We present an overview of our framework that provides users with an enhanced model of "built-up space". In order to test our approach using realistic design data (in terms of both scale and the nature of the building models) we describe how our system interfaces with IFC, and we conduct timing experiments to determine the practicality of our approach. We discuss general computational approaches for deriving higher-level spatial modalities by focusing on the example of route graphs. Finally, we present a firefighting scenario with alternative route graphs to motivate the application of our framework.

  16. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  17. A procedure for building product models

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin

    2001-01-01

    This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes, which are to be supported...... with product models. The next phase includes an analysis of the product assortment, and the set up of a so-called product master. Finally the product model is designed and implemented using object oriented modelling. The procedure is developed in order to ensure that the product models constructed are fit...... for the business processes they support, and properly structured and documented, in order to facilitate that the systems can be maintained continually and further developed. The research has been carried out at the Centre for Industrialisation of Engineering, Department of Manufacturing Engineering, Technical...

  18. Extension of the PMV model to non-air-conditioned building in warm climates

    DEFF Research Database (Denmark)

    Fanger, Povl Ole; Toftum, Jørn

    2002-01-01

    The PMV model agrees well with high-quality field studies in buildings with HVAC systems, situated in cold, temperate and warm climates, studied during both summer and winter. In non-air-conditioned buildings in warm climates, occupants may sense the warmth as being less severe than the PMV...... predicts. The main reason is low expectations, but a metabolic rate that is estimated too high can also contribute to explaining the difference. An extension of the PMV model that includes an expectancy factor is introduced for use in non-air-conditioned buildings in warm climates. The extended PMV model...... agrees well with quality field studies in non-air-conditioned buildings of three continents....

  19. Space-Filling Supercapacitor Carpets: Highly scalable fractal architecture for energy storage

    Science.gov (United States)

    Tiliakos, Athanasios; Trefilov, Alexandra M. I.; Tanasǎ, Eugenia; Balan, Adriana; Stamatin, Ioan

    2018-04-01

    Revamping ground-breaking ideas from fractal geometry, we propose an alternative micro-supercapacitor configuration realized by laser-induced graphene (LIG) foams produced via laser pyrolysis of inexpensive commercial polymers. The Space-Filling Supercapacitor Carpet (SFSC) architecture introduces the concept of nested electrodes based on the pre-fractal Peano space-filling curve, arranged in a symmetrical equilateral setup that incorporates multiple parallel capacitor cells sharing common electrodes for maximum efficiency and optimal length-to-area distribution. We elucidate on the theoretical foundations of the SFSC architecture, and we introduce innovations (high-resolution vector-mode printing) in the LIG method that allow for the realization of flexible and scalable devices based on low iterations of the Peano algorithm. SFSCs exhibit distributed capacitance properties, leading to capacitance, energy, and power ratings proportional to the number of nested electrodes (up to 4.3 mF, 0.4 μWh, and 0.2 mW for the largest tested model of low iteration using aqueous electrolytes), with competitively high energy and power densities. This can pave the road for full scalability in energy storage, reaching beyond the scale of micro-supercapacitors for incorporating into larger and more demanding applications.

  20. Public health component in building information modeling

    Science.gov (United States)

    Trufanov, A. I.; Rossodivita, A.; Tikhomirov, A. A.; Berestneva, O. G.; Marukhina, O. V.

    2018-05-01

    A building information modelling (BIM) conception has established itself as an effective and practical approach to plan, design, construct, and manage buildings and infrastructure. Analysis of the governance literature has shown that the BIM-developed tools do not take fully into account the growing demands from ecology and health fields. In this connection, it is possible to offer an optimal way of adapting such tools to the necessary consideration of the sanitary and hygienic specifications of materials used in construction industry. It is proposed to do it through the introduction of assessments that meet the requirements of national sanitary standards. This approach was demonstrated in the case study of Revit® program.

  1. Business Process Optimization Through Soa And Cloud Integration Using Soa- Ra Model

    Directory of Open Access Journals (Sweden)

    Syed Ejaz Ali Shah

    2015-08-01

    Full Text Available Business processes workflow architecture based on agility and flexibility plays an important role in the success of any enterprise. In new era most of the processes are automated and they are supported by IT-Services in the form of Service Oriented Architecture SOA components. Due to mobility and scalability as well as high performance computing and distributed working environment it is crucial to focus on an architecture which is agile optimized cost effective and easy to implement. In this paper we have conducted a research study on layer based BPM SOA and cloud integrated architecture. The main contribution of the research study is to propose an agile cost effective and scalable solution framework based on Architectural Building Blocks ABBs following a SOA-RA layered model to integrate BPM SOA and cloud services.

  2. Differentiation of Human Pluripotent Stem Cells into Functional Endothelial Cells in Scalable Suspension Culture

    Directory of Open Access Journals (Sweden)

    Ruth Olmer

    2018-05-01

    Full Text Available Summary: Endothelial cells (ECs are involved in a variety of cellular responses. As multifunctional components of vascular structures, endothelial (progenitor cells have been utilized in cellular therapies and are required as an important cellular component of engineered tissue constructs and in vitro disease models. Although primary ECs from different sources are readily isolated and expanded, cell quantity and quality in terms of functionality and karyotype stability is limited. ECs derived from human induced pluripotent stem cells (hiPSCs represent an alternative and potentially superior cell source, but traditional culture approaches and 2D differentiation protocols hardly allow for production of large cell numbers. Aiming at the production of ECs, we have developed a robust approach for efficient endothelial differentiation of hiPSCs in scalable suspension culture. The established protocol results in relevant numbers of ECs for regenerative approaches and industrial applications that show in vitro proliferation capacity and a high degree of chromosomal stability. : In this article, U. Martin and colleagues show the generation of hiPSC endothelial cells in scalable cultures in up to 100 mL culture volume. The generated ECs show in vitro proliferation capacity and a high degree of chromosomal stability after in vitro expansion. The established protocol allows to generate hiPSC-derived ECs in relevant numbers for regenerative approaches. Keywords: hiPSC differentiation, endothelial cells, scalable culture

  3. Using scalable vector graphics to evolve art

    NARCIS (Netherlands)

    den Heijer, E.; Eiben, A. E.

    2016-01-01

    In this paper, we describe our investigations of the use of scalable vector graphics as a genotype representation in evolutionary art. We describe the technical aspects of using SVG in evolutionary art, and explain our custom, SVG specific operators initialisation, mutation and crossover. We perform

  4. JobCenter: an open source, cross-platform, and distributed job queue management system optimized for scalability and versatility.

    Science.gov (United States)

    Jaschob, Daniel; Riffle, Michael

    2012-07-30

    Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.

  5. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen; Guo, Siyue

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an office building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.

  6. Economical analyses of build-operate-transfer model in establishing alternative power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yumurtaci, Zehra [Yildiz Technical University, Department of Mechanical Engineering, Y.T.U. Mak. Fak. Mak. Muh. Bolumu, Besiktas, 34349 Istanbul (Turkey)]. E-mail: zyumur@yildiz.edu.tr; Erdem, Hasan Hueseyin [Yildiz Technical University, Department of Mechanical Engineering, Y.T.U. Mak. Fak. Mak. Muh. Bolumu, Besiktas, 34349 Istanbul (Turkey)

    2007-01-15

    The most widely employed method to meet the increasing electricity demand is building new power plants. The most important issue in building new power plants is to find financial funds. Various models are employed, especially in developing countries, in order to overcome this problem and to find a financial source. One of these models is the build-operate-transfer (BOT) model. In this model, the investor raises all the funds for mandatory expenses and provides financing, builds the plant and, after a certain plant operation period, transfers the plant to the national power organization. In this model, the object is to decrease the burden of power plants on the state budget. The most important issue in the BOT model is the dependence of the unit electricity cost on the transfer period. In this study, the model giving the unit electricity cost depending on the transfer of the plants established according to the BOT model, has been discussed. Unit electricity investment cost and unit electricity cost in relation to transfer period for plant types have been determined. Furthermore, unit electricity cost change depending on load factor, which is one of the parameters affecting annual electricity production, has been determined, and the results have been analyzed. This method can be employed for comparing the production costs of different plants that are planned to be established according to the BOT model, or it can be employed to determine the appropriateness of the BOT model.

  7. Economical analyses of build-operate-transfer model in establishing alternative power plants

    International Nuclear Information System (INIS)

    Yumurtaci, Zehra; Erdem, Hasan Hueseyin

    2007-01-01

    The most widely employed method to meet the increasing electricity demand is building new power plants. The most important issue in building new power plants is to find financial funds. Various models are employed, especially in developing countries, in order to overcome this problem and to find a financial source. One of these models is the build-operate-transfer (BOT) model. In this model, the investor raises all the funds for mandatory expenses and provides financing, builds the plant and, after a certain plant operation period, transfers the plant to the national power organization. In this model, the object is to decrease the burden of power plants on the state budget. The most important issue in the BOT model is the dependence of the unit electricity cost on the transfer period. In this study, the model giving the unit electricity cost depending on the transfer of the plants established according to the BOT model, has been discussed. Unit electricity investment cost and unit electricity cost in relation to transfer period for plant types have been determined. Furthermore, unit electricity cost change depending on load factor, which is one of the parameters affecting annual electricity production, has been determined, and the results have been analyzed. This method can be employed for comparing the production costs of different plants that are planned to be established according to the BOT model, or it can be employed to determine the appropriateness of the BOT model

  8. Six-Tube Freezable Radiator Testing and Model Correlation

    Science.gov (United States)

    Lilibridge, Sean T.; Navarro, Moses

    2012-01-01

    Freezable Radiators offer an attractive solution to the issue of thermal control system scalability. As thermal environments change, a freezable radiator will effectively scale the total heat rejection it is capable of as a function of the thermal environment and flow rate through the radiator. Scalable thermal control systems are a critical technology for spacecraft that will endure missions with widely varying thermal requirements. These changing requirements are a result of the spacecraft?s surroundings and because of different thermal loads rejected during different mission phases. However, freezing and thawing (recov ering) a freezable radiator is a process that has historically proven very difficult to predict through modeling, resulting in highly inaccurate predictions of recovery time. These predictions are a critical step in gaining the capability to quickly design and produce optimized freezable radiators for a range of mission requirements. This paper builds upon previous efforts made to correlate a Thermal Desktop(TM) model with empirical testing data from two test articles, with additional model modifications and empirical data from a sub-component radiator for a full scale design. Two working fluids were tested: MultiTherm WB-58 and a 50-50 mixture of DI water and Amsoil ANT.

  9. Building aggregate timber supply models from individual harvest choice

    Science.gov (United States)

    Maksym Polyakov; David N. Wear; Robert Huggett

    2009-01-01

    Timber supply has traditionally been modelled using aggregate data. In this paper, we build aggregate supply models for four roundwood products for the US state of North Carolina from a stand-level harvest choice model applied to detailed forest inventory. The simulated elasticities of pulpwood supply are much lower than reported by previous studies. Cross price...

  10. Vortex Filaments in Grids for Scalable, Fine Smoke Simulation.

    Science.gov (United States)

    Meng, Zhang; Weixin, Si; Yinling, Qian; Hanqiu, Sun; Jing, Qin; Heng, Pheng-Ann

    2015-01-01

    Vortex modeling can produce attractive visual effects of dynamic fluids, which are widely applicable for dynamic media, computer games, special effects, and virtual reality systems. However, it is challenging to effectively simulate intensive and fine detailed fluids such as smoke with fast increasing vortex filaments and smoke particles. The authors propose a novel vortex filaments in grids scheme in which the uniform grids dynamically bridge the vortex filaments and smoke particles for scalable, fine smoke simulation with macroscopic vortex structures. Using the vortex model, their approach supports the trade-off between simulation speed and scale of details. After computing the whole velocity, external control can be easily exerted on the embedded grid to guide the vortex-based smoke motion. The experimental results demonstrate the efficiency of using the proposed scheme for a visually plausible smoke simulation with macroscopic vortex structures.

  11. FINDING CUBOID-BASED BUILDING MODELS IN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2012-07-01

    Full Text Available In this paper, we present an automatic approach for the derivation of 3D building models of level-of-detail 1 (LOD 1 from point clouds obtained from (dense image matching or, for comparison only, from LIDAR. Our approach makes use of the predominance of vertical structures and orthogonal intersections in architectural scenes. After robustly determining the scene's vertical direction based on the 3D points we use it as constraint for a RANSAC-based search for vertical planes in the point cloud. The planes are further analyzed to segment reliable outlines for rectangular surface within these planes, which are connected to construct cuboid-based building models. We demonstrate that our approach is robust and effective over a range of real-world input data sets with varying point density, amount of noise, and outliers.

  12. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  13. Scalable quantum memory in the ultrastrong coupling regime.

    Science.gov (United States)

    Kyaw, T H; Felicetti, S; Romero, G; Solano, E; Kwek, L-C

    2015-03-02

    Circuit quantum electrodynamics, consisting of superconducting artificial atoms coupled to on-chip resonators, represents a prime candidate to implement the scalable quantum computing architecture because of the presence of good tunability and controllability. Furthermore, recent advances have pushed the technology towards the ultrastrong coupling regime of light-matter interaction, where the qubit-resonator coupling strength reaches a considerable fraction of the resonator frequency. Here, we propose a qubit-resonator system operating in that regime, as a quantum memory device and study the storage and retrieval of quantum information in and from the Z2 parity-protected quantum memory, within experimentally feasible schemes. We are also convinced that our proposal might pave a way to realize a scalable quantum random-access memory due to its fast storage and readout performances.

  14. A Bit Stream Scalable Speech/Audio Coder Combining Enhanced Regular Pulse Excitation and Parametric Coding

    Directory of Open Access Journals (Sweden)

    Albertus C. den Brinker

    2007-01-01

    Full Text Available This paper introduces a new audio and speech broadband coding technique based on the combination of a pulse excitation coder and a standardized parametric coder, namely, MPEG-4 high-quality parametric coder. After presenting a series of enhancements to regular pulse excitation (RPE to make it suitable for the modeling of broadband signals, it is shown how pulse and parametric codings complement each other and how they can be merged to yield a layered bit stream scalable coder able to operate at different points in the quality bit rate plane. The performance of the proposed coder is evaluated in a listening test. The major result is that the extra functionality of the bit stream scalability does not come at the price of a reduced performance since the coder is competitive with standardized coders (MP3, AAC, SSC.

  15. A Bit Stream Scalable Speech/Audio Coder Combining Enhanced Regular Pulse Excitation and Parametric Coding

    Science.gov (United States)

    Riera-Palou, Felip; den Brinker, Albertus C.

    2007-12-01

    This paper introduces a new audio and speech broadband coding technique based on the combination of a pulse excitation coder and a standardized parametric coder, namely, MPEG-4 high-quality parametric coder. After presenting a series of enhancements to regular pulse excitation (RPE) to make it suitable for the modeling of broadband signals, it is shown how pulse and parametric codings complement each other and how they can be merged to yield a layered bit stream scalable coder able to operate at different points in the quality bit rate plane. The performance of the proposed coder is evaluated in a listening test. The major result is that the extra functionality of the bit stream scalability does not come at the price of a reduced performance since the coder is competitive with standardized coders (MP3, AAC, SSC).

  16. Change detection on LOD 2 building models with very high resolution spaceborne stereo imagery

    Science.gov (United States)

    Qin, Rongjun

    2014-10-01

    Due to the fast development of the urban environment, the need for efficient maintenance and updating of 3D building models is ever increasing. Change detection is an essential step to spot the changed area for data (map/3D models) updating and urban monitoring. Traditional methods based on 2D images are no longer suitable for change detection in building scale, owing to the increased spectral variability of the building roofs and larger perspective distortion of the very high resolution (VHR) imagery. Change detection in 3D is increasingly being investigated using airborne laser scanning data or matched Digital Surface Models (DSM), but rare study has been conducted regarding to change detection on 3D city models with VHR images, which is more informative but meanwhile more complicated. This is due to the fact that the 3D models are abstracted geometric representation of the urban reality, while the VHR images record everything. In this paper, a novel method is proposed to detect changes directly on LOD (Level of Detail) 2 building models with VHR spaceborne stereo images from a different date, with particular focus on addressing the special characteristics of the 3D models. In the first step, the 3D building models are projected onto a raster grid, encoded with building object, terrain object, and planar faces. The DSM is extracted from the stereo imagery by hierarchical semi-global matching (SGM). In the second step, a multi-channel change indicator is extracted between the 3D models and stereo images, considering the inherent geometric consistency (IGC), height difference, and texture similarity for each planar face. Each channel of the indicator is then clustered with the Self-organizing Map (SOM), with "change", "non-change" and "uncertain change" status labeled through a voting strategy. The "uncertain changes" are then determined with a Markov Random Field (MRF) analysis considering the geometric relationship between faces. In the third step, buildings are

  17. Improved model for solar heating of buildings

    OpenAIRE

    Lie, Bernt

    2015-01-01

    A considerable future increase in the global energy use is expected, and the effects of energy conversion on the climate are already observed. Future energy conversion should thus be based on resources that have negligible climate effects; solar energy is perhaps the most important of such resources. The presented work builds on a previous complete model for solar heating of a house; here the aim to introduce ventilation heat recovery and improve on the hot water storage model. Ventilation he...

  18. Lost opportunities: Modeling commercial building energy code adoption in the United States

    International Nuclear Information System (INIS)

    Nelson, Hal T.

    2012-01-01

    This paper models the adoption of commercial building energy codes in the US between 1977 and 2006. Energy code adoption typically results in an increase in aggregate social welfare by cost effectively reducing energy expenditures. Using a Cox proportional hazards model, I test if relative state funding, a new, objective, multivariate regression-derived measure of government capacity, as well as a vector of control variables commonly used in comparative state research, predict commercial building energy code adoption. The research shows little political influence over historical commercial building energy code adoption in the sample. Colder climates and higher electricity prices also do not predict more frequent code adoptions. I do find evidence of high government capacity states being 60 percent more likely than low capacity states to adopt commercial building energy codes in the following year. Wealthier states are also more likely to adopt commercial codes. Policy recommendations to increase building code adoption include increasing access to low cost capital for the private sector and providing noncompetitive block grants to the states from the federal government. - Highlights: ► Model the adoption of commercial building energy codes from 1977–2006 in the US. ► Little political influence over historical building energy code adoption. ► High capacity states are over 60 percent more likely than low capacity states to adopt codes. ► Wealthier states are more likely to adopt commercial codes. ► Access to capital and technical assistance is critical to increase code adoption.

  19. Distributed Flexibility Characterization and Resource Allocation Strategies for Multi-zone Commercial Buildings in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Hao, He; Lian, Jianming; Kalsi, Karanjit; Stoustrup, Jakob

    2015-12-15

    The HVAC (Heating, Ventilation, and Air- Conditioning) system of commercial buildings is a complex system with a large number of dynamically interacting components. In particular, the thermal dynamics of each zone are coupled with those of the neighboring zones. In this paper, we study a multi-agent based approach to model and control commercial building HVAC system for providing grid services. In the multi-agent system (MAS), individual zones are modeled as agents that can communicate, interact, and negotiate with one another to achieve a common objective. We first propose a distributed characterization method on the aggregated airflow (and thus fan power) flexibility that the HVAC system can provide to the ancillary service market. Then, we propose a Nash-bargaining based airflow allocation strategy to track a dispatch signal (that is within the offered flexibility limit) while respecting the preference and flexibility of individual zones. Moreover, we devise a distributed algorithm to obtain the Nash bargaining solution via dual decomposition and average consensus. Numerical simulations illustrate that the proposed distributed protocols are much more scalable than the centralized approaches especially when the system becomes larger and more complex.

  20. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  1. Guided Inquiry and Consensus-Building Used to Construct Cellular Models

    Directory of Open Access Journals (Sweden)

    Joel I. Cohen

    2015-02-01

    Full Text Available Using models helps students learn from a “whole systems” perspective when studying the cell. This paper describes a model that employs guided inquiry and requires consensus building among students for its completion. The model is interactive, meaning that it expands upon a static model which, once completed, cannot be altered and additionally relates various levels of biological organization (molecular, organelle, and cellular to define cell and organelle function and interaction. Learning goals are assessed using data summed from final grades and from images of the student’s final cell model (plant, bacteria, and yeast taken from diverse seventh grade classes. Instructional figures showing consensus-building pathways and seating arrangements are discussed. Results suggest that the model leads to a high rate of participation, facilitates guided inquiry, and fosters group and individual exploration by challenging student understanding of the living cell.

  2. SuperLU{_}DIST: A scalable distributed-memory sparse direct solver for unsymmetric linear systems

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiaoye S.; Demmel, James W.

    2002-03-27

    In this paper, we present the main algorithmic features in the software package SuperLU{_}DIST, a distributed-memory sparse direct solver for large sets of linear equations. We give in detail our parallelization strategies, with focus on scalability issues, and demonstrate the parallel performance and scalability on current machines. The solver is based on sparse Gaussian elimination, with an innovative static pivoting strategy proposed earlier by the authors. The main advantage of static pivoting over classical partial pivoting is that it permits a priori determination of data structures and communication pattern for sparse Gaussian elimination, which makes it more scalable on distributed memory machines. Based on this a priori knowledge, we designed highly parallel and scalable algorithms for both LU decomposition and triangular solve and we show that they are suitable for large-scale distributed memory machines.

  3. Scalable printed electronics: an organic decoder addressing ferroelectric non-volatile memory

    Science.gov (United States)

    Ng, Tse Nga; Schwartz, David E.; Lavery, Leah L.; Whiting, Gregory L.; Russo, Beverly; Krusor, Brent; Veres, Janos; Bröms, Per; Herlogsson, Lars; Alam, Naveed; Hagel, Olle; Nilsson, Jakob; Karlsson, Christer

    2012-01-01

    Scalable circuits of organic logic and memory are realized using all-additive printing processes. A 3-bit organic complementary decoder is fabricated and used to read and write non-volatile, rewritable ferroelectric memory. The decoder-memory array is patterned by inkjet and gravure printing on flexible plastics. Simulation models for the organic transistors are developed, enabling circuit designs tolerant of the variations in printed devices. We explain the key design rules in fabrication of complex printed circuits and elucidate the performance requirements of materials and devices for reliable organic digital logic. PMID:22900143

  4. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  5. Scalable architecture for a room temperature solid-state quantum information processor.

    Science.gov (United States)

    Yao, N Y; Jiang, L; Gorshkov, A V; Maurer, P C; Giedke, G; Cirac, J I; Lukin, M D

    2012-04-24

    The realization of a scalable quantum information processor has emerged over the past decade as one of the central challenges at the interface of fundamental science and engineering. Here we propose and analyse an architecture for a scalable, solid-state quantum information processor capable of operating at room temperature. Our approach is based on recent experimental advances involving nitrogen-vacancy colour centres in diamond. In particular, we demonstrate that the multiple challenges associated with operation at ambient temperature, individual addressing at the nanoscale, strong qubit coupling, robustness against disorder and low decoherence rates can be simultaneously achieved under realistic, experimentally relevant conditions. The architecture uses a novel approach to quantum information transfer and includes a hierarchy of control at successive length scales. Moreover, it alleviates the stringent constraints currently limiting the realization of scalable quantum processors and will provide fundamental insights into the physics of non-equilibrium many-body quantum systems.

  6. Scalable force directed graph layout algorithms using fast multipole methods

    KAUST Repository

    Yunis, Enas Abdulrahman

    2012-06-01

    We present an extension to ExaFMM, a Fast Multipole Method library, as a generalized approach for fast and scalable execution of the Force-Directed Graph Layout algorithm. The Force-Directed Graph Layout algorithm is a physics-based approach to graph layout that treats the vertices V as repelling charged particles with the edges E connecting them acting as springs. Traditionally, the amount of work required in applying the Force-Directed Graph Layout algorithm is O(|V|2 + |E|) using direct calculations and O(|V| log |V| + |E|) using truncation, filtering, and/or multi-level techniques. Correct application of the Fast Multipole Method allows us to maintain a lower complexity of O(|V| + |E|) while regaining most of the precision lost in other techniques. Solving layout problems for truly large graphs with millions of vertices still requires a scalable algorithm and implementation. We have been able to leverage the scalability and architectural adaptability of the ExaFMM library to create a Force-Directed Graph Layout implementation that runs efficiently on distributed multicore and multi-GPU architectures. © 2012 IEEE.

  7. Scalability of voltage-controlled filamentary and nanometallic resistance memory devices.

    Science.gov (United States)

    Lu, Yang; Lee, Jong Ho; Chen, I-Wei

    2017-08-31

    Much effort has been devoted to device and materials engineering to realize nanoscale resistance random access memory (RRAM) for practical applications, but a rational physical basis to be relied on to design scalable devices spanning many length scales is still lacking. In particular, there is no clear criterion for switching control in those RRAM devices in which resistance changes are limited to localized nanoscale filaments that experience concentrated heat, electric current and field. Here, we demonstrate voltage-controlled resistance switching, always at a constant characteristic critical voltage, for macro and nanodevices in both filamentary RRAM and nanometallic RRAM, and the latter switches uniformly and does not require a forming process. As a result, area-scalability can be achieved under a device-area-proportional current compliance for the low resistance state of the filamentary RRAM, and for both the low and high resistance states of the nanometallic RRAM. This finding will help design area-scalable RRAM at the nanoscale. It also establishes an analogy between RRAM and synapses, in which signal transmission is also voltage-controlled.

  8. Energy-Efficient and Comfortable Buildings through Multivariate Integrated Control (ECoMIC)

    Energy Technology Data Exchange (ETDEWEB)

    Birru, Dagnachew [Philips Electronics North America Corporation, Andover, MA (United States); Wen, Yao-Jung [Philips Electronics North America Corporation, Andover, MA (United States); Rubinstein, Francis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Clear, Robert D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-10-28

    This project aims to develop an integrated control solution for enhanced energy efficiency and user comfort in commercial buildings. The developed technology is a zone-based control framework that minimizes energy usage while maintaining occupants’ visual and thermal comfort through control of electric lights, motorized venetian blinds and thermostats. The control framework is designed following a modular, scalable and flexible architecture to facilitate easy integration with exiting building management systems. The control framework contains two key algorithms: 1) the lighting load balancing algorithm and 2) the thermostat control algorithm. The lighting load balancing algorithm adopts a model-based closed-loop control approach to determine the optimal electric light and venetian blind settings. It is formulated into an optimization problem with minimizing lighting-related energy consumptions as the objective and delivering adequate task light and preventing daylight glare as the constraints. The thermostat control algorithm is based on a well-established thermal comfort model and formulated as a root-finding problem to dynamically determine the optimal thermostat setpoint for both energy savings and improved thermal comfort. To address building-wide scalability, a system architecture was developed for the zone-based control technology. Three levels of services are defined in the architecture: external services, facility level services and zone level services. The zone-level service includes the control algorithms described above as well as the corresponding interfaces, profiles, sensors and actuators to realize the zone controller. The facility level services connect to the zones through a backbone network, handle supervisory level information and controls, and thus facilitate building-wide scalability. The external services provide communication capability to entities outside of the building for grid interaction and remote access. Various aspects of the

  9. Responsive, Flexible and Scalable Broader Impacts (Invited)

    Science.gov (United States)

    Decharon, A.; Companion, C.; Steinman, M.

    2010-12-01

    In many educator professional development workshops, scientists present content in a slideshow-type format and field questions afterwards. Drawbacks of this approach include: inability to begin the lecture with content that is responsive to audience needs; lack of flexible access to specific material within the linear presentation; and “Q&A” sessions are not easily scalable to broader audiences. Often this type of traditional interaction provides little direct benefit to the scientists. The Centers for Ocean Sciences Education Excellence - Ocean Systems (COSEE-OS) applies the technique of concept mapping with demonstrated effectiveness in helping scientists and educators “get on the same page” (deCharon et al., 2009). A key aspect is scientist professional development geared towards improving face-to-face and online communication with non-scientists. COSEE-OS promotes scientist-educator collaboration, tests the application of scientist-educator maps in new contexts through webinars, and is piloting the expansion of maps as long-lived resources for the broader community. Collaboration - COSEE-OS has developed and tested a workshop model bringing scientists and educators together in a peer-oriented process, often clarifying common misconceptions. Scientist-educator teams develop online concept maps that are hyperlinked to “assets” (i.e., images, videos, news) and are responsive to the needs of non-scientist audiences. In workshop evaluations, 91% of educators said that the process of concept mapping helped them think through science topics and 89% said that concept mapping helped build a bridge of communication with scientists (n=53). Application - After developing a concept map, with COSEE-OS staff assistance, scientists are invited to give webinar presentations that include live “Q&A” sessions. The webinars extend the reach of scientist-created concept maps to new contexts, both geographically and topically (e.g., oil spill), with a relatively small

  10. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    Science.gov (United States)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to

  11. Integrated Urban System and Energy Consumption Model: Public and Singular Buildings

    Directory of Open Access Journals (Sweden)

    Rocco Papa

    2014-05-01

    Full Text Available The present paper illustrates the results of the first steps of a study on one aspect investigated as the preliminary step of the definition of the analysis - comprehension model of the relation between: city, buildings, and user behavior, for the reduction of energy consumption within the research project “Smart Energy Master” for the energetic governance of the territory (PON_MIUR n. pos. 04a2_00120 CUP Ricerca: E61H12000130005, at the Department of Civil, Building and Environmental Engineering - University of Naples Federico II, principal investigator prof. Carmela Gargiulo.Specifically the literary review aimed at determining if, and in what measure, the presence of public and singular buildings is present in the energy consumption estimate models,  proposed by the scientific community, for the city or neighborhood scale.The difficulties in defining the weight of these singular buildings on the total energy consumption and the impossibility to define mean values that are significant for all subsets and different types as well as for each one, have forced model makers to either ignore them completely or chose a portion of this specific stock to include.

  12. Reviewing the Role of Stakeholders in Operational Research: Opportunities for Group Model Building

    NARCIS (Netherlands)

    Gooyert, V. de; Rouwette, E.A.J.A.; Kranenburg, H.L. van

    2013-01-01

    Stakeholders have always received much attention in system dynamics, especially in the group model building tradition, which emphasizes the deep involvement of a client group in building a system dynamics model. In organizations, stakeholders are gaining more and more attention by managers who try

  13. Scalability of DL_POLY on High Performance Computing Platform

    Directory of Open Access Journals (Sweden)

    Mabule Samuel Mabakane

    2017-12-01

    Full Text Available This paper presents a case study on the scalability of several versions of the molecular dynamics code (DL_POLY performed on South Africa‘s Centre for High Performance Computing e1350 IBM Linux cluster, Sun system and Lengau supercomputers. Within this study different problem sizes were designed and the same chosen systems were employed in order to test the performance of DL_POLY using weak and strong scalability. It was found that the speed-up results for the small systems were better than large systems on both Ethernet and Infiniband network. However, simulations of large systems in DL_POLY performed well using Infiniband network on Lengau cluster as compared to e1350 and Sun supercomputer.

  14. Scalable fast multipole accelerated vortex methods

    KAUST Repository

    Hu, Qi

    2014-05-01

    The fast multipole method (FMM) is often used to accelerate the calculation of particle interactions in particle-based methods to simulate incompressible flows. To evaluate the most time-consuming kernels - the Biot-Savart equation and stretching term of the vorticity equation, we mathematically reformulated it so that only two Laplace scalar potentials are used instead of six. This automatically ensuring divergence-free far-field computation. Based on this formulation, we developed a new FMM-based vortex method on heterogeneous architectures, which distributed the work between multicore CPUs and GPUs to best utilize the hardware resources and achieve excellent scalability. The algorithm uses new data structures which can dynamically manage inter-node communication and load balance efficiently, with only a small parallel construction overhead. This algorithm can scale to large-sized clusters showing both strong and weak scalability. Careful error and timing trade-off analysis are also performed for the cutoff functions induced by the vortex particle method. Our implementation can perform one time step of the velocity+stretching calculation for one billion particles on 32 nodes in 55.9 seconds, which yields 49.12 Tflop/s.

  15. Scalability Optimization of Seamless Positioning Service

    Directory of Open Access Journals (Sweden)

    Juraj Machaj

    2016-01-01

    Full Text Available Recently positioning services are getting more attention not only within research community but also from service providers. From the service providers point of view positioning service that will be able to work seamlessly in all environments, for example, indoor, dense urban, and rural, has a huge potential to open new markets. However, such system does not only need to provide accurate position estimates but have to be scalable and resistant to fake positioning requests. In the previous works we have proposed a modular system, which is able to provide seamless positioning in various environments. The system automatically selects optimal positioning module based on available radio signals. The system currently consists of three positioning modules—GPS, GSM based positioning, and Wi-Fi based positioning. In this paper we will propose algorithm which will reduce time needed for position estimation and thus allow higher scalability of the modular system and thus allow providing positioning services to higher amount of users. Such improvement is extremely important, for real world application where large number of users will require position estimates, since positioning error is affected by response time of the positioning server.

  16. 3D modeling of buildings outstanding sites

    CERN Document Server

    Héno, Rapha?le

    2014-01-01

    Conventional topographic databases, obtained by capture on aerial or spatial images provide a simplified 3D modeling of our urban environment, answering the needs of numerous applications (development, risk prevention, mobility management, etc.). However, when we have to represent and analyze more complex sites (monuments, civil engineering works, archeological sites, etc.), these models no longer suffice and other acquisition and processing means have to be implemented. This book focuses on the study of adapted lifting means for "notable buildings". The methods tackled in this book cover las

  17. An HL7-FHIR-based Object Model for a Home-Centered Data Warehouse for Ambient Assisted Living Environments.

    Science.gov (United States)

    Schwartze, Jonas; Jansen, Lars; Schrom, Harald; Wolf, Klaus-Hendrik; Haux, Reinhold; Marschollek, Michael

    2015-01-01

    Current AAL environments focus on assisting a single person with seperated technologies. There is no interoperability between sub-domains in home environments, like building energy management or housing industry services. BASIS (Building Automation by a Scalable and Intelligent System) aims to integrate all sensors and actuators into a single, efficient home bus. First step is to create a semtically enriched data warehouse object model. We choose FHIR and built an object model mainly based on the Observation, Device and Location resources with minor extensions needed by AAL-foreign sub domains. FHIR turned out to be very flexible and complete for other home related sub-domains. The object model is implemented in a separated software-partition storing all structural and procedural data of BASIS.

  18. PM2006: a highly scalable urban planning management information system--Case study: Suzhou Urban Planning Bureau

    Science.gov (United States)

    Jing, Changfeng; Liang, Song; Ruan, Yong; Huang, Jie

    2008-10-01

    During the urbanization process, when facing complex requirements of city development, ever-growing urban data, rapid development of planning business and increasing planning complexity, a scalable, extensible urban planning management information system is needed urgently. PM2006 is such a system that can deal with these problems. In response to the status and problems in urban planning, the scalability and extensibility of PM2006 are introduced which can be seen as business-oriented workflow extensibility, scalability of DLL-based architecture, flexibility on platforms of GIS and database, scalability of data updating and maintenance and so on. It is verified that PM2006 system has good extensibility and scalability which can meet the requirements of all levels of administrative divisions and can adapt to ever-growing changes in urban planning business. At the end of this paper, the application of PM2006 in Urban Planning Bureau of Suzhou city is described.

  19. Building on Tinto's model of engagement and persistence ...

    African Journals Online (AJOL)

    Building on Tinto's model of engagement and persistence: Experiences from the Umthombo Youth Development Foundation Scholarship Scheme. A Ross. Abstract. Background. Major inequalities in staffing levels at rural and urban hospitals contribute to poorer health outcomes in rural areas. Local and international ...

  20. Simulation Speed Analysis and Improvements of Modelica Models for Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jorissen, Filip; Wetter, Michael; Helsen, Lieve

    2015-09-21

    This paper presents an approach for speeding up Modelica models. Insight is provided into how Modelica models are solved and what determines the tool’s computational speed. Aspects such as algebraic loops, code efficiency and integrator choice are discussed. This is illustrated using simple building simulation examples and Dymola. The generality of the work is in some cases verified using OpenModelica. Using this approach, a medium sized office building including building envelope, heating ventilation and air conditioning (HVAC) systems and control strategy can be simulated at a speed five hundred times faster than real time.