WorldWideScience

Sample records for building scalable models

  1. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    Science.gov (United States)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using

  2. The Concept of Business Model Scalability

    DEFF Research Database (Denmark)

    Lund, Morten; Nielsen, Christian

    2018-01-01

    -term pro table business. However, the main message of this article is that while providing a good value proposition may help the rm ‘get by’, the really successful businesses of today are those able to reach the sweet-spot of business model scalability. Design/Methodology/Approach: The article is based...... on a ve-year longitudinal action research project of over 90 companies that participated in the International Center for Innovation project aimed at building 10 global network-based business models. Findings: This article introduces and discusses the term scalability from a company-level perspective......Purpose: The purpose of the article is to de ne what scalable business models are. Central to the contemporary understanding of business models is the value proposition towards the customer and the hypotheses generated about delivering value to the customer which become a good foundation for a long...

  3. Building Scalable Knowledge Graphs for Earth Science

    Science.gov (United States)

    Ramachandran, R.; Maskey, M.; Gatlin, P. N.; Zhang, J.; Duan, X.; Bugbee, K.; Christopher, S. A.; Miller, J. J.

    2017-12-01

    Estimates indicate that the world's information will grow by 800% in the next five years. In any given field, a single researcher or a team of researchers cannot keep up with this rate of knowledge expansion without the help of cognitive systems. Cognitive computing, defined as the use of information technology to augment human cognition, can help tackle large systemic problems. Knowledge graphs, one of the foundational components of cognitive systems, link key entities in a specific domain with other entities via relationships. Researchers could mine these graphs to make probabilistic recommendations and to infer new knowledge. At this point, however, there is a dearth of tools to generate scalable Knowledge graphs using existing corpus of scientific literature for Earth science research. Our project is currently developing an end-to-end automated methodology for incrementally constructing Knowledge graphs for Earth Science. Semantic Entity Recognition (SER) is one of the key steps in this methodology. SER for Earth Science uses external resources (including metadata catalogs and controlled vocabulary) as references to guide entity extraction and recognition (i.e., labeling) from unstructured text, in order to build a large training set to seed the subsequent auto-learning component in our algorithm. Results from several SER experiments will be presented as well as lessons learned.

  4. Building scalable apps with Redis and Node.js

    CERN Document Server

    Johanan, Joshua

    2014-01-01

    If the phrase scalability sounds alien to you, then this is an ideal book for you. You will not need much Node.js experience as each framework is demonstrated in a way that requires no previous knowledge of the framework. You will be building scalable Node.js applications in no time! Knowledge of JavaScript is required.

  5. From Digital Disruption to Business Model Scalability

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten; Thomsen, Peter Poulsen

    2017-01-01

    This article discusses the terms disruption, digital disruption, business models and business model scalability. It illustrates how managers should be using these terms for the benefit of their business by developing business models capable of achieving exponentially increasing returns to scale...... will seldom lead to business model scalability capable of competing with digital disruption(s)....... as a response to digital disruption. A series of case studies illustrate that besides frequent existing messages in the business literature relating to the importance of creating agile businesses, both in growing and declining economies, as well as hard to copy value propositions or value propositions that take...

  6. VPLS: an effective technology for building scalable transparent LAN services

    Science.gov (United States)

    Dong, Ximing; Yu, Shaohua

    2005-02-01

    Virtual Private LAN Service (VPLS) is generating considerable interest with enterprises and service providers as it offers multipoint transparent LAN service (TLS) over MPLS networks. This paper describes an effective technology - VPLS, which links virtual switch instances (VSIs) through MPLS to form an emulated Ethernet switch and build Scalable Transparent Lan Services. It first focuses on the architecture of VPLS with Ethernet bridging technique at the edge and MPLS at the core, then it tries to elucidate the data forwarding mechanism within VPLS domain, including learning and aging MAC addresses on a per LSP basis, flooding of unknown frames and replication for unknown, multicast, and broadcast frames. The loop-avoidance mechanism, known as split horizon forwarding, is also analyzed. Another important aspect of VPLS service is its basic operation, including autodiscovery and signaling, is discussed. From the perspective of efficiency and scalability the paper compares two important signaling mechanism, BGP and LDP, which are used to set up a PW between the PEs and bind the PWs to a particular VSI. With the extension of VPLS and the increase of full mesh of PWs between PE devices (n*(n-1)/2 PWs in all, a n2 complete problem), VPLS instance could have a large number of remote PE associations, resulting in an inefficient use of network bandwidth and system resources as the ingress PE has to replicate each frame and append MPLS labels for remote PE. So the latter part of this paper focuses on the scalability issue: the Hierarchical VPLS. Within the architecture of HVPLS, this paper addresses two ways to cope with a possibly large number of MAC addresses, which make VPLS operate more efficiently.

  7. Scalability of Sustainable Business Models in Hybrid Organizations

    Directory of Open Access Journals (Sweden)

    Adam Jabłoński

    2016-02-01

    Full Text Available The dynamics of change in modern business create new mechanisms for company management to determine their pursuit and the achievement of their high performance. This performance maintained over a long period of time becomes a source of ensuring business continuity by companies. An ontological being enabling the adoption of such assumptions is such a business model that has the ability to generate results in every possible market situation and, moreover, it has the features of permanent adaptability. A feature that describes the adaptability of the business model is its scalability. Being a factor ensuring more work and more efficient work with an increasing number of components, scalability can be applied to the concept of business models as the company’s ability to maintain similar or higher performance through it. Ensuring the company’s performance in the long term helps to build the so-called sustainable business model that often balances the objectives of stakeholders and shareholders, and that is created by the implemented principles of value-based management and corporate social responsibility. This perception of business paves the way for building hybrid organizations that integrate business activities with pro-social ones. The combination of an approach typical of hybrid organizations in designing and implementing sustainable business models pursuant to the scalability criterion seems interesting from the cognitive point of view. Today, hybrid organizations are great spaces for building effective and efficient mechanisms for dialogue between business and society. This requires the appropriate business model. The purpose of the paper is to present the conceptualization and operationalization of scalability of sustainable business models that determine the performance of a hybrid organization in the network environment. The paper presents the original concept of applying scalability in sustainable business models with detailed

  8. The Concept of Business Model Scalability

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten

    2015-01-01

    The power of business models lies in their ability to visualize and clarify how firms’ may configure their value creation processes. Among the key aspects of business model thinking are a focus on what the customer values, how this value is best delivered to the customer and how strategic partners...... are leveraged in this value creation, delivery and realization exercise. Central to the mainstream understanding of business models is the value proposition towards the customer and the hypothesis generated is that if the firm delivers to the customer what he/she requires, then there is a good foundation...... for a long-term profitable business. However, the message conveyed in this article is that while providing a good value proposition may help the firm ‘get by’, the really successful businesses of today are those able to reach the sweet-spot of business model scalability. This article introduces and discusses...

  9. A system to build distributed multivariate models and manage disparate data sharing policies: implementation in the scalable national network for effectiveness research.

    Science.gov (United States)

    Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila

    2015-11-01

    Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage

  10. Scalable Deployment of Advanced Building Energy Management Systems

    Science.gov (United States)

    2013-05-01

    build their own visualization screens containing charts and 3D graphics.  Lack of functionality for generating comprehensive reports that can be sent...through the windows and subsequently absorbed by interior walls, floors and furniture , air leakage through doors, sensible air from HVAC, and sensible...Unit Min Max Temperature of Air Entering Condenser ºC 14 35 Temperature of Chilled Water Leaving Chiller ºC 5 12 Part Load Ratio -- 0.1 1.2 Model

  11. Model building

    International Nuclear Information System (INIS)

    Frampton, Paul H.

    1998-01-01

    In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA

  12. Model building

    International Nuclear Information System (INIS)

    Frampton, P.H.

    1998-01-01

    In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA. copyright 1998 American Institute of Physics

  13. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin

    2017-12-08

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference algorithms for such a model are increasingly limited due to their high time complexity and poor scalability. In this paper, we propose a multi-stage maximum likelihood approach to recover the latent parameters of the stochastic block model, in time linear with respect to the number of edges. We also propose a parallel algorithm based on message passing. Our algorithm can overlap communication and computation, providing speedup without compromising accuracy as the number of processors grows. For example, to process a real-world graph with about 1.3 million nodes and 10 million edges, our algorithm requires about 6 seconds on 64 cores of a contemporary commodity Linux cluster. Experiments demonstrate that the algorithm can produce high quality results on both benchmark and real-world graphs. An example of finding more meaningful communities is illustrated consequently in comparison with a popular modularity maximization algorithm.

  14. Building a scalable event-level metadata service for ATLAS

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Goosens, L; Viegas, F T A; McGlone, H

    2008-01-01

    The ATLAS TAG Database is a multi-terabyte event-level metadata selection system, intended to allow discovery, selection of and navigation to events of interest to an analysis. The TAG Database encompasses file- and relational-database-resident event-level metadata, distributed across all ATLAS Tiers. An oracle hosted global TAG relational database, containing all ATLAS events, implemented in Oracle, will exist at Tier O. Implementing a system that is both performant and manageable at this scale is a challenge. A 1 TB relational TAG Database has been deployed at Tier 0 using simulated tag data. The database contains one billion events, each described by two hundred event metadata attributes, and is currently undergoing extensive testing in terms of queries, population and manageability. These 1 TB tests aim to demonstrate and optimise the performance and scalability of an Oracle TAG Database on a global scale. Partitioning and indexing strategies are crucial to well-performing queries and manageability of the database and have implications for database population and distribution, so these are investigated. Physics query patterns are anticipated, but a crucial feature of the system must be to support a broad range of queries across all attributes. Concurrently, event tags from ATLAS Computing System Commissioning distributed simulations are accumulated in an Oracle-hosted database at CERN, providing an event-level selection service valuable for user experience and gathering information about physics query patterns. In this paper we describe the status of the Global TAG relational database scalability work and highlight areas of future direction

  15. Building Models and Building Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Kaj; Skauge, Jørn

    2008-01-01

    I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygnings­model­lerings­programmer beskrevet. Vigtige aspekter om comp...

  16. A Scalable Heuristic for Viral Marketing Under the Tipping Model

    Science.gov (United States)

    2013-09-01

    Flixster is a social media website that allows users to share reviews and other information about cinema . [35] It was extracted in Dec. 2010. – FourSquare...work of Reichman were developed independently . We also note that Reichman performs no experimental evaluation of the algorithm. A Scalable Heuristic...other dif- fusion models, such as the independent cascade model [21] and evolutionary graph theory [25] as well as probabilistic variants of the

  17. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin; Zhang, Zhihua; Wong, Ka-Chun; Zhang, Xiangliang; Keyes, David E.

    2017-01-01

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference

  18. Semantic Models for Scalable Search in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Dennis Pfisterer

    2013-03-01

    Full Text Available The Internet of Things is anticipated to connect billions of embedded devices equipped with sensors to perceive their surroundings. Thereby, the state of the real world will be available online and in real-time and can be combined with other data and services in the Internet to realize novel applications such as Smart Cities, Smart Grids, or Smart Healthcare. This requires an open representation of sensor data and scalable search over data from diverse sources including sensors. In this paper we show how the Semantic Web technologies RDF (an open semantic data format and SPARQL (a query language for RDF-encoded data can be used to address those challenges. In particular, we describe how prediction models can be employed for scalable sensor search, how these prediction models can be encoded as RDF, and how the models can be queried by means of SPARQL.

  19. Historical building monitoring using an energy-efficient scalable wireless sensor network architecture.

    Science.gov (United States)

    Capella, Juan V; Perles, Angel; Bonastre, Alberto; Serrano, Juan J

    2011-01-01

    We present a set of novel low power wireless sensor nodes designed for monitoring wooden masterpieces and historical buildings, in order to perform an early detection of pests. Although our previous star-based system configuration has been in operation for more than 13 years, it does not scale well for sensorization of large buildings or when deploying hundreds of nodes. In this paper we demonstrate the feasibility of a cluster-based dynamic-tree hierarchical Wireless Sensor Network (WSN) architecture where realistic assumptions of radio frequency data transmission are applied to cluster construction, and a mix of heterogeneous nodes are used to minimize economic cost of the whole system and maximize power saving of the leaf nodes. Simulation results show that the specialization of a fraction of the nodes by providing better antennas and some energy harvesting techniques can dramatically extend the life of the entire WSN and reduce the cost of the whole system. A demonstration of the proposed architecture with a new routing protocol and applied to termite pest detection has been implemented on a set of new nodes and should last for about 10 years, but it provides better scalability, reliability and deployment properties.

  20. Historical Building Monitoring Using an Energy-Efficient Scalable Wireless Sensor Network Architecture

    Science.gov (United States)

    Capella, Juan V.; Perles, Angel; Bonastre, Alberto; Serrano, Juan J.

    2011-01-01

    We present a set of novel low power wireless sensor nodes designed for monitoring wooden masterpieces and historical buildings, in order to perform an early detection of pests. Although our previous star-based system configuration has been in operation for more than 13 years, it does not scale well for sensorization of large buildings or when deploying hundreds of nodes. In this paper we demonstrate the feasibility of a cluster-based dynamic-tree hierarchical Wireless Sensor Network (WSN) architecture where realistic assumptions of radio frequency data transmission are applied to cluster construction, and a mix of heterogeneous nodes are used to minimize economic cost of the whole system and maximize power saving of the leaf nodes. Simulation results show that the specialization of a fraction of the nodes by providing better antennas and some energy harvesting techniques can dramatically extend the life of the entire WSN and reduce the cost of the whole system. A demonstration of the proposed architecture with a new routing protocol and applied to termite pest detection has been implemented on a set of new nodes and should last for about 10 years, but it provides better scalability, reliability and deployment properties. PMID:22346630

  1. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  2. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  3. SUSY GUT Model Building

    International Nuclear Information System (INIS)

    Raby, Stuart

    2008-01-01

    In this talk I discuss the evolution of SUSY GUT model building as I see it. Starting with 4 dimensional model building, I then consider orbifold GUTs in 5 dimensions and finally orbifold GUTs embedded into the E 8 xE 8 heterotic string.

  4. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  5. Developing a scalable modeling architecture for studying survivability technologies

    Science.gov (United States)

    Mohammad, Syed; Bounker, Paul; Mason, James; Brister, Jason; Shady, Dan; Tucker, David

    2006-05-01

    To facilitate interoperability of models in a scalable environment, and provide a relevant virtual environment in which Survivability technologies can be evaluated, the US Army Research Development and Engineering Command (RDECOM) Modeling Architecture for Technology Research and Experimentation (MATREX) Science and Technology Objective (STO) program has initiated the Survivability Thread which will seek to address some of the many technical and programmatic challenges associated with the effort. In coordination with different Thread customers, such as the Survivability branches of various Army labs, a collaborative group has been formed to define the requirements for the simulation environment that would in turn provide them a value-added tool for assessing models and gauge system-level performance relevant to Future Combat Systems (FCS) and the Survivability requirements of other burgeoning programs. An initial set of customer requirements has been generated in coordination with the RDECOM Survivability IPT lead, through the Survivability Technology Area at RDECOM Tank-automotive Research Development and Engineering Center (TARDEC, Warren, MI). The results of this project are aimed at a culminating experiment and demonstration scheduled for September, 2006, which will include a multitude of components from within RDECOM and provide the framework for future experiments to support Survivability research. This paper details the components with which the MATREX Survivability Thread was created and executed, and provides insight into the capabilities currently demanded by the Survivability faculty within RDECOM.

  6. An extended systematic mapping study about the scalability of i* Models

    Directory of Open Access Journals (Sweden)

    Paulo Lima

    2016-12-01

    Full Text Available i* models have been used for requirements specification in many domains, such as healthcare, telecommunication, and air traffic control. Managing the scalability and the complexity of such models is an important challenge in Requirements Engineering (RE. Scalability is also one of the most intractable issues in the design of visual notations in general: a well-known problem with visual representations is that they do not scale well. This issue has led us to investigate scalability in i* models and its variants by means of a systematic mapping study. This paper is an extended version of a previous paper on the scalability of i* including papers indicated by specialists. Moreover, we also discuss the challenges and open issues regarding scalability of i* models and its variants. A total of 126 papers were analyzed in order to understand: how the RE community perceives scalability; and which proposals have considered this topic. We found that scalability issues are indeed perceived as relevant and that further work is still required, even though many potential solutions have already been proposed. This study can be a starting point for researchers aiming to further advance the treatment of scalability in i* models.

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  9. Detailed Modeling and Evaluation of a Scalable Multilevel Checkpointing System

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronevetsky, Greg [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); de Supinski, Bronis R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-01

    High-performance computing (HPC) systems are growing more powerful by utilizing more components. As the system mean time before failure correspondingly drops, applications must checkpoint frequently to make progress. But, at scale, the cost of checkpointing becomes prohibitive. A solution to this problem is multilevel checkpointing, which employs multiple types of checkpoints in a single run. Moreover, lightweight checkpoints can handle the most common failure modes, while more expensive checkpoints can handle severe failures. We designed a multilevel checkpointing library, the Scalable Checkpoint/Restart (SCR) library, that writes lightweight checkpoints to node-local storage in addition to the parallel file system. We present probabilistic Markov models of SCR's performance. We show that on future large-scale systems, SCR can lead to a gain in machine efficiency of up to 35 percent, and reduce the load on the parallel file system by a factor of two. In addition, we predict that checkpoint scavenging, or only writing checkpoints to the parallel file system on application termination, can reduce the load on the parallel file system by 20 × on today's systems and still maintain high application efficiency.

  10. Building-related health impacts in European and Chinese cities: a scalable assessment method.

    Science.gov (United States)

    Tuomisto, Jouni T; Niittynen, Marjo; Pärjälä, Erkki; Asikainen, Arja; Perez, Laura; Trüeb, Stephan; Jantunen, Matti; Künzli, Nino; Sabel, Clive E

    2015-12-14

    Public health is often affected by societal decisions that are not primarily about health. Climate change mitigation requires intensive actions to minimise greenhouse gas emissions in the future. Many of these actions take place in cities due to their traffic, buildings, and energy consumption. Active climate mitigation policies will also, aside of their long term global impacts, have short term local impacts, both positive and negative, on public health. Our main objective was to develop a generic open impact model to estimate health impacts of emissions due to heat and power consumption of buildings. In addition, the model should be usable for policy comparisons by non-health experts on city level with city-specific data, it should give guidance on the particular climate mitigation questions but at the same time increase understanding on the related health impacts and the model should follow the building stock in time, make comparisons between scenarios, propagate uncertainties, and scale to different levels of detail. We tested The functionalities of the model in two case cities, namely Kuopio and Basel. We estimated the health and climate impacts of two actual policies planned or implemented in the cities. The assessed policies were replacement of peat with wood chips in co-generation of district heat and power, and improved energy efficiency of buildings achieved by renovations. Health impacts were not large in the two cities, but also clear differences in implementation and predictability between the two tested policies were seen. Renovation policies can improve the energy efficiency of buildings and reduce greenhouse gas emissions significantly, but this requires systematic policy sustained for decades. In contrast, fuel changes in large district heating facilities may have rapid and large impacts on emissions. However, the life cycle impacts of different fuels is somewhat an open question. In conclusion, we were able to develop a practical model for city

  11. Things That Squeak and Make You Feel Bad: Building Scalable User Experience Programs for Space Assessment

    Directory of Open Access Journals (Sweden)

    Rebecca Kuglitsch

    2018-04-01

    Full Text Available This article suggests a process for creating a user experience (UX assessment of space program that requires limited resources and minimal prior UX experience. By beginning with small scale methods, like comment boxes and easel prompts, librarians can overturn false assumptions about user behaviors, ground deeper investigations such as focus groups, and generate momentum. At the same time, these methods should feed into larger efforts to build trust and interest with peers and administration, laying the groundwork for more in-depth space UX assessment and more significant changes. The process and approach we suggest can be scaled for use in both large and small library systems. Developing a user experience space assessment program can seem overwhelming, especially without a dedicated user experience librarian or department, but does not have to be. In this piece, we explore how to scale and sequence small UX projects, communicate UX practices and results to stakeholders, and build support in order to develop an intentional but still manageable space assessment program. Our approach takes advantage of our institutional context—a large academic library system with several branch locations, allowing us to pilot projects at different scales. We were able to coordinate across a complex multi-site system, as well as in branch libraries with a staffing model analogous to libraries at smaller institutions. This gives us confidence that our methods can be applied at libraries of different sizes. As subject librarians who served as co-coordinators of a UX team on a voluntary basis, we also confronted the question of how we could attend to user needs while staying on top of our regular workload. Haphazard experimentation is unsatisfying and wasteful, particularly when there is limited time, so we sought to develop a process we could implement that applied approachable, purposeful UX space assessments while building trust and buy-in with colleagues

  12. A simple, scalable and low-cost method to generate thermal diagnostics of a domestic building

    International Nuclear Information System (INIS)

    Papafragkou, Anastasios; Ghosh, Siddhartha; James, Patrick A.B.; Rogers, Alex; Bahaj, AbuBakr S.

    2014-01-01

    Highlights: • Our diagnostic method uses a single field measurement from a temperature logger. • Building technical performance and occupant behaviour are addressed simultaneously. • Our algorithm learns a thermal model of a home and diagnoses the heating system. • We propose a novel clustering approach to decouple user behaviour from technical performance. • Our diagnostic confidence is enhanced using a large scale deployment. - Abstract: Traditional approaches to understand the problem of the energy performance in the domestic sector include on-site surveys by energy assessors and the installation of complex home energy monitoring systems. The time and money that needs to be invested by the occupants and the form of feedback generated by these approaches often makes them unattractive to householders. This paper demonstrates a simple, low cost method that generates thermal diagnostics for dwellings, measuring only one field dataset; internal temperature over a period of 1 week. A thermal model, which is essentially a learning algorithm, generates a set of thermal diagnostics about the primary heating system, the occupants’ preferences and the impact of certain interventions, such as lowering the thermostat set-point. A simple clustering approach is also proposed to categorise homes according to their building fabric thermal performance and occupants’ energy efficiency with respect to ventilation. The advantage of this clustering approach is that the occupants receive tailored advice on certain actions that if taken will improve the overall thermal performance of a dwelling. Due to the method’s low cost and simplicity it could facilitate government initiatives, such as the ‘Green Deal’ in the UK

  13. Efficient Delivery of Scalable Video Using a Streaming Class Model

    Directory of Open Access Journals (Sweden)

    Jason J. Quinlan

    2018-03-01

    Full Text Available When we couple the rise in video streaming with the growing number of portable devices (smart phones, tablets, laptops, we see an ever-increasing demand for high-definition video online while on the move. Wireless networks are inherently characterised by restricted shared bandwidth and relatively high error loss rates, thus presenting a challenge for the efficient delivery of high quality video. Additionally, mobile devices can support/demand a range of video resolutions and qualities. This demand for mobile streaming highlights the need for adaptive video streaming schemes that can adjust to available bandwidth and heterogeneity, and can provide a graceful changes in video quality, all while respecting viewing satisfaction. In this context, the use of well-known scalable/layered media streaming techniques, commonly known as scalable video coding (SVC, is an attractive solution. SVC encodes a number of video quality levels within a single media stream. This has been shown to be an especially effective and efficient solution, but it fares badly in the presence of datagram losses. While multiple description coding (MDC can reduce the effects of packet loss on scalable video delivery, the increased delivery cost is counterproductive for constrained networks. This situation is accentuated in cases where only the lower quality level is required. In this paper, we assess these issues and propose a new approach called Streaming Classes (SC through which we can define a key set of quality levels, each of which can be delivered in a self-contained manner. This facilitates efficient delivery, yielding reduced transmission byte-cost for devices requiring lower quality, relative to MDC and Adaptive Layer Distribution (ALD (42% and 76% respective reduction for layer 2, while also maintaining high levels of consistent quality. We also illustrate how selective packetisation technique can further reduce the effects of packet loss on viewable quality by

  14. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  15. Open string model building

    International Nuclear Information System (INIS)

    Ishibashi, Nobuyuki; Onogi, Tetsuya

    1989-01-01

    Consistency conditions of open string theories, which can be a powerful tool in open string model building, are proposed. By making use of these conditions and assuming a simple prescription for the Chan-Paton factors, open string theories in several backgrounds are studied. We show that 1. there exist a large number of consistent bosonic open string theories on Z 2 orbifolds, 2. SO(32) type I superstring is the unique consistent model among fermionic string theories on the ten-dimensional flat Minkowski space, and 3. with our prescription for the Chan-Paton factors, there exist no consistent open superstring theories on (six-dimensional Minkowski space-time) x (Z 2 orbifold). (orig.)

  16. A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Gent, Peter; Lamarque, Jean-Francois; Conley, Andrew; Vertenstein, Mariana; Craig, Anthony

    2013-02-13

    The objective of this award was to build a scalable and extensible Earth System Model that can be used to study climate change science. That objective has been achieved with the public release of the Community Earth System Model, version 1 (CESM1). In particular, the development of the CESM1 atmospheric chemistry component was substantially funded by this award, as was the development of the significantly improved coupler component. The CESM1 allows new climate change science in areas such as future air quality in very large cities, the effects of recovery of the southern hemisphere ozone hole, and effects of runoff from ice melt in the Greenland and Antarctic ice sheets. Results from a whole series of future climate projections using the CESM1 are also freely available via the web from the CMIP5 archive at the Lawrence Livermore National Laboratory. Many research papers using these results have now been published, and will form part of the 5th Assessment Report of the United Nations Intergovernmental Panel on Climate Change, which is to be published late in 2013.

  17. SAME4HPC: A Promising Approach in Building a Scalable and Mobile Environment for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Karthik, Rajasekar [ORNL

    2014-01-01

    In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack with Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.

  18. A scalable infrastructure model for carbon capture and storage: SimCCS

    International Nuclear Information System (INIS)

    Middleton, Richard S.; Bielicki, Jeffrey M.

    2009-01-01

    In the carbon capture and storage (CCS) process, CO 2 sources and geologic reservoirs may be widely spatially dispersed and need to be connected through a dedicated CO 2 pipeline network. We introduce a scalable infrastructure model for CCS (simCCS) that generates a fully integrated, cost-minimizing CCS system. SimCCS determines where and how much CO 2 to capture and store, and where to build and connect pipelines of different sizes, in order to minimize the combined annualized costs of sequestering a given amount of CO 2 . SimCCS is able to aggregate CO 2 flows between sources and reservoirs into trunk pipelines that take advantage of economies of scale. Pipeline construction costs take into account factors including topography and social impacts. SimCCS can be used to calculate the scale of CCS deployment (local, regional, national). SimCCS' deployment of a realistic, capacitated pipeline network is a major advancement for planning CCS infrastructure. We demonstrate simCCS using a set of 37 CO 2 sources and 14 reservoirs for California. The results highlight the importance of systematic planning for CCS infrastructure by examining the sensitivity of CCS infrastructure, as optimized by simCCS, to varying CO 2 targets. We finish by identifying critical future research areas for CCS infrastructure

  19. Scalable Power-Component Models for Concept Testing

    Science.gov (United States)

    2011-08-17

    motor speed can be either positive or negative dependent upon the propelling or regenerative braking scenario. The simulation provides three...the machine during generation or regenerative braking . To use the model, the user modifies the motor model criteria parameters by double-clicking... SYSTEMS ENGINEERING AND TECHNOLOGY SYMPOSIUM MODELING & SIMULATION, TESTING AND VALIDATION (MSTV) MINI-SYMPOSIUM AUGUST 9-11 DEARBORN, MICHIGAN

  20. Model building and new particles

    International Nuclear Information System (INIS)

    Frampton, P.H.

    1992-01-01

    After an outline of the Standard Model, indications of new physics beyond it are discussed. The nature of model building is illustrated by three examples which predict, respectively, new particles called the axigluon, sarks and the aspon. (author). 11 refs

  1. Scalable Topic Modeling: Online Learning, Diagnostics, and Recommendation

    Science.gov (United States)

    2017-03-01

    problem of estimating the conditional distribution is a well-defined mathematical problem. Model checking and application again requires the domain...Critique, Repeat: Data Analysis with Latent Variable Models. Annual Review of Statistics and Its Application , 1 203–232, 2014. 2. S. Gershman, D. Blei, K...central statistical and computational problem. 3. With the results of inference, we use our model to form predictions about the future, explore the data, or

  2. Scalable learning of probabilistic latent models for collaborative filtering

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2015-01-01

    variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly...

  3. Scalable Telemonitoring Model in Cloud for Health Care Analysis

    Science.gov (United States)

    Sawant, Yogesh; Jayakumar, Naveenkumar, Dr.; Pawar, Sanket Sunil

    2017-08-01

    Telemonitoring model is health observations model that going to surveillance patients remotely. Telemonitoring model is suitable for patients to avoid high operating expense to get Emergency treatment. Telemonitoring gives the path for monitoring the medical device that generates a complete profile of patient’s health through assembling essential signs as well as additional health information. Telemonitoring model is relying on four differential modules which is capable to generate realistic synthetic electrocardiogram (ECG) signals. Telemonitoring model shows four categories of chronic disease: pulmonary state, diabetes, hypertension, as well as cardiovascular diseases. On the other hand, the results of this application model recommend facilitating despite of their nationality, socioeconomic grade, or age, patients observe amid tele-monitoring programs as well as the utilization of technologies. Patient’s multiple health status is shown in the result such as beat-to-beat variation in morphology and timing of the human ECG, including QT dispersion and R-peak amplitude modulation. This model will be utilized to evaluate biomedical signal processing methods that are utilized to calculate clinical information from the ECG.

  4. Buildings Lean Maintenance Implementation Model

    Science.gov (United States)

    Abreu, Antonio; Calado, João; Requeijo, José

    2016-11-01

    Nowadays, companies in global markets have to achieve high levels of performance and competitiveness to stay "alive".Within this assumption, the building maintenance cannot be done in a casual and improvised way due to the costs related. Starting with some discussion about lean management and building maintenance, this paper introduces a model to support the Lean Building Maintenance (LBM) approach. Finally based on a real case study from a Portuguese company, the benefits, challenges and difficulties are presented and discussed.

  5. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  6. Model Transport: Towards Scalable Transfer Learning on Manifolds

    DEFF Research Database (Denmark)

    Freifeld, Oren; Hauberg, Søren; Black, Michael J.

    2014-01-01

    We consider the intersection of two research fields: transfer learning and statistics on manifolds. In particular, we consider, for manifold-valued data, transfer learning of tangent-space models such as Gaussians distributions, PCA, regression, or classifiers. Though one would hope to simply use...... ordinary Rn-transfer learning ideas, the manifold structure prevents it. We overcome this by basing our method on inner-product-preserving parallel transport, a well-known tool widely used in other problems of statistics on manifolds in computer vision. At first, this straightforward idea seems to suffer...... “commutes” with learning. Consequently, our compact framework, applicable to a large class of manifolds, is not restricted by the size of either the training or test sets. We demonstrate the approach by transferring PCA and logistic-regression models of real-world data involving 3D shapes and image...

  7. Scalable and Robust BDDC Preconditioners for Reservoir and Electromagnetics Modeling

    KAUST Repository

    Zampini, S.; Widlund, O.B.; Keyes, David E.

    2015-01-01

    The purpose of the study is to show the effectiveness of recent algorithmic advances in Balancing Domain Decomposition by Constraints (BDDC) preconditioners for the solution of elliptic PDEs with highly heterogeneous coefficients, and discretized by means of the finite element method. Applications to large linear systems generated by div- and curl- conforming finite elements discretizations commonly arising in the contexts of modelling reservoirs and electromagnetics will be presented.

  8. Scalable and Robust BDDC Preconditioners for Reservoir and Electromagnetics Modeling

    KAUST Repository

    Zampini, S.

    2015-09-13

    The purpose of the study is to show the effectiveness of recent algorithmic advances in Balancing Domain Decomposition by Constraints (BDDC) preconditioners for the solution of elliptic PDEs with highly heterogeneous coefficients, and discretized by means of the finite element method. Applications to large linear systems generated by div- and curl- conforming finite elements discretizations commonly arising in the contexts of modelling reservoirs and electromagnetics will be presented.

  9. A veracity preserving model for synthesizing scalable electricity load profiles

    OpenAIRE

    Huang, Yunyou; Zhan, Jianfeng; Luo, Chunjie; Wang, Lei; Wang, Nana; Zheng, Daoyi; Fan, Fanda; Ren, Rui

    2018-01-01

    Electricity users are the major players of the electric systems, and electricity consumption is growing at an extraordinary rate. The research on electricity consumption behaviors is becoming increasingly important to design and deployment of the electric systems. Unfortunately, electricity load profiles are difficult to acquire. Data synthesis is one of the best approaches to solving the lack of data, and the key is the model that preserves the real electricity consumption behaviors. In this...

  10. A Scalable Cloud Library Empowering Big Data Management, Diagnosis, and Visualization of Cloud-Resolving Models

    Science.gov (United States)

    Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.

    2015-12-01

    A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a

  11. Scalable Database Design of End-Game Model with Decoupled Countermeasure and Threat Information

    Science.gov (United States)

    2017-11-01

    the Army Modular Active Protection System (MAPS) program to provide end-to-end APS modeling and simulation capabilities. The SSES simulation features...research project of scalable database design was initiated in support of SSES modularization efforts with respect to 4 major software components...Iron Curtain KE kinetic energy MAPS Modular Active Protective System OLE DB object linking and embedding database RDB relational database RPG

  12. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad; Elsawy, Hesham; Bader, Ahmed; Alouini, Mohamed-Slim

    2017-01-01

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  13. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  14. A scalable variational inequality approach for flow through porous media models with pressure-dependent viscosity

    Science.gov (United States)

    Mapakshi, N. K.; Chang, J.; Nakshatrala, K. B.

    2018-04-01

    Mathematical models for flow through porous media typically enjoy the so-called maximum principles, which place bounds on the pressure field. It is highly desirable to preserve these bounds on the pressure field in predictive numerical simulations, that is, one needs to satisfy discrete maximum principles (DMP). Unfortunately, many of the existing formulations for flow through porous media models do not satisfy DMP. This paper presents a robust, scalable numerical formulation based on variational inequalities (VI), to model non-linear flows through heterogeneous, anisotropic porous media without violating DMP. VI is an optimization technique that places bounds on the numerical solutions of partial differential equations. To crystallize the ideas, a modification to Darcy equations by taking into account pressure-dependent viscosity will be discretized using the lowest-order Raviart-Thomas (RT0) and Variational Multi-scale (VMS) finite element formulations. It will be shown that these formulations violate DMP, and, in fact, these violations increase with an increase in anisotropy. It will be shown that the proposed VI-based formulation provides a viable route to enforce DMP. Moreover, it will be shown that the proposed formulation is scalable, and can work with any numerical discretization and weak form. A series of numerical benchmark problems are solved to demonstrate the effects of heterogeneity, anisotropy and non-linearity on DMP violations under the two chosen formulations (RT0 and VMS), and that of non-linearity on solver convergence for the proposed VI-based formulation. Parallel scalability on modern computational platforms will be illustrated through strong-scaling studies, which will prove the efficiency of the proposed formulation in a parallel setting. Algorithmic scalability as the problem size is scaled up will be demonstrated through novel static-scaling studies. The performed static-scaling studies can serve as a guide for users to be able to select

  15. A Scalable Version of the Navy Operational Global Atmospheric Prediction System Spectral Forecast Model

    Directory of Open Access Journals (Sweden)

    Thomas E. Rosmond

    2000-01-01

    Full Text Available The Navy Operational Global Atmospheric Prediction System (NOGAPS includes a state-of-the-art spectral forecast model similar to models run at several major operational numerical weather prediction (NWP centers around the world. The model, developed by the Naval Research Laboratory (NRL in Monterey, California, has run operational at the Fleet Numerical Meteorological and Oceanographic Center (FNMOC since 1982, and most recently is being run on a Cray C90 in a multi-tasked configuration. Typically the multi-tasked code runs on 10 to 15 processors with overall parallel efficiency of about 90%. resolution is T159L30, but other operational and research applications run at significantly lower resolutions. A scalable NOGAPS forecast model has been developed by NRL in anticipation of a FNMOC C90 replacement in about 2001, as well as for current NOGAPS research requirements to run on DOD High-Performance Computing (HPC scalable systems. The model is designed to run with message passing (MPI. Model design criteria include bit reproducibility for different processor numbers and reasonably efficient performance on fully shared memory, distributed memory, and distributed shared memory systems for a wide range of model resolutions. Results for a wide range of processor numbers, model resolutions, and different vendor architectures are presented. Single node performance has been disappointing on RISC based systems, at least compared to vector processor performance. This is a common complaint, and will require careful re-examination of traditional numerical weather prediction (NWP model software design and data organization to fully exploit future scalable architectures.

  16. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics for Scientific Data and Analysis

    Data.gov (United States)

    National Aeronautics and Space Administration — We will construct SciSpark, a scalable system for interactive model evaluation and for the rapid development of climate metrics and analyses. SciSpark directly...

  17. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  18. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    2010-01-01

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence......For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...

  19. More scalability, less pain: A simple programming model and its implementation for extreme computing

    International Nuclear Information System (INIS)

    Lusk, E.L.; Pieper, S.C.; Butler, R.M.

    2010-01-01

    This is the story of a simple programming model, its implementation for extreme computing, and a breakthrough in nuclear physics. A critical issue for the future of high-performance computing is the programming model to use on next-generation architectures. Described here is a promising approach: program very large machines by combining a simplified programming model with a scalable library implementation. The presentation takes the form of a case study in nuclear physics. The chosen application addresses fundamental issues in the origins of our Universe, while the library developed to enable this application on the largest computers may have applications beyond this one.

  20. Genetic algorithms and genetic programming for multiscale modeling: Applications in materials science and chemistry and advances in scalability

    Science.gov (United States)

    Sastry, Kumara Narasimha

    2007-03-01

    building blocks in organic chemistry---indicate that MOGAs produce High-quality semiempirical methods that (1) are stable to small perturbations, (2) yield accurate configuration energies on untested and critical excited states, and (3) yield ab initio quality excited-state dynamics. The proposed method enables simulations of more complex systems to realistic, multi-picosecond timescales, well beyond previous attempts or expectation of human experts, and 2--3 orders-of-magnitude reduction in computational cost. While the two applications use simple evolutionary operators, in order to tackle more complex systems, their scalability and limitations have to be investigated. The second part of the thesis addresses some of the challenges involved with a successful design of genetic algorithms and genetic programming for multiscale modeling. The first issue addressed is the scalability of genetic programming, where facetwise models are built to assess the population size required by GP to ensure adequate supply of raw building blocks and also to ensure accurate decision-making between competing building blocks. This study also presents a design of competent genetic programming, where traditional fixed recombination operators are replaced by building and sampling probabilistic models of promising candidate programs. The proposed scalable GP, called extended compact GP (eCGP), combines the ideas from extended compact genetic algorithm (eCGA) and probabilistic incremental program evolution (PIPE) and adaptively identifies, propagates and exchanges important subsolutions of a search problem. Results show that eCGP scales cubically with problem size on both GP-easy and GP-hard problems. Finally, facetwise models are developed to explore limitations of scalability of MOGAs, where the scalability of multiobjective algorithms in reliably maintaining Pareto-optimal solutions is addressed. The results show that even when the building blocks are accurately identified, massive multimodality

  1. Darwinian Model Building

    NARCIS (Netherlands)

    Kester, Do; Bontekoe, Romke; MohammadDjafari, A; Bercher, JF; Bessiere, P

    2010-01-01

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment).

  2. Darwinian Model Building

    International Nuclear Information System (INIS)

    Kester, Do; Bontekoe, Romke

    2011-01-01

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment). Variation is achieved by cross-over and mutation operations on the genotype which consists in the present case of a single chromosome. The genotypes 'live' in the environment of the data. Nested Sampling is used to optimize the free parameters of the models given the data, thus giving rise to the phenotypes. Selection is based on the phenotypes.The evidences which naturally follow from the Nested Sampling Algorithm are used in a second level of Nested Sampling to find increasingly better models.The data in this paper originate from the Leiden Cytology and Pathology Laboratory (LCPL), which screens pap smears for cervical cancer. We have data for 1750 women who on average underwent 5 tests each. The data on individual women are treated as a small time series. We will try to estimate the next value of the prime cancer indicator from previous tests of the same woman.

  3. Flavored model building

    International Nuclear Information System (INIS)

    Hagedorn, C.

    2008-01-01

    In this thesis we discuss possibilities to solve the family replication problem and to understand the observed strong hierarchy among the fermion masses and the diverse mixing pattern of quarks and leptons. We show that non-abelian discrete symmetries which act non-trivially in generation space can serve as profound explanation. We present three low energy models with the permutation symmetry S 4 , the dihedral group D 5 and the double-valued group T' as flavor symmetry. The T' model turns out to be very predictive, since it explains tri-bimaximal mixing in the lepton sector and, moreover, leads to two non-trivial relations in the quark sector, √((m d )/(m s ))= vertical stroke V us vertical stroke and √((m d )/(m s ))= vertical stroke (V td )/(V ts ) vertical stroke. The main message of the T' model is the observation that the diverse pattern in the quark and lepton mixings can be well-understood, if the flavor symmetry is not broken in an arbitrary way, but only to residual (non-trivial) subgroups. Apart from leading to deeper insights into the origin of the fermion mixings this idea enables us to perform systematic studies of large classes of discrete groups. This we show in our study of dihedral symmetries D n and D' n . As a result we find only five distinct (Dirac) mass matrix structures arising from a dihedral group, if we additionally require partial unification of either left-handed or left-handed conjugate fermions and the determinant of the mass matrix to be non-vanishing. Furthermore, we reveal the ability of dihedral groups to predict the Cabibbo angle θ C , i.e. vertical stroke V us(cd) vertical stroke cos((3π)/(7)), as well as maximal atmospheric mixing, θ 23 =(π)/(4), and vanishing θ 13 in the lepton sector. (orig.)

  4. Flavored model building

    Energy Technology Data Exchange (ETDEWEB)

    Hagedorn, C.

    2008-01-15

    In this thesis we discuss possibilities to solve the family replication problem and to understand the observed strong hierarchy among the fermion masses and the diverse mixing pattern of quarks and leptons. We show that non-abelian discrete symmetries which act non-trivially in generation space can serve as profound explanation. We present three low energy models with the permutation symmetry S{sub 4}, the dihedral group D{sub 5} and the double-valued group T' as flavor symmetry. The T' model turns out to be very predictive, since it explains tri-bimaximal mixing in the lepton sector and, moreover, leads to two non-trivial relations in the quark sector, {radical}((m{sub d})/(m{sub s}))= vertical stroke V{sub us} vertical stroke and {radical}((m{sub d})/(m{sub s}))= vertical stroke (V{sub td})/(V{sub ts}) vertical stroke. The main message of the T' model is the observation that the diverse pattern in the quark and lepton mixings can be well-understood, if the flavor symmetry is not broken in an arbitrary way, but only to residual (non-trivial) subgroups. Apart from leading to deeper insights into the origin of the fermion mixings this idea enables us to perform systematic studies of large classes of discrete groups. This we show in our study of dihedral symmetries D{sub n} and D'{sub n}. As a result we find only five distinct (Dirac) mass matrix structures arising from a dihedral group, if we additionally require partial unification of either left-handed or left-handed conjugate fermions and the determinant of the mass matrix to be non-vanishing. Furthermore, we reveal the ability of dihedral groups to predict the Cabibbo angle {theta}{sub C}, i.e. vertical stroke V{sub us(cd)} vertical stroke = cos((3{pi})/(7)), as well as maximal atmospheric mixing, {theta}{sub 23}=({pi})/(4), and vanishing {theta}{sub 13} in the lepton sector. (orig.)

  5. A framework for scalable parameter estimation of gene circuit models using structural information

    KAUST Repository

    Kuwahara, Hiroyuki

    2013-06-21

    Motivation: Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Results: Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. The Author 2013.

  6. A framework for scalable parameter estimation of gene circuit models using structural information

    KAUST Repository

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-01-01

    Motivation: Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Results: Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. The Author 2013.

  7. A framework for scalable parameter estimation of gene circuit models using structural information.

    Science.gov (United States)

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  8. Use of modeling to assess the scalability of Ethernet networks for the ATLAS second level trigger

    CERN Document Server

    Korcyl, K; Dobinson, Robert W; Saka, F

    1999-01-01

    The second level trigger of LHC's ATLAS experiment has to perform real-time analyses on detector data at 10 GBytes/s. A switching network is required to connect more than thousand read-out buffers to about thousand processors that execute the trigger algorithm. We are investigating the use of Ethernet technology to build this large switching network. Ethernet is attractive because of the huge installed base, competitive prices, and recent introduction of the high-performance Gigabit version. Due to the network's size it has to be constructed as a layered structure of smaller units. To assess the scalability of such a structure we evaluated a single switch unit. (0 refs).

  9. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  10. Investigation of the blockchain systems’ scalability features using the agent based modelling

    OpenAIRE

    Šulnius, Aleksas

    2017-01-01

    Investigation of the BlockChain Systems’ Scalability Features using the Agent Based Modelling. BlockChain currently is in the spotlight of all the FinTech industry. This technology is being called revolutionary, ground breaking, disruptive and even the WEB 3.0. On the other hand it is widely agreed that the BlockChain is in its early stages of development. In its current state BlockChain is in similar position that the Internet was in the early nineties. In order for this technology to gain m...

  11. Alternatives to quintessence model building

    International Nuclear Information System (INIS)

    Avelino, P.P.; Beca, L.M.G.; Pinto, P.; Carvalho, J.P.M. de; Martins, C.J.A.P.

    2003-01-01

    We discuss the issue of toy model building for the dark energy component of the universe. Specifically, we consider two generic toy models recently proposed as alternatives to quintessence models, respectively known as Cardassian expansion and the Chaplygin gas. We show that the former is entirely equivalent to a class of quintessence models. We determine the observational constraints on the latter, coming from recent supernovae results and from the shape of the matter power spectrum. As expected, these restrict the model to a behavior that closely matches that of a standard cosmological constant Λ

  12. LoRa Scalability: A Simulation Model Based on Interference Measurements

    Directory of Open Access Journals (Sweden)

    Jetmir Haxhibeqiri

    2017-05-01

    Full Text Available LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  13. LoRa Scalability: A Simulation Model Based on Interference Measurements.

    Science.gov (United States)

    Haxhibeqiri, Jetmir; Van den Abeele, Floris; Moerman, Ingrid; Hoebeke, Jeroen

    2017-05-23

    LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT) applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data.

  14. Progress Report 2008: A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Drake, John B [ORNL; Worley, Patrick H [ORNL; Hoffman, Forrest M [ORNL; Jones, Phil [Los Alamos National Laboratory (LANL)

    2009-01-01

    This project employs multi-disciplinary teams to accelerate development of the Community Climate System Model (CCSM), based at the National Center for Atmospheric Research (NCAR). A consortium of eight Department of Energy (DOE) National Laboratories collaborate with NCAR and the NASA Global Modeling and Assimilation Office (GMAO). The laboratories are Argonne (ANL), Brookhaven (BNL) Los Alamos (LANL), Lawrence Berkeley (LBNL), Lawrence Livermore (LLNL), Oak Ridge (ORNL), Pacific Northwest (PNNL) and Sandia (SNL). The work plan focuses on scalablity for petascale computation and extensibility to a more comprehensive earth system model. Our stated goal is to support the DOE mission in climate change research by helping ... To determine the range of possible climate changes over the 21st century and beyond through simulations using a more accurate climate system model that includes the full range of human and natural climate feedbacks with increased realism and spatial resolution.

  15. Scalability of the muscular action in a parametric 3D model of the index finger.

    Science.gov (United States)

    Sancho-Bru, Joaquín L; Vergara, Margarita; Rodríguez-Cervantes, Pablo-Jesús; Giurintano, David J; Pérez-González, Antonio

    2008-01-01

    A method for scaling the muscle action is proposed and used to achieve a 3D inverse dynamic model of the human finger with all its components scalable. This method is based on scaling the physiological cross-sectional area (PCSA) in a Hill muscle model. Different anthropometric parameters and maximal grip force data have been measured and their correlations have been analyzed and used for scaling the PCSA of each muscle. A linear relationship between the normalized PCSA and the product of the length and breadth of the hand has been finally used for scaling, with a slope of 0.01315 cm(-2), with the length and breadth of the hand expressed in centimeters. The parametric muscle model has been included in a parametric finger model previously developed by the authors, and it has been validated reproducing the results of an experiment in which subjects from different population groups exerted maximal voluntary forces with their index finger in a controlled posture.

  16. Toward a scalable flexible-order model for 3D nonlinear water waves

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Ducrozet, Guillaume; Bingham, Harry B.

    For marine and coastal applications, current work are directed toward the development of a scalable numerical 3D model for fully nonlinear potential water waves over arbitrary depths. The model is high-order accurate, robust and efficient for large-scale problems, and support will be included...... for flexibility in the description of structures by the use of curvilinear boundary-fitted meshes. The mathematical equations for potential waves in the physical domain is transformed through $\\sigma$-mapping(s) to a time-invariant boundary-fitted domain which then becomes a basis for an efficient solution...... strategy on a time-invariant mesh. The 3D numerical model is based on a finite difference method as in the original works \\cite{LiFleming1997,BinghamZhang2007}. Full details and other aspects of an improved 3D solution can be found in \\cite{EBL08}. The new and improved approach for three...

  17. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolla, Hemanth [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Borghesi, Giulio [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-05-01

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- merical tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.

  18. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  19. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria

    2016-01-01

    Scalability is a design principle often valued for the engineering of complex systems. Scalability is the ability of a system to change the current value of one of its specification parameters. Although targeted frameworks are available for the evaluation of scalability for specific digital systems...... re-engineering of 5 independent system modules, from the replacement of a wireless Bluetooth interface, to the revision of the ADC sample-and-hold operation could help increase system bandwidth....

  20. Performance and scalability of finite-difference and finite-element wave-propagation modeling on Intel's Xeon Phi

    NARCIS (Netherlands)

    Zhebel, E.; Minisini, S.; Kononov, A.; Mulder, W.A.

    2013-01-01

    With the rapid developments in parallel compute architectures, algorithms for seismic modeling and imaging need to be reconsidered in terms of parallelization. The aim of this paper is to compare scalability of seismic modeling algorithms: finite differences, continuous mass-lumped finite elements

  1. A conclusive scalable model for the complete actuation response for IPMC transducers

    International Nuclear Information System (INIS)

    McDaid, A J; Aw, K C; Haemmerle, E; Xie, S Q

    2010-01-01

    This paper proposes a conclusive scalable model for the complete actuation response for ionic polymer metal composites (IPMC). This single model is proven to be able to accurately predict the free displacement/velocity and force actuation at varying displacements, with up to 3 V inputs. An accurate dynamic relationship between the force and displacement has been established which can be used to predict the complete actuation response of the IPMC transducer. The model is accurate at large displacements and can also predict the response when interacting with external mechanical systems and loads. This model equips engineers with a useful design tool which enables simple mechanical design, simulation and optimization when integrating IPMC actuators into an application. The response of the IPMC is modelled in three stages: (i) a nonlinear equivalent electrical circuit to predict the current drawn, (ii) an electromechanical coupling term and (iii) a segmented mechanical beam model which includes an electrically induced torque for the polymer. Model parameters are obtained using the dynamic time response and results are presented demonstrating the correspondence between the model and experimental results over a large operating range. This newly developed model is a large step forward, aiding in the progression of IPMCs towards wide acceptance as replacements to traditional actuators

  2. Solar energy in buildings solved by building information modeling

    Science.gov (United States)

    Chudikova, B.; Faltejsek, M.

    2018-03-01

    Building lead us to use renewable energy sources for all types of buildings. The use of solar energy is the alternatives that can be applied in a good ratio of space, price, and resultant benefits. Building Information Modelling is a modern and effective way of dealing with buildings with regard to all aspects of the life cycle. The basis is careful planning and simulation in the pre-investment phase, where it is possible to determine the effective result and influence the lifetime of the building and the cost of its operation. By simulating, analysing and insert a building model into its future environment where climate conditions and surrounding buildings play a role, it is possible to predict the usability of the solar energy and establish an ideal model. Solar systems also very affect the internal layout of buildings. Pre-investment phase analysis, with a view to future aspects, will ensure that the resulting building will be both low-energy and environmentally friendly.

  3. Integration of design applications with building models

    DEFF Research Database (Denmark)

    Eastman, C. M.; Jeng, T. S.; Chowdbury, R.

    1997-01-01

    This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

  4. NYU3T: teaching, technology, teamwork: a model for interprofessional education scalability and sustainability.

    Science.gov (United States)

    Djukic, Maja; Fulmer, Terry; Adams, Jennifer G; Lee, Sabrina; Triola, Marc M

    2012-09-01

    Interprofessional education is a critical precursor to effective teamwork and the collaboration of health care professionals in clinical settings. Numerous barriers have been identified that preclude scalable and sustainable interprofessional education (IPE) efforts. This article describes NYU3T: Teaching, Technology, Teamwork, a model that uses novel technologies such as Web-based learning, virtual patients, and high-fidelity simulation to overcome some of the common barriers and drive implementation of evidence-based teamwork curricula. It outlines the program's curricular components, implementation strategy, evaluation methods, and lessons learned from the first year of delivery and describes implications for future large-scale IPE initiatives. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Working towards a scalable model of problem-based learning instruction in undergraduate engineering education

    Science.gov (United States)

    Mantri, Archana

    2014-05-01

    The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and Communication Engineering, namely Analog Electronics, Digital Electronics and Pulse, Digital & Switching Circuits is presented here. It measures the effects of pedagogy, gender and cognitive styles on the knowledge, skill and attitude of the students. The study was conducted two times with content designed around same set of OEPs but with two different trained facilitators for all the three courses. The repeatability of results for effects of the independent parameters on dependent parameters is studied and inferences are drawn.

  6. Irregular Shaped Building Design Optimization with Building Information Modelling

    Directory of Open Access Journals (Sweden)

    Lee Xia Sheng

    2016-01-01

    Full Text Available This research is to recognise the function of Building Information Modelling (BIM in design optimization for irregular shaped buildings. The study focuses on a conceptual irregular shaped “twisted” building design similar to some existing sculpture-like architectures. Form and function are the two most important aspects of new buildings, which are becoming more sophisticated as parts of equally sophisticated “systems” that we are living in. Nowadays, it is common to have irregular shaped or sculpture-like buildings which are very different when compared to regular buildings. Construction industry stakeholders are facing stiff challenges in many aspects such as buildability, cost effectiveness, delivery time and facility management when dealing with irregular shaped building projects. Building Information Modelling (BIM is being utilized to enable architects, engineers and constructors to gain improved visualization for irregular shaped buildings; this has a purpose of identifying critical issues before initiating physical construction work. In this study, three variations of design options differing in rotating angle: 30 degrees, 60 degrees and 90 degrees are created to conduct quantifiable comparisons. Discussions are focused on three major aspects including structural planning, usable building space, and structural constructability. This research concludes that Building Information Modelling is instrumental in facilitating design optimization for irregular shaped building. In the process of comparing different design variations, instead of just giving “yes or no” type of response, stakeholders can now easily visualize, evaluate and decide to achieve the right balance based on their own criteria. Therefore, construction project stakeholders are empowered with superior evaluation and decision making capability.

  7. Energy modelling and capacity building

    International Nuclear Information System (INIS)

    2005-01-01

    The Planning and Economic Studies Section of the IAEA's Department of Nuclear Energy is focusing on building analytical capacity in MS for energy-environmental-economic assessments and for the elaboration of sustainable energy strategies. It offers a variety of analytical models specifically designed for use in developing countries for (i) evaluating alternative energy strategies; (ii) assessing environmental, economic and financial impacts of energy options; (iii) assessing infrastructure needs; (iv) evaluating regional development possibilities and energy trade; (v) assessing the role of nuclear power in addressing priority issues (climate change, energy security, etc.). These models can be used for analysing energy or electricity systems, and to assess possible implications of different energy, environmental or financial policies that affect the energy sector and energy systems. The models vary in complexity and data requirements, and so can be adapted to the available data, statistics and analytical needs of different countries. These models are constantly updated to reflect changes in the real world and in the concerns that drive energy system choices. They can provide thoughtfully informed choices for policy makers over a broader range of circumstances and interests. For example, they can readily reflect the workings of competitive energy and electricity markets, and cover such topics as external costs. The IAEA further offers training in the use of these models and -just as important- in the interpretation and critical evaluation of results. Training of national teams to develop national competence over the full spectrum of models, is a high priority. The IAEA maintains a broad spectrum of databanks relevant to energy, economic and environmental analysis in MS, and make these data available to analysts in MS for use in their own analytical work. The Reference Technology Data Base (RTDB) and the Reference Data Series (RDS-1) are the major vehicles by which we

  8. Scalable devices

    KAUST Repository

    Krü ger, Jens J.; Hadwiger, Markus

    2014-01-01

    In computer science in general and in particular the field of high performance computing and supercomputing the term scalable plays an important role. It indicates that a piece of hardware, a concept, an algorithm, or an entire system scales

  9. Scalability of Semi-Implicit Time Integrators for Nonhydrostatic Galerkin-based Atmospheric Models on Large Scale Cluster

    Science.gov (United States)

    2011-01-01

    present performance statistics to explain the scalability behavior. Keywords-atmospheric models, time intergrators , MPI, scal- ability, performance; I...across inter-element bound- aries. Basis functions are constructed as tensor products of Lagrange polynomials ψi (x) = hα(ξ) ⊗ hβ(η) ⊗ hγ(ζ)., where hα

  10. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  11. Virtual building environments (VBE) - Applying information modeling to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  12. Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

    Science.gov (United States)

    Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth

    2017-12-01

    The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.

  13. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Don W.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminary theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.

  14. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  15. Scalable devices

    KAUST Repository

    Krüger, Jens J.

    2014-01-01

    In computer science in general and in particular the field of high performance computing and supercomputing the term scalable plays an important role. It indicates that a piece of hardware, a concept, an algorithm, or an entire system scales with the size of the problem, i.e., it can not only be used in a very specific setting but it\\'s applicable for a wide range of problems. From small scenarios to possibly very large settings. In this spirit, there exist a number of fixed areas of research on scalability. There are works on scalable algorithms, scalable architectures but what are scalable devices? In the context of this chapter, we are interested in a whole range of display devices, ranging from small scale hardware such as tablet computers, pads, smart-phones etc. up to large tiled display walls. What interests us mostly is not so much the hardware setup but mostly the visualization algorithms behind these display systems that scale from your average smart phone up to the largest gigapixel display walls.

  16. Scalable approximate policies for Markov decision process models of hospital elective admissions.

    Science.gov (United States)

    Zhu, George; Lizotte, Dan; Hoey, Jesse

    2014-05-01

    To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    2010-11-01

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  18. Scalable Frequent Subgraph Mining

    KAUST Repository

    Abdelhamid, Ehab

    2017-06-19

    A graph is a data structure that contains a set of nodes and a set of edges connecting these nodes. Nodes represent objects while edges model relationships among these objects. Graphs are used in various domains due to their ability to model complex relations among several objects. Given an input graph, the Frequent Subgraph Mining (FSM) task finds all subgraphs with frequencies exceeding a given threshold. FSM is crucial for graph analysis, and it is an essential building block in a variety of applications, such as graph clustering and indexing. FSM is computationally expensive, and its existing solutions are extremely slow. Consequently, these solutions are incapable of mining modern large graphs. This slowness is caused by the underlying approaches of these solutions which require finding and storing an excessive amount of subgraph matches. This dissertation proposes a scalable solution for FSM that avoids the limitations of previous work. This solution is composed of four components. The first component is a single-threaded technique which, for each candidate subgraph, needs to find only a minimal number of matches. The second component is a scalable parallel FSM technique that utilizes a novel two-phase approach. The first phase quickly builds an approximate search space, which is then used by the second phase to optimize and balance the workload of the FSM task. The third component focuses on accelerating frequency evaluation, which is a critical step in FSM. To do so, a machine learning model is employed to predict the type of each graph node, and accordingly, an optimized method is selected to evaluate that node. The fourth component focuses on mining dynamic graphs, such as social networks. To this end, an incremental index is maintained during the dynamic updates. Only this index is processed and updated for the majority of graph updates. Consequently, search space is significantly pruned and efficiency is improved. The empirical evaluation shows that the

  19. Scalable Nonlinear Solvers for Fully Implicit Coupled Nuclear Fuel Modeling. Final Report

    International Nuclear Information System (INIS)

    Cai, Xiao-Chuan; Yang, Chao; Pernice, Michael

    2014-01-01

    The focus of the project is on the development and customization of some highly scalable domain decomposition based preconditioning techniques for the numerical solution of nonlinear, coupled systems of partial differential equations (PDEs) arising from nuclear fuel simulations. These high-order PDEs represent multiple interacting physical fields (for example, heat conduction, oxygen transport, solid deformation), each is modeled by a certain type of Cahn-Hilliard and/or Allen-Cahn equations. Most existing approaches involve a careful splitting of the fields and the use of field-by-field iterations to obtain a solution of the coupled problem. Such approaches have many advantages such as ease of implementation since only single field solvers are needed, but also exhibit disadvantages. For example, certain nonlinear interactions between the fields may not be fully captured, and for unsteady problems, stable time integration schemes are difficult to design. In addition, when implemented on large scale parallel computers, the sequential nature of the field-by-field iterations substantially reduces the parallel efficiency. To overcome the disadvantages, fully coupled approaches have been investigated in order to obtain full physics simulations.

  20. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  1. Salvus: A scalable software suite for full-waveform modelling & inversion

    Science.gov (United States)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; Fichtner, A.

    2017-12-01

    Full-waveform inversion (FWI), whether at the lab, exploration, or planetary scale, requires the cooperation of five principal components. (1) The geometry of the domain needs to be properly discretized and an initial guess of the model parameters must be projected onto it; (2) Large volumes of recorded waveform data must be collected, organized, and processed; (3) Synthetic waveform data must be efficiently and accurately computed through complex domains; (4) Suitable misfit functions and optimization techniques must be used to relate discrepancies in data space to perturbations in the model; and (5) Some form of workflow management must be employed to schedule and run (1) - (4) in the correct order. Each one of these components can represent a formidable technical challenge which redirects energy from the true task at hand: using FWI to extract new information about some underlying continuum.In this presentation we give an overview of the current status of the Salvus software suite, which was introduced to address the challenges listed above. Specifically, we touch on (1) salvus_mesher, which eases the discretization of complex Earth models into hexahedral meshes; (2) salvus_seismo, which integrates with LASIF and ObsPy to streamline the processing and preparation of seismic data; (3) salvus_wave, a high-performance and scalable spectral-element solver capable of simulating waveforms through general unstructured 2- and 3-D domains, and (4) salvus_opt, an optimization toolbox specifically designed for full-waveform inverse problems. Tying everything together, we also discuss (5) salvus_flow: a workflow package designed to orchestrate and manage the rest of the suite. It is our hope that these developments represent a step towards the automation of large-scale seismic waveform inversion, while also lowering the barrier of entry for new applications. We include several examples of Salvus' use in (extra-) planetary seismology, non-destructive testing, and medical

  2. A scalable and deformable stylized model of the adult human eye for radiation dose assessment.

    Science.gov (United States)

    El Basha, Daniel; Furuta, Takuya; Iyer, Siva S R; Bolch, Wesley E

    2018-03-23

    With recent changes in the recommended annual limit on eye lens exposures to ionizing radiation, there is considerable interest in predictive computational dosimetry models of the human eye and its various ocular structures including the crystalline lens, ciliary body, cornea, retina, optic nerve, and central retinal artery. Computational eye models to date have been constructed as stylized models, high-resolution voxel models, and polygon mesh models. Their common feature, however, is that they are typically constructed of nominal size and of a roughly spherical shape associated with the emmetropic eye. In this study, we present a geometric eye model that is both scalable (allowing for changes in eye size) and deformable (allowing for changes in eye shape), and that is suitable for use in radiation transport studies of ocular exposures and radiation treatments of eye disease. The model allows continuous and variable changes in eye size (axial lengths from 20 to 26 mm) and eye shape (diopters from -12 to +6). As an explanatory example of its use, five models (emmetropic eyes of small, average, and large size, as well as average size eyes of -12D and +6D) were constructed and subjected to normally incident beams of monoenergetic electrons and photons, with resultant energy-dependent dose coefficients presented for both anterior and posterior eye structures. Electron dose coefficients were found to vary with changes to both eye size and shape for the posterior eye structures, while their values for the eye crystalline lens were found to be sensitive to changes in only eye size. No dependence upon eye size or eye shape was found for photon dose coefficients at energies below 2 MeV. Future applications of the model can include more extensive tabulations of dose coefficients to all ocular structures (not only the lens) as a function of eye size and shape, as well as the assessment of x-ray therapies for ocular disease for patients with non-emmetropic eyes. © 2018

  3. A scalable and deformable stylized model of the adult human eye for radiation dose assessment

    Science.gov (United States)

    El Basha, Daniel; Furuta, Takuya; Iyer, Siva S. R.; Bolch, Wesley E.

    2018-05-01

    With recent changes in the recommended annual limit on eye lens exposures to ionizing radiation, there is considerable interest in predictive computational dosimetry models of the human eye and its various ocular structures including the crystalline lens, ciliary body, cornea, retina, optic nerve, and central retinal artery. Computational eye models to date have been constructed as stylized models, high-resolution voxel models, and polygon mesh models. Their common feature, however, is that they are typically constructed of nominal size and of a roughly spherical shape associated with the emmetropic eye. In this study, we present a geometric eye model that is both scalable (allowing for changes in eye size) and deformable (allowing for changes in eye shape), and that is suitable for use in radiation transport studies of ocular exposures and radiation treatments of eye disease. The model allows continuous and variable changes in eye size (axial lengths from 20 to 26 mm) and eye shape (diopters from  ‑12 to  +6). As an explanatory example of its use, five models (emmetropic eyes of small, average, and large size, as well as average size eyes of  ‑12D and  +6D) were constructed and subjected to normally incident beams of monoenergetic electrons and photons, with resultant energy-dependent dose coefficients presented for both anterior and posterior eye structures. Electron dose coefficients were found to vary with changes to both eye size and shape for the posterior eye structures, while their values for the crystalline lens were found to be sensitive to changes in only eye size. No dependence upon eye size or eye shape was found for photon dose coefficients at energies below 2 MeV. Future applications of the model can include more extensive tabulations of dose coefficients to all ocular structures (not only the lens) as a function of eye size and shape, as well as the assessment of x-ray therapies for ocular disease for patients with non

  4. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  5. Modeling of shear wall buildings

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, A K [North Carolina State Univ., Raleigh (USA). Dept. of Civil Engineering

    1984-05-01

    Many nuclear power plant buildings, for example, the auxiliary building, have reinforced concrete shear walls as the primary lateral load resisting system. Typically, these walls have low height to length ratio, often less than unity. Such walls exhibit marked shear lag phenomenon which would affect their bending stiffness and the overall stress distribution in the building. The deformation and the stress distribution in walls have been studied which is applicable to both the short and the tall buildings. The behavior of the wall is divided into two parts: the symmetric flange action and the antisymmetry web action. The latter has two parts: the web shear and the web bending. Appropriate stiffness equations have been derived for all the three actions. These actions can be synthesized to solve any nonlinear cross-section. Two specific problems, that of lateral and torsional loadings of a rectangular box, have been studied. It is found that in short buildings shear lag plays a very important role. Any beam type formulation which either ignores shear lag or includes it in an idealized form is likely to lead to erroneous results. On the other hand a rigidity type approach with some modifications to the standard procedures would yield nearly accurate answers.

  6. Modeling, Fabrication and Characterization of Scalable Electroless Gold Plated Nanostructures for Enhanced Surface Plasmon Resonance

    Science.gov (United States)

    Jang, Gyoung Gug

    The scientific and industrial demand for controllable thin gold (Au) film and Au nanostructures is increasing in many fields including opto-electronics, photovoltaics, MEMS devices, diagnostics, bio-molecular sensors, spectro-/microscopic surfaces and probes. In this study, a novel continuous flow electroless (CF-EL) Au plating method is developed to fabricate uniform Au thin films in ambient condition. The enhanced local mass transfer rate and continuous deposition resulting from CF-EL plating improved physical uniformity of deposited Au films and thermally transformed nanoparticles (NPs). Au films and NPs exhibited improved optical photoluminescence (PL) and surface plasmon resonance (SPR), respectively, relative to batch immersion EL (BI-EL) plating. Suggested mass transfer models of Au mole deposition are consistent with optical feature of CF-EL and BI-EL films. The prototype CF-EL plating system is upgraded an automated scalable CF-EL plating system with real-time transmission UV-vis (T-UV) spectroscopy which provides the advantage of CF-EL plating, such as more uniform surface morphology, and overcomes the disadvantages of conventional EL plating, such as no continuous process and low deposition rate, using continuous process and controllable deposition rate. Throughout this work, dynamic morphological and chemical transitions during redox-driven self-assembly of Ag and Au film on silica surfaces under kinetic and equilibrium conditions are distinguished by correlating real-time T-UV spectroscopy with X-ray photoelectron spectroscopy (XPS) and scanning electron microscopy (SEM) measurements. The characterization suggests that four previously unrecognized time-dependent physicochemical regimes occur during consecutive EL deposition of silver (Ag) and Au onto tin-sensitized silica surfaces: self-limiting Ag activation; transitory Ag NP formation; transitional Au-Ag alloy formation during galvanic replacement of Ag by Au; and uniform morphology formation under

  7. Non-commutative standard model: model building

    CERN Document Server

    Chaichian, Masud; Presnajder, P

    2003-01-01

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U sub * (n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,+-1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U sub * (3) x U sub * (2) x U sub * (1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  8. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  9. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  10. Scalable coherent interface

    International Nuclear Information System (INIS)

    Alnaes, K.; Kristiansen, E.H.; Gustavson, D.B.; James, D.V.

    1990-01-01

    The Scalable Coherent Interface (IEEE P1596) is establishing an interface standard for very high performance multiprocessors, supporting a cache-coherent-memory model scalable to systems with up to 64K nodes. This Scalable Coherent Interface (SCI) will supply a peak bandwidth per node of 1 GigaByte/second. The SCI standard should facilitate assembly of processor, memory, I/O and bus bridge cards from multiple vendors into massively parallel systems with throughput far above what is possible today. The SCI standard encompasses two levels of interface, a physical level and a logical level. The physical level specifies electrical, mechanical and thermal characteristics of connectors and cards that meet the standard. The logical level describes the address space, data transfer protocols, cache coherence mechanisms, synchronization primitives and error recovery. In this paper we address logical level issues such as packet formats, packet transmission, transaction handshake, flow control, and cache coherence. 11 refs., 10 figs

  11. A review of building information modelling

    Science.gov (United States)

    Wang, Wen; Han, Rui

    2018-05-01

    Building Information Modelling (BIM) is widely seen as a catalyst for innovation and productivity. It is becoming standard for new construction and is the most significant technology changing how we design, build, use and manage the building. It is a dominant technological trend in the software industry and although the theoretical groundwork was laid in the previous century, it is a popular topic in academic research. BIM is discussed in this study, which results can provide better and more comprehensive choices for building owners, designers, and developers in future.

  12. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  13. Comparison of Building Energy Modeling Programs: Building Loads

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Dandan [Tsinghua Univ., Beijing (China); Hong, Tianzhen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yan, Da [Tsinghua Univ., Beijing (China); Wang, Chuang [Tsinghua Univ., Beijing (China)

    2012-06-01

    This technical report presented the methodologies, processes, and results of comparing three Building Energy Modeling Programs (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In the fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to

  14. Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.

    Science.gov (United States)

    Maani, Ehsan; Katsaggelos, Aggelos K

    2009-09-01

    The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.

  15. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  16. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    Science.gov (United States)

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  17. Building Information Modelling in Denmark and Iceland

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Jóhannesson, Elvar Ingi

    2013-01-01

    with BIM is studied. Based on findings from both parts, ideas and recommendations are put forward for the Icelandic building industry about feasible ways of implementing BIM. Findings – Among the results are that the use of BIM is very limited in the Icelandic companies compared to the other Nordic...... for making standards and guidelines related to BIM. Public building clients are also encouraged to consider initiating projects based on making simple building models of existing buildings in order to introduce the BIM technology to the industry. Icelandic companies are recommended to start implementing BIM...... countries. Research limitations/implications – The research is limited to the Nordic countries in Europe, but many recommendations could be relevant to other countries. Practical implications – It is recommended to the Icelandic building authorities to get into cooperation with their Nordic counterparts...

  18. Economic aspects and models for building codes

    DEFF Research Database (Denmark)

    Bonke, Jens; Pedersen, Dan Ove; Johnsen, Kjeld

    It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study.......It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study....

  19. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  20. Build Your Own Payment Model.

    Science.gov (United States)

    Berlin, Joey

    2017-07-01

    Physicians participating in MACRA have a unique opportunity to create and submit their own alternative payment models to the government and take command of their own future payments. At least one Texas physician is taking a crack at developing his own model.

  1. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  2. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  3. Component-Based Modelling for Scalable Smart City Systems Interoperability: A Case Study on Integrating Energy Demand Response Systems.

    Science.gov (United States)

    Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan

    2016-10-28

    Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.

  4. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  5. Minimalism in Inflation Model Building

    CERN Document Server

    Dvali, Gia; Dvali, Gia; Riotto, Antonio

    1998-01-01

    In this paper we demand that a successfull inflationary scenario should follow from a model entirely motivated by particle physics considerations. We show that such a connection is indeed possible within the framework of concrete supersymmetric Grand Unified Theories where the doublet-triplet splitting problem is naturally solved. The Fayet-Iliopoulos D-term of a gauge $U(1)_{\\xi}$ symmetry, which plays a crucial role in the solution of the doublet-triplet splitting problem, simultaneously provides a built-in inflationary slope protected from dangerous supergravity corrections.

  6. Minimalism in inflation model building

    Science.gov (United States)

    Dvali, Gia; Riotto, Antonio

    1998-01-01

    In this paper we demand that a successful inflationary scenario should follow from a model entirely motivated by particle physics considerations. We show that such a connection is indeed possible within the framework of concrete supersymmetric Grand Unified Theories where the doublet-triplet splitting problem is naturally solved. The Fayet-Iliopoulos D-term of a gauge U(1)ξ symmetry, which plays a crucial role in the solution of the doublet-triplet splitting problem, simultaneously provides a built-in inflationary slope protected from dangerous supergravity corrections.

  7. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  8. Model for Refurbishment of Heritage Buildings

    DEFF Research Database (Denmark)

    Rasmussen, Torben Valdbjørn

    2014-01-01

    the Heritage Agency, the Danish Working Environment Authority and the owner as a team cooperated in identifying feasible refurbishments. In this case, the focus centered on restoring and identifying potential energy savings and deciding on energy upgrading measures for the listed complex. The refurbished...... with the requirements for the use of the building. The model focuses on the cooperation and dialogue between authorities and owners, who refurbish heritage buildings. The developed model was used for the refurbishment of the listed complex, Fæstningens Materialgård. Fæstningens Materialgård is a case study where...

  9. Heterotic model building: 16 special manifolds

    International Nuclear Information System (INIS)

    He, Yang-Hui; Lee, Seung-Joo; Lukas, Andre; Sun, Chuang

    2014-01-01

    We study heterotic model building on 16 specific Calabi-Yau manifolds constructed as hypersurfaces in toric four-folds. These 16 manifolds are the only ones among the more than half a billion manifolds in the Kreuzer-Skarke list with a non-trivial first fundamental group. We classify the line bundle models on these manifolds, both for SU(5) and SO(10) GUTs, which lead to consistent supersymmetric string vacua and have three chiral families. A total of about 29000 models is found, most of them corresponding to SO(10) GUTs. These models constitute a starting point for detailed heterotic model building on Calabi-Yau manifolds in the Kreuzer-Skarke list. The data for these models can be downloaded http://www-thphys.physics.ox.ac.uk/projects/CalabiYau/toricdata/index.html.

  10. Modeling of Dynamic Responses in Building Insulation

    Directory of Open Access Journals (Sweden)

    Anna Antonyová

    2015-10-01

    Full Text Available In this research a measurement systemwas developedfor monitoring humidity and temperature in the cavity between the wall and the insulating material in the building envelope. This new technology does not disturb the insulating material during testing. The measurement system can also be applied to insulation fixed ten or twenty years earlier and sufficiently reveals the quality of the insulation. A mathematical model is proposed to characterize the dynamic responses in the cavity between the wall and the building insulation as influenced by weather conditions.These dynamic responses are manifested as a delay of both humidity and temperature changes in the cavity when compared with the changes in the ambient surrounding of the building. The process is then modeled through numerical methods and statistical analysis of the experimental data obtained using the new system of measurement.

  11. A procedure for Building Product Models

    DEFF Research Database (Denmark)

    Hvam, Lars

    1999-01-01

    , easily adaptable concepts and methods from data modeling (object oriented analysis) and domain modeling (product modeling). The concepts are general and can be used for modeling all types of specifications in the different phases in the product life cycle. The modeling techniques presented have been......The application of product modeling in manufacturing companies raises the important question of how to model product knowledge in a comprehensible and efficient way. An important challenge is to qualify engineers to model and specify IT-systems (product models) to support their specification...... activities. A basic assumption is that engineers have to take the responsability for building product models to be used in their domain. To do that they must be able to carry out the modeling task on their own without any need for support from computer science experts. This paper presents a set of simple...

  12. U.S. Department of Energy Commercial Reference Building Models of the National Building Stock

    Energy Technology Data Exchange (ETDEWEB)

    Deru, M.; Field, K.; Studer, D.; Benne, K.; Griffith, B.; Torcellini, P.; Liu, B.; Halverson, M.; Winiarski, D.; Rosenberg, M.; Yazdanian, M.; Huang, J.; Crawley, D.

    2011-02-01

    The U.S. Department of Energy (DOE) Building Technologies Program has set the aggressive goal of producing marketable net-zero energy buildings by 2025. This goal will require collaboration between the DOE laboratories and the building industry. We developed standard or reference energy models for the most common commercial buildings to serve as starting points for energy efficiency research. These models represent fairly realistic buildings and typical construction practices. Fifteen commercial building types and one multifamily residential building were determined by consensus between DOE, the National Renewable Energy Laboratory, Pacific Northwest National Laboratory, and Lawrence Berkeley National Laboratory, and represent approximately two-thirds of the commercial building stock.

  13. Aspects of superstring model-building

    International Nuclear Information System (INIS)

    Ellis, J.

    1989-01-01

    Several approaches to model-building with strings are discussed, including Calabi-Yau manifolds and fermionic formulations of strings directly in four dimensions. Ideas about supersymmetry breaking are reviewed. Flipped SU(5)xU(1) is touted as the theory of everything below the Planck scale (perhaps). (author). 64 refs, 7 figs

  14. Modelling of settlement induced building damage

    NARCIS (Netherlands)

    Giardina, G.

    2013-01-01

    This thesis focuses on the modelling of settlement induced damage to masonry buildings. In densely populated areas, the need for new space is nowadays producing a rapid increment of underground excavations. Due to the construction of new metro lines, tunnelling activity in urban areas is growing.

  15. Models for map building and navigation

    International Nuclear Information System (INIS)

    Penna, M.A.; Jian Wu

    1993-01-01

    In this paper the authors present several models for solving map building and navigation problems. These models are motivated by biological processes, and presented in the context of artificial neural networks. Since the nodes, weights, and threshold functions of the models all have physical meanings, they can easily predict network topologies and avoid traditional trial-and-error training. On one hand, this makes their models useful in constructing solutions to engineering problems (problems such as those that occur in robotics, for example). On the other hand, this might also contribute to the ability of their models to explain some biological processes, few of which are completely understood at this time

  16. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Mustafaraj, Giorgio; Marini, Dashamir; Costa, Andrea; Keane, Marcus

    2014-01-01

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE) hourly from −5.6% to 7.5% and CV(RMSE) hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  17. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  18. Improved model for solar heating of buildings

    OpenAIRE

    Lie, Bernt

    2015-01-01

    A considerable future increase in the global energy use is expected, and the effects of energy conversion on the climate are already observed. Future energy conversion should thus be based on resources that have negligible climate effects; solar energy is perhaps the most important of such resources. The presented work builds on a previous complete model for solar heating of a house; here the aim to introduce ventilation heat recovery and improve on the hot water storage model. Ventilation he...

  19. Reconstructing building mass models from UAV images

    KAUST Repository

    Li, Minglei

    2015-07-26

    We present an automatic reconstruction pipeline for large scale urban scenes from aerial images captured by a camera mounted on an unmanned aerial vehicle. Using state-of-the-art Structure from Motion and Multi-View Stereo algorithms, we first generate a dense point cloud from the aerial images. Based on the statistical analysis of the footprint grid of the buildings, the point cloud is classified into different categories (i.e., buildings, ground, trees, and others). Roof structures are extracted for each individual building using Markov random field optimization. Then, a contour refinement algorithm based on pivot point detection is utilized to refine the contour of patches. Finally, polygonal mesh models are extracted from the refined contours. Experiments on various scenes as well as comparisons with state-of-the-art reconstruction methods demonstrate the effectiveness and robustness of the proposed method.

  20. Thermal Models for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2012-01-01

    the comfort of residents, proper prediction models for indoor temperature have to be developed. This paper presents a model for prediction of indoor temperature and power consumption from electrical space heating in an office building, using stochastic differential equations. The heat dynamic model is build......The Danish government has set the ambitious goal that the share of the total Danish electricity consumption, covered by wind energy, should be increased to 50% by year 2020. This asks for radical changes in how we utilize and transmit electricity in the future power grid. To fully utilize the high...... share of renewable power generation, which is in general intermittent and non-controllable, the consumption side has to be much more flexible than today. To achieve such flexibility, methods for moving power consumption in time, within the hourly timescale, have to be developed. One approach currently...

  1. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  2. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  3. Operationalizing the Reciprocal Engagement Model of Genetic Counseling Practice: a Framework for the Scalable Delivery of Genomic Counseling and Testing.

    Science.gov (United States)

    Schmidlen, Tara; Sturm, Amy C; Hovick, Shelly; Scheinfeldt, Laura; Scott Roberts, J; Morr, Lindsey; McElroy, Joseph; Toland, Amanda E; Christman, Michael; O'Daniel, Julianne M; Gordon, Erynn S; Bernhardt, Barbara A; Ormond, Kelly E; Sweet, Kevin

    2018-02-19

    With the advent of widespread genomic testing for diagnostic indications and disease risk assessment, there is increased need to optimize genetic counseling services to support the scalable delivery of precision medicine. Here, we describe how we operationalized the reciprocal engagement model of genetic counseling practice to develop a framework of counseling components and strategies for the delivery of genomic results. This framework was constructed based upon qualitative research with patients receiving genomic counseling following online receipt of potentially actionable complex disease and pharmacogenomics reports. Consultation with a transdisciplinary group of investigators, including practicing genetic counselors, was sought to ensure broad scope and applicability of these strategies for use with any large-scale genomic testing effort. We preserve the provision of pre-test education and informed consent as established in Mendelian/single-gene disease genetic counseling practice. Following receipt of genomic results, patients are afforded the opportunity to tailor the counseling agenda by selecting the specific test results they wish to discuss, specifying questions for discussion, and indicating their preference for counseling modality. The genetic counselor uses these patient preferences to set the genomic counseling session and to personalize result communication and risk reduction recommendations. Tailored visual aids and result summary reports divide areas of risk (genetic variant, family history, lifestyle) for each disease to facilitate discussion of multiple disease risks. Post-counseling, session summary reports are actively routed to both the patient and their physician team to encourage review and follow-up. Given the breadth of genomic information potentially resulting from genomic testing, this framework is put forth as a starting point to meet the need for scalable genetic counseling services in the delivery of precision medicine.

  4. Progress in D-brane model building

    International Nuclear Information System (INIS)

    Marchesano, F.

    2007-01-01

    The state of the art in D-brane model building is briefly reviewed, focusing on recent achievements in the construction of D=4 N=1 type II string vacua with semi-realistic gauge sectors. Such progress relies on a better understanding of the spectrum of BPS D-branes, the effective field theory obtained from them and the explicit construction of vacua. We first consider D-branes in standard Calabi-Yau compactifications, and then the more involved case of compactifications with fluxes. We discuss how the non-trivial interplay between D-branes and fluxes modifies the previous model-building rules, as well as provides new possibilities to connect string theory to particle physics. (Abstract Copyright [2007], Wiley Periodicals, Inc.)

  5. Boxes of Model Building and Visualization.

    Science.gov (United States)

    Turk, Dušan

    2017-01-01

    Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.

  6. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  7. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  8. 3D modeling of buildings outstanding sites

    CERN Document Server

    Héno, Rapha?le

    2014-01-01

    Conventional topographic databases, obtained by capture on aerial or spatial images provide a simplified 3D modeling of our urban environment, answering the needs of numerous applications (development, risk prevention, mobility management, etc.). However, when we have to represent and analyze more complex sites (monuments, civil engineering works, archeological sites, etc.), these models no longer suffice and other acquisition and processing means have to be implemented. This book focuses on the study of adapted lifting means for "notable buildings". The methods tackled in this book cover las

  9. A procedure for building product models

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin

    2001-01-01

    This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes, which are to be supported...... with product models. The next phase includes an analysis of the product assortment, and the set up of a so-called product master. Finally the product model is designed and implemented using object oriented modelling. The procedure is developed in order to ensure that the product models constructed are fit...... for the business processes they support, and properly structured and documented, in order to facilitate that the systems can be maintained continually and further developed. The research has been carried out at the Centre for Industrialisation of Engineering, Department of Manufacturing Engineering, Technical...

  10. Public health component in building information modeling

    Science.gov (United States)

    Trufanov, A. I.; Rossodivita, A.; Tikhomirov, A. A.; Berestneva, O. G.; Marukhina, O. V.

    2018-05-01

    A building information modelling (BIM) conception has established itself as an effective and practical approach to plan, design, construct, and manage buildings and infrastructure. Analysis of the governance literature has shown that the BIM-developed tools do not take fully into account the growing demands from ecology and health fields. In this connection, it is possible to offer an optimal way of adapting such tools to the necessary consideration of the sanitary and hygienic specifications of materials used in construction industry. It is proposed to do it through the introduction of assessments that meet the requirements of national sanitary standards. This approach was demonstrated in the case study of Revit® program.

  11. Demand Response Resource Quantification with Detailed Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Elaine; Horsey, Henry; Merket, Noel; Stoll, Brady; Nag, Ambarish

    2017-04-03

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  12. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin

    2016-07-29

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available data such as aerial images and street-view images. First, we propose a parametric model to describe the exterior of a building (with a varying number of parameters) and propose a set of attributes as a building representation with fixed dimensionality. Second, we propose a hierarchical graphical model with hidden variables to encode the relationships between building attributes and learn both the structure and parameters of the model from the database. Third, we propose optimization algorithms to generate three-dimensional models based on building attributes sampled from the graphical model. Finally, we demonstrate our framework by synthesizing new building models and completing partially observed building models from photographs.

  13. Iterative-build OMIT maps: map improvement by iterative model building and refinement without model bias

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Adams, Paul D.; Read, Randy J.; Zwart, Peter H.; Hung, Li-Wei

    2008-01-01

    An OMIT procedure is presented that has the benefits of iterative model building density modification and refinement yet is essentially unbiased by the atomic model that is built. A procedure for carrying out iterative model building, density modification and refinement is presented in which the density in an OMIT region is essentially unbiased by an atomic model. Density from a set of overlapping OMIT regions can be combined to create a composite ‘iterative-build’ OMIT map that is everywhere unbiased by an atomic model but also everywhere benefiting from the model-based information present elsewhere in the unit cell. The procedure may have applications in the validation of specific features in atomic models as well as in overall model validation. The procedure is demonstrated with a molecular-replacement structure and with an experimentally phased structure and a variation on the method is demonstrated by removing model bias from a structure from the Protein Data Bank

  14. Scalable Coupling of Multiscale AEH and PARADYN Analyses for Impact Modeling

    National Research Council Canada - National Science Library

    Valisetty, Rama R; Chung, Peter W; Namburu, Raju R

    2005-01-01

    .... An asymptotic expansion homogenization (AEH)-based microstructural model available for modeling microstructural aspects of modern armor materials is coupled with PARADYN, a parallel explicit Lagrangian finite-element code...

  15. Working group report: Flavor physics and model building

    Indian Academy of Sciences (India)

    cO Indian Academy of Sciences. Vol. ... This is the report of flavor physics and model building working group at ... those in model building have been primarily devoted to neutrino physics. ..... [12] Andrei Gritsan, ICHEP 2004, Beijing, China.

  16. Scalability Modeling for Optimal Provisioning of Data Centers in Telenor: A better balance between under- and over-provisioning

    OpenAIRE

    Rygg, Knut Helge

    2012-01-01

    The scalability of an information system describes the relationship between system ca-pacity and system size. This report studies the scalability of Microsoft Lync Server 2010 in order to provide guidelines for provisioning hardware resources. Optimal pro-visioning is required to reduce both deployment and operational costs, while keeping an acceptable service quality.All Lync servers in the test setup are virtualizedusingVMware ESXi 5.0 and the system runs on a Cisco Unified Computing System...

  17. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  18. Automatic Generation of 3D Building Models with Multiple Roofs

    Institute of Scientific and Technical Information of China (English)

    Kenichi Sugihara; Yoshitugu Hayashi

    2008-01-01

    Based on building footprints (building polygons) on digital maps, we are proposing the GIS and CG integrated system that automatically generates 3D building models with multiple roofs. Most building polygons' edges meet at right angles (orthogonal polygon). The integrated system partitions orthogonal building polygons into a set of rectangles and places rectangular roofs and box-shaped building bodies on these rectangles. In order to partition an orthogonal polygon, we proposed a useful polygon expression in deciding from which vertex a dividing line is drawn. In this paper, we propose a new scheme for partitioning building polygons and show the process of creating 3D roof models.

  19. Directions for model building from asymptotic safety

    Science.gov (United States)

    Bond, Andrew D.; Hiller, Gudrun; Kowalska, Kamila; Litim, Daniel F.

    2017-08-01

    Building on recent advances in the understanding of gauge-Yukawa theories we explore possibilities to UV-complete the Standard Model in an asymptotically safe manner. Minimal extensions are based on a large flavor sector of additional fermions coupled to a scalar singlet matrix field. We find that asymptotic safety requires fermions in higher representations of SU(3) C × SU(2) L . Possible signatures at colliders are worked out and include R-hadron searches, diboson signatures and the evolution of the strong and weak coupling constants.

  20. Modeling the building blocks of biodiversity.

    Directory of Open Access Journals (Sweden)

    Lucas N Joppa

    Full Text Available BACKGROUND: Networks of single interaction types, such as plant-pollinator mutualisms, are biodiversity's "building blocks". Yet, the structure of mutualistic and antagonistic networks differs, leaving no unified modeling framework across biodiversity's component pieces. METHODS/PRINCIPAL FINDINGS: We use a one-dimensional "niche model" to predict antagonistic and mutualistic species interactions, finding that accuracy decreases with the size of the network. We show that properties of the modeled network structure closely approximate empirical properties even where individual interactions are poorly predicted. Further, some aspects of the structure of the niche space were consistently different between network classes. CONCLUSIONS/SIGNIFICANCE: These novel results reveal fundamental differences between the ability to predict ecologically important features of the overall structure of a network and the ability to predict pair-wise species interactions.

  1. HYDROSCAPE: A SCAlable and ParallelizablE Rainfall Runoff Model for Hydrological Applications

    Science.gov (United States)

    Piccolroaz, S.; Di Lazzaro, M.; Zarlenga, A.; Majone, B.; Bellin, A.; Fiori, A.

    2015-12-01

    In this work we present HYDROSCAPE, an innovative streamflow routing method based on the travel time approach, and modeled through a fine-scale geomorphological description of hydrological flow paths. The model is designed aimed at being easily coupled with weather forecast or climate models providing the hydrological forcing, and at the same time preserving the geomorphological dispersion of the river network, which is kept unchanged independently on the grid size of rainfall input. This makes HYDROSCAPE particularly suitable for multi-scale applications, ranging from medium size catchments up to the continental scale, and to investigate the effects of extreme rainfall events that require an accurate description of basin response timing. Key feature of the model is its computational efficiency, which allows performing a large number of simulations for sensitivity/uncertainty analyses in a Monte Carlo framework. Further, the model is highly parsimonious, involving the calibration of only three parameters: one defining the residence time of hillslope response, one for channel velocity, and a multiplicative factor accounting for uncertainties in the identification of the potential maximum soil moisture retention in the SCS-CN method. HYDROSCAPE is designed with a simple and flexible modular structure, which makes it particularly prone to massive parallelization, customization according to the specific user needs and preferences (e.g., rainfall-runoff model), and continuous development and improvement. Finally, the possibility to specify the desired computational time step and evaluate streamflow at any location in the domain, makes HYDROSCAPE an attractive tool for many hydrological applications, and a valuable alternative to more complex and highly parametrized large scale hydrological models. Together with model development and features, we present an application to the Upper Tiber River basin (Italy), providing a practical example of model performance and

  2. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  3. Investigating the Role of Biogeochemical Processes in the Northern High Latitudes on Global Climate Feedbacks Using an Efficient Scalable Earth System Model

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Atul K. [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2016-09-14

    The overall objectives of this DOE funded project is to combine scientific and computational challenges in climate modeling by expanding our understanding of the biogeophysical-biogeochemical processes and their interactions in the northern high latitudes (NHLs) using an earth system modeling (ESM) approach, and by adopting an adaptive parallel runtime system in an ESM to achieve efficient and scalable climate simulations through improved load balancing algorithms.

  4. A scalable community detection algorithm for large graphs using stochastic block models

    KAUST Repository

    Peng, Chengbin

    2017-11-24

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of

  5. A scalable community detection algorithm for large graphs using stochastic block models

    KAUST Repository

    Peng, Chengbin; Zhang, Zhihua; Wong, Ka-Chun; Zhang, Xiangliang; Keyes, David E.

    2017-01-01

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of

  6. Scalability on LHS (Latin Hypercube Sampling) samples for use in uncertainty analysis of large numerical models

    International Nuclear Information System (INIS)

    Baron, Jorge H.; Nunez Mac Leod, J.E.

    2000-01-01

    The present paper deals with the utilization of advanced sampling statistical methods to perform uncertainty and sensitivity analysis on numerical models. Such models may represent physical phenomena, logical structures (such as boolean expressions) or other systems, and various of their intrinsic parameters and/or input variables are usually treated as random variables simultaneously. In the present paper a simple method to scale-up Latin Hypercube Sampling (LHS) samples is presented, starting with a small sample and duplicating its size at each step, making it possible to use the already run numerical model results with the smaller sample. The method does not distort the statistical properties of the random variables and does not add any bias to the samples. The results is a significant reduction in numerical models running time can be achieved (by re-using the previously run samples), keeping all the advantages of LHS, until an acceptable representation level is achieved in the output variables. (author)

  7. Sustainability and scalability of university spinouts:a business model perspective

    OpenAIRE

    Ziaee Bigdeli, Ali; Li, Feng; Shi, Xiaohui

    2015-01-01

    Most previous studies of university spinouts (USOs) have focused on what determines their formation from the perspectives of the entrepreneurs or of their parent universities. However, few studies have investigated how these entrepreneurial businesses actually grow and how their business models evolve in the process. This paper examines the evolution of USOs' business models over their different development phases. Using empirical evidence gathered from three comprehensive case studies, we ex...

  8. Fast, Automated, Scalable Generation of Textured 3D Models of Indoor Environments

    Science.gov (United States)

    2014-12-18

    throughs of environments, gaming entertainment, augmented reality , indoor navigation, and energy simulation analysis. These applications rely on the...models are used in virtual reality , gaming, navigation, and simulation applica- tions. State-of-the-art scanning produces accurate point-clouds of...meshes that remove furniture and other temporary objects. We propose a method to texture-map these models from captured camera imagery to produce

  9. Scalable and Accurate SMT-Based Model Checking of Data Flow Systems

    Science.gov (United States)

    2013-10-31

    of variable x is always less than that of variable y) can be represented in this theory. • A theory of inductive datatypes . Modeling software... datatypes can be done directly in this theory. • A theory of arrays. Software that uses arrays can be modeled with constraints in this theory, as can...Arithmetic (and specialized fragments) Arrays Inductive datatypes Bit-vectors Uninterpreted functions SMT Engine Input interfaces FEATURES Support for

  10. PATHLOGIC-S: a scalable Boolean framework for modelling cellular signalling.

    Directory of Open Access Journals (Sweden)

    Liam G Fearnley

    Full Text Available Curated databases of signal transduction have grown to describe several thousand reactions, and efficient use of these data requires the development of modelling tools to elucidate and explore system properties. We present PATHLOGIC-S, a Boolean specification for a signalling model, with its associated GPL-licensed implementation using integer programming techniques. The PATHLOGIC-S specification has been designed to function on current desktop workstations, and is capable of providing analyses on some of the largest currently available datasets through use of Boolean modelling techniques to generate predictions of stable and semi-stable network states from data in community file formats. PATHLOGIC-S also addresses major problems associated with the presence and modelling of inhibition in Boolean systems, and reduces logical incoherence due to common inhibitory mechanisms in signalling systems. We apply this approach to signal transduction networks including Reactome and two pathways from the Panther Pathways database, and present the results of computations on each along with a discussion of execution time. A software implementation of the framework and model is freely available under a GPL license.

  11. Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Science.gov (United States)

    Gray-Davies, Tristan; Holmes, Chris C.; Caron, François

    2018-01-01

    We present a novel Bayesian nonparametric regression model for covariates X and continuous response variable Y ∈ ℝ. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F (y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates. PMID:29623150

  12. Building a Democratic Model of Science Teaching

    Directory of Open Access Journals (Sweden)

    Suhadi Ibnu

    2016-02-01

    Full Text Available Earlier in the last century, learning in science, as was learning in other disciplines, was developed according to the philosophy of behaviorism. This did not serve the purposes of learning in science properly, as the students were forced to absorb information transferred from the main and the only source of learning, the teacher. Towards the end of the century a significant shift from behaviorism to constructivism philosophy took place. The shift promoted the development of more democratic models of learning in science which provided greater opportunities to the students to act as real scientist, chattering for the building of knowledge and scientific skills. Considering the characteristics of science and the characteristics of the students as active learners, the shift towards democratic models of learning is unavoidable and is merely a matter of time

  13. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2015-10-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  14. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  15. A scalable delivery framework and a pricing model for streaming media with advertisements

    Science.gov (United States)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  16. A Scalable Approach to Modeling Cascading Risk in the MDAP Network

    Science.gov (United States)

    2014-05-01

    Populate Decision Process Model. • Identify challenges to data acquisition. Legend: ATIE_MOD Automated Text & Image  Extraction Module  IID_MOD...8217:~ TI ~.O.Y <D1Y o:yle-~Jti<NI:Aboolate:tos>:J14 : lert•tl ::J!i <DtV o; vlc "~’"""’al>oolote:tos~: 3l4: 1•tt:t’l...DAES, PE docs, SARS – Topic models built from MDAP hub data seem to be relevant to neighbors. – Challenges : Formatting and Content inconsistencies

  17. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  18. An experimental investigation for scalability of the seismic response of microconcrete model nuclear power plant structures

    International Nuclear Information System (INIS)

    Bennett, J.G.; Dove, R.C.; Dunwoody, W.E.; Farrar, C.R.

    1987-01-01

    The paper reports the results from tests including reduced stiffnesses found in the prototype and 1/4 scale model, implications of the test results on the validity of past tests, and implications of these results from the 1986 tests on the seismic behavior of actual Seismic Category I Structures and their attached equipment. (orig./HP)

  19. Developmental Impact Analysis of an ICT-Enabled Scalable Healthcare Model in BRICS Economies

    Directory of Open Access Journals (Sweden)

    Dhrubes Biswas

    2012-06-01

    Full Text Available This article highlights the need for initiating a healthcare business model in a grassroots, emerging-nation context. This article’s backdrop is a history of chronic anomalies afflicting the healthcare sector in India and similarly placed BRICS nations. In these countries, a significant percentage of populations remain deprived of basic healthcare facilities and emergency services. Community (primary care services are being offered by public and private stakeholders as a panacea to the problem. Yet, there is an urgent need for specialized (tertiary care services at all levels. As a response to this challenge, an all-inclusive health-exchange system (HES model, which utilizes information communication technology (ICT to provide solutions in rural India, has been developed. The uniqueness of the model lies in its innovative hub-and-spoke architecture and its emphasis on affordability, accessibility, and availability to the masses. This article describes a developmental impact analysis (DIA that was used to assess the impact of this model. The article contributes to the knowledge base of readers by making them aware of the healthcare challenges emerging nations are facing and ways to mitigate those challenges using entrepreneurial solutions.

  20. Non parametric, self organizing, scalable modeling of spatiotemporal inputs: the sign language paradigm.

    Science.gov (United States)

    Caridakis, G; Karpouzis, K; Drosopoulos, A; Kollias, S

    2012-12-01

    Modeling and recognizing spatiotemporal, as opposed to static input, is a challenging task since it incorporates input dynamics as part of the problem. The vast majority of existing methods tackle the problem as an extension of the static counterpart, using dynamics, such as input derivatives, at feature level and adopting artificial intelligence and machine learning techniques originally designed for solving problems that do not specifically address the temporal aspect. The proposed approach deals with temporal and spatial aspects of the spatiotemporal domain in a discriminative as well as coupling manner. Self Organizing Maps (SOM) model the spatial aspect of the problem and Markov models its temporal counterpart. Incorporation of adjacency, both in training and classification, enhances the overall architecture with robustness and adaptability. The proposed scheme is validated both theoretically, through an error propagation study, and experimentally, on the recognition of individual signs, performed by different, native Greek Sign Language users. Results illustrate the architecture's superiority when compared to Hidden Markov Model techniques and variations both in terms of classification performance and computational cost. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Deep Potential Molecular Dynamics: A Scalable Model with the Accuracy of Quantum Mechanics

    Science.gov (United States)

    Zhang, Linfeng; Han, Jiequn; Wang, Han; Car, Roberto; E, Weinan

    2018-04-01

    We introduce a scheme for molecular simulations, the deep potential molecular dynamics (DPMD) method, based on a many-body potential and interatomic forces generated by a carefully crafted deep neural network trained with ab initio data. The neural network model preserves all the natural symmetries in the problem. It is first-principles based in the sense that there are no ad hoc components aside from the network model. We show that the proposed scheme provides an efficient and accurate protocol in a variety of systems, including bulk materials and molecules. In all these cases, DPMD gives results that are essentially indistinguishable from the original data, at a cost that scales linearly with system size.

  2. Helicopter model rotor-blade vortex interaction impulsive noise: Scalability and parametric variations

    Science.gov (United States)

    Splettstoesser, W. R.; Schultz, K. J.; Boxwell, D. A.; Schmitz, F. H.

    1984-01-01

    Acoustic data taken in the anechoic Deutsch-Niederlaendischer Windkanal (DNW) have documented the blade vortex interaction (BVI) impulsive noise radiated from a 1/7-scale model main rotor of the AH-1 series helicopter. Averaged model scale data were compared with averaged full scale, inflight acoustic data under similar nondimensional test conditions. At low advance ratios (mu = 0.164 to 0.194), the data scale remarkable well in level and waveform shape, and also duplicate the directivity pattern of BVI impulsive noise. At moderate advance ratios (mu = 0.224 to 0.270), the scaling deteriorates, suggesting that the model scale rotor is not adequately simulating the full scale BVI noise; presently, no proved explanation of this discrepancy exists. Carefully performed parametric variations over a complete matrix of testing conditions have shown that all of the four governing nondimensional parameters - tip Mach number at hover, advance ratio, local inflow ratio, and thrust coefficient - are highly sensitive to BVI noise radiation.

  3. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  4. Systematic model building with flavor symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Plentinger, Florian

    2009-12-19

    The observation of neutrino masses and lepton mixing has highlighted the incompleteness of the Standard Model of particle physics. In conjunction with this discovery, new questions arise: why are the neutrino masses so small, which form has their mass hierarchy, why is the mixing in the quark and lepton sectors so different or what is the structure of the Higgs sector. In order to address these issues and to predict future experimental results, different approaches are considered. One particularly interesting possibility, are Grand Unified Theories such as SU(5) or SO(10). GUTs are vertical symmetries since they unify the SM particles into multiplets and usually predict new particles which can naturally explain the smallness of the neutrino masses via the seesaw mechanism. On the other hand, also horizontal symmetries, i.e., flavor symmetries, acting on the generation space of the SM particles, are promising. They can serve as an explanation for the quark and lepton mass hierarchies as well as for the different mixings in the quark and lepton sectors. In addition, flavor symmetries are significantly involved in the Higgs sector and predict certain forms of mass matrices. This high predictivity makes GUTs and flavor symmetries interesting for both, theorists and experimentalists. These extensions of the SM can be also combined with theories such as supersymmetry or extra dimensions. In addition, they usually have implications on the observed matter-antimatter asymmetry of the universe or can provide a dark matter candidate. In general, they also predict the lepton flavor violating rare decays {mu} {yields} e{gamma}, {tau} {yields} {mu}{gamma}, and {tau} {yields} e{gamma} which are strongly bounded by experiments but might be observed in the future. In this thesis, we combine all of these approaches, i.e., GUTs, the seesaw mechanism and flavor symmetries. Moreover, our request is to develop and perform a systematic model building approach with flavor symmetries and

  5. Systematic model building with flavor symmetries

    International Nuclear Information System (INIS)

    Plentinger, Florian

    2009-01-01

    The observation of neutrino masses and lepton mixing has highlighted the incompleteness of the Standard Model of particle physics. In conjunction with this discovery, new questions arise: why are the neutrino masses so small, which form has their mass hierarchy, why is the mixing in the quark and lepton sectors so different or what is the structure of the Higgs sector. In order to address these issues and to predict future experimental results, different approaches are considered. One particularly interesting possibility, are Grand Unified Theories such as SU(5) or SO(10). GUTs are vertical symmetries since they unify the SM particles into multiplets and usually predict new particles which can naturally explain the smallness of the neutrino masses via the seesaw mechanism. On the other hand, also horizontal symmetries, i.e., flavor symmetries, acting on the generation space of the SM particles, are promising. They can serve as an explanation for the quark and lepton mass hierarchies as well as for the different mixings in the quark and lepton sectors. In addition, flavor symmetries are significantly involved in the Higgs sector and predict certain forms of mass matrices. This high predictivity makes GUTs and flavor symmetries interesting for both, theorists and experimentalists. These extensions of the SM can be also combined with theories such as supersymmetry or extra dimensions. In addition, they usually have implications on the observed matter-antimatter asymmetry of the universe or can provide a dark matter candidate. In general, they also predict the lepton flavor violating rare decays μ → eγ, τ → μγ, and τ → eγ which are strongly bounded by experiments but might be observed in the future. In this thesis, we combine all of these approaches, i.e., GUTs, the seesaw mechanism and flavor symmetries. Moreover, our request is to develop and perform a systematic model building approach with flavor symmetries and to search for phenomenological

  6. Near-Source Modeling Updates: Building Downwash & Near-Road

    Science.gov (United States)

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  7. Encoding Dissimilarity Data for Statistical Model Building.

    Science.gov (United States)

    Wahba, Grace

    2010-12-01

    We summarize, review and comment upon three papers which discuss the use of discrete, noisy, incomplete, scattered pairwise dissimilarity data in statistical model building. Convex cone optimization codes are used to embed the objects into a Euclidean space which respects the dissimilarity information while controlling the dimension of the space. A "newbie" algorithm is provided for embedding new objects into this space. This allows the dissimilarity information to be incorporated into a Smoothing Spline ANOVA penalized likelihood model, a Support Vector Machine, or any model that will admit Reproducing Kernel Hilbert Space components, for nonparametric regression, supervised learning, or semi-supervised learning. Future work and open questions are discussed. The papers are: F. Lu, S. Keles, S. Wright and G. Wahba 2005. A framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences 102, 12332-1233.G. Corrada Bravo, G. Wahba, K. Lee, B. Klein, R. Klein and S. Iyengar 2009. Examining the relative influence of familial, genetic and environmental covariate information in flexible risk models. Proceedings of the National Academy of Sciences 106, 8128-8133F. Lu, Y. Lin and G. Wahba. Robust manifold unfolding with kernel regularization. TR 1008, Department of Statistics, University of Wisconsin-Madison.

  8. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    OpenAIRE

    Lee, Jin Kook; Kim, Mi Jeong

    2014-01-01

    This paper describes how a building information modelling (BIM)-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs), which follow an object-oriented data modelli...

  9. S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.

    Science.gov (United States)

    Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.

    2017-12-01

    The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and

  10. Flexible building stock modelling with array-programming

    DEFF Research Database (Denmark)

    Brøgger, Morten; Wittchen, Kim Bjarne

    2017-01-01

    Many building stock models employ archetype-buildings in order to capture the essential characteristics of a diverse building stock. However, these models often require multiple archetypes, which make them inflexible. This paper proposes an array-programming based model, which calculates the heat...... tend to overestimate potential energy-savings, if we do not consider these discrepancies. The proposed model makes it possible to compute and visualize potential energy-savings in a flexible and transparent way....

  11. Building groundwater modeling capacity in Mongolia

    Science.gov (United States)

    Valder, Joshua F.; Carter, Janet M.; Anderson, Mark T.; Davis, Kyle W.; Haynes, Michelle A.; Dorjsuren Dechinlhundev,

    2016-06-16

    Ulaanbaatar, the capital city of Mongolia (fig. 1), is dependent on groundwater for its municipal and industrial water supply. The population of Mongolia is about 3 million people, with about one-half the population residing in or near Ulaanbaatar (World Population Review, 2016). Groundwater is drawn from a network of shallow wells in an alluvial aquifer along the Tuul River. Evidence indicates that current water use may not be sustainable from existing water sources, especially when factoring the projected water demand from a rapidly growing urban population (Ministry of Environment and Green Development, 2013). In response, the Government of Mongolia Ministry of Environment, Green Development, and Tourism (MEGDT) and the Freshwater Institute, Mongolia, requested technical assistance on groundwater modeling through the U.S. Army Corps of Engineers (USACE) to the U.S. Geological Survey (USGS). Scientists from the USGS and USACE provided two workshops in 2015 to Mongolian hydrology experts on basic principles of groundwater modeling using the USGS groundwater modeling program MODFLOW-2005 (Harbaugh, 2005). The purpose of the workshops was to bring together representatives from the Government of Mongolia, local universities, technical experts, and other key stakeholders to build in-country capacity in hydrogeology and groundwater modeling.A preliminary steady-state groundwater-flow model was developed as part of the workshops to demonstrate groundwater modeling techniques to simulate groundwater conditions in alluvial deposits along the Tuul River in the vicinity of Ulaanbaatar. ModelMuse (Winston, 2009) was used as the graphical user interface for MODFLOW for training purposes during the workshops. Basic and advanced groundwater modeling concepts included in the workshops were groundwater principles; estimating hydraulic properties; developing model grids, data sets, and MODFLOW input files; and viewing and evaluating MODFLOW output files. A key to success was

  12. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The highly automated PHENIX AutoBuild wizard is described. The procedure can be applied equally well to phases derived from isomorphous/anomalous and molecular-replacement methods. The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution

  13. A legacy building model for holistic nursing.

    Science.gov (United States)

    Lange, Bernadette; Zahourek, Rothlyn P; Mariano, Carla

    2014-06-01

    This pilot project was an effort to record the historical roots, development, and legacy of holistic nursing through the visionary spirit of four older American Holistic Nurses Association (AHNA) members. The aim was twofold: (a) to capture the holistic nursing career experiences of elder AHNA members and (b) to begin to create a Legacy Building Model for Holistic Nursing. The narratives will help initiate an ongoing, systematic method for the collection of historical data and serve as a perpetual archive of knowledge and inspiration for present and future holistic nurses. An aesthetic inquiry approach was used to conduct in-depth interviews with four older AHNA members who have made significant contributions to holistic nursing. The narratives provide a rich description of their personal and professional evolution as holistic nurses. The narratives are presented in an aesthetic format of the art forms of snapshot, pastiche, and collage rather than traditional presentations of research findings. A synopsis of the narratives is a dialogue between the three authors and provides insight for how a Legacy Model can guide our future. Considerations for practice, education, and research are discussed based on the words of wisdom from the four older holistic nurses.

  14. Scalable Automated Model Search

    Science.gov (United States)

    2014-05-20

    profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on...minimization. The computer journal, 7(4):308–313, 1965. [31] K. Ousterhout, A. Panda , J. Rosen, S. Venkataraman, R. Xin, S. Ratnasamy, S. Shenker...and I. Stoica. The case for tiny tasks in compute clusters. [32] B. Panda , J. S. Herbach, S. Basu, and R. J. Bayardo. Planet: massively parallel

  15. Optimizing Energy Consumption in Building Designs Using Building Information Model (BIM

    Directory of Open Access Journals (Sweden)

    Egwunatum Samuel

    2016-09-01

    Full Text Available Given the ability of a Building Information Model (BIM to serve as a multi-disciplinary data repository, this paper seeks to explore and exploit the sustainability value of Building Information Modelling/models in delivering buildings that require less energy for their operation, emit less CO2 and at the same time provide a comfortable living environment for their occupants. This objective was achieved by a critical and extensive review of the literature covering: (1 building energy consumption, (2 building energy performance and analysis, and (3 building information modeling and energy assessment. The literature cited in this paper showed that linking an energy analysis tool with a BIM model helped project design teams to predict and create optimized energy consumption. To validate this finding, an in-depth analysis was carried out on a completed BIM integrated construction project using the Arboleda Project in the Dominican Republic. The findings showed that the BIM-based energy analysis helped the design team achieve the world’s first 103% positive energy building. From the research findings, the paper concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results as well as deliver energy-efficient buildings. The study further recommends that the adoption of a level 2 BIM and the integration of BIM in energy optimization analyse should be made compulsory for all projects irrespective of the method of procurement (government-funded or otherwise or its size.

  16. Modeling arson - An exercise in qualitative model building

    Science.gov (United States)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  17. Multidisciplinary Energy Assessment of Tertiary Buildings: Automated Geomatic Inspection, Building Information Modeling Reconstruction and Building Performance Simulation

    Directory of Open Access Journals (Sweden)

    Faustino Patiño-Cambeiro

    2017-07-01

    Full Text Available There is an urgent need for energy efficiency in buildings within the European framework, considering its environmental implications, and Europe’s energy dependence. Furthermore, the need for enhancing and increasing productivity in the building industry turns new technologies and building energy performance simulation environments into extremely interesting solutions towards rigorous analysis and decision making in renovation within acceptable risk levels. The present work describes a multidisciplinary approach for the estimation of the energy performance of an educational building. The research involved data acquisition with advanced geomatic tools, the development of an optimized building information model, and energy assessment in Building Performance Simulation (BPS software. Interoperability issues were observed in the different steps of the process. The inspection and diagnostic phases were conducted in a timely, accurate manner thanks to automated data acquisition and subsequent analysis using Building Information Modeling based tools (BIM-based tools. Energy simulation was performed using Design Builder, and the results obtained were compared with those yielded by the official software tool established by Spanish regulations for energy certification. The discrepancies between the results of both programs have proven that the official software program is conservative in this sense. This may cause the depreciation of the assessed buildings.

  18. Investigation Into Informational Compatibility Of Building Information Modelling And Building Performance Analysis Software Solutions

    OpenAIRE

    Hyun, S.; Marjanovic-Halburd, L.; Raslan, R.

    2015-01-01

    There are significant opportunities for Building Information Modelling (BIM) to address issues related to sustainable and energy efficient building design. While the potential benefits associated with the integration of BIM and BPA (Building Performance Analysis) have been recognised, its specifications and formats remain in their early infancy and often fail to live up to the promise of seamless interoperability at various stages of design process. This paper conducts a case study to investi...

  19. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  20. Iterative model-building, structure refinement, and density modification with the PHENIX AutoBuild Wizard

    Energy Technology Data Exchange (ETDEWEB)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2007-04-29

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} to 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.

  1. Analysis of a Residential Building Energy Consumption Demand Model

    Directory of Open Access Journals (Sweden)

    Meng Liu

    2011-03-01

    Full Text Available In order to estimate the energy consumption demand of residential buildings, this paper first discusses the status and shortcomings of current domestic energy consumption models. Then it proposes and develops a residential building energy consumption demand model based on a back propagation (BP neural network model. After that, taking residential buildings in Chongqing (P.R. China as an example, 16 energy consumption indicators are introduced as characteristics of the residential buildings in Chongqing. The index system of the BP neutral network prediction model is established and the multi-factorial BP neural network prediction model of Chongqing residential building energy consumption is developed using the Cshap language, based on the SQL server 2005 platform. The results obtained by applying the model in Chongqing are in good agreement with actual ones. In addition, the model provides corresponding approximate data by taking into account the potential energy structure adjustments and relevant energy policy regulations.

  2. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Karl Grau; Rode, Carsten

    2009-01-01

    cavity such as behind the exterior cladding of a building envelope, i.e. a flow which is parallel to the construction plane. (2) Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the constructionplane. The paper presents the models and how they have...

  3. COMPLEMENTARITY OF HISTORIC BUILDING INFORMATION MODELLING AND GEOGRAPHIC INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    X. Yang

    2016-06-01

    Full Text Available In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM and Geographical Information Systems (GIS to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D, time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  4. Statistical models describing the energy signature of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Thavlov, Anders

    2010-01-01

    Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i...... or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as UA......-values, time constants of the building, and other parameters related to the heat dynamics. A method for selecting the most appropriate model for a given building is outlined and finally a perspective of the applications is given. Aknowledgements to the Danish Energy Saving Trust and the Interreg IV ``Vind i...

  5. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  6. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin; Wonka, Peter

    2016-01-01

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available

  7. DEVELOPING PARAMETRIC BUILDING MODELS – THE GANDIS USE CASE

    Directory of Open Access Journals (Sweden)

    W. Thaller

    2012-09-01

    Full Text Available In the course of a project related to green building design, we have created a group of eight parametric building models that can be manipulated interactively with respect to dimensions, number of floors, and a few other parameters. We report on the commonalities and differences between the models and the abstractions that we were able to identify.

  8. Building

    OpenAIRE

    Seavy, Ryan

    2014-01-01

    Building for concrete is temporary. The building of wood and steel stands against the concrete to give form and then gives way, leaving a trace of its existence behind. Concrete is not a building material. One does not build with concrete. One builds for concrete. MARCH

  9. Whole-Building Hygrothermal Modeling in IEA Annex 41

    DEFF Research Database (Denmark)

    Rode, Carsten; Woloszyn, Monika

    2007-01-01

    . The IEA Annex 41 project runs from 2004–2007, coming to conclusion just before the Thermal Performance of the Exterior Envelopes of Whole Buildings X conference. The Annex 41 project and its Subtask 1 do not aim to produce one state-of-the-art hygrothermal simulation model for whole buildings, but rather...... the modeling, free scientific contributions have been invited from specific fields that need the most attention in order to better accomplish the integral building simulations. This paper will give an overview of the advances in whole-building hygrothermal simulation that have been accomplished and presented...

  10. Collaborative data analytics for smart buildings: opportunities and models

    DEFF Research Database (Denmark)

    Lazarova-Molnar, Sanja; Mohamed, Nader

    2018-01-01

    of collaborative data analytics for smart buildings, its benefits, as well as presently possible models of carrying it out. Furthermore, we present a framework for collaborative fault detection and diagnosis as a case of collaborative data analytics for smart buildings. We also provide a preliminary analysis...... of the energy efficiency benefit of such collaborative framework for smart buildings. The result shows that significant energy savings can be achieved for smart buildings using collaborative data analytics.......Smart buildings equipped with state-of-the-art sensors and meters are becoming more common. Large quantities of data are being collected by these devices. For a single building to benefit from its own collected data, it will need to wait for a long time to collect sufficient data to build accurate...

  11. Impacts of building information modeling on facility maintenance management

    Energy Technology Data Exchange (ETDEWEB)

    Ahamed, Shafee; Neelamkavil, Joseph; Canas, Roberto [Centre for Computer-assisted Construction Technologies, National Research Council of Canada, London, Ontario (Canada)

    2010-07-01

    Building information modeling (BIM) is a digital representation of the physical and functional properties of a building; it has been used by construction professionals for a long time and stakeholders are now using it in different aspects of the building lifecycle. This paper intends to present how BIM impacts the construction industry and how it can be used for facility maintenance management. The maintenance and operations of buildings are in most cases still managed through the use of drawings and spreadsheets although life cycle costs of a building are significantly higher than initial investment costs; thus, the use of BIM could help in achieving a higher efficiency and so important benefits. This study is part of an ongoing research project, the nD modeling project, which aims at predicting building energy consumption with better accuracy.

  12. ARMAGH OBSERVATORY – HISTORIC BUILDING INFORMATION MODELLING FOR VIRTUAL LEARNING IN BUILDING CONSERVATION

    Directory of Open Access Journals (Sweden)

    M. Murphy

    2017-08-01

    Full Text Available In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  13. Armagh Observatory - Historic Building Information Modelling for Virtual Learning in Building Conservation

    Science.gov (United States)

    Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.

    2017-08-01

    In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  14. Modelling the heat dynamics of buildings using stochastic

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae; Madsen, Henrik

    2000-01-01

    This paper describes the continuous time modelling of the heat dynamics of a building. The considered building is a residential like test house divided into two test rooms with a water based central heating. Each test room is divided into thermal zones in order to describe both short and long term...... variations. Besides modelling the heat transfer between thermal zones, attention is put on modelling the heat input from radiators and solar radiation. The applied modelling procedure is based on collected building performance data and statistical methods. The statistical methods are used in parameter...

  15. A Heat Dynamic Model for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2015-01-01

    This article presents a heat dynamic model for prediction of the indoor temperature in an office building. The model has been used in several flexible load applications, where the indoor temperature is allowed to vary around a given reference to provide power system services by shifting the heating...... of the building in time. This way the thermal mass of the building can be used to absorb energy from renewable energy source when available and postpone heating in periods with lack of renewable energy generation. The model is used in a model predictive controller to ensure the residential comfort over a given...

  16. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  17. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  18. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches

    Directory of Open Access Journals (Sweden)

    Mojtaba Valinejad Shoubi

    2015-03-01

    Full Text Available A sustainable building is constructed of materials that could decrease environmental impacts, such as energy usage, during the lifecycle of the building. Building Information Modeling (BIM has been identified as an effective tool for building performance analysis virtually in the design stage. The main aims of this study were to assess various combinations of materials using BIM and identify alternative, sustainable solutions to reduce operational energy consumption. The amount of energy consumed by a double story bungalow house in Johor, Malaysia, and assessments of alternative material configurations to determine the best energy performance were evaluated by using Revit Architecture 2012 and Autodesk Ecotect Analysis software to show which of the materials helped in reducing the operational energy use of the building to the greatest extent throughout its annual life cycle. At the end, some alternative, sustainable designs in terms of energy savings have been suggested.

  19. Modelling inspection policies for building maintenance.

    Science.gov (United States)

    Christer, A H

    1982-08-01

    A method of assessing the potential of an inspection maintenance policy as opposed to an existing breakdown maintenance policy for a building complex is developed. The method is based upon information likely to be available and specific subjective assessments which could be made available. Estimates of the expected number of defects identified at an inspection and the consequential cost saving are presented as functions of the inspection frequency.

  20. Integrating Building Information Modeling and Augmented Reality to Improve Investigation of Historical Buildings

    Directory of Open Access Journals (Sweden)

    Francesco Chionna

    2015-12-01

    Full Text Available This paper describes an experimental system to support investigation of historical buildings using Building Information Modeling (BIM and Augmented Reality (AR. The system requires the use of an off-line software to build the BIM representation and defines a method to integrate diagnostic data into BIM. The system offers access to such information during site investigation using AR glasses supported by marker and marker-less technologies. The main innovation is the possibility to contextualize through AR not only existing BIM properties but also results from non-invasive tools. User evaluations show how the use of the system may enhance the perception of engineers during the investigation process.

  1. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    Science.gov (United States)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  2. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schellen, H.L.; Schijndel, van A.W.M.; Blades, N.

    2014-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  3. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schijndel, van A.W.M.; Schellen, H.L.; Blades, N.; Mahdavi, A.; Mertens, B.

    2013-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  4. Building Component Library: An Online Repository to Facilitate Building Energy Model Creation; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, K.; Long, N.; Swindler, A.

    2012-05-01

    This paper describes the Building Component Library (BCL), the U.S. Department of Energy's (DOE) online repository of building components that can be directly used to create energy models. This comprehensive, searchable library consists of components and measures as well as the metadata which describes them. The library is also designed to allow contributors to easily add new components, providing a continuously growing, standardized list of components for users to draw upon.

  5. The Dutch sustainable building policy: A model for developing countries?

    Energy Technology Data Exchange (ETDEWEB)

    Melchert, Luciana [Faculty of Architecture and Urbanism, University of Sao Paulo, Rua do Lago, 876, CEP 05508.900, Sao Paulo SP (Brazil)

    2007-02-15

    This article explores the institutionalization of environmental policies in the Dutch building sector and the applicability of the current model to developing countries. First, it analyzes the transition of sustainable building practices in the Netherlands from the 1970s until today, exploring how these were originally embedded in a discourse on 'de-modernization', which attempted to improve the environmental performance of building stocks by means of self-sufficient technologies, whereas nowadays they adopt a framework of 'ecological modernization', with integrative approaches seeking to improve the environmental performance of building stocks through more efficient-rather than self-sufficient-technologies. The study subsequently shows how the current Dutch sustainable building framework has thereby managed to achieve a pragmatic and widely accepted rationale, which can serve to orient the ecological restructuring of building stocks in developing countries. (author)

  6. Guidelines for Using Building Information Modeling for Energy Analysis of Buildings

    Directory of Open Access Journals (Sweden)

    Thomas Reeves

    2015-12-01

    Full Text Available Building energy modeling (BEM, a subset of building information modeling (BIM, integrates energy analysis into the design, construction, and operation and maintenance of buildings. As there are various existing BEM tools available, there is a need to evaluate the utility of these tools in various phases of the building lifecycle. The goal of this research was to develop guidelines for evaluation and selection of BEM tools to be used in particular building lifecycle phases. The objectives of this research were to: (1 Evaluate existing BEM tools; (2 Illustrate the application of the three BEM tools; (3 Re-evaluate the three BEM tools; and (4 Develop guidelines for evaluation, selection and application of BEM tools in the design, construction and operation/maintenance phases of buildings. Twelve BEM tools were initially evaluated using four criteria: interoperability, usability, available inputs, and available outputs. Each of the top three BEM tools selected based on this initial evaluation was used in a case study to simulate and evaluate energy usage, daylighting performance, and natural ventilation for two academic buildings (LEED-certified and non-LEED-certified. The results of the case study were used to re-evaluate the three BEM tools using the initial criteria with addition of the two new criteria (speed and accuracy, and to develop guidelines for evaluating and selecting BEM tools to analyze building energy performance. The major contribution of this research is the development of these guidelines that can help potential BEM users to identify the most appropriate BEM tool for application in particular building lifecycle phases.

  7. Building energy modeling for green architecture and intelligent dashboard applications

    Science.gov (United States)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the

  8. Features of Functioning the Integrated Building Thermal Model

    Directory of Open Access Journals (Sweden)

    Morozov Maxim N.

    2017-01-01

    Full Text Available A model of the building heating system, consisting of energy source, a distributed automatic control system, elements of individual heating unit and heating system is designed. Application Simulink of mathematical package Matlab is selected as a platform for the model. There are the specialized application Simscape libraries in aggregate with a wide range of Matlab mathematical tools allow to apply the “acausal” modeling concept. Implementation the “physical” representation of the object model gave improving the accuracy of the models. Principle of operation and features of the functioning of the thermal model is described. The investigations of building cooling dynamics were carried out.

  9. Building 3D models with modo 701

    CERN Document Server

    García, Juan Jiménez

    2013-01-01

    The book will focus on creating a sample application throughout the book, building gradually from chapter to chapter.If you are new to the 3D world, this is the key to getting started with a modern software in the modern visualization industry. Only minimal previous knowledge is needed.If you have some previous knowledge about 3D content creation, you will find useful tricks that will differentiate the learning experience from a typical user manual from this, a practical guide concerning the most common problems and situations and how to solve them.

  10. Artificial intelligence support for scientific model-building

    Science.gov (United States)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  11. A Learning Framework for Control-Oriented Modeling of Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.; Vishnu, Abhinav; Vrabie, Draguna L.

    2018-01-18

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and big data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.

  12. BIM-Enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  13. Integration of Models of Building Interiors with Cadastral Data

    OpenAIRE

    Gotlib Dariusz; Karabin Marcin

    2017-01-01

    Demands for applications which use models of building interiors is growing and highly diversified. Those models are applied at the stage of designing and construction of a building, in applications which support real estate management, in navigation and marketing systems and, finally, in crisis management and security systems. They are created on the basis of different data: architectural and construction plans, both, in the analogue form, as well as CAD files, BIM data files, by means of las...

  14. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  15. Numeric Analysis for Relationship-Aware Scalable Streaming Scheme

    Directory of Open Access Journals (Sweden)

    Heung Ki Lee

    2014-01-01

    Full Text Available Frequent packet loss of media data is a critical problem that degrades the quality of streaming services over mobile networks. Packet loss invalidates frames containing lost packets and other related frames at the same time. Indirect loss caused by losing packets decreases the quality of streaming. A scalable streaming service can decrease the amount of dropped multimedia resulting from a single packet loss. Content providers typically divide one large media stream into several layers through a scalable streaming service and then provide each scalable layer to the user depending on the mobile network. Also, a scalable streaming service makes it possible to decode partial multimedia data depending on the relationship between frames and layers. Therefore, a scalable streaming service provides a way to decrease the wasted multimedia data when one packet is lost. However, the hierarchical structure between frames and layers of scalable streams determines the service quality of the scalable streaming service. Even if whole packets of layers are transmitted successfully, they cannot be decoded as a result of the absence of reference frames and layers. Therefore, the complicated relationship between frames and layers in a scalable stream increases the volume of abandoned layers. For providing a high-quality scalable streaming service, we choose a proper relationship between scalable layers as well as the amount of transmitted multimedia data depending on the network situation. We prove that a simple scalable scheme outperforms a complicated scheme in an error-prone network. We suggest an adaptive set-top box (AdaptiveSTB to lower the dependency between scalable layers in a scalable stream. Also, we provide a numerical model to obtain the indirect loss of multimedia data and apply it to various multimedia streams. Our AdaptiveSTB enhances the quality of a scalable streaming service by removing indirect loss.

  16. Four-dimensional strings: Phenomenology and model building

    International Nuclear Information System (INIS)

    Quiros, M.

    1989-01-01

    In these lectures we will review some of the last developments in string theories leading to the construction of realistic four-dimensional string models. Special attention will be paid to world-sheet and space-time supersymmetry, modular invariance and model building for supersymmetric and (tachyon-free) nonsupersymmetric ten and four-dimensional models. (orig.)

  17. Building models for marketing decisions : Past, present and future

    NARCIS (Netherlands)

    Leeflang, PSH; Wittink, DR

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models

  18. Building aggregate timber supply models from individual harvest choice

    Science.gov (United States)

    Maksym Polyakov; David N. Wear; Robert Huggett

    2009-01-01

    Timber supply has traditionally been modelled using aggregate data. In this paper, we build aggregate supply models for four roundwood products for the US state of North Carolina from a stand-level harvest choice model applied to detailed forest inventory. The simulated elasticities of pulpwood supply are much lower than reported by previous studies. Cross price...

  19. Building a better model of cancer

    Directory of Open Access Journals (Sweden)

    DeGregori James

    2006-10-01

    Full Text Available Abstract The 2006 Cold Spring Harbor Laboratory meeting on the Mechanisms and Models of Cancer was held August 16–20. The meeting featured several hundred presentations of many short talks (mostly selected from the abstracts and posters, with the airing of a number of exciting new discoveries. We will focus this meeting review on models of cancer (primarily mouse models, highlighting recent advances in new mouse models that better recapitulate sporadic tumorigenesis, demonstrations of tumor addiction to tumor suppressor inactivation, new insight into senescence as a tumor barrier, improved understanding of the evolutionary paths of cancer development, and environmental/immunological influences on cancer.

  20. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Staci R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-01

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. These methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.

  1. PKI Scalability Issues

    OpenAIRE

    Slagell, Adam J; Bonilla, Rafael

    2004-01-01

    This report surveys different PKI technologies such as PKIX and SPKI and the issues of PKI that affect scalability. Much focus is spent on certificate revocation methodologies and status verification systems such as CRLs, Delta-CRLs, CRS, Certificate Revocation Trees, Windowed Certificate Revocation, OCSP, SCVP and DVCS.

  2. RF building block modeling: optimization and synthesis

    NARCIS (Netherlands)

    Cheng, W.

    2012-01-01

    For circuit designers it is desirable to have relatively simple RF circuit models that do give decent estimation accuracy and provide sufficient understanding of circuits. Chapter 2 in this thesis shows a general weak nonlinearity model that meets these demands. Using a method that is related to

  3. Team learning: building shared mental models

    NARCIS (Netherlands)

    Bossche, van den P.; Gijselaers, W.; Segers, M.; Woltjer, G.B.; Kirschner, P.

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning

  4. Building analytical three-field cosmological models

    Energy Technology Data Exchange (ETDEWEB)

    Santos, J.R.L. [Universidade de Federal de Campina Grande, Unidade Academica de Fisica, Campina Grande, PB (Brazil); Moraes, P.H.R.S. [ITA-Instituto Tecnologico de Aeronautica, Sao Jose dos Campos, SP (Brazil); Ferreira, D.A. [Universidade de Federal de Campina Grande, Unidade Academica de Fisica, Campina Grande, PB (Brazil); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, PB (Brazil); Neta, D.C.V. [Universidade de Federal de Campina Grande, Unidade Academica de Fisica, Campina Grande, PB (Brazil); Universidade Estadual da Paraiba, Departamento de Fisica, Campina Grande, PB (Brazil)

    2018-02-15

    A difficult task to deal with is the analytical treatment of models composed of three real scalar fields, as their equations of motion are in general coupled and hard to integrate. In order to overcome this problem we introduce a methodology to construct three-field models based on the so-called ''extension method''. The fundamental idea of the procedure is to combine three one-field systems in a non-trivial way, to construct an effective three scalar field model. An interesting scenario where the method can be implemented is with inflationary models, where the Einstein-Hilbert Lagrangian is coupled with the scalar field Lagrangian. We exemplify how a new model constructed from our method can lead to non-trivial behaviors for cosmological parameters. (orig.)

  5. Stochastic approaches to inflation model building

    International Nuclear Information System (INIS)

    Ramirez, Erandy; Liddle, Andrew R.

    2005-01-01

    While inflation gives an appealing explanation of observed cosmological data, there are a wide range of different inflation models, providing differing predictions for the initial perturbations. Typically models are motivated either by fundamental physics considerations or by simplicity. An alternative is to generate large numbers of models via a random generation process, such as the flow equations approach. The flow equations approach is known to predict a definite structure to the observational predictions. In this paper, we first demonstrate a more efficient implementation of the flow equations exploiting an analytic solution found by Liddle (2003). We then consider alternative stochastic methods of generating large numbers of inflation models, with the aim of testing whether the structures generated by the flow equations are robust. We find that while typically there remains some concentration of points in the observable plane under the different methods, there is significant variation in the predictions amongst the methods considered

  6. Building functional networks of spiking model neurons.

    Science.gov (United States)

    Abbott, L F; DePasquale, Brian; Memmesheimer, Raoul-Martin

    2016-03-01

    Most of the networks used by computer scientists and many of those studied by modelers in neuroscience represent unit activities as continuous variables. Neurons, however, communicate primarily through discontinuous spiking. We review methods for transferring our ability to construct interesting networks that perform relevant tasks from the artificial continuous domain to more realistic spiking network models. These methods raise a number of issues that warrant further theoretical and experimental study.

  7. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  8. Introduction of Building Information Modeling (BIM) Technologies in Construction

    Science.gov (United States)

    Milyutina, M. A.

    2018-05-01

    The issues of introduction of building information modeling (BIM) in construction industry are considered in this work. The advantages of this approach and perspectives of the transition to new design technologies, construction process management, and operation in the near future are stated. The importance of development of pilot projects that should identify the ways and means of verification of the regulatory and technical base, as well as economic indicators in the transition to Building Information Technologies in the construction, is noted.

  9. String consistency for unified model building

    International Nuclear Information System (INIS)

    Chaudhuri, S.; Chung, S.W.; Hockney, G.; Lykken, J.

    1995-01-01

    We explore the use of real fermionization as a test case for understanding how specific features of phenomenological interest in the low-energy effective superpotential are realized in exact solutions to heterotic superstring theory. We present pedagogic examples of models which realize SO(10) as a level two current algebra on the world-sheet, and discuss in general how higher level current algebras can be realized in the tensor product of simple constituent conformal field theories. We describe formal developments necessary to compute couplings in models built using real fermionization. This allows us to isolate cases of spin structures where the standard prescription for real fermionization may break down. (orig.)

  10. Building probabilistic graphical models with Python

    CERN Document Server

    Karkera, Kiran R

    2014-01-01

    This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you. This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.

  11. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  12. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  13. Building Information Model: advantages, tools and adoption efficiency

    Science.gov (United States)

    Abakumov, R. G.; Naumov, A. E.

    2018-03-01

    The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.

  14. A model for the sustainable selection of building envelope assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Huedo, Patricia, E-mail: huedo@uji.es [Universitat Jaume I (Spain); Mulet, Elena, E-mail: emulet@uji.es [Universitat Jaume I (Spain); López-Mesa, Belinda, E-mail: belinda@unizar.es [Universidad de Zaragoza (Spain)

    2016-02-15

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate the impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.

  15. A model for the sustainable selection of building envelope assemblies

    International Nuclear Information System (INIS)

    Huedo, Patricia; Mulet, Elena; López-Mesa, Belinda

    2016-01-01

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate the impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.

  16. Modelling energy demand in the Norwegian building stock

    Energy Technology Data Exchange (ETDEWEB)

    Sartori, Igor

    2008-07-15

    Energy demand in the building stock in Norway represents about 40% of the final energy consumption, of which 22% goes to the residential sector and 18% to the service sector. In Norway there is a strong dependency on electricity for heating purposes, with electricity covering about 80% of the energy demand in buildings. The building sector can play an important role in the achievement of a more sustainable energy system. The work performed in the articles presented in this thesis investigates various aspects related to the energy demand in the building sector, both in singular cases and in the stock as a whole. The work performed in the first part of this thesis on development and survey of case studies provided background knowledge that was then used in the second part, on modelling the entire stock. In the first part, a literature survey of case studies showed that, in a life cycle perspective, the energy used in the operating phase of buildings is the single most important factor. Design of low-energy buildings is then beneficial and should be pursued, even though it implies a somewhat higher embodied energy. A case study was performed on a school building. First, a methodology using a Monte Carlo method in the calibration process was explored. Then, the calibrated model of the school was used to investigate measures for the achievement of high energy efficiency standard through renovation work. In the second part, a model was developed to study the energy demand in a scenario analysis. The results showed the robustness of policies that included conservation measures against the conflicting effects of the other policies. Adopting conservation measures on a large scale showed the potential to reduce both electricity and total energy demand from present day levels while the building stock keeps growing. The results also highlighted the inertia to change of the building stock, due to low activity levels compared to the stock size. It also became clear that a deeper

  17. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  18. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  19. Modeling Aggregate Hourly Energy Consumption in a Regional Building Stock

    Directory of Open Access Journals (Sweden)

    Anna Kipping

    2017-12-01

    Full Text Available Sound estimates of future heat and electricity demand with high temporal and spatial resolution are needed for energy system planning, grid design, and evaluating demand-side management options and polices on regional and national levels. In this study, smart meter data on electricity consumption in buildings are combined with cross-sectional building information to model hourly electricity consumption within the household and service sectors on a regional basis in Norway. The same modeling approach is applied to model aggregate hourly district heat consumption in three different consumer groups located in Oslo. A comparison of modeled and metered hourly energy consumption shows that hourly variations and aggregate consumption per county and year are reproduced well by the models. However, for some smaller regions, modeled annual electricity consumption is over- or underestimated by more than 20%. Our results indicate that the presented method is useful for modeling the current and future hourly energy consumption of a regional building stock, but that larger and more detailed training datasets are required to improve the models, and more detailed building stock statistics on regional level are needed to generate useful estimates on aggregate regional energy consumption.

  20. Building metaphors and extending models of grief.

    Science.gov (United States)

    VandeCreek, L

    1985-01-01

    Persons in grief turn to metaphors as they seek to understand and express their experience. Metaphors illustrated in this article include "grief is a whirlwind," "grief is the Great Depression all over again" and "grief is gray, cloudy and rainy weather." Hospice personnel can enhance their bereavement efforts by identifying and cultivating the expression of personal metaphors from patients and families. Two metaphors have gained wide cultural acceptance and lie behind contemporary scientific explorations of grief. These are "grief is recovery from illness" (Bowlby and Parkes) and "death is the last stage of growth and grief is the adjustment reaction to this growth" (Kubler-Ross). These models have developed linear perspectives of grief but have neglected to study the fluctuating intensity of symptoms. Adopting Worden's four-part typology of grief, the author illustrates how the pie graph can be used to display this important aspect of the grief experience, thus enhancing these models.

  1. Sustainability Product Properties in Building Information Models

    Science.gov (United States)

    2012-09-01

    preferred car- pool parking spots, preferred low-emitting/fuel-efficient vehicle parking spots, bike racks and telecommuting as options to promote good...most part, these have not been in a computable form. Fallon then stressed the importance of a common conceptual framework, using the IFC model...organizations would be formed with the help of Mr. Kalin. He stressed the goal of the project was to create templates that would be free to use

  2. Group theory for unified model building

    International Nuclear Information System (INIS)

    Slansky, R.

    1981-01-01

    The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)

  3. Linking Remote Sensing Data and Energy Balance Models for a Scalable Agriculture Insurance System for sub-Saharan Africa

    Science.gov (United States)

    Brown, M. E.; Osgood, D. E.; McCarty, J. L.; Husak, G. J.; Hain, C.; Neigh, C. S. R.

    2014-12-01

    One of the most immediate and obvious impacts of climate change is on the weather-sensitive agriculture sector. Both local and global impacts on production of food will have a negative effect on the ability of humanity to meet its growing food demands. Agriculture has become more risky, particularly for farmers in the most vulnerable and food insecure regions of the world such as East Africa. Smallholders and low-income farmers need better financial tools to reduce the risk to food security while enabling productivity increases to meet the needs of a growing population. This paper will describe a recently funded project that brings together climate science, economics, and remote sensing expertise to focus on providing a scalable and sensor-independent remote sensing based product that can be used in developing regional rainfed agriculture insurance programs around the world. We will focus our efforts in Ethiopia and Kenya in East Africa and in Senegal and Burkina Faso in West Africa, where there are active index insurance pilots that can test the effectiveness of our remote sensing-based approach for use in the agriculture insurance industry. The paper will present the overall program, explain links to the insurance industry, and present comparisons of the four remote sensing datasets used to identify drought: the CHIRPS 30-year rainfall data product, the GIMMS 30-year vegetation data product from AVHRR, the ESA soil moisture ECV-30 year soil moisture data product, and a MODIS Evapotranspiration (ET) 15-year dataset. A summary of next year's plans for this project will be presented at the close of the presentation.

  4. Experimental and analytical studies of a deeply embedded reactor building model considering soil-building interaction. Pt. 1

    International Nuclear Information System (INIS)

    Tanaka, H.; Ohta, T.; Uchiyama, S.

    1979-01-01

    The purpose of this paper is to describe the dynamic characteristics of a deeply embedded reactor building model derived from experimental and analytical studies which considers soil-building interaction behaviour. The model building is made of reinforced concrete. It has two stories above ground level and a basement, resting on sandy gravel layer at a depth of 3 meters. The backfill around the building was made to ground level. The model building is simplified and reduced to about one-fifteenth (1/15) of the prototype. It has bearing wall system for the basement and the first story, and frame system for the second. (orig.)

  5. Building Scalable Knowledge Graphs for Earth Science

    Science.gov (United States)

    Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick; Zhang, Jia; Duan, Xiaoyi; Miller, J. J.; Bugbee, Kaylin; Christopher, Sundar; Freitag, Brian

    2017-01-01

    Knowledge Graphs link key entities in a specific domain with other entities via relationships. From these relationships, researchers can query knowledge graphs for probabilistic recommendations to infer new knowledge. Scientific papers are an untapped resource which knowledge graphs could leverage to accelerate research discovery. Goal: Develop an end-to-end (semi) automated methodology for constructing Knowledge Graphs for Earth Science.

  6. BIM, GIS and semantic models of cultural heritage buildings

    Directory of Open Access Journals (Sweden)

    Pavel Tobiáš

    2016-12-01

    Full Text Available Even though there has been a great development of using building information models in the AEC (Architecture/Engineering/Construction sector recently, creation of models of existing buildings is still not very usual. The cultural heritage documentation is still, in most cases, kept in the form of 2D drawings while these drawings mostly contain only geometry without semantics, attributes or definitions of relationships and hierarchies between particular building elements. All these additional information would, however, be very providential for the tasks of cultural heritage preservation, i.e. for the facility management of heritage buildings or for reconstruction planning and it would be suitable to manage all geometric and non-geometric information in a single 3D information model. This paper is based on the existing literature and focuses on the historic building information modelling to provide information about the current state of the art. First, a summary of available software tools is introduced while not only the BIM tools but also the related GIS software is considered. This is followed by a review of existing efforts worldwide and an evaluation of the facts found.

  7. Building entity models through observation and learning

    Science.gov (United States)

    Garcia, Richard; Kania, Robert; Fields, MaryAnne; Barnes, Laura

    2011-05-01

    To support the missions and tasks of mixed robotic/human teams, future robotic systems will need to adapt to the dynamic behavior of both teammates and opponents. One of the basic elements of this adaptation is the ability to exploit both long and short-term temporal data. This adaptation allows robotic systems to predict/anticipate, as well as influence, future behavior for both opponents and teammates and will afford the system the ability to adjust its own behavior in order to optimize its ability to achieve the mission goals. This work is a preliminary step in the effort to develop online entity behavior models through a combination of learning techniques and observations. As knowledge is extracted from the system through sensor and temporal feedback, agents within the multi-agent system attempt to develop and exploit a basic movement model of an opponent. For the purpose of this work, extraction and exploitation is performed through the use of a discretized two-dimensional game. The game consists of a predetermined number of sentries attempting to keep an unknown intruder agent from penetrating their territory. The sentries utilize temporal data coupled with past opponent observations to hypothesize the probable locations of the opponent and thus optimize their guarding locations.

  8. Exploitation of Semantic Building Model in Indoor Navigation Systems

    Science.gov (United States)

    Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min

    2009-04-01

    There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication

  9. Scalable Resolution Display Walls

    KAUST Repository

    Leigh, Jason; Johnson, Andrew; Renambot, Luc; Peterka, Tom; Jeong, Byungil; Sandin, Daniel J.; Talandis, Jonas; Jagodic, Ratko; Nam, Sungwon; Hur, Hyejung; Sun, Yiwen

    2013-01-01

    This article will describe the progress since 2000 on research and development in 2-D and 3-D scalable resolution display walls that are built from tiling individual lower resolution flat panel displays. The article will describe approaches and trends in display hardware construction, middleware architecture, and user-interaction design. The article will also highlight examples of use cases and the benefits the technology has brought to their respective disciplines. © 1963-2012 IEEE.

  10. A Unified Building Model for 3D Urban GIS

    Directory of Open Access Journals (Sweden)

    Ihab Hijazi

    2012-07-01

    Full Text Available Several tasks in urban and architectural design are today undertaken in a geospatial context. Building Information Models (BIM and geospatial technologies offer 3D data models that provide information about buildings and the surrounding environment. The Industry Foundation Classes (IFC and CityGML are today the two most prominent semantic models for representation of BIM and geospatial models respectively. CityGML has emerged as a standard for modeling city models while IFC has been developed as a reference model for building objects and sites. Current CAD and geospatial software provide tools that allow the conversion of information from one format to the other. These tools are however fairly limited in their capabilities, often resulting in data and information losses in the transformations. This paper describes a new approach for data integration based on a unified building model (UBM which encapsulates both the CityGML and IFC models, thus avoiding translations between the models and loss of information. To build the UBM, all classes and related concepts were initially collected from both models, overlapping concepts were merged, new objects were created to ensure the capturing of both indoor and outdoor objects, and finally, spatial relationships between the objects were redefined. Unified Modeling Language (UML notations were used for representing its objects and relationships between them. There are two use-case scenarios, both set in a hospital: “evacuation” and “allocating spaces for patient wards” were developed to validate and test the proposed UBM data model. Based on these two scenarios, four validation queries were defined in order to validate the appropriateness of the proposed unified building model. It has been validated, through the case scenarios and four queries, that the UBM being developed is able to integrate CityGML data as well as IFC data in an apparently seamless way. Constraints and enrichment functions are

  11. Current State of the Art Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  12. Models for describing the thermal characteristics of building components

    DEFF Research Database (Denmark)

    Jimenez, M.J.; Madsen, Henrik

    2008-01-01

    , for example. For the analysis of these tests, dynamic analysis models and methods are required. However, a wide variety of models and methods exists, and the problem of choosing the most appropriate approach for each particular case is a non-trivial and interdisciplinary task. Knowledge of a large family....... The characteristics of each type of model are highlighted. Some available software tools for each of the methods described will be mentioned. A case study also demonstrating the difference between linear and nonlinear models is considered....... of these approaches may therefore be very useful for selecting a suitable approach for each particular case. This paper presents an overview of models that can be applied for modelling the thermal characteristics of buildings and building components using data from outdoor testing. The choice of approach depends...

  13. Multiscale modelling for better hygrothermal prediction of porous building materials

    Directory of Open Access Journals (Sweden)

    Belarbi Rafik

    2018-01-01

    Full Text Available The aim of this work is to understand the influence of the microstructuralgeometric parameters of porous building materials on the mechanisms of coupled heat, air and moisture transfers, in order to predict behavior of the building to control and improve it in its durability. For this a multi-scale approach is implemented. It consists of mastering the dominant physical phenomena and their interactions on the microscopic scale. Followed by a dual-scale modelling, microscopic-macroscopic, of coupled heat, air and moisture transfers that takes into account the intrinsic properties and microstructural topology of the material using X-ray tomography combined with the correlation of 3D images were undertaken. In fact, the hygromorphicbehavior under hydric solicitations was considered. In this context, a model of coupled heat, air and moisture transfer in porous building materials was developed using the periodic homogenization technique. These informations were subsequently implemented in a dynamic computation simulation that model the hygrothermalbehaviourof material at the scale of the envelopes and indoor air quality of building. Results reveals that is essential to consider the local behaviors of materials, but also to be able to measure and quantify the evolution of its properties on a macroscopic scale from the youngest age of the material. In addition, comparisons between experimental and numerical temperature and relative humidity profilesin multilayers wall and in building envelopes were undertaken. Good agreements were observed.

  14. Seismic simulation analysis of nuclear reactor building by soil-building interaction model

    International Nuclear Information System (INIS)

    Muto, K.; Kobayashi, T.; Motohashi, S.; Kusano, N.; Mizuno, N.; Sugiyama, N.

    1981-01-01

    Seismic simulation analysis were performed for evaluating soil-structure interaction effects by an analytical approach using a 'Lattice Model' developed by the authors. The purpose of this paper is to check the adequacy of this procedure for analyzing soil-structure interaction by means of comparing computed results with recorded ones. The 'Lattice Model' approach employs a lumped mass interactive model, in which not only the structure but also the underlying and/or surrounding soil are modeled as descretized elements. The analytical model used for this study extends about 310 m in the horizontal direction and about 103 m in depth. The reactor building is modeled as three shearing-bending sticks (outer wall, inner wall and shield wall) and the underlying and surrounding soil are divided into four shearing sticks (column directly beneath the reactor building, adjacent, near and distant columns). A corresponding input base motion for the 'Lattice Model' was determined by a deconvolution analysis using a recorded motion at elevation -18.5 m in the free-field. The results of this simulation analysis were shown to be in reasonably good agreement with the recorded ones in the forms of the distribution of ground motions and structural responses, acceleration time histories and related response spectra. These results showed that the 'Lattice Model' approach was an appropriate one to estimate the soil-structure interaction effects. (orig./HP)

  15. Modeling of heat and mass transfer in lateritic building envelopes

    International Nuclear Information System (INIS)

    Meukam, Pierre

    2004-10-01

    The aim of the present work is to investigate the behavior of building envelopes made of local lateritic soil bricks subjected to different climatic conditions. The analysis is developed for the prediction of the temperature, relative humidity and water content behavior within the walls. The building envelopes studied in this work consist of lateritic soil bricks with incorporation of natural pozzolan or sawdust in order to obtain small thermal conductivity and low-density materials, and limit the heat transfer between the atmospheric climate and the inside environment. In order to describe coupled heat and moisture transfer in wet porous materials, the coupled equations were solved by the introduction of diffusion coefficients. A numerical model HMtrans, developed for prediction of beat and moisture transfer in multi-layered building components, was used to simulate the temperature, water content and relative humidity profiles within the building envelopes. The results allow the prediction of the duration of the exposed building walls to the local weather conditions. They show that for any of three climatic conditions considered, relative humidity and water content do not exceed 87% and 5% respectively. There is therefore minimum possibility of water condensation in the materials studied. The durability of building envelopes made of lateritic soil bricks with incorporation of natural pozzolan or sawdust is not strongly affected by the climatic conditions in tropical and equatorial regions. (author)

  16. A general model of confidence building: analysis and implications

    International Nuclear Information System (INIS)

    Kilgour, D.M.

    1998-01-01

    For more than two decades, security approaches in Europe have included confidence building. Many have argued that Confidence-Building Measures (CBMS) played an essential role in the enormous transformations that took place there. Thus, it is hardly,surprising that CBMs have been proposed as measures to reduce tensions and transform security relationships elsewhere in the world. The move toward wider application of CBMs has strengthened recently, as conventional military, diplomatic, and humanitarian approaches seem to have failed to address problems associated with peace-building and peace support operations. There is, however, a serious problem. We don't really know why, or even how, CBMs work. Consequently, we have no reliable way to design CBMs that would be appropriate in substance, form, and timing for regions culturally, geographically, and militarily different from Europe. Lacking a solid understanding of confidence building, we are handicapped in our efforts to extend its successes to the domain of peace building and peace support. To paraphrase Macintosh, if we don't know how CBMs succeeded in the past, then we are unlikely to be good at maintaining, improving, or extending them. The specific aim of this project is to step into this gap, using the methods of game theory to clarify some aspects of the underlying logic of confidence building. Formal decision models will be shown to contribute new and valuable insights that will assist in the design of CBMs to contribute to new problems and in new arenas. (author)

  17. Integrated Urban System and Energy Consumption Model: Residential Buildings

    Directory of Open Access Journals (Sweden)

    Rocco Papa

    2014-05-01

    Full Text Available This paper describes a segment of research conducted within the project PON 04a2_E Smart Energy Master for the energetic government of the territory conducted by the Department of Civil, Architectural and Environment Engineering, University of Naples "Federico II".  In particular, this article is part of the study carried out for the definition of the comprehension/interpretation model that correlates buildings, city’s activities and users’ behaviour in order to promote energy savings. In detail, this segment of the research wants to define the residential variables to be used in the model. For this purpose a knowledge framework at international level has been defined, to estimate the energy requirements of residential buildings and the identification of a set of parameters, whose variation has a significant influence on the energy consumption of residential buildings.

  18. Management Model for efficient quality control in new buildings

    Directory of Open Access Journals (Sweden)

    C. E. Rodríguez-Jiménez

    2017-09-01

    Full Text Available The management of the quality control of each building process is usually set up in Spain from different levels of demand. This work tries to obtain a model of reference, to compare the quality control of the building process of a specific product (building, and to be able to evaluate its warranty level. In the quest of this purpose, we take credit of specialized sources and 153 real cases of Quality Control were carefully revised using a multi-judgment method. Applying different techniques to get a specific valuation (impartial of the input parameters through Delphi’s method (17 experts query, whose matrix treatment with the Fuzzy-QFD tool condenses numerical references through a weighted distribution of the selected functions and their corresponding conditioning factors. The model thus obtained (M153 is useful in order to have a quality control reference to meet the expectations of the quality.

  19. IMPROVING TRADITIONAL BUILDING REPAIR CONSTRUCTION QUALITY USING HISTORIC BUILDING INFORMATION MODELING CONCEPT

    Directory of Open Access Journals (Sweden)

    T. C. Wu

    2013-07-01

    Full Text Available In addition to the repair construction project following the repair principles contemplated by heritage experts, the construction process should be recorded and measured at any time for monitoring to ensure the quality of repair. The conventional construction record methods mostly depend on the localized shooting of 2D digital images coupled with text and table for illustration to achieve the purpose of monitoring. Such methods cannot fully and comprehensively record the 3D spatial relationships in the real world. Therefore, the construction records of traditional buildings are very important but cannot function due to technical limitations. This study applied the 3D laser scanning technology to establish a 3D point cloud model for the repair construction of historical buildings. It also broke down the detailed components of the 3D point cloud model by using the concept of the historic building information modeling, and established the 3D models of various components and their attribute data in the 3DGIS platform database. In the construction process, according to the time of completion of each stage as developed on the construction project, this study conducted the 3D laser scanning and database establishment for each stage, also applied 3DGIS spatial information and attribute information comparison and analysis to propose the analysis of differences in completion of various stages for improving the traditional building repair construction quality. This method helps to improve the quality of repair construction work of tangible cultural assets of the world. The established 3DGIS platform can be used as a power tool for subsequent management and maintenance.

  20. Modeling of HVAC operational faults in building performance simulation

    International Nuclear Information System (INIS)

    Zhang, Rongpeng; Hong, Tianzhen

    2017-01-01

    Highlights: •Discuss significance of capturing operational faults in existing buildings. •Develop a novel feature in EnergyPlus to model operational faults of HVAC systems. •Compare three approaches to faults modeling using EnergyPlus. •A case study demonstrates the use of the fault-modeling feature. •Future developments of new faults are discussed. -- Abstract: Operational faults are common in the heating, ventilating, and air conditioning (HVAC) systems of existing buildings, leading to a decrease in energy efficiency and occupant comfort. Various fault detection and diagnostic methods have been developed to identify and analyze HVAC operational faults at the component or subsystem level. However, current methods lack a holistic approach to predicting the overall impacts of faults at the building level—an approach that adequately addresses the coupling between various operational components, the synchronized effect between simultaneous faults, and the dynamic nature of fault severity. This study introduces the novel development of a fault-modeling feature in EnergyPlus which fills in the knowledge gap left by previous studies. This paper presents the design and implementation of the new feature in EnergyPlus and discusses in detail the fault-modeling challenges faced. The new fault-modeling feature enables EnergyPlus to quantify the impacts of faults on building energy use and occupant comfort, thus supporting the decision making of timely fault corrections. Including actual building operational faults in energy models also improves the accuracy of the baseline model, which is critical in the measurement and verification of retrofit or commissioning projects. As an example, EnergyPlus version 8.6 was used to investigate the impacts of a number of typical operational faults in an office building across several U.S. climate zones. The results demonstrate that the faults have significant impacts on building energy performance as well as on occupant

  1. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  2. Innovative model of business process reengineering at machine building enterprises

    Science.gov (United States)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  3. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  4. Profiles in Leadership: Enhancing Learning through Model and Theory Building.

    Science.gov (United States)

    Mello, Jeffrey A.

    2003-01-01

    A class assignment was designed to present factors affecting leadership dynamics, allow practice in model and theory building, and examine leadership from multicultural perspectives. Students developed a profile of a fictional or real leader and analyzed qualities, motivations, context, and effectiveness in written and oral presentations.…

  5. Building extraction for 3D city modelling using airborne laser ...

    African Journals Online (AJOL)

    Light detection and ranging (LiDAR) technology has become a standard tool for three-dimensional mapping because it offers fast rate of data acquisition with unprecedented level of accuracy. This study presents an approach to accurately extract and model building in three-dimensional space from airborne laser scanning ...

  6. Building and Sustaining Digital Collections: Models for Libraries and Museums.

    Science.gov (United States)

    Council on Library and Information Resources, Washington, DC.

    In February 2001, the Council on Library and Information Resources (CLIR) and the National Initiative for a Networked Cultural Heritage (NINCH) convened a meeting to discuss how museums and libraries are building digital collections and what business models are available to sustain them. A group of museum and library senior executives met with…

  7. Building a 3-D Appearance Model of the Human Face

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Larsen, Rasmus; Lading, Brian

    2003-01-01

    This paper describes a method for building an appearance model from three-dimensional data of human faces. The data consists of 3-D vertices, polygons and a texture map. The method uses a set of nine manually placed landmarks to automatically form a dense correspondence of thousands of points...

  8. Building on Tinto's model of engagement and persistence ...

    African Journals Online (AJOL)

    Building on Tinto's model of engagement and persistence: Experiences from the Umthombo Youth Development Foundation Scholarship Scheme. A Ross. Abstract. Background. Major inequalities in staffing levels at rural and urban hospitals contribute to poorer health outcomes in rural areas. Local and international ...

  9. Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations

    Science.gov (United States)

    Sung, Christopher Teh Boon

    2011-01-01

    Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…

  10. FITTING OF PARAMETRIC BUILDING MODELS TO OBLIQUE AERIAL IMAGES

    Directory of Open Access Journals (Sweden)

    U. S. Panday

    2012-09-01

    Full Text Available In literature and in photogrammetric workstations many approaches and systems to automatically reconstruct buildings from remote sensing data are described and available. Those building models are being used for instance in city modeling or in cadastre context. If a roof overhang is present, the building walls cannot be estimated correctly from nadir-view aerial images or airborne laser scanning (ALS data. This leads to inconsistent building outlines, which has a negative influence on visual impression, but more seriously also represents a wrong legal boundary in the cadaster. Oblique aerial images as opposed to nadir-view images reveal greater detail, enabling to see different views of an object taken from different directions. Building walls are visible from oblique images directly and those images are used for automated roof overhang estimation in this research. A fitting algorithm is employed to find roof parameters of simple buildings. It uses a least squares algorithm to fit projected wire frames to their corresponding edge lines extracted from the images. Self-occlusion is detected based on intersection result of viewing ray and the planes formed by the building whereas occlusion from other objects is detected using an ALS point cloud. Overhang and ground height are obtained by sweeping vertical and horizontal planes respectively. Experimental results are verified with high resolution ortho-images, field survey, and ALS data. Planimetric accuracy of 1cm mean and 5cm standard deviation was obtained, while buildings' orientation were accurate to mean of 0.23° and standard deviation of 0.96° with ortho-image. Overhang parameters were aligned to approximately 10cm with field survey. The ground and roof heights were accurate to mean of – 9cm and 8cm with standard deviations of 16cm and 8cm with ALS respectively. The developed approach reconstructs 3D building models well in cases of sufficient texture. More images should be acquired for

  11. Declarative and Scalable Selection for Map Visualizations

    DEFF Research Database (Denmark)

    Kefaloukos, Pimin Konstantin Balic

    and is itself a source and cause of prolific data creation. This calls for scalable map processing techniques that can handle the data volume and which play well with the predominant data models on the Web. (4) Maps are now consumed around the clock by a global audience. While historical maps were singleuser......-defined constraints as well as custom objectives. The purpose of the language is to derive a target multi-scale database from a source database according to holistic specifications. (b) The Glossy SQL compiler allows Glossy SQL to be scalably executed in a spatial analytics system, such as a spatial relational......, there are indications that the method is scalable for databases that contain millions of records, especially if the target language of the compiler is substituted by a cluster-ready variant of SQL. While several realistic use cases for maps have been implemented in CVL, additional non-geographic data visualization uses...

  12. Enhancing Scalability of Sparse Direct Methods

    International Nuclear Information System (INIS)

    Li, Xiaoye S.; Demmel, James; Grigori, Laura; Gu, Ming; Xia, Jianlin; Jardin, Steve; Sovinec, Carl; Lee, Lie-Quan

    2007-01-01

    TOPS is providing high-performance, scalable sparse direct solvers, which have had significant impacts on the SciDAC applications, including fusion simulation (CEMM), accelerator modeling (COMPASS), as well as many other mission-critical applications in DOE and elsewhere. Our recent developments have been focusing on new techniques to overcome scalability bottleneck of direct methods, in both time and memory. These include parallelizing symbolic analysis phase and developing linear-complexity sparse factorization methods. The new techniques will make sparse direct methods more widely usable in large 3D simulations on highly-parallel petascale computers

  13. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  14. Early experiences building a software quality prediction model

    Science.gov (United States)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  15. Modeling the Temperature Effect of Orientations in Residential Buildings

    Directory of Open Access Journals (Sweden)

    Sabahat Arif

    2012-07-01

    Full Text Available Indoor thermal comfort in a building has been an important issue for the environmental sustainability. It is an accepted fact that their designs and planning consume a lot of energy in the modern architecture of 20th and 21st centuries. An appropriate orientation of a building can provide thermally comfortable indoor temperatures which otherwise can consume extra energy to condition these spaces through all the seasons. This experimental study investigates the potential effect of this solar passive design strategy on indoor temperatures and a simple model is presented for predicting indoor temperatures based upon the ambient temperatures.

  16. Protocol to Manage Heritage-Building Interventions Using Heritage Building Information Modelling (HBIM

    Directory of Open Access Journals (Sweden)

    Isabel Jordan-Palomar

    2018-03-01

    Full Text Available The workflow in historic architecture projects presents problems related to the lack of clarity of processes, dispersion of information and the use of outdated tools. Different heritage organisations have showed interest in innovative methods to resolve those problems and improve cultural tourism for sustainable economic development. Building Information Modelling (BIM has emerged as a suitable computerised system for improving heritage management. Its application to historic buildings is named Historic BIM (HBIM. HBIM literature highlights the need for further research in terms of the overall processes of heritage projects, its practical implementation and a need for better cultural documentation. This work uses Design Science Research to develop a protocol to improve the workflow in heritage interdisciplinary projects. Research techniques used include documentary analysis, semi-structured interviews and focus groups. HBIM is proposed as a virtual model that will hold heritage data and will articulate processes. As a result, a simple and visual HBIM protocol was developed and applied in a real case study. The protocol was named BIMlegacy and it is divided into eight phases: building registration, determine intervention options, develop design for intervention, planning the physical intervention, physical intervention, handover, maintenance and culture dissemination. It contemplates all the stakeholders involved.

  17. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  18. Product Modelling for Building Design: Annotated Bibliography (2nd Edition)

    DEFF Research Database (Denmark)

    Galle, Per

    1999-01-01

    This bibliography concerns research publications from 1976 to 1994-5, on product modelling in computer aided architectural design and computer aided engineering design of buildings and their surroundings. For each item of literature, full bibliographic information is given whenever available...... of literature is offered on machine interpretation of drawings, which may be relevant in the context of information exchange among different product models. Although the bibliography is fairly comprehensive as far as it goes, no completeness of coverage is claimed....

  19. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  20. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  1. FIRST PRISMATIC BUILDING MODEL RECONSTRUCTION FROM TOMOSAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    Y. Sun

    2016-06-01

    Full Text Available This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007 and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.

  2. Modeling Manpower and Equipment Productivity in Tall Building Construction Projects

    Science.gov (United States)

    Mudumbai Krishnaswamy, Parthasarathy; Rajiah, Murugasan; Vasan, Ramya

    2017-12-01

    Tall building construction projects involve two critical resources of manpower and equipment. Their usage, however, widely varies due to several factors affecting their productivity. Currently, no systematic study for estimating and increasing their productivity is available. What is prevalent is the use of empirical data, experience of similar projects and assumptions. As tall building projects are here to stay and increase, to meet the emerging demands in ever shrinking urban spaces, it is imperative to explore ways and means of scientific productivity models for basic construction activities: concrete, reinforcement, formwork, block work and plastering for the input of specific resources in a mixed environment of manpower and equipment usage. Data pertaining to 72 tall building projects in India were collected and analyzed. Then, suitable productivity estimation models were developed using multiple linear regression analysis and validated using independent field data. It is hoped that the models developed in the study will be useful for quantity surveyors, cost engineers and project managers to estimate productivity of resources in tall building projects.

  3. METHODS OF SELECTING THE EFFECTIVE MODELS OF BUILDINGS REPROFILING PROJECTS

    Directory of Open Access Journals (Sweden)

    Александр Иванович МЕНЕЙЛЮК

    2016-02-01

    Full Text Available The article highlights the important task of project management in reprofiling of buildings. It is expedient to pay attention to selecting effective engineering solutions to reduce the duration and cost reduction at the project management in the construction industry. This article presents a methodology for the selection of efficient organizational and technical solutions for the reconstruction of buildings reprofiling. The method is based on a compilation of project variants in the program Microsoft Project and experimental statistical analysis using the program COMPEX. The introduction of this technique in the realigning of buildings allows choosing efficient models of projects, depending on the given constraints. Also, this technique can be used for various construction projects.

  4. Scalable Open Source Smart Grid Simulator (SGSim)

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Jacobsen, Rune Hylsberg; Stefanni, Francesco

    2017-01-01

    . This paper presents an open source smart grid simulator (SGSim). The simulator is based on open source SystemC Network Simulation Library (SCNSL) and aims to model scalable smart grid applications. SGSim has been tested under different smart grid scenarios that contain hundreds of thousands of households...

  5. Cooperative Scalable Moving Continuous Query Processing

    DEFF Research Database (Denmark)

    Li, Xiaohui; Karras, Panagiotis; Jensen, Christian S.

    2012-01-01

    of the global view and handle the majority of the workload. Meanwhile, moving clients, having basic memory and computation resources, handle small portions of the workload. This model is further enhanced by dynamic region allocation and grid size adjustment mechanisms that reduce the communication...... and computation cost for both servers and clients. An experimental study demonstrates that our approaches offer better scalability than competitors...

  6. Fine modeling of energy exchanges between buildings and urban atmosphere

    International Nuclear Information System (INIS)

    Daviau-Pellegrin, Noelie

    2016-01-01

    This thesis work is about the effect of buildings on the urban atmosphere and more precisely the energetic exchanges that take place between these two systems. In order to model more finely the thermal effects of buildings on the atmospheric flows in simulations run under the CFD software Code-Saturne, we proceed to couple this tool with the building model BuildSysPro. This library is run under Dymola and can generate matrices describing the building thermal properties that can be used outside this software. In order to carry out the coupling, we use these matrices in a code that allows the building thermal calculations and the CFD to exchange their results. After a review about the physical phenomena and the existing models, we explain the interactions between the atmosphere and the urban elements, especially buildings. The latter can impact the air flows dynamically, as they act as obstacles, and thermally, through their surface temperatures. At first, we analyse the data obtained from the measurement campaign EM2PAU that we use in order to validate the coupled model. EM2PAU was carried out in Nantes in 2011 and represents a canyon street with two rows of four containers. Its distinctive feature lies in the simultaneous measurements of the air and wall temperatures as well as the wind speeds with anemometers located on a 10 m-high mast for the reference wind and on six locations in the canyon. This aims for studying the thermal influence of buildings on the air flows. Then the numerical simulations of the air flows in EM2PAU is carried out with different methods that allow us to calculate or impose the surface temperature we use for each of the container walls. The first method consists in imposing their temperatures from the measurements. For each wall, we set the temperature to the surface temperature that was measured during the EM2PAU campaign. The second method involves imposing the outdoor air temperature that was measured at a given time to all the

  7. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Voluntary National Model Building Codes E Exhibit E... National Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2) of...

  8. Internet of Things building blocks and business models

    CERN Document Server

    Hussain, Fatima

    2017-01-01

    This book describes the building blocks and introductory business models for Internet of Things (IoT). The author provide an overview of the entire IoT architecture and constituent layers, followed by detail description of each block . Various inter-connecting technologies and sensors are discussed in context of IoT networks. In addition to this, concepts of Big Data and Fog Computing are presented and characterized as per data generated by versatile IoT applications . Smart parking system and context aware services are presented as an hybrid model of cloud and Fog Afterwards, various IoT applications and respective business models are discussed. Finally, author summarizes the IoT building blocks and identify research issues in each, and suggest potential research projects worthy of pursuing. .

  9. Bibliography for the Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  10. Links Related to the Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  11. Integration of Models of Building Interiors with Cadastral Data

    Science.gov (United States)

    Gotlib, Dariusz; Karabin, Marcin

    2017-12-01

    Demands for applications which use models of building interiors is growing and highly diversified. Those models are applied at the stage of designing and construction of a building, in applications which support real estate management, in navigation and marketing systems and, finally, in crisis management and security systems. They are created on the basis of different data: architectural and construction plans, both, in the analogue form, as well as CAD files, BIM data files, by means of laser scanning (TLS) and conventional surveys. In this context the issue of searching solutions which would integrate the existing models and lead to elimination of data redundancy is becoming more important. The authors analysed the possible input- of cadastral data (legal extent of premises) at the stage of the creation and updating different models of building's interiors. The paper focuses on one issue - the way of describing the geometry of premises basing on the most popular source data, i.e. architectural and construction plans. However, the described rules may be considered as universal and also may be applied in practice concerned may be used during the process of creation and updating indoor models based on BIM dataset or laser scanning clouds

  12. Active buildings: modelling physical activity and movement in office buildings. An observational study protocol.

    Science.gov (United States)

    Smith, Lee; Ucci, Marcella; Marmot, Alexi; Spinney, Richard; Laskowski, Marek; Sawyer, Alexia; Konstantatou, Marina; Hamer, Mark; Ambler, Gareth; Wardle, Jane; Fisher, Abigail

    2013-11-12

    Health benefits of regular participation in physical activity are well documented but population levels are low. Office layout, and in particular the number and location of office building destinations (eg, print and meeting rooms), may influence both walking time and characteristics of sitting time. No research to date has focused on the role that the layout of the indoor office environment plays in facilitating or inhibiting step counts and characteristics of sitting time. The primary aim of this study was to investigate associations between office layout and physical activity, as well as sitting time using objective measures. Active buildings is a unique collaboration between public health, built environment and computer science researchers. The study involves objective monitoring complemented by a larger questionnaire arm. UK office buildings will be selected based on a variety of features, including office floor area and number of occupants. Questionnaires will include items on standard demographics, well-being, physical activity behaviour and putative socioecological correlates of workplace physical activity. Based on survey responses, approximately 30 participants will be recruited from each building into the objective monitoring arm. Participants will wear accelerometers (to monitor physical activity and sitting inside and outside the office) and a novel tracking device will be placed in the office (to record participant location) for five consecutive days. Data will be analysed using regression analyses, as well as novel agent-based modelling techniques. The results of this study will be disseminated through peer-reviewed publications and scientific presentations. Ethical approval was obtained through the University College London Research Ethics Committee (Reference number 4400/001).

  13. Getting a Cohesive Answer from a Common Start: Scalable Multidisciplinary Analysis through Transformation of a Systems Model

    Science.gov (United States)

    Cole, Bjorn; Chung, Seung

    2012-01-01

    One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must trade between time and cost for analysis quality and quantity. The quality often correlates with greater run time in multidisciplinary models and the quantity is associated with the number of alternatives that can be analyzed. The trade-off is due to the resource intensive process of creating a cohesive multidisciplinary systems model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than hand-written translation scripts between multi-disciplinary models and their analyses. The key is to work from a core systems model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query/View/Transformation (QVT), from the OMG community. SysML was designed to model multidisciplinary systems. The QVT standard was designed to transform SysML models into other models, including those leveraged by engineering analyses. The Europa Habitability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, symbolic analysis (supported by Wolfram Mathematica) is coordinated by data objects transformed from the systems model, enabling extremely flexible and powerful design exploration and analytical investigations of expected system performance.

  14. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  15. Map algebra and model algebra for integrated model building

    NARCIS (Netherlands)

    Schmitz, O.; Karssenberg, D.J.; Jong, K. de; Kok, J.-L. de; Jong, S.M. de

    2013-01-01

    Computer models are important tools for the assessment of environmental systems. A seamless workflow of construction and coupling of model components is essential for environmental scientists. However, currently available software packages are often tailored either to the construction of model

  16. A scalable multi-resolution spatio-temporal model for brain activation and connectivity in fMRI data

    KAUST Repository

    Castruccio, Stefano

    2018-01-23

    Functional Magnetic Resonance Imaging (fMRI) is a primary modality for studying brain activity. Modeling spatial dependence of imaging data at different spatial scales is one of the main challenges of contemporary neuroimaging, and it could allow for accurate testing for significance in neural activity. The high dimensionality of this type of data (on the order of hundreds of thousands of voxels) poses serious modeling challenges and considerable computational constraints. For the sake of feasibility, standard models typically reduce dimensionality by modeling covariance among regions of interest (ROIs)—coarser or larger spatial units—rather than among voxels. However, ignoring spatial dependence at different scales could drastically reduce our ability to detect activation patterns in the brain and hence produce misleading results. We introduce a multi-resolution spatio-temporal model and a computationally efficient methodology to estimate cognitive control related activation and whole-brain connectivity. The proposed model allows for testing voxel-specific activation while accounting for non-stationary local spatial dependence within anatomically defined ROIs, as well as regional dependence (between-ROIs). The model is used in a motor-task fMRI study to investigate brain activation and connectivity patterns aimed at identifying associations between these patterns and regaining motor functionality following a stroke.

  17. Getting a Cohesive Answer from a Common Start: Scalable Multidisciplinary Analysis through Transformation of a System Model

    Science.gov (United States)

    Cole, Bjorn; Chung, Seung H.

    2012-01-01

    One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must tradeoff time and cost for analysis quality and quantity. The quality is associated with the fidelity of the multidisciplinary models and the quantity is associated with the design space that can be analyzed. The tradeoff is due to the resource intensive process of creating a cohesive multidisciplinary system model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than handwritten translation scripts or codes of multidisciplinary models and their analyses. The key is to work from a core system model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query-View- Transform (QVT), from the OMG community. SysML was designed to model multidisciplinary systems and analyses. The QVT standard was designed to transform SysML models. The Europa Hability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, a symbolic mathematical framework (supported by Wolfram Mathematica) is coordinated by data objects transformed from the system model, enabling extremely flexible and powerful tradespace exploration and analytical investigations of expected system performance.

  18. Building 235-F Goldsim Fate And Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, G. A.; Phifer, M. A.

    2012-09-14

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D&D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ?Ci/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ?Ci/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met.

  19. Building 235-F Goldsim Fate And Transport Model

    International Nuclear Information System (INIS)

    Taylor, G. A.; Phifer, M. A.

    2012-01-01

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D and D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ρCi/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ρCi/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met

  20. Air Dispersion Modeling for Building 3026C/D Demolition

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Sjoreen, Andrea L [ORNL; Eckerman, Keith F [ORNL

    2010-06-01

    This report presents estimates of dispersion coefficients and effective dose for potential air dispersion scenarios of uncontrolled releases from Oak Ridge National Laboratory (ORNL) buildings 3026C, 3026D, and 3140 prior to or during the demolition of the 3026 Complex. The Environmental Protection Agency (EPA) AERMOD system1-6 was used to compute these estimates. AERMOD stands for AERMIC Model, where AERMIC is the American Meteorological Society-EPA Regulatory Model Improvement Committee. Five source locations (three in building 3026D and one each in building 3026C and the filter house 3140) and associated source characteristics were determined with the customer. In addition, the area of study was determined and building footprints and intake locations of air-handling systems were obtained. In addition to the air intakes, receptor sites consisting of ground level locations on four polar grids (50 m, 100 m, 200 m, and 500 m) and two intersecting lines of points (50 m separation), corresponding to sidewalks along Central Avenue and Fifth Street. Three years of meteorological data (2006 2008) were used each consisting of three datasets: 1) National Weather Service data; 2) upper air data for the Knoxville-Oak Ridge area; and 3) local weather data from Tower C (10 m, 30 m and 100 m) on the ORNL reservation. Annual average air concentration, highest 1 h average and highest 3 h average air concentrations were computed using AERMOD for the five source locations for the three years of meteorological data. The highest 1 h average air concentrations were converted to dispersion coefficients to characterize the atmospheric dispersion as the customer was interested in the most significant response and the highest 1 h average data reflects the best time-averaged values available from the AERMOD code. Results are presented in tabular and graphical form. The results for dose were obtained using radionuclide activities for each of the buildings provided by the customer.7

  1. BUILDING DETECTION USING AERIAL IMAGES AND DIGITAL SURFACE MODELS

    Directory of Open Access Journals (Sweden)

    J. Mu

    2017-05-01

    Full Text Available In this paper a method for building detection in aerial images based on variational inference of logistic regression is proposed. It consists of three steps. In order to characterize the appearances of buildings in aerial images, an effective bag-of-Words (BoW method is applied for feature extraction in the first step. In the second step, a classifier of logistic regression is learned using these local features. The logistic regression can be trained using different methods. In this paper we adopt a fully Bayesian treatment for learning the classifier, which has a number of obvious advantages over other learning methods. Due to the presence of hyper prior in the probabilistic model of logistic regression, approximate inference methods have to be applied for prediction. In order to speed up the inference, a variational inference method based on mean field instead of stochastic approximation such as Markov Chain Monte Carlo is applied. After the prediction, a probabilistic map is obtained. In the third step, a fully connected conditional random field model is formulated and the probabilistic map is used as the data term in the model. A mean field inference is utilized in order to obtain a binary building mask. A benchmark data set consisting of aerial images and digital surfaced model (DSM released by ISPRS for 2D semantic labeling is used for performance evaluation. The results demonstrate the effectiveness of the proposed method.

  2. A scalable multi-resolution spatio-temporal model for brain activation and connectivity in fMRI data

    KAUST Repository

    Castruccio, Stefano; Ombao, Hernando; Genton, Marc G.

    2018-01-01

    Functional Magnetic Resonance Imaging (fMRI) is a primary modality for studying brain activity. Modeling spatial dependence of imaging data at different spatial scales is one of the main challenges of contemporary neuroimaging, and it could allow

  3. Heterotic SO(32) model building in four dimensions

    International Nuclear Information System (INIS)

    Choi, K.S.; Groot Nibbelink, S.; Minnesota Univ., Minneapolis, MN; Trapletti, M.

    2004-10-01

    Four dimensional heterotic SO(32) orbifold models are classified systematically with model building applications in mind. We obtain all Z 3 , Z 7 and Z 2N models based on vectorial gauge shifts. The resulting gauge groups are reminiscent of those of type-I model building, as they always take the form SO(2n 0 ) x U(n 1 ) x.. x U(n N-1 ) x SO(2n N ). The complete twisted spectrum is determined simultaneously for all orbifold models in a parametric way depending on n 0 ,.., n N , rather than on a model by model basis. This reveals interesting patterns in the twisted states: They are always built out of vectors and anti-symmetric tensors of the U(n) groups, and either vectors or spinors of the SO(2n) groups. Our results may shed additional light on the S-duality between heterotic and type-I strings in four dimensions. As a spin-off we obtain an SO(10) GUT model with four generations from the Z 4 orbifold. (orig.)

  4. Nonlinear Economic Model Predictive Control Strategy for Active Smart Buildings

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.

    2016-01-01

    Nowadays, the development of advanced and innovative intelligent control techniques for energy management in buildings is a key issue within the smart grid topic. A nonlinear economic model predictive control (EMPC) scheme, based on the branch-and-bound tree search used as optimization algorithm ...... controller is shown very reliable keeping the comfort levels in the two considered seasons and shifting the load away from peak hours in order to achieve the desired flexible electricity consumption.......Nowadays, the development of advanced and innovative intelligent control techniques for energy management in buildings is a key issue within the smart grid topic. A nonlinear economic model predictive control (EMPC) scheme, based on the branch-and-bound tree search used as optimization algorithm...

  5. FINDING CUBOID-BASED BUILDING MODELS IN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2012-07-01

    Full Text Available In this paper, we present an automatic approach for the derivation of 3D building models of level-of-detail 1 (LOD 1 from point clouds obtained from (dense image matching or, for comparison only, from LIDAR. Our approach makes use of the predominance of vertical structures and orthogonal intersections in architectural scenes. After robustly determining the scene's vertical direction based on the 3D points we use it as constraint for a RANSAC-based search for vertical planes in the point cloud. The planes are further analyzed to segment reliable outlines for rectangular surface within these planes, which are connected to construct cuboid-based building models. We demonstrate that our approach is robust and effective over a range of real-world input data sets with varying point density, amount of noise, and outliers.

  6. State reduced order models for the modelling of the thermal behavior of buildings

    Energy Technology Data Exchange (ETDEWEB)

    Menezo, Christophe; Bouia, Hassan; Roux, Jean-Jacques; Depecker, Patrick [Institute National de Sciences Appliquees de Lyon, Villeurbanne Cedex, (France). Centre de Thermique de Lyon (CETHIL). Equipe Thermique du Batiment]. E-mail: menezo@insa-cethil-etb.insa-lyon.fr; bouia@insa-cethil-etb.insa-lyon.fr; roux@insa-cethil-etb.insa-lyon.fr; depecker@insa-cethil-etb.insa-lyon.fr

    2000-07-01

    This work is devoted to the field of building physics and related to the reduction of heat conduction models. The aim is to enlarge the model libraries of heat and mass transfer codes through limiting the considerable dimensions reached by the numerical systems during the modelling process of a multizone building. We show that the balanced realization technique, specifically adapted to the coupling of reduced order models with the other thermal phenomena, turns out to be very efficient. (author)

  7. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  8. BUILDING NEW BUSINESS MODELS FOR SUSTAINABLE GROWTH AND DEVELOPMENT

    OpenAIRE

    Taco C. R. van Someren; Shuhua van Someren-Wang

    2011-01-01

    Considered are issues of methodology and methods, as well as ideology of strategic innovation. Using the tools of this approach is offered as mechanisms to develop and build business models for sustainable socio-economic economic growth and development of different regions. The connection between key problems of sustainable development and management policy of different economic entities is studied. The consultancy company Ynnovate’s experience in addressing these issues in the EU and China i...

  9. Building and Verifying a Predictive Model of Interruption Resumption

    Science.gov (United States)

    2012-03-01

    the gardener to remember those plants (and whether they need to be removed), and so will not commit resources to remember that information . The overall...camera), the storyteller needed help much less often. This result suggests that when there is no one to help them remember the last thing they said...INV ITED P A P E R Building and Verifying a Predictive Model of Interruption Resumption Help from a robot, to allow a human storyteller to continue

  10. BUILDING INFORMATION MODELS FOR MONITORING AND SIMULATION DATA IN HERITAGE BUILDINGS

    Directory of Open Access Journals (Sweden)

    D. P. Pocobelli

    2018-05-01

    Full Text Available This paper analyses the use of BIM in heritage buildings, assessing the state-of-the-art and finding paths for further development. Specifically, this work is part of a broader project, which final aim is to support stakeholders through BIM. Given that humidity is one of the major causes of weathering, being able to detect, depict and forecast it, is a key task. A BIM model of a heritage building – enhanced with the integration of a weathering forecasting model – will be able to give detailed information on possible degradation patterns, and when they will happen. This information can be effectively used to plan both ordinary and extraordinary maintenance. The Jewel Tower in London, our case study, is digitised using combined laser scanning and photogrammetry, and a virtual model is produced. The point cloud derived from combined laser scanning & photogrammetry is traced out in with Autodesk Revit, where the main volumetry (gross walls and floors is created with parametric objects. Surface characterisation of the façade is given through renderings. Specifically, new rendering materials have been created for this purpose, based on rectified photos of the Tower. The model is then integrated with moisture data, organised in spreadsheets and linked to it via parametric objects representing the points where measurements had been previously taken. The spatial distribution of moisture is then depicted using Dynamo. This simple exercise demonstrates the potential Dynamo has for condition reporting, and future work will concentrate on the creation of a complex forecasting model to be linked through it.

  11. Model building strategy for logistic regression: purposeful selection.

    Science.gov (United States)

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  12. Building and testing models with extended Higgs sectors

    Science.gov (United States)

    Ivanov, Igor P.

    2017-07-01

    Models with non-minimal Higgs sectors represent a mainstream direction in theoretical exploration of physics opportunities beyond the Standard Model. Extended scalar sectors help alleviate difficulties of the Standard Model and lead to a rich spectrum of characteristic collider signatures and astroparticle consequences. In this review, we introduce the reader to the world of extended Higgs sectors. Not pretending to exhaustively cover the entire body of literature, we walk through a selection of the most popular examples: the two- and multi-Higgs-doublet models, as well as singlet and triplet extensions. We will show how one typically builds models with extended Higgs sectors, describe the main goals and the challenges which arise on the way, and mention some methods to overcome them. We will also describe how such models can be tested, what are the key observables one focuses on, and illustrate the general strategy with a subjective selection of results.

  13. Scalable photoreactor for hydrogen production

    KAUST Repository

    Takanabe, Kazuhiro; Shinagawa, Tatsuya

    2017-01-01

    Provided herein are scalable photoreactors that can include a membrane-free water- splitting electrolyzer and systems that can include a plurality of membrane-free water- splitting electrolyzers. Also provided herein are methods of using the scalable photoreactors provided herein.

  14. Scalable photoreactor for hydrogen production

    KAUST Repository

    Takanabe, Kazuhiro

    2017-04-06

    Provided herein are scalable photoreactors that can include a membrane-free water- splitting electrolyzer and systems that can include a plurality of membrane-free water- splitting electrolyzers. Also provided herein are methods of using the scalable photoreactors provided herein.

  15. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  16. Designing a Scalable Fault Tolerance Model for High Performance Computational Chemistry: A Case Study with Coupled Cluster Perturbative Triples.

    Science.gov (United States)

    van Dam, Hubertus J J; Vishnu, Abhinav; de Jong, Wibe A

    2011-01-11

    In the past couple of decades, the massive computational power provided by the most modern supercomputers has resulted in simulation of higher-order computational chemistry methods, previously considered intractable. As the system sizes continue to increase, the computational chemistry domain continues to escalate this trend using parallel computing with programming models such as Message Passing Interface (MPI) and Partitioned Global Address Space (PGAS) programming models such as Global Arrays. The ever increasing scale of these supercomputers comes at a cost of reduced Mean Time Between Failures (MTBF), currently on the order of days and projected to be on the order of hours for upcoming extreme scale systems. While traditional disk-based check pointing methods are ubiquitous for storing intermediate solutions, they suffer from high overhead of writing and recovering from checkpoints. In practice, checkpointing itself often brings the system down. Clearly, methods beyond checkpointing are imperative to handling the aggravating issue of reducing MTBF. In this paper, we address this challenge by designing and implementing an efficient fault tolerant version of the Coupled Cluster (CC) method with NWChem, using in-memory data redundancy. We present the challenges associated with our design, including an efficient data storage model, maintenance of at least one consistent data copy, and the recovery process. Our performance evaluation without faults shows that the current design exhibits a small overhead. In the presence of a simulated fault, the proposed design incurs negligible overhead in comparison to the state of the art implementation without faults.

  17. Building Modelling Methodologies for Virtual District Heating and Cooling Networks

    Energy Technology Data Exchange (ETDEWEB)

    Saurav, Kumar; Choudhury, Anamitra R.; Chandan, Vikas; Lingman, Peter; Linder, Nicklas

    2017-10-26

    District heating and cooling systems (DHC) are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increase the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components interacting with each other. In this paper we present two building methodologies to model the consumer buildings. These models will be further integrated with network model and the control system layer to create a virtual test bed for the entire DHC system. The model is validated using data collected from a real life DHC system located at Lulea, a city on the coast of northern Sweden. The test bed will be then used for simulating various test cases such as peak energy reduction, overall demand reduction etc.

  18. Toward Accessing Spatial Structure from Building Information Models

    Science.gov (United States)

    Schultz, C.; Bhatt, M.

    2011-08-01

    Data about building designs and layouts is becoming increasingly more readily available. In the near future, service personal (such as maintenance staff or emergency rescue workers) arriving at a building site will have immediate real-time access to enormous amounts of data relating to structural properties, utilities, materials, temperature, and so on. The critical problem for users is the taxing and error prone task of interpreting such a large body of facts in order to extract salient information. This is necessary for comprehending a situation and deciding on a plan of action, and is a particularly serious issue in time-critical and safety-critical activities such as firefighting. Current unifying building models such as the Industry Foundation Classes (IFC), while being comprehensive, do not directly provide data structures that focus on spatial reasoning and spatial modalities that are required for high-level analytical tasks. The aim of the research presented in this paper is to provide computational tools for higher level querying and reasoning that shift the cognitive burden of dealing with enormous amounts of data away from the user. The user can then spend more energy and time in planning and decision making in order to accomplish the tasks at hand. We present an overview of our framework that provides users with an enhanced model of "built-up space". In order to test our approach using realistic design data (in terms of both scale and the nature of the building models) we describe how our system interfaces with IFC, and we conduct timing experiments to determine the practicality of our approach. We discuss general computational approaches for deriving higher-level spatial modalities by focusing on the example of route graphs. Finally, we present a firefighting scenario with alternative route graphs to motivate the application of our framework.

  19. Model building in the free-fermionic formulation of superstrings

    International Nuclear Information System (INIS)

    Dreiner, H.K.

    1989-01-01

    In this thesis the author presents results in the free fermionic formulation of string theory in four space-time dimensions as presented by I. Antoniadis and C. Bachas. First he discusses how to build N = 1 space-time supersymmetric models. He also uses the low-energy requirements of N = 1 space-time supersymmetry as well as chiral space-time fermions to show that the spectrum does not contain any massless scalar fields which transform under the adjoint representation of the gauge group. He also discusses the consequences of these results for model building efforts. In Chapter 1 and 2 he introduces the concepts of string theory as well as the notation which he will be using throughout the following chapters. In Chapter 3 he reviews the free fermionic formulation of string theory as presented by [AB] including the rules for model building. He first classifies all the possible single boundary conditions for the free fermionic fields in the theory and then classifies the cases for which two or more distinct boundary conditions are compatible. In Chapter 4 he uses the rules from Chapter 3 to construct several toy models, which show what possible gauge groups can arise in the theory and how they can be constructed. In Chapter 5 he uses the classification of the boundary conditions for the fermionic fields to classify all the models with N = 4 spacetime supersymmetry. He then discusses the different possibilities to obtain models with N = 2, 1, and 0 spacetime supersymmetry. He shows that the requirement of N = 1 spacetime supersymmetry severely restricts the allowed constructions of the world-sheet supercharge. In Chapter 6 he proves, using the requirement of N = 1 space-time supersymmetry, that the spectrum does not contain any massless scalar fields transforming as the adjoint representation of the gauge group

  20. Building information modeling in the architectural design phases

    DEFF Research Database (Denmark)

    Hermund, Anders

    2009-01-01

    The overall economical benefits of Building Information Modeling are generally comprehensible, but are there other problems with the implementation of BIM as a formulized system in a field that ultimately is dependant on a creative input? Is optimization and economic benefit really contributing...... with an architectural quality? In Denmark the implementation of the digital working methods related to BIM has been introduced by government law in 2007. Will the important role of the architect as designer change in accordance with these new methods, and does the idea of one big integrated model represent a paradox...... in relation to designing? The BIM mindset requires changes on many levels....

  1. Modeling of electromigration salt removal methods in building materials

    DEFF Research Database (Denmark)

    Johannesson, Björn; Ottosen, Lisbeth M.

    2008-01-01

    for salt attack of various kinds, is one potential method to preserve old building envelopes. By establishing a model for ionic multi-species diffusion, which also accounts for external applied electrical fields, it is proposed that an important complement to the experimental tests and that verification...... with its ionic mobility properties. It is, further, assumed that Gauss’s law can be used to calculate the internal electrical field induced by the diffusion it self. In this manner the external electrical field applied can be modeled, simply, by assigning proper boundary conditions for the equation...

  2. Integration of Models of Building Interiors with Cadastral Data

    Directory of Open Access Journals (Sweden)

    Gotlib Dariusz

    2017-12-01

    Full Text Available Demands for applications which use models of building interiors is growing and highly diversified. Those models are applied at the stage of designing and construction of a building, in applications which support real estate management, in navigation and marketing systems and, finally, in crisis management and security systems. They are created on the basis of different data: architectural and construction plans, both, in the analogue form, as well as CAD files, BIM data files, by means of laser scanning (TLS and conventional surveys. In this context the issue of searching solutions which would integrate the existing models and lead to elimination of data redundancy is becoming more important. The authors analysed the possible input- of cadastral data (legal extent of premises at the stage of the creation and updating different models of building’s interiors. The paper focuses on one issue - the way of describing the geometry of premises basing on the most popular source data, i.e. architectural and construction plans. However, the described rules may be considered as universal and also may be applied in practice concerned may be used during the process of creation and updating indoor models based on BIM dataset or laser scanning clouds

  3. Metadata and their impact on processes in Building Information Modeling

    Directory of Open Access Journals (Sweden)

    Vladimir Nyvlt

    2014-04-01

    Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.

  4. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    Model Predictive Control (MPC) of building systems is a promising approach to optimize building energy performance. In contrast to traditional control strategies which are reactive in nature, MPC optimizes the utilization of resources based on the predicted effects. It has been shown that energy ...

  5. Building crop models within different crop modelling frameworks

    NARCIS (Netherlands)

    Adam, M.Y.O.; Corbeels, M.; Leffelaar, P.A.; Keulen, van H.; Wery, J.; Ewert, F.

    2012-01-01

    Modular frameworks for crop modelling have evolved through simultaneous progress in crop science and software development but differences among these frameworks exist which are not well understood, resulting in potential misuse for crop modelling. In this paper we review differences and similarities

  6. Scalable Frequent Subgraph Mining

    KAUST Repository

    Abdelhamid, Ehab

    2017-01-01

    Given an input graph, the Frequent Subgraph Mining (FSM) task finds all subgraphs with frequencies exceeding a given threshold. FSM is crucial for graph analysis, and it is an essential building block in a variety

  7. Intratracheal Bleomycin Aerosolization: The Best Route of Administration for a Scalable and Homogeneous Pulmonary Fibrosis Rat Model?

    Directory of Open Access Journals (Sweden)

    Alexandre Robbe

    2015-01-01

    Full Text Available Idiopathic pulmonary fibrosis (IPF is a chronic disease with a poor prognosis and is characterized by the accumulation of fibrotic tissue in lungs resulting from a dysfunction in the healing process. In humans, the pathological process is patchy and temporally heterogeneous and the exact mechanisms remain poorly understood. Different animal models were thus developed. Among these, intratracheal administration of bleomycin (BML is one of the most frequently used methods to induce lung fibrosis in rodents. In the present study, we first characterized histologically the time-course of lung alteration in rats submitted to BLM instillation. Heterogeneous damages were observed among lungs, consisting in an inflammatory phase at early time-points. It was followed by a transition to a fibrotic state characterized by an increased myofibroblast number and collagen accumulation. We then compared instillation and aerosolization routes of BLM administration. The fibrotic process was studied in each pulmonary lobe using a modified Ashcroft scale. The two quantification methods were confronted and the interobserver variability evaluated. Both methods induced fibrosis development as demonstrated by a similar progression of the highest modified Ashcroft score. However, we highlighted that aerosolization allows a more homogeneous distribution of lesions among lungs, with a persistence of higher grade damages upon time.

  8. A scalable satellite-based crop yield mapper: Integrating satellites and crop models for field-scale estimation in India

    Science.gov (United States)

    Jain, M.; Singh, B.; Srivastava, A.; Lobell, D. B.

    2015-12-01

    Food security will be challenged over the upcoming decades due to increased food demand, natural resource degradation, and climate change. In order to identify potential solutions to increase food security in the face of these changes, tools that can rapidly and accurately assess farm productivity are needed. With this aim, we have developed generalizable methods to map crop yields at the field scale using a combination of satellite imagery and crop models, and implement this approach within Google Earth Engine. We use these methods to examine wheat yield trends in Northern India, which provides over 15% of the global wheat supply and where over 80% of farmers rely on wheat as a staple food source. In addition, we identify the extent to which farmers are shifting sow date in response to heat stress, and how well shifting sow date reduces the negative impacts of heat stress on yield. To identify local-level decision-making, we map wheat sow date and yield at a high spatial resolution (30 m) using Landsat satellite imagery from 1980 to the present. This unique dataset allows us to examine sow date decisions at the field scale over 30 years, and by relating these decisions to weather experienced over the same time period, we can identify how farmers learn and adapt cropping decisions based on weather through time.

  9. Modelling energy demand in the buildings sector within the EU

    Energy Technology Data Exchange (ETDEWEB)

    O Broin, Eoin

    2012-11-01

    In the on-going effort within the EU to tackle greenhouse gas emissions and secure future energy supplies, the buildings sector is often referred to as offering a large potential for energy savings. The aim of this thesis is to produce scenarios that highlight the parameters that affect the energy demands and thus potentials for savings of the building sector. Top-down and bottom-up approaches to modelling energy demand in EU buildings are applied in this thesis. The top-down approach uses econometrics to establish the historical contribution of various parameters to energy demands for space and water heating in the residential sectors of four EU countries. The bottom-up approach models the explicit impact of trends in energy efficiency improvement on total energy demand in the EU buildings stock. The two approaches are implemented independently, i.e., the results from the top-down studies do not feed into those from the bottom-up studies or vice versa. The explanatory variables used in the top-down approach are: energy prices; heating degree days, as a proxy for outdoor climate; a linear time trend, as a proxy for technology development; and the lag of energy demand, as a proxy for inertia in the system. In this case, inertia refers to the time it takes to replace space and water heating systems in reaction to price changes. The analysis gives long-term price elasticities of demand as follows: for France, -0.17; for Italy, -0.35; for Sweden, -0.27; and for the UK, -0.35. These results reveal that the price elasticity of demand for space and water heating is inelastic in each of these cases. Nonetheless, scenarios created for the period up to 2050 using these elasticities and an annual price increase of 3 % show that demand can be reduced by more than 1 % per year in France and Sweden and by less than 1 % per year in Italy and the UK. In the bottom-up modelling, varying rates for conversion efficiencies, heating standards for new buildings, end-use efficiency, and

  10. CFD Modeling of Airflow in a Livestock Building

    DEFF Research Database (Denmark)

    Rong, Li; Elhadidi, B.; Khalifa, H. E.

    2010-01-01

    In this paper, a 2D simulation for a typical livestock building is performed to assess the ammonia emission removal rate to the atmosphere. Two geometry models are used and compared in order to represent the slatted floor. In the first model the floor is modeled as a slatted floor and in the second...... the accuracy of the porous jump assumption by comparing the velocity, and ammonia concentration in a 2D simulation, heated solid bodies are added to represent the livestock in the following simulations. The results of simulations with heat source also indicate that modeling the slatted floor with slats...... is necessary. Furthermore, the combination of low inlet velocity and heated objects causes the flow to be buoyancy dominated and unsteady. This unsteadiness can be common in similar buoyancy induced flows for high Rayleigh number flow. The paper concludes with tradeoffs suggested for simulation of livestock...

  11. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol

    2011-01-01

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...... and phase behavior; 3. unit operations; and 4. process synthesis and design. The methods and tools in each level include: For the first level, a lipid‐database of collected experimental data from the open literature, confidential data from industry and generated data from validated predictive property...... of these unit operations with respect to performance parameters such as minimum total cost, product yield improvement, operability etc., and process intensification for the retrofit of existing biofuel plants. In the fourth level the information and models developed are used as building blocks...

  12. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  13. Theory Building- Towards an understanding of business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2009-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start to think more radically, when considering to innovate their business model. However, the development and innovation of business...... models is a complex venture and has not been widely researched yet. The objective of this paper is therefore 1) to build a [descriptive] theoretical understanding, based on Christensen's (2005) three-step procedure, to business models and their innovation and, as a result of that, 2) to strengthen...... researchers' and practitioners' perspectives as to how the process of business model innovation can be realized. By using various researchers' perspectives and assumptions, we identify relevant inconsistencies, which consequently lead us to propose possible supplementary solutions. We conclude our paper...

  14. THE EFFECT OF BUILDING FAÇADE MODEL ON LIGHT DISTRIBUTION (CASE STUDY: MENARA PHINISI BUILDING OF UNM

    Directory of Open Access Journals (Sweden)

    Nurul Jamala

    2017-12-01

    Full Text Available Global warming issues influence the temperature of the earth surface. It is an impact on energy consumption, especially in buildings. Utilization of daylight is one of the factors that need to be considered, in order to minimize energy consumption as a source of artificial lighting. This study analyzed the distribution of light on the Menara Phinisi building of Makassar State University. Quantitative research method that is to describe the data of simulation in Autodesk Ecotect program. The research objective was to determine the effect of the building facade model on the value of illumination inside the building. Results of the study concluded that the decrease percentage of the distribution of light on the building facade using and not using the facade is 3,16% or 236 lux. Distribution of light in horizontal and diagonal facade models differ in the amount of 2,5%. Design analysis of the building serves as a guide for analyzing the influence of the building facade model so that it can create energy efficient buildings.

  15. Automatic generation and simulation of urban building energy models based on city datasets for city-scale building retrofit analysis

    International Nuclear Information System (INIS)

    Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann

    2017-01-01

    Highlights: •Developed methods and used data models to integrate city’s public building records. •Shading from neighborhood buildings strongly influences urban building performance. •A case study demonstrated the workflow, simulation and analysis of building retrofits. •CityBES retrofit analysis feature provides actionable information for decision making. •Discussed significance and challenges of urban building energy modeling. -- Abstract: Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details of using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city’s mild

  16. Modeling, simulation, and fabrication of a fully integrated, acid-stable, scalable solar-driven water-splitting system.

    Science.gov (United States)

    Walczak, Karl; Chen, Yikai; Karp, Christoph; Beeman, Jeffrey W; Shaner, Matthew; Spurgeon, Joshua; Sharp, Ian D; Amashukeli, Xenia; West, William; Jin, Jian; Lewis, Nathan S; Xiang, Chengxiang

    2015-02-01

    A fully integrated solar-driven water-splitting system comprised of WO3 /FTO/p(+) n Si as the photoanode, Pt/TiO2 /Ti/n(+) p Si as the photocathode, and Nafion as the membrane separator, was simulated, assembled, operated in 1.0 M HClO4 , and evaluated for performance and safety characteristics under dual side illumination. A multi-physics model that accounted for the performance of the photoabsorbers and electrocatalysts, ion transport in the solution electrolyte, and gaseous product crossover was first used to define the optimal geometric design space for the system. The photoelectrodes and the membrane separators were then interconnected in a louvered design system configuration, for which the light-absorbing area and the solution-transport pathways were simultaneously optimized. The performance of the photocathode and the photoanode were separately evaluated in a traditional three-electrode photoelectrochemical cell configuration. The photocathode and photoanode were then assembled back-to-back in a tandem configuration to provide sufficient photovoltage to sustain solar-driven unassisted water-splitting. The current-voltage characteristics of the photoelectrodes showed that the low photocurrent density of the photoanode limited the overall solar-to-hydrogen (STH) conversion efficiency due to the large band gap of WO3 . A hydrogen-production rate of 0.17 mL hr(-1) and a STH conversion efficiency of 0.24 % was observed in a full cell configuration for >20 h with minimal product crossover in the fully operational, intrinsically safe, solar-driven water-splitting system. The solar-to-hydrogen conversion efficiency, ηSTH , calculated using the multiphysics numerical simulation was in excellent agreement with the experimental behavior of the system. The value of ηSTH was entirely limited by the performance of the photoelectrochemical assemblies employed in this study. The louvered design provides a robust platform for implementation of various types of

  17. Reduced order modeling and parameter identification of a building energy system model through an optimization routine

    International Nuclear Information System (INIS)

    Harish, V.S.K.V.; Kumar, Arun

    2016-01-01

    Highlights: • A BES model based on 1st principles is developed and solved numerically. • Parameters of lumped capacitance model are fitted using the proposed optimization routine. • Validations are showed for different types of building construction elements. • Step response excitations for outdoor air temperature and relative humidity are analyzed. - Abstract: Different control techniques together with intelligent building technology (Building Automation Systems) are used to improve energy efficiency of buildings. In almost all control projects, it is crucial to have building energy models with high computational efficiency in order to design and tune the controllers and simulate their performance. In this paper, a set of partial differential equations are formulated accounting for energy flow within the building space. These equations are then solved as conventional finite difference equations using Crank–Nicholson scheme. Such a model of a higher order is regarded as a benchmark model. An optimization algorithm has been developed, depicted through a flowchart, which minimizes the sum squared error between the step responses of the numerical and the optimal model. Optimal model of the construction element is nothing but a RC-network model with the values of Rs and Cs estimated using the non-linear time invariant constrained optimization routine. The model is validated with comparing the step responses with other two RC-network models whose parameter values are selected based on a certain criteria. Validations are showed for different types of building construction elements viz., low, medium and heavy thermal capacity elements. Simulation results show that the optimal model closely follow the step responses of the numerical model as compared to the responses of other two models.

  18. Myria: Scalable Analytics as a Service

    Science.gov (United States)

    Howe, B.; Halperin, D.; Whitaker, A.

    2014-12-01

    At the UW eScience Institute, we're working to empower non-experts, especially in the sciences, to write and use data-parallel algorithms. To this end, we are building Myria, a web-based platform for scalable analytics and data-parallel programming. Myria's internal model of computation is the relational algebra extended with iteration, such that every program is inherently data-parallel, just as every query in a database is inherently data-parallel. But unlike databases, iteration is a first class concept, allowing us to express machine learning tasks, graph traversal tasks, and more. Programs can be expressed in a number of languages and can be executed on a number of execution environments, but we emphasize a particular language called MyriaL that supports both imperative and declarative styles and a particular execution engine called MyriaX that uses an in-memory column-oriented representation and asynchronous iteration. We deliver Myria over the web as a service, providing an editor, performance analysis tools, and catalog browsing features in a single environment. We find that this web-based "delivery vector" is critical in reaching non-experts: they are insulated from irrelevant effort technical work associated with installation, configuration, and resource management. The MyriaX backend, one of several execution runtimes we support, is a main-memory, column-oriented, RDBMS-on-the-worker system that supports cyclic data flows as a first-class citizen and has been shown to outperform competitive systems on 100-machine cluster sizes. I will describe the Myria system, give a demo, and present some new results in large-scale oceanographic microbiology.

  19. Scalable Nanomanufacturing—A Review

    Directory of Open Access Journals (Sweden)

    Khershed Cooper

    2017-01-01

    Full Text Available This article describes the field of scalable nanomanufacturing, its importance and need, its research activities and achievements. The National Science Foundation is taking a leading role in fostering basic research in scalable nanomanufacturing (SNM. From this effort several novel nanomanufacturing approaches have been proposed, studied and demonstrated, including scalable nanopatterning. This paper will discuss SNM research areas in materials, processes and applications, scale-up methods with project examples, and manufacturing challenges that need to be addressed to move nanotechnology discoveries closer to the marketplace.

  20. Scalable Nonlinear Compact Schemes

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Debojyoti [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil M. [Univ. of Chicago, IL (United States); Brown, Jed [Univ. of Colorado, Boulder, CO (United States)

    2014-04-01

    In this work, we focus on compact schemes resulting in tridiagonal systems of equations, specifically the fifth-order CRWENO scheme. We propose a scalable implementation of the nonlinear compact schemes by implementing a parallel tridiagonal solver based on the partitioning/substructuring approach. We use an iterative solver for the reduced system of equations; however, we solve this system to machine zero accuracy to ensure that no parallelization errors are introduced. It is possible to achieve machine-zero convergence with few iterations because of the diagonal dominance of the system. The number of iterations is specified a priori instead of a norm-based exit criterion, and collective communications are avoided. The overall algorithm thus involves only point-to-point communication between neighboring processors. Our implementation of the tridiagonal solver differs from and avoids the drawbacks of past efforts in the following ways: it introduces no parallelization-related approximations (multiprocessor solutions are exactly identical to uniprocessor ones), it involves minimal communication, the mathematical complexity is similar to that of the Thomas algorithm on a single processor, and it does not require any communication and computation scheduling.

  1. Modelling piezoelectric energy harvesting potential in an educational building

    International Nuclear Information System (INIS)

    Li, Xiaofeng; Strezov, Vladimir

    2014-01-01

    Highlights: • Energy harvesting potential of commercialized piezoelectric tiles is analyzed. • The parameters which will affect the energy harvesting efficiency are determined. • The potential could cover 0.5% of the total energy usage of the library building. • A simplified evaluation indicator is proposed to test the considered paving area. - Abstract: In this paper, potential application of a commercial piezoelectric energy harvester in a central hub building at Macquarie University in Sydney, Australia is examined and discussed. Optimization of the piezoelectric tile deployment is presented according to the frequency of pedestrian mobility and a model is developed where 3.1% of the total floor area with the highest pedestrian mobility is paved with piezoelectric tiles. The modelling results indicate that the total annual energy harvesting potential for the proposed optimized tile pavement model is estimated at 1.1 MW h/year. This potential energy generation may be further increased to 9.9 MW h/year with a possible improvement in piezoelectric energy conversion efficiency integrated into the system. This energy harvesting potential would be sufficient to meet close to 0.5% of the annual energy needs of the building. The study confirms that locating high traffic areas is critical for optimization of the energy harvesting efficiency, as well as the orientation of the tile pavement significantly affects the total amount of the harvested energy. A Density Flow evaluation is recommended in this study to qualitatively evaluate the piezoelectric power harvesting potential of the considered area based on the number of pedestrian crossings per unit time

  2. BUILDING NEW BUSINESS MODELS FOR SUSTAINABLE GROWTH AND DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Taco C. R. van Someren

    2011-06-01

    Full Text Available Considered are issues of methodology and methods, as well as ideology of strategic innovation. Using the tools of this approach is offered as mechanisms to develop and build business models for sustainable socio-economic economic growth and development of different regions. The connection between key problems of sustainable development and management policy of different economic entities is studied. The consultancy company Ynnovate’s experience in addressing these issues in the EU and China is shown. It is offered to the use its experience and tools in exploring the areas of cross-border economic cooperation between territories of the Russian Far East and China

  3. Scalable and balanced dynamic hybrid data assimilation

    Science.gov (United States)

    Kauranne, Tuomo; Amour, Idrissa; Gunia, Martin; Kallio, Kari; Lepistö, Ahti; Koponen, Sampsa

    2017-04-01

    Scalability of complex weather forecasting suites is dependent on the technical tools available for implementing highly parallel computational kernels, but to an equally large extent also on the dependence patterns between various components of the suite, such as observation processing, data assimilation and the forecast model. Scalability is a particular challenge for 4D variational assimilation methods that necessarily couple the forecast model into the assimilation process and subject this combination to an inherently serial quasi-Newton minimization process. Ensemble based assimilation methods are naturally more parallel, but large models force ensemble sizes to be small and that results in poor assimilation accuracy, somewhat akin to shooting with a shotgun in a million-dimensional space. The Variational Ensemble Kalman Filter (VEnKF) is an ensemble method that can attain the accuracy of 4D variational data assimilation with a small ensemble size. It achieves this by processing a Gaussian approximation of the current error covariance distribution, instead of a set of ensemble members, analogously to the Extended Kalman Filter EKF. Ensemble members are re-sampled every time a new set of observations is processed from a new approximation of that Gaussian distribution which makes VEnKF a dynamic assimilation method. After this a smoothing step is applied that turns VEnKF into a dynamic Variational Ensemble Kalman Smoother VEnKS. In this smoothing step, the same process is iterated with frequent re-sampling of the ensemble but now using past iterations as surrogate observations until the end result is a smooth and balanced model trajectory. In principle, VEnKF could suffer from similar scalability issues as 4D-Var. However, this can be avoided by isolating the forecast model completely from the minimization process by implementing the latter as a wrapper code whose only link to the model is calling for many parallel and totally independent model runs, all of them

  4. Building America Case Study: Accelerating the Delivery of Home-Performance Upgrades Using a Synergistic Business Model, Minneapolis, Minnesota

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    Achieving Building America energy savings goals (40 percent by 2030) will require many existing homes to install energy upgrades. Engaging large numbers of homeowners in building science-guided upgrades during a single remodeling event has been difficult for a number of reasons. Performance upgrades in existing homes tend to occur over multiple years and usually result from component failures (furnace failure) and weather damage (ice dams, roofing, siding). This research attempted to: A) Understand the homeowner's motivations regarding investing in building science based performance upgrades. B) Determining a rapidly scalable approach to engage large numbers of homeowners directly through existing customer networks. C) Access a business model that will manage all aspects of the contractor-homeowner-performance professional interface to ensure good upgrade decisions over time. The solution results from a synergistic approach utilizing networks of suppliers merging with networks of homeowner customers. Companies in the $400 to $800 billion home services industry have proven direct marketing and sales proficiencies that have led to the development of vast customer networks. Companies such as pest control, lawn care, and security have nurtured these networks by successfully addressing the ongoing needs of homes. This long-term access to customers and trust established with consistent delivery has also provided opportunities for home service providers to grow by successfully introducing new products and services like attic insulation and air sealing. The most important component for success is a business model that will facilitate and manage the process. The team analyzes a group that developed a working model.

  5. Building CMU Sphinx language model for the Ho

    Directory of Open Access Journals (Sweden)

    Mohamed Yassine El Amrani

    2016-11-01

    Full Text Available This paper investigates the use of a simplified set of Arabic phonemes in an Arabic Speech Recognition system applied to Holy Quran. The CMU Sphinx 4 was used to train and evaluate a language model for the Hafs narration of the Holy Quran. The building of the language model was done using a simplified list of Arabic phonemes instead of the mainly used Romanized set in order to simplify the process of generating the language model. The experiments resulted in very low Word Error Rate (WER reaching 1.5% while using a very small set of audio files during the training phase when using all the audio data for both the training and the testing phases. However, when using 90% and 80% of the training data, the WER obtained was respectively 50.0% and 55.7%.

  6. Fractional and multivariable calculus model building and optimization problems

    CERN Document Server

    Mathai, A M

    2017-01-01

    This textbook presents a rigorous approach to multivariable calculus in the context of model building and optimization problems. This comprehensive overview is based on lectures given at five SERC Schools from 2008 to 2012 and covers a broad range of topics that will enable readers to understand and create deterministic and nondeterministic models. Researchers, advanced undergraduate, and graduate students in mathematics, statistics, physics, engineering, and biological sciences will find this book to be a valuable resource for finding appropriate models to describe real-life situations. The first chapter begins with an introduction to fractional calculus moving on to discuss fractional integrals, fractional derivatives, fractional differential equations and their solutions. Multivariable calculus is covered in the second chapter and introduces the fundamentals of multivariable calculus (multivariable functions, limits and continuity, differentiability, directional derivatives and expansions of multivariable ...

  7. Dispersion model for airborne radioactive particulates inside a process building

    International Nuclear Information System (INIS)

    Perkins, W.C.; Stoddard, D.H.

    1984-02-01

    An empirical model, predicting the spread of airborne radioactive particles after they are released inside a building, has been developed. The basis for this model is a composite of data for dispersion of airborne activity recorded during 12 case incidents. These incidents occurred at the Savannah River Plant (SRP) during approximately 90 plant-years of experience with the chemical and metallurgical processing of purified neptunium and plutonium. The model illustrates that the multiple-air-zone concept, used in the designs of many nuclear facilities, can be an efficient safety feature to limit the spread of airborne activity from a release. This study also provides some insight into an apparently anomalous behavior of airborne particulates, namely, their migration against the prevailing flow of ventilation air. 2 references, 12 figures, 4 tables

  8. Airflow and radon transport modeling in four large buildings

    International Nuclear Information System (INIS)

    Fan, J.B.; Persily, A.K.

    1995-01-01

    Computer simulations of multizone airflow and contaminant transport were performed in four large buildings using the program CONTAM88. This paper describes the physical characteristics of the buildings and their idealizations as multizone building airflow systems. These buildings include a twelve-story multifamily residential building, a five-story mechanically ventilated office building with an atrium, a seven-story mechanically ventilated office building with an underground parking garage, and a one-story school building. The air change rates and interzonal airflows of these buildings are predicted for a range of wind speeds, indoor-outdoor temperature differences, and percentages of outdoor air intake in the supply air Simulations of radon transport were also performed in the buildings to investigate the effects of indoor-outdoor temperature difference and wind speed on indoor radon concentrations

  9. Design issues for numerical libraries on scalable multicore architectures

    International Nuclear Information System (INIS)

    Heroux, M A

    2008-01-01

    Future generations of scalable computers will rely on multicore nodes for a significant portion of overall system performance. At present, most applications and libraries cannot exploit multiple cores beyond running addition MPI processes per node. In this paper we discuss important multicore architecture issues, programming models, algorithms requirements and software design related to effective use of scalable multicore computers. In particular, we focus on important issues for library research and development, making recommendations for how to effectively develop libraries for future scalable computer systems

  10. Computational scalability of large size image dissemination

    Science.gov (United States)

    Kooper, Rob; Bajcsy, Peter

    2011-01-01

    We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.

  11. Big data integration: scalability and sustainability

    KAUST Repository

    Zhang, Zhang

    2016-01-26

    Integration of various types of omics data is critically indispensable for addressing most important and complex biological questions. In the era of big data, however, data integration becomes increasingly tedious, time-consuming and expensive, posing a significant obstacle to fully exploit the wealth of big biological data. Here we propose a scalable and sustainable architecture that integrates big omics data through community-contributed modules. Community modules are contributed and maintained by different committed groups and each module corresponds to a specific data type, deals with data collection, processing and visualization, and delivers data on-demand via web services. Based on this community-based architecture, we build Information Commons for Rice (IC4R; http://ic4r.org), a rice knowledgebase that integrates a variety of rice omics data from multiple community modules, including genome-wide expression profiles derived entirely from RNA-Seq data, resequencing-based genomic variations obtained from re-sequencing data of thousands of rice varieties, plant homologous genes covering multiple diverse plant species, post-translational modifications, rice-related literatures, and community annotations. Taken together, such architecture achieves integration of different types of data from multiple community-contributed modules and accordingly features scalable, sustainable and collaborative integration of big data as well as low costs for database update and maintenance, thus helpful for building IC4R into a comprehensive knowledgebase covering all aspects of rice data and beneficial for both basic and translational researches.

  12. Application of 6D Building Information Model (6D BIM) for Business-storage Building in Slovenia

    Science.gov (United States)

    Pučko, Zoran; Vincek, Dražen; Štrukelj, Andrej; Šuman, Nataša

    2017-10-01

    The aim of this paper is to present an application of 6D building information modelling (6D BIM) on a real business-storage building in Slovenia. First, features of building maintenance in general are described according to the current Slovenian legislation, and also a general principle of BIM is given. After that, step-by-step activities for modelling 6D BIM are exposed, namely from Element list for maintenance, determination of their lifetime and service measures, cost analysing and time analysing to 6D BIM modelling. The presented 6D BIM model is designed in a unique way in which cost analysis is performed as 5D BIM model with linked data to use BIM Construction Project Management Software (Vico Office), integrated with 3D BIM model, whereas time analysis as 4D BIM model is carried out as non-linked data with the help of Excel (without connection to 3D BIM model). The paper is intended to serve as a guide to the building owners to prepare 6D BIM and to provide an insight into the relevant dynamic information about intervals and costs for execution of maintenance works in the whole building lifecycle.

  13. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    Science.gov (United States)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and

  14. The theoretical modelling of aerosol behaviour within containment buildings

    International Nuclear Information System (INIS)

    Dunbar, I.H.

    1988-01-01

    The modelling of the deposition of aerosol particles within the containment building plays an important part in determining the effectiveness of the building in reducing releases of activity following accidents. This paper describes attempts to ensure the accuracy of computer codes which model aerosol behaviour, with special reference to the code AEROSIM-M. Code intercomparisons have been used to test the reliability of the coding and the accuracy of the numerical methods. Those codes which assume that the particle size distribution is always lognormal give significantly different results from those which do not make this assumption but instead discretise the range of particle sizes. When the same physical assumptions are made, the predictions of different discrete codes are in reasonable agreement. In comparisons between an earlier version of AEROSIM and sodium fire experiments, the code achieved good agreement on the overall time-scale of deposition. An extensive set of tests of AEROSIM-M against experiments relevant to LWR conditions is underway. (author)

  15. Building predictive models of soil particle-size distribution

    Directory of Open Access Journals (Sweden)

    Alessandro Samuel-Rosa

    2013-04-01

    Full Text Available Is it possible to build predictive models (PMs of soil particle-size distribution (psd in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index. The PMs explained more than half of the data variance. This performance is similar to (or even better than that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd of soils in regions of complex geology.

  16. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  17. Updating of a dynamic finite element model from the Hualien scale model reactor building

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Lebailly, P.

    1996-08-01

    The forces occurring at the soil-structure interface of a building have generally a large influence on the way the building reacts to an earthquake. One can be tempted to characterise these forces more accurately bu updating a model from the structure. However, this procedure requires an updating method suitable for dissipative models, since significant damping can be observed at the soil-structure interface of buildings. Such a method is presented here. It is based on the minimization of a mechanical energy built from the difference between Eigen data calculated bu the model and Eigen data issued from experimental tests on the real structure. An experimental validation of this method is then proposed on a model from the HUALIEN scale-model reactor building. This scale-model, built on the HUALIEN site of TAIWAN, is devoted to the study of soil-structure interaction. The updating concerned the soil impedances, modelled by a layer of springs and viscous dampers attached to the building foundation. A good agreement was found between the Eigen modes and dynamic responses calculated bu the updated model and the corresponding experimental data. (authors). 12 refs., 3 figs., 4 tabs

  18. 5D Building Information Modelling – A Practicability Review

    Directory of Open Access Journals (Sweden)

    Lee Xia Sheng

    2016-01-01

    Full Text Available Quality, time and cost are the three most important elements in any construction project. Building information that comes timely and accurately in multiple dimensions will facilitate a refined decision making process which can improve the construction quality, time and cost. 5 dimensional Building Information Modelling or 5D BIM is an emerging trend in the construction industry that integrates all the major information starting from the initial design to the final construction stage. After that, the integrated information is arranged and communicated through Virtual Design and Construction (VDC. This research is to gauge the practicability of 5D BIM with an action research type pilot study by the means of hands-on modelling of a conceptual bungalow design based on one of the most popular BIM tools. A bungalow is selected as a study subject to simulate the major stages of 5D BIM digital workflow. The whole process starts with developing drawings (2D into digital model (3D, and is followed by the incorporation of time (4D and cost (5D. Observations are focused on the major factors that will affect the practicability of 5D BIM, including the modelling effort, inter-operability, information output and limitations. This research concludes that 5D BIM certainly has high level practicability which further differentiates BIM from Computer Aided Design (CAD. The integration of information not only enhanced the efficiency and accuracy of process in all stages, but also enabled decision makers to have a sophisticated interpretation of information which is almost impossible with the conventional 2D CAD workflow. Although it is possible to incorporate more than 5 dimensions of information, it is foreseeable that excessive information may escalate the complexity unfavourably for BIM implementation. 5D BIM has achieved a significant level of practicability; further research should be conducted to streamline implementation. Once 5D BIM is matured and widely

  19. From neurons to nests: nest-building behaviour as a model in behavioural and comparative neuroscience.

    Science.gov (United States)

    Hall, Zachary J; Meddle, Simone L; Healy, Susan D

    Despite centuries of observing the nest building of most extant bird species, we know surprisingly little about how birds build nests and, specifically, how the avian brain controls nest building. Here, we argue that nest building in birds may be a useful model behaviour in which to study how the brain controls behaviour. Specifically, we argue that nest building as a behavioural model provides a unique opportunity to study not only the mechanisms through which the brain controls behaviour within individuals of a single species but also how evolution may have shaped the brain to produce interspecific variation in nest-building behaviour. In this review, we outline the questions in both behavioural and comparative neuroscience that nest building could be used to address, summarize recent findings regarding the neurobiology of nest building in lab-reared zebra finches and across species building different nest structures, and suggest some future directions for the neurobiology of nest building.

  20. Comparison of different approaches of modelling in a masonry building

    Science.gov (United States)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  1. Modeling impact damper in building frames using GAP element

    Directory of Open Access Journals (Sweden)

    Seyed Mehdi Zahrai

    2017-05-01

    Full Text Available Main effective factor in impact dampers to control vibration is to create disruption in structural oscillation amplitude using small forces induced by auxiliary masses to reduce strong vibrations. So far, modeling of the impact damper has been conducted solely through MATLAB software. Naturally, the functional aspects of this software are limited in research and development aspects compared to the common programs such as SAP2000 and ETABS. In this paper, a Single Degree of Freedom System, SDOF, is first modeled under harmonic loading with maximum amplitude of 0.4g in SAP2000 program. Then, the results are compared with numerical model. In this way, the proposed model is validated and the SDOF system equipped with an impact damper is investigated under the Kobe and Northridge earthquake records using SAP2000 model. Based on obtained results, the system equipped with an impact damper under the Kobe and Northridge earthquakes for structures considered in this study would have better seismic performance in which maximum displacements are reduced 6% and 33% respectively. Finally, impact dampers are modeled in a 4-story building structure with concentric bracing leading to 12% reduction in story drifts.

  2. The Creation of Space Vector Models of Buildings From RPAS Photogrammetry Data

    Directory of Open Access Journals (Sweden)

    Trhan Ondrej

    2017-06-01

    Full Text Available The results of Remote Piloted Aircraft System (RPAS photogrammetry are digital surface models and orthophotos. The main problem of the digital surface models obtained is that buildings are not perpendicular and the shape of roofs is deformed. The task of this paper is to obtain a more accurate digital surface model using building reconstructions. The paper discusses the problem of obtaining and approximating building footprints, reconstructing the final spatial vector digital building model, and modifying the buildings on the digital surface model.

  3. Regulatory odour model development: Survey of modelling tools and datasets with focus on building effects

    DEFF Research Database (Denmark)

    Olesen, H. R.; Løfstrøm, P.; Berkowicz, R.

    dispersion models for estimating local concentration levels in general. However, the report focuses on some particular issues, which are relevant for subsequent work on odour due to animal production. An issue of primary concern is the effect that buildings (stables) have on flow and dispersion. The handling...... of building effects is a complicated problem, and a major part of the report is devoted to the treatment of building effects in dispersion models......A project within the framework of a larger research programme, Action Plan for the Aquatic Environment III (VMP III) aims towards improving an atmospheric dispersion model (OML). The OML model is used for regulatory applications in Denmark, and it is the candidate model to be used also in future...

  4. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  5. Impact of whole-building hygrothermal modelling on the assessment of indoor climate in a library building

    Energy Technology Data Exchange (ETDEWEB)

    Steeman, M.; Janssens, A. [Ghent University, Department of Architecture and Urban Planning, Jozef Plateaustraat 22, B-9000 Gent (Belgium); De Paepe, M. [Ghent University, Department of Flow, Heat and Combustion Mechanics, Sint-Pietersnieuwstraat 41, B-9000 Gent (Belgium)

    2010-07-15

    This paper focuses on the importance of accurately modelling the hygrothermal interaction between the building and its hygroscopic content for the assessment of the indoor climate. Libraries contain a large amount of stored books which require a stable relative humidity to guarantee their preservation. On the other hand, visitors and staff must be comfortable with the indoor climate. The indoor climate of a new library building is evaluated by means of measurements and simulations. Complaints of the staff are confirmed by measured data during the winter and summer of 2007-2008. For the evaluation of the indoor climate, a building simulation model is used in which the porous books are either described by a HAM model or by a simplified isothermal model. Calculations demonstrate that the HAM model predicts a more stable indoor climate regarding both temperature and relative humidity variations in comparison to the estimations by the simplified model. This is attributed to the ability of the HAM model to account for the effect of temperature variations on moisture storage. Moreover, by applying the HAM model, a good agreement with the measured indoor climate is found. As expected, a larger exposed book surface ameliorates the indoor climate because a more stable indoor relative humidity is obtained. Finally, the building simulation model is used to improve the indoor climate with respect to the preservation of valuable books. Results demonstrate that more stringent interventions on the air handling unit are expected when a simplified approach is used to model the hygroscopic books. (author)

  6. A MODEL BUILDING CODE ARTICLE ON FALLOUT SHELTERS WITH RECOMMENDATIONS FOR INCLUSION OF REQUIREMENTS FOR FALLOUT SHELTER CONSTRUCTION IN FOUR NATIONAL MODEL BUILDING CODES.

    Science.gov (United States)

    American Inst. of Architects, Washington, DC.

    A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…

  7. Dispersion model for airborne particulates inside a building

    International Nuclear Information System (INIS)

    Perkins, W.C.; Stoddard, D.H.

    1985-01-01

    An empirical model has been developed for the spread of airborne radioactive particles after they are released inside a building. The model has been useful in performing safety analyses of actinide materials facilities at the Savannah River Plant (SRP). These facilities employ the multiple-air-zone concept; that is, ventilation air flows from rooms or areas of least radioactive material hazard, through zones of increasing hazard, to a treatment system. A composite of the data for dispersion of airborne activity during 12 actual case incidents at SRP forms the basis for this model. These incidents occurred during approximately 90 plant-years of experience at SRP with the chemical and metallurgical processing of purified neptunium and plutonium after their recovery from irradiated uranium. The model gives ratios of the airborne activity concentrations in rooms and corridors near the site of the release. The multiple-air-zone concept has been applied to many designs of nuclear facilities as a safety feature to limit the spread of airborne activity from a release. The model illustrates the limitations of this concept: it predicts an apparently anomalous behavior of airborne particulates; namely, a small migration against the flow of the ventilation air

  8. COMPUTER MODELING OF STRUCTURAL - CONCENTRATION CHARACTERISTICS OF BUILDING COMPOSITE MATERIALS

    Directory of Open Access Journals (Sweden)

    I. I. Zaripova

    2015-09-01

    Full Text Available In the article the computer modeling of structural and concentration characteristics of the building composite material on the basis of the theory of the package. The main provisions of the algorithmon the basis of which it was possible to get the package with a significant number of packaged elements, making it more representative in comparison with existing analogues modeling. We describe the modeled area related areas, the presence of which determines the possibility of a percolation process, which in turn makes it possible to study and management of individual properties of the composite material of construction. As an example of the construction of a composite material is considered concrete that does not exclude the possibility of using algorithms and modeling results of similar studies for composite matrix type (matrix of the same material and distributed in a certain way by volume particles of another substance. Based on modeling results can be manufactured parts and construction elementsfor various purposes with improved technical characteristics (by controlling the concentration composition substance.

  9. Climate change and high-resolution whole-building numerical modelling

    NARCIS (Netherlands)

    Blocken, B.J.E.; Briggen, P.M.; Schellen, H.L.; Hensen, J.L.M.

    2010-01-01

    This paper briefly discusses the need of high-resolution whole-building numerical modelling in the context of climate change. High-resolution whole-building numerical modelling can be used for detailed analysis of the potential consequences of climate change on buildings and to evaluate remedial

  10. Flood vulnerability assessment of residential buildings by explicit damage process modelling

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    2015-01-01

    The present paper introduces a vulnerability modelling approach for residential buildings in flood. The modelling approach explicitly considers relevant damage processes, i.e. water infiltration into the building, mechanical failure of components in the building envelope and damage from water...

  11. Modeling gamma radiation dose in dwellings due to building materials.

    Science.gov (United States)

    de Jong, Peter; van Dijk, Willem

    2008-01-01

    A model is presented that calculates the absorbed dose rate in air of gamma radiation emitted by building materials in a rectangular body construction. The basis for these calculations is formed by a fixed set of specific absorbed dose rates (the dose rate per Bq kg(-1) 238U, 232Th, and 40K), as determined for a standard geometry with the dimensions 4 x 5 x 2.8 m3. Using the computer codes Marmer and MicroShield, correction factors are assessed that quantify the influence of several room and material related parameters on the specific absorbed dose rates. The investigated parameters are the position in the construction; the thickness, density, and dimensions of the construction parts; the contribution from the outer leave; the presence of doors and windows; the attenuation by internal partition walls; the contribution from building materials present in adjacent rooms; and the effect of non-equilibrium due to 222Rn exhalation. To verify the precision, the proposed method is applied to three Dutch reference dwellings, i.e., a row house, a coupled house, and a gallery apartment. The averaged difference with MCNP calculations is found to be 4%.

  12. Transformation of Malaysian Construction Industry with Building Information Modelling (BIM

    Directory of Open Access Journals (Sweden)

    Latiffi Aryani Ahmad

    2016-01-01

    Full Text Available Building Information Modelling (BIM is a revolution of technology and a process that transformed the way building is planned, designed, analysed, constructed and managed. The revolution of technology and process could increase the quality of construction projects. The knowledge of BIM has been expanding in many countries including Malaysia. Since its inception, the use of BIM has broadened up widely with different purposes. The aims of this paper is to investigate the BIM implementation and uses in Malaysian construction projects. The methodologies adopted for structuring this paper are by using literature review and semi-structured interview with construction players that have experienced and being involved in projects using BIM. The purpose of literature review is to illustrate on the pervious research on the subject matter. Meanwhile, the purpose of interviews is to explore the involvement of construction players, years of experience in projects using BIM and BIM uses in construction projects. The findings revealed that BIM has been implemented in Malaysia since 2007 by various construction players, which are client, architect, C&S engineer, M&E engineer, QS, contractor, facilities manager and BIM consultant. The findings also revealed that BIM is used for project’ visualisation, improving project design, detecting design clashes, quantity take off and operation and maintenance. Further work will be focused on the current practices of construction players in projects using BIM.

  13. NASA 3D Models: Vehicle Assembly Building (VAB)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Vehicle Assembly Building (VAB) is one of the largest buildings in the world. It was originally built for assembly of Apollo/Saturn vehicles and was later...

  14. ALADDIN - enhancing applicability and scalability

    International Nuclear Information System (INIS)

    Roverso, Davide

    2001-02-01

    The ALADDIN project aims at the study and development of flexible, accurate, and reliable techniques and principles for computerised event classification and fault diagnosis for complex machinery and industrial processes. The main focus of the project is on advanced numerical techniques, such as wavelets, and empirical modelling with neural networks. This document reports on recent important advancements, which significantly widen the practical applicability of the developed principles, both in terms of flexibility of use, and in terms of scalability to large problem domains. In particular, two novel techniques are here described. The first, which we call Wavelet On- Line Pre-processing (WOLP), is aimed at extracting, on-line, relevant dynamic features from the process data streams. This technique allows a system a greater flexibility in detecting and processing transients at a range of different time scales. The second technique, which we call Autonomous Recursive Task Decomposition (ARTD), is aimed at tackling the problem of constructing a classifier able to discriminate among a large number of different event/fault classes, which is often the case when the application domain is a complex industrial process. ARTD also allows for incremental application development (i.e. the incremental addition of new classes to an existing classifier, without the need of retraining the entire system), and for simplified application maintenance. The description of these novel techniques is complemented by reports of quantitative experiments that show in practice the extent of these improvements. (Author)

  15. Modeling and Control of AHUs in Building HVAC Systems

    OpenAIRE

    Liang, Wei

    2014-01-01

    Heating, ventilation and air conditioning (HVAC) is a mechanical system that provides thermal comfort and accepted indoor air quality often instrumented for large-scale buildings. The HVAC system takes a dominant portion of overall building energy consumption and accounts for 50% of the energy used in the U.S. commercial and residential buildings in 2012. The performance and energy saving of building HVAC systems can be significantly improved by the implementation of better and smarter contro...

  16. Motivation Factors for Adopting Building Information Modeling (BIM in Iraq

    Directory of Open Access Journals (Sweden)

    W. A. Hatem

    2018-04-01

    Full Text Available Building information modeling (BIM is an integrated and comprehensive system including whatever is related to a construction project and its stages. It represents a unified database for all project data through which project documents are available to all stakeholders. This paper evaluates the factors driving the adoption of BIM in Iraqi construction projects in different ministries and adopts quantitative approach to collect data by using a questionnaire survey specially prepared for this purpose which was distributed to experts in the ministries of the Iraqi construction sector. Returned data were subjected to proper statistical analysis. Results showed that the highest motivation for BIM application is to include it in the educational curricula, raise awareness through courses and workshops and contracting with international experts with experience in BIM field.

  17. Communicate and collaborate by using building information modeling

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    Building Information Modeling (BIM) represents a new approach within the Architecture, Engineering, and Construction (AEC) industry, one that encourages collaboration and engagement of all stakeholders on a project. This study discusses the potential of adopting BIM as a communication...... and collaboration platform. The discussion is based on: (1) a review of the latest BIM literature, (2) a qualitative survey of professionals within the industry, and (3) mapping of available BIM standards. This study presents the potential benefits, risks, and the overarching challenges of adopting BIM, and makes...... recommendations for its use, particularly as a tool for collaboration. Specifically, this study focuses on the issue of implementing standardized BIM guidelines across national borders (in this study Denmark and Sweden), and discusses the challenge of developing a common standard applicable and acceptable at both...

  18. Building Information Modeling for Managing Design and Construction

    DEFF Research Database (Denmark)

    Berard, Ole Bengt

    outcome of construction work. Even though contractors regularly encounter design information problems, these issues are accepted as a condition of doing business and better design information has yet to be defined. Building information modeling has the inherent promise of improving the quality of design...... information for work tasks. * Amount of Information – the number of documents and files, and other media, should be appropriate for the scope. The criteria were identified by empirical studies and theory on information quality in the architectural, engineering and construction (AEC) industry and other fields......Contractors planning and executing construction work encounter many kinds of problems with design information, such as uncoordinated drawings and specification, missing relevant information, and late delivery of design information. Research has shown that missing design information and unintended...

  19. Building a sustainable Academic Health Department: the South Carolina model.

    Science.gov (United States)

    Smith, Lillian Upton; Waddell, Lisa; Kyle, Joseph; Hand, Gregory A

    2014-01-01

    Given the limited resources available to public health, it is critical that university programs complement the development needs of agencies. Unfortunately, academic and practice public health entities have long been challenged in building sustainable collaborations that support practice-based research, teaching, and service. The academic health department concept offers a promising solution. In South Carolina, the partners started their academic health department program with a small grant that expanded into a dynamic infrastructure that supports innovative professional exchange and development programs. This article provides a background and describes the key elements of the South Carolina model: joint leadership, a multicomponent memorandum of agreement, and a shared professional development mission. The combination of these elements allows the partners to leverage resources and deftly respond to challenges and opportunities, ultimately fostering the sustainability of the collaboration.

  20. Physical and JIT Model Based Hybrid Modeling Approach for Building Thermal Load Prediction

    Science.gov (United States)

    Iino, Yutaka; Murai, Masahiko; Murayama, Dai; Motoyama, Ichiro

    Energy conservation in building fields is one of the key issues in environmental point of view as well as that of industrial, transportation and residential fields. The half of the total energy consumption in a building is occupied by HVAC (Heating, Ventilating and Air Conditioning) systems. In order to realize energy conservation of HVAC system, a thermal load prediction model for building is required. This paper propose a hybrid modeling approach with physical and Just-in-Time (JIT) model for building thermal load prediction. The proposed method has features and benefits such as, (1) it is applicable to the case in which past operation data for load prediction model learning is poor, (2) it has a self checking function, which always supervises if the data driven load prediction and the physical based one are consistent or not, so it can find if something is wrong in load prediction procedure, (3) it has ability to adjust load prediction in real-time against sudden change of model parameters and environmental conditions. The proposed method is evaluated with real operation data of an existing building, and the improvement of load prediction performance is illustrated.

  1. Multi-criteria decision model for retrofitting existing buildings

    Directory of Open Access Journals (Sweden)

    M. D. Bostenaru Dan

    2004-01-01

    Full Text Available Decision is an element in the risk management process. In this paper the way how science can help in decision making and implementation for retrofitting buildings in earthquake prone urban areas is investigated. In such interventions actors from various spheres are involved. Their interests range among minimising the intervention for maximal preservation or increasing it for seismic safety. Research was conducted to see how to facilitate collaboration between these actors. A particular attention was given to the role of time in actors' preferences. For this reason, on decision level, both the processural and the personal dimension of risk management, the later seen as a task, were considered. A systematic approach was employed to determine the functional structure of a participative decision model. Three layers on which actors implied in this multi-criteria decision problem interact were identified: town, building and element. So-called 'retrofit elements' are characteristic bearers in the architectural survey, engineering simulations, costs estimation and define the realms perceived by the inhabitants. This way they represent an interaction basis for the interest groups considered in a deeper study. Such orientation means for actors' interaction were designed on other levels of intervention as well. Finally, an 'experiment' for the implementation of the decision model is presented: a strategic plan for an urban intervention towards reduction of earthquake hazard impact through retrofitting. A systematic approach proves thus to be a very good communication basis among the participants in the seismic risk management process. Nevertheless, it can only be applied in later phases (decision, implementation, control only, since it serves verifying and improving solution and not developing the concept. The 'retrofit elements' are a typical example of the detailing degree reached in the retrofit design plans in these phases.

  2. Multi-criteria decision model for retrofitting existing buildings

    Science.gov (United States)

    Bostenaru Dan, M. D.

    2004-08-01

    Decision is an element in the risk management process. In this paper the way how science can help in decision making and implementation for retrofitting buildings in earthquake prone urban areas is investigated. In such interventions actors from various spheres are involved. Their interests range among minimising the intervention for maximal preservation or increasing it for seismic safety. Research was conducted to see how to facilitate collaboration between these actors. A particular attention was given to the role of time in actors' preferences. For this reason, on decision level, both the processural and the personal dimension of risk management, the later seen as a task, were considered. A systematic approach was employed to determine the functional structure of a participative decision model. Three layers on which actors implied in this multi-criteria decision problem interact were identified: town, building and element. So-called 'retrofit elements' are characteristic bearers in the architectural survey, engineering simulations, costs estimation and define the realms perceived by the inhabitants. This way they represent an interaction basis for the interest groups considered in a deeper study. Such orientation means for actors' interaction were designed on other levels of intervention as well. Finally, an 'experiment' for the implementation of the decision model is presented: a strategic plan for an urban intervention towards reduction of earthquake hazard impact through retrofitting. A systematic approach proves thus to be a very good communication basis among the participants in the seismic risk management process. Nevertheless, it can only be applied in later phases (decision, implementation, control) only, since it serves verifying and improving solution and not developing the concept. The 'retrofit elements' are a typical example of the detailing degree reached in the retrofit design plans in these phases.

  3. Thermal models of buildings. Determination of temperatures, heating and cooling loads. Theories, models and computer programs

    Energy Technology Data Exchange (ETDEWEB)

    Kaellblad, K

    1998-05-01

    The need to estimate indoor temperatures, heating or cooling load and energy requirements for buildings arises in many stages of a buildings life cycle, e.g. at the early layout stage, during the design of a building and for energy retrofitting planning. Other purposes are to meet the authorities requirements given in building codes. All these situations require good calculation methods. The main purpose of this report is to present the authors work with problems related to thermal models and calculation methods for determination of temperatures and heating or cooling loads in buildings. Thus the major part of the report deals with treatment of solar radiation in glazing systems, shading of solar and sky radiation and the computer program JULOTTA used to simulate the thermal behavior of rooms and buildings. Other parts of thermal models of buildings are more briefly discussed and included in order to give an overview of existing problems and available solutions. A brief presentation of how thermal models can be built up is also given and it is a hope that the report can be useful as an introduction to this part of building physics as well as during development of calculation methods and computer programs. The report may also serve as a help for the users of energy related programs. Independent of which method or program a user choose to work with it is his or her own responsibility to understand the limits of the tool, else wrong conclusions may be drawn from the results 52 refs, 22 figs, 4 tabs

  4. Development of Automated Procedures to Generate Reference Building Models for ASHRAE Standard 90.1 and India’s Building Energy Code and Implementation in OpenStudio

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Andrew [National Renewable Energy Lab. (NREL), Golden, CO (United States); Haves, Philip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jegi, Subhash [International Institute of Information Technology, Hyderabad (India); Garg, Vishal [International Institute of Information Technology, Hyderabad (India); Ravache, Baptiste [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-09-14

    This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.

  5. Green Template for Life Cycle Assessment of Buildings Based on Building Information Modeling: Focus on Embodied Environmental Impact

    Directory of Open Access Journals (Sweden)

    Sungwoo Lee

    2015-12-01

    Full Text Available The increased popularity of building information modeling (BIM for application in the construction of eco-friendly green buildings has given rise to techniques for evaluating green buildings constructed using BIM features. Existing BIM-based green building evaluation techniques mostly rely on externally provided evaluation tools, which pose problems associated with interoperability, including a lack of data compatibility and the amount of time required for format conversion. To overcome these problems, this study sets out to develop a template (the “green template” for evaluating the embodied environmental impact of using a BIM design tool as part of BIM-based building life-cycle assessment (LCA technology development. Firstly, the BIM level of detail (LOD was determined to evaluate the embodied environmental impact, and constructed a database of the impact factors of the embodied environmental impact of the major building materials, thereby adopting an LCA-based approach. The libraries of major building elements were developed by using the established databases and compiled evaluation table of the embodied environmental impact of the building materials. Finally, the green template was developed as an embodied environmental impact evaluation tool and a case study was performed to test its applicability. The results of the green template-based embodied environmental impact evaluation of a test building were validated against those of its actual quantity takeoff (2D takeoff, and its reliability was confirmed by an effective error rate of ≤5%. This study aims to develop a system for assessing the impact of the substances discharged from concrete production process on six environmental impact categories, i.e., global warming (GWP, acidification (AP, eutrophication (EP, abiotic depletion (ADP, ozone depletion (ODP, and photochemical oxidant creation (POCP, using the life a cycle assessment (LCA method. To achieve this, we proposed an LCA method

  6. Responsive, Flexible and Scalable Broader Impacts (Invited)

    Science.gov (United States)

    Decharon, A.; Companion, C.; Steinman, M.

    2010-12-01

    In many educator professional development workshops, scientists present content in a slideshow-type format and field questions afterwards. Drawbacks of this approach include: inability to begin the lecture with content that is responsive to audience needs; lack of flexible access to specific material within the linear presentation; and “Q&A” sessions are not easily scalable to broader audiences. Often this type of traditional interaction provides little direct benefit to the scientists. The Centers for Ocean Sciences Education Excellence - Ocean Systems (COSEE-OS) applies the technique of concept mapping with demonstrated effectiveness in helping scientists and educators “get on the same page” (deCharon et al., 2009). A key aspect is scientist professional development geared towards improving face-to-face and online communication with non-scientists. COSEE-OS promotes scientist-educator collaboration, tests the application of scientist-educator maps in new contexts through webinars, and is piloting the expansion of maps as long-lived resources for the broader community. Collaboration - COSEE-OS has developed and tested a workshop model bringing scientists and educators together in a peer-oriented process, often clarifying common misconceptions. Scientist-educator teams develop online concept maps that are hyperlinked to “assets” (i.e., images, videos, news) and are responsive to the needs of non-scientist audiences. In workshop evaluations, 91% of educators said that the process of concept mapping helped them think through science topics and 89% said that concept mapping helped build a bridge of communication with scientists (n=53). Application - After developing a concept map, with COSEE-OS staff assistance, scientists are invited to give webinar presentations that include live “Q&A” sessions. The webinars extend the reach of scientist-created concept maps to new contexts, both geographically and topically (e.g., oil spill), with a relatively small

  7. A fuzzy-based model to implement the global safety buildings index assessment for agri-food buildings

    Directory of Open Access Journals (Sweden)

    Francesco Barreca

    2014-06-01

    Full Text Available The latest EU policies focus on the issue of food safety with a view to ensuring adequate and standard quality levels for the food produced and/or consumed within the EC. To that purpose, the environment where agricultural products are manufactured and processed plays a crucial role in achieving food hygiene. As a consequence, it is of the outmost importance to adopt proper building solutions which meet health and hygiene requirements as well as to use suitable tools to measure the levels achieved. Similarly, it is necessary to verify and evaluate the level of workers’ safety and welfare in their working environment. Workers’ safety has not only an ethical and social value but also an economic implication, since possible accidents or environmental stressors are the major causes of the lower efficiency and productivity of workers. Therefore, it is fundamental to design suitable models of analysis that allow assessing buildings as a whole, taking into account both health and hygiene safety as well as workers’ safety and welfare. Hence, this paper proposes an assessment model that, based on an established study protocol and on the application of a fuzzy logic procedure, allows assessing the global safety level of an agri-food building by means of a global safety buildings index. The model here presented is original since it uses fuzzy logic to evaluate the performances of both the technical and environmental systems of an agri-food building in terms of health and hygiene safety of the manufacturing process as well as of workers’ health and safety. The result of the assessment is expressed through a triangular fuzzy membership function which allows carrying out comparative analyses of different buildings. A specific procedure was developed to apply the model to a case study which tested its operational simplicity and the validity of its results. The proposed model allows obtaining a synthetic and global value of the building performance of

  8. Scalable algorithms for contact problems

    CERN Document Server

    Dostál, Zdeněk; Sadowská, Marie; Vondrák, Vít

    2016-01-01

    This book presents a comprehensive and self-contained treatment of the authors’ newly developed scalable algorithms for the solutions of multibody contact problems of linear elasticity. The brand new feature of these algorithms is theoretically supported numerical scalability and parallel scalability demonstrated on problems discretized by billions of degrees of freedom. The theory supports solving multibody frictionless contact problems, contact problems with possibly orthotropic Tresca’s friction, and transient contact problems. It covers BEM discretization, jumping coefficients, floating bodies, mortar non-penetration conditions, etc. The exposition is divided into four parts, the first of which reviews appropriate facets of linear algebra, optimization, and analysis. The most important algorithms and optimality results are presented in the third part of the volume. The presentation is complete, including continuous formulation, discretization, decomposition, optimality results, and numerical experimen...

  9. Building alternate protein structures using the elastic network model.

    Science.gov (United States)

    Yang, Qingyi; Sharp, Kim A

    2009-02-15

    We describe a method for efficiently generating ensembles of alternate, all-atom protein structures that (a) differ significantly from the starting structure, (b) have good stereochemistry (bonded geometry), and (c) have good steric properties (absence of atomic overlap). The method uses reconstruction from a series of backbone framework structures that are obtained from a modified elastic network model (ENM) by perturbation along low-frequency normal modes. To ensure good quality backbone frameworks, the single force parameter ENM is modified by introducing two more force parameters to characterize the interaction between the consecutive carbon alphas and those within the same secondary structure domain. The relative stiffness of the three parameters is parameterized to reproduce B-factors, while maintaining good bonded geometry. After parameterization, violations of experimental Calpha-Calpha distances and Calpha-Calpha-Calpha pseudo angles along the backbone are reduced to less than 1%. Simultaneously, the average B-factor correlation coefficient improves to R = 0.77. Two applications illustrate the potential of the approach. (1) 102,051 protein backbones spanning a conformational space of 15 A root mean square deviation were generated from 148 nonredundant proteins in the PDB database, and all-atom models with minimal bonded and nonbonded violations were produced from this ensemble of backbone structures using the SCWRL side chain building program. (2) Improved backbone templates for homology modeling. Fifteen query sequences were each modeled on two targets. For each of the 30 target frameworks, dozens of improved templates could be produced In all cases, improved full atom homology models resulted, of which 50% could be identified blind using the D-Fire statistical potential. (c) 2008 Wiley-Liss, Inc.

  10. Building damage assessment from PolSAR data using texture parameters of statistical model

    Science.gov (United States)

    Li, Linlin; Liu, Xiuguo; Chen, Qihao; Yang, Shuai

    2018-04-01

    Accurate building damage assessment is essential in providing decision support for disaster relief and reconstruction. Polarimetric synthetic aperture radar (PolSAR) has become one of the most effective means of building damage assessment, due to its all-day/all-weather ability and richer backscatter information of targets. However, intact buildings that are not parallel to the SAR flight pass (termed oriented buildings) and collapsed buildings share similar scattering mechanisms, both of which are dominated by volume scattering. This characteristic always leads to misjudgments between assessments of collapsed buildings and oriented buildings from PolSAR data. Because the collapsed buildings and the intact buildings (whether oriented or parallel buildings) have different textures, a novel building damage assessment method is proposed in this study to address this problem by introducing texture parameters of statistical models. First, the logarithms of the estimated texture parameters of different statistical models are taken as a new texture feature to describe the collapse of the buildings. Second, the collapsed buildings and intact buildings are distinguished using an appropriate threshold. Then, the building blocks are classified into three levels based on the building block collapse rate. Moreover, this paper also discusses the capability for performing damage assessment using texture parameters from different statistical models or using different estimators. The RADARSAT-2 and ALOS-1 PolSAR images are used to present and analyze the performance of the proposed method. The results show that using the texture parameters avoids the problem of confusing collapsed and oriented buildings and improves the assessment accuracy. The results assessed by using the K/G0 distribution texture parameters estimated based on the second moment obtain the highest extraction accuracies. For the RADARSAT-2 and ALOS-1 data, the overall accuracy (OA) for these three types of

  11. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  12. Highly scalable Ab initio genomic motif identification

    KAUST Repository

    Marchand, Benoit; Bajic, Vladimir B.; Kaushik, Dinesh

    2011-01-01

    We present results of scaling an ab initio motif family identification system, Dragon Motif Finder (DMF), to 65,536 processor cores of IBM Blue Gene/P. DMF seeks groups of mutually similar polynucleotide patterns within a set of genomic sequences and builds various motif families from them. Such information is of relevance to many problems in life sciences. Prior attempts to scale such ab initio motif-finding algorithms achieved limited success. We solve the scalability issues using a combination of mixed-mode MPI-OpenMP parallel programming, master-slave work assignment, multi-level workload distribution, multi-level MPI collectives, and serial optimizations. While the scalability of our algorithm was excellent (94% parallel efficiency on 65,536 cores relative to 256 cores on a modest-size problem), the final speedup with respect to the original serial code exceeded 250,000 when serial optimizations are included. This enabled us to carry out many large-scale ab initio motiffinding simulations in a few hours while the original serial code would have needed decades of execution time. Copyright 2011 ACM.

  13. Scalable cloud without dedicated storage

    Science.gov (United States)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  14. Modeling urban building energy use: A review of modeling approaches and procedures

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wenliang; Zhou, Yuyu; Cetin, Kristen; Eom, Jiyong; Wang, Yu; Chen, Gang; Zhang, Xuesong

    2017-12-01

    With rapid urbanization and economic development, the world has been experiencing an unprecedented increase in energy consumption and greenhouse gas (GHG) emissions. While reducing energy consumption and GHG emissions is a common interest shared by major developed and developing countries, actions to enable these global reductions are generally implemented at the city scale. This is because baseline information from individual cities plays an important role in identifying economical options for improving building energy efficiency and reducing GHG emissions. Numerous approaches have been proposed for modeling urban building energy use in the past decades. This paper aims to provide an up-to-date review of the broad categories of energy models for urban buildings and describes the basic workflow of physics-based, bottom-up models and their applications in simulating urban-scale building energy use. Because there are significant differences across models with varied potential for application, strengths and weaknesses of the reviewed models are also presented. This is followed by a discussion of challenging issues associated with model preparation and calibration.

  15. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    Science.gov (United States)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to

  16. Activity measurement and effective dose modelling of natural radionuclides in building material

    International Nuclear Information System (INIS)

    Maringer, F.J.; Baumgartner, A.; Rechberger, F.; Seidel, C.; Stietka, M.

    2013-01-01

    In this paper the assessment of natural radionuclides' activity concentration in building materials, calibration requirements and related indoor exposure dose models is presented. Particular attention is turned to specific improvements in low-level gamma-ray spectrometry to determine the activity concentration of necessary natural radionuclides in building materials with adequate measurement uncertainties. Different approaches for the modelling of the effective dose indoor due to external radiation resulted from natural radionuclides in building material and results of actual building material assessments are shown. - Highlights: • Dose models for indoor radiation exposure due to natural radionuclides in building materials. • Strategies and methods in radionuclide metrology, activity measurement and dose modelling. • Selection of appropriate parameters in radiation protection standards for building materials. • Scientific-based limitations of indoor exposure due to natural radionuclides in building materials

  17. Scalable Simulation of Electromagnetic Hybrid Codes

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.; Fujimoto, Richard; Karimabadi, Dr. Homa

    2006-01-01

    New discrete-event formulations of physics simulation models are emerging that can outperform models based on traditional time-stepped techniques. Detailed simulation of the Earth's magnetosphere, for example, requires execution of sub-models that are at widely differing timescales. In contrast to time-stepped simulation which requires tightly coupled updates to entire system state at regular time intervals, the new discrete event simulation (DES) approaches help evolve the states of sub-models on relatively independent timescales. However, parallel execution of DES-based models raises challenges with respect to their scalability and performance. One of the key challenges is to improve the computation granularity to offset synchronization and communication overheads within and across processors. Our previous work was limited in scalability and runtime performance due to the parallelization challenges. Here we report on optimizations we performed on DES-based plasma simulation models to improve parallel performance. The net result is the capability to simulate hybrid particle-in-cell (PIC) models with over 2 billion ion particles using 512 processors on supercomputing platforms

  18. Towards a Scalable, Biomimetic, Antibacterial Coating

    Science.gov (United States)

    Dickson, Mary Nora

    Corneal afflictions are the second leading cause of blindness worldwide. When a corneal transplant is unavailable or contraindicated, an artificial cornea device is the only chance to save sight. Bacterial or fungal biofilm build up on artificial cornea devices can lead to serious complications including the need for systemic antibiotic treatment and even explantation. As a result, much emphasis has been placed on anti-adhesion chemical coatings and antibiotic leeching coatings. These methods are not long-lasting, and microorganisms can eventually circumvent these measures. Thus, I have developed a surface topographical antimicrobial coating. Various surface structures including rough surfaces, superhydrophobic surfaces, and the natural surfaces of insects' wings and sharks' skin are promising anti-biofilm candidates, however none meet the criteria necessary for implementation on the surface of an artificial cornea device. In this thesis I: 1) developed scalable fabrication protocols for a library of biomimetic nanostructure polymer surfaces 2) assessed the potential these for poly(methyl methacrylate) nanopillars to kill or prevent formation of biofilm by E. coli bacteria and species of Pseudomonas and Staphylococcus bacteria and improved upon a proposed mechanism for the rupture of Gram-negative bacterial cell walls 3) developed a scalable, commercially viable method for producing antibacterial nanopillars on a curved, PMMA artificial cornea device and 4) developed scalable fabrication protocols for implantation of antibacterial nanopatterned surfaces on the surfaces of thermoplastic polyurethane materials, commonly used in catheter tubings. This project constitutes a first step towards fabrication of the first entirely PMMA artificial cornea device. The major finding of this work is that by precisely controlling the topography of a polymer surface at the nano-scale, we can kill adherent bacteria and prevent biofilm formation of certain pathogenic bacteria

  19. A Study on Development of a Cost Optimal and Energy Saving Building Model: Focused on Industrial Building

    Directory of Open Access Journals (Sweden)

    Hye Yeon Kim

    2016-03-01

    Full Text Available This study suggests an optimization method for the life cycle cost (LCC in an economic feasibility analysis when applying energy saving techniques in the early design stage of a building. Literature and previous studies were reviewed to select appropriate optimization and LCC analysis techniques. The energy simulation (Energy Plus and computational program (MATLAB were linked to provide an automated optimization process. From the results, it is suggested that this process could outline the cost optimization model with which it is possible to minimize the LCC. To aid in understanding the model, a case study on an industrial building was performed to outline the operations of the cost optimization model including energy savings. An energy optimization model was also presented to illustrate the need for the cost optimization model.

  20. From Point Clouds to Building Information Models: 3D Semi-Automatic Reconstruction of Indoors of Existing Buildings

    Directory of Open Access Journals (Sweden)

    Hélène Macher

    2017-10-01

    Full Text Available The creation of as-built Building Information Models requires the acquisition of the as-is state of existing buildings. Laser scanners are widely used to achieve this goal since they permit to collect information about object geometry in form of point clouds and provide a large amount of accurate data in a very fast way and with a high level of details. Unfortunately, the scan-to-BIM (Building Information Model process remains currently largely a manual process which is time consuming and error-prone. In this paper, a semi-automatic approach is presented for the 3D reconstruction of indoors of existing buildings from point clouds. Several segmentations are performed so that point clouds corresponding to grounds, ceilings and walls are extracted. Based on these point clouds, walls and slabs of buildings are reconstructed and described in the IFC format in order to be integrated into BIM software. The assessment of the approach is proposed thanks to two datasets. The evaluation items are the degree of automation, the transferability of the approach and the geometric quality of results of the 3D reconstruction. Additionally, quality indexes are introduced to inspect the results in order to be able to detect potential errors of reconstruction.

  1. Adoption of Building Information Modelling in project planning risk management

    Science.gov (United States)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  2. Greening Existing Buildings in Contemporary Iraqi Urban Reality/ Virtual Model

    Directory of Open Access Journals (Sweden)

    Saba Jabar Neama Al-Khafaji

    2015-11-01

    Full Text Available The approach of greening existing buildings, is an urgent necessity, because the greening operation provides the speed and optimal efficiency in the environmental performance, as well as keeping up with the global green architecture revolution. Therefore, greening existing buildings in Iraq is important for trends towards renewable energies, because of what the country went through economic conditions and crises and wars which kept the country away from what took place globally in this issue. The research problem is: insufficient knowledge about the importance and the mechanism of the greening of existing buildings, including its environmental and economic dimensions, by rationalization of energy consumption and preserving the environment. The research objective is: clarifying the importance of greening existing buildings environmentally and economically, providing a virtual experience for greening the presidency building of Baghdad University, through advanced computer program. The main conclusions is: there is difference representing by reducing the disbursed thermal loads amount for cooling in summer and heating in winter through the use of computerized program (DesignBuilder and that after the implementation of greening operations on the building envelope, which confirms its effectiveness in raising the energy performance efficiency inside the building. Hence, the importance of the application of greening existing buildings approach in Iraq, to bring back Iraqi architecture to environmental and local track proper.

  3. Modelling the life-cycle of sustainable, living buildings

    NARCIS (Netherlands)

    Van Nederveen, S.; Gielingh, W.

    2009-01-01

    Credit-reductions by banks, as a consequence of the global monetary crisis, will hit the construction industry for many years to come. There are however still financing opportunities for building projects that are perceived as less risky. Buildings that are not only sustainable, but also flexible

  4. Design of model based LQG control for integrated building systems

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.; Paassen, van A.H.C.

    2006-01-01

    The automation of the operation of integrated building systems requires using modern control techniques to enhance the quality of the building indoor environments. This paper describes the theatrical base and practical application of an optimal dynamic regulator using modelbased Linear Quadratic

  5. Intelligent Controls for Net-Zero Energy Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Haorong; Cho, Yong; Peng, Dongming

    2011-10-30

    The goal of this project is to develop and demonstrate enabling technologies that can empower homeowners to convert their homes into net-zero energy buildings in a cost-effective manner. The project objectives and expected outcomes are as follows: • To develop rapid and scalable building information collection and modeling technologies that can obtain and process “as-built” building information in an automated or semiautomated manner. • To identify low-cost measurements and develop low-cost virtual sensors that can monitor building operations in a plug-n-play and low-cost manner. • To integrate and demonstrate low-cost building information modeling (BIM) technologies. • To develop decision support tools which can empower building owners to perform energy auditing and retrofit analysis. • To develop and demonstrate low-cost automated diagnostics and optimal control technologies which can improve building energy efficiency in a continual manner.

  6. Scalable shared-memory multiprocessing

    CERN Document Server

    Lenoski, Daniel E

    1995-01-01

    Dr. Lenoski and Dr. Weber have experience with leading-edge research and practical issues involved in implementing large-scale parallel systems. They were key contributors to the architecture and design of the DASH multiprocessor. Currently, they are involved with commercializing scalable shared-memory technology.

  7. Models test on dynamic structure-structure interaction of nuclear power plant buildings

    International Nuclear Information System (INIS)

    Kitada, Y.; Hirotani, T.

    1999-01-01

    A reactor building of an NPP (nuclear power plant) is generally constructed closely adjacent to a turbine building and other buildings such as the auxiliary building, and in increasing numbers of NPPs, multiple plants are being planned and constructed closely on a single site. In these situations, adjacent buildings are considered to influence each other through the soil during earthquakes and to exhibit dynamic behaviour different from that of separate buildings, because those buildings in NPP are generally heavy and massive. The dynamic interaction between buildings during earthquake through the soil is termed here as 'dynamic cross interaction (DCI)'. In order to comprehend DCI appropriately, forced vibration tests and earthquake observation are needed using closely constructed building models. Standing on this background, Nuclear Power Engineering Corporation (NUPEC) had planned the project to investigate the DCI effect in 1993 after the preceding SSI (soil-structure interaction) investigation project, 'model tests on embedment effect of reactor building'. The project consists of field and laboratory tests. The field test is being carried out using three different building construction conditions, e.g. a single reactor building to be used for the comparison purposes as for a reference, two same reactor buildings used to evaluate pure DCI effects, and two different buildings, reactor and turbine building models to evaluate DCI effects under the actual plant conditions. Forced vibration tests and earthquake observations are planned in the field test. The laboratory test is planned to evaluate basic characteristics of the DCI effects using simple soil model made of silicon rubber and structure models made of aluminum. In this test, forced vibration tests and shaking table tests are planned. The project was started in April 1994 and will be completed in March 2002. This paper describes an outline and the summary of the current status of this project. (orig.)

  8. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data

    Directory of Open Access Journals (Sweden)

    Jaewook Jung

    2017-03-01

    Full Text Available With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL combined with Hypothesize and Test (HAT. The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International

  9. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data.

    Science.gov (United States)

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-03-19

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for

  10. IBM SPSS modeler essentials effective techniques for building powerful data mining and predictive analytics solutions

    CERN Document Server

    McCormick, Keith; Wei, Bowen

    2017-01-01

    IBM SPSS Modeler allows quick, efficient predictive analytics and insight building from your data, and is a popularly used data mining tool. This book will guide you through the data mining process, and presents relevant statistical methods which are used to build predictive models and conduct other analytic tasks using IBM SPSS Modeler. From ...

  11. Implementation of building information modeling in Malaysian construction industry

    Science.gov (United States)

    Memon, Aftab Hameed; Rahman, Ismail Abdul; Harman, Nur Melly Edora

    2014-10-01

    This study has assessed the implementation level of Building Information Modeling (BIM) in the construction industry of Malaysia. It also investigated several computer software packages facilitating BIM and challenges affecting its implementation. Data collection for this study was carried out using questionnaire survey among the construction practitioners. 95 completed forms of questionnaire received against 150 distributed questionnaire sets from consultant, contractor and client organizations were analyzed statistically. Analysis findings indicated that the level of implementation of BIM in the construction industry of Malaysia is very low. Average index method employed to assess the effectiveness of various software packages of BIM highlighted that Bentley construction, AutoCAD and ArchiCAD are three most popular and effective software packages. Major challenges to BIM implementation are it requires enhanced collaboration, add work to a designer, interoperability and needs enhanced collaboration. For improving the level of implementing BIM in Malaysian industry, it is recommended that a flexible training program of BIM for all practitioners must be created.

  12. Energy policy for integrating the building environmental performance model of an air conditioned building in a subtropical climate

    International Nuclear Information System (INIS)

    Mui, K.W.

    2006-01-01

    For an air conditioned building, the major electricity consumption is by the heating, and air conditioning (HVAC) system. As energy saving strategies may be in conflict with the criteria of indoor air quality and thermal comfort, a concept of the building environmental performance model (BEPM) has been developed to optimize energy consumption in HVAC systems without any deterioration of the indoor air quality and thermal comfort. The BEPM is divided into two main modules: the adaptive comfort temperature (ACT) module and the new demand control ventilation (nDCV) module. This study aims to enhance and prompt the conventional operation of the air side systems by incorporating temperature reset with the adaptive comfort temperature control and the new demand control ventilation system in high rise buildings in Hong Kong. A new example weather year (1991) was established as a reference to compute the energy use of HVAC systems in buildings in order to obtain more representative data for predicting annual energy consumption. A survey of 165 Hong Kong office buildings was conducted and it provided valuable information on the existing HVAC design values in different grades of private commercial buildings in Hong Kong. It was found that the actual measured values of indoor temperature were lower than the design ones. Furthermore, with the new example weather year and the integration of the BEPM into Grade A private office buildings in Hong Kong, the total energy saving of the air conditioning systems was calculated (i.e. a saving of HK$122 million in electrical consumption per year) while the thermal comfort for the occupants was also maintained

  13. Energy Savings Modeling and Inspection Guidelines for Commercial Building Federal Tax Deductions for Buildings in 2016 and Later

    Energy Technology Data Exchange (ETDEWEB)

    Deru, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Field-Macumber, Kristin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This document provides guidance for modeling and inspecting energy-efficient property in commercial buildings for certification of the energy and power cost savings related to Section 179D of the Internal Revenue Code (IRC) enacted in Section 1331 of the 2005 Energy Policy Act (EPAct) of 2005, noted in Internal Revenue Service (IRS) Notices 2006-52 (IRS 2006), 2008-40 (IRS 2008) and 2012-26 (IRS 2012), and updated by the Protecting Americans from Tax Hikes (PATH) Act of 2015. Specifically, Section 179D provides federal tax deductions for energy-efficient property related to a commercial building's envelope; interior lighting; heating, ventilating, and air conditioning (HVAC); and service hot water (SHW) systems. This document applies to buildings placed in service on or after January 1, 2016.

  14. Economical and scalable synthesis of 6-amino-2-cyanobenzothiazole

    Directory of Open Access Journals (Sweden)

    Jacob R. Hauser

    2016-09-01

    Full Text Available 2-Cyanobenzothiazoles (CBTs are useful building blocks for: 1 luciferin derivatives for bioluminescent imaging; and 2 handles for bioorthogonal ligations. A particularly versatile CBT is 6-amino-2-cyanobenzothiazole (ACBT, which has an amine handle for straight-forward derivatisation. Here we present an economical and scalable synthesis of ACBT based on a cyanation catalysed by 1,4-diazabicyclo[2.2.2]octane (DABCO, and discuss its advantages for scale-up over previously reported routes.

  15. Building Software Tools for Combat Modeling and Analysis

    National Research Council Canada - National Science Library

    Yuanxin, Chen

    2004-01-01

    ... (Meta-Language for Combat Simulations) and its associated parser and C++ code generator were designed to reduce the amount of time and developmental efforts needed to build sophisticated real world combat simulations. A C++...

  16. MODELLING AND SIMULATION MATTERS UPON THE STATIC ANALYSIS OF A BUILDING

    Directory of Open Access Journals (Sweden)

    DUTA Alina

    2017-05-01

    Full Text Available The present paper puts forward a method applied to determine the static analysis and the stress of a two-level building, via an analysis with finite elements for building construction domain. Prior to this, we shall deal with a strategic issue, i.e. the achievement of a model with finite elements to validate the best approximation for the building structure. The method endorsed comes to replace the mathematical model, which is more complicated. However, a central issue that has to be dealt with before determining the displacements and the stress analysis is the achievement of the model with finite elements, as the best approximation of the building structure.

  17. Empirical Validation of Building Simulation Software : Modeling of Double Facades

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group.......The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group....

  18. Think 500, not 50! A scalable approach to student success in STEM.

    Science.gov (United States)

    LaCourse, William R; Sutphin, Kathy Lee; Ott, Laura E; Maton, Kenneth I; McDermott, Patrice; Bieberich, Charles; Farabaugh, Philip; Rous, Philip

    2017-01-01

    UMBC, a diverse public research university, "builds" upon its reputation in producing highly capable undergraduate scholars to create a comprehensive new model, STEM BUILD at UMBC. This program is designed to help more students develop the skills, experience and motivation to excel in science, technology, engineering, and mathematics (STEM). This article provides an in-depth description of STEM BUILD at UMBC and provides the context of this initiative within UMBC's vision and mission. The STEM BUILD model targets promising STEM students who enter as freshmen or transfer students and do not qualify for significant university or other scholarship support. Of primary importance to this initiative are capacity, scalability, and institutional sustainability, as we distill the advantages and opportunities of UMBC's successful scholars programs and expand their application to more students. The general approach is to infuse the mentoring and training process into the fabric of the undergraduate experience while fostering community, scientific identity, and resilience. At the heart of STEM BUILD at UMBC is the development of BUILD Group Research (BGR), a sequence of experiences designed to overcome the challenges that undergraduates without programmatic support often encounter (e.g., limited internship opportunities, mentorships, and research positions for which top STEM students are favored). BUILD Training Program (BTP) Trainees serve as pioneers in this initiative, which is potentially a national model for universities as they address the call to retain and graduate more students in STEM disciplines - especially those from underrepresented groups. As such, BTP is a research study using random assignment trial methodology that focuses on the scalability and eventual incorporation of successful measures into the traditional format of the academy. Critical measures to transform institutional culture include establishing an extensive STEM Living and Learning Community to

  19. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Minho, E-mail: minmin40@hanmail.net [Asset Management Division, Mate Plus Co., Ltd., 9th Fl., Financial News Bldg. 24-5 Yeouido-dong, Yeongdeungpo-gu, Seoul, 150-877 (Korea, Republic of); Hong, Taehoon, E-mail: hong7@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of); Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of)

    2015-01-15

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.

  20. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    International Nuclear Information System (INIS)

    Jang, Minho; Hong, Taehoon; Ji, Changyoon

    2015-01-01

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C 6 H 6 eq.) calculated by the previous model was much lower (1965 kg C 6 H 6 eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings

  1. 3D Modeling of Interior Building Environments and Objects from Noisy Sensor Suites

    Science.gov (United States)

    2015-05-14

    Even though I moved to the other side of the country, they have always been supportive. My Dad , David Turner, has not only been my role model of an...model [10, 23–25]. Models with flat regions or sharp corners where the curvature approaches zero or infinity can become degenerate or have poor ...quality. Since building models are composed almost entirely of such areas, these techniques are not appropriate. Models of building interiors are rich with

  2. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    Yiming Yan

    2017-01-01

    Full Text Available In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM, which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.

  3. Investigating the impact of different thermal comfort models for zero energy buildings in hot climates

    NARCIS (Netherlands)

    Attia, S.G.; Hensen, J.L.M.

    2014-01-01

    The selection of a thermal comfort model has a major impact on energy consumption of Net Zero Energy Buildings (NZEBs) in hot climates. The objective of this paper is to compare the influence of using different comfort models for zero energy buildings in hot climates. The paper compares the impact

  4. Reviewing the Role of Stakeholders in Operational Research: Opportunities for Group Model Building

    NARCIS (Netherlands)

    Gooyert, V. de; Rouwette, E.A.J.A.; Kranenburg, H.L. van

    2013-01-01

    Stakeholders have always received much attention in system dynamics, especially in the group model building tradition, which emphasizes the deep involvement of a client group in building a system dynamics model. In organizations, stakeholders are gaining more and more attention by managers who try

  5. Prediction model for sound transmission from machinery in buildings: feasible approaches and problems to be solved

    NARCIS (Netherlands)

    Gerretsen, E.

    2000-01-01

    Prediction models for the airborne and impact sound transmission in buildings have recently been established (EN 12354- 1&2:1999). However, these models do not cover technical installations and machinery as a source of sound in buildings. Yet these can cause unacceptable sound levels and it is

  6. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    extend the co-array model to other languages in a small experimental version of Co-array Python. Another collaborative project defined a Fortran 95 interface to ARMCI to encourage Fortran programmers to use the one-sided communication model in anticipation of their conversion to the co-array model later. A collaborative project with the Earth Sciences community at NASA Goddard and GFDL experimented with the co-array model within computational kernels related to their climate models, first using CafLib and then extending the co-array model to use design patterns. Future work will build on the design-pattern idea with a redesign of CafLib as a true object-oriented library using Fortran 2003 and as a parallel numerical library using Fortran 2008.

  7. INTEGRATING SMARTPHONE IMAGES AND AIRBORNE LIDAR DATA FOR COMPLETE URBAN BUILDING MODELLING

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2016-06-01

    Full Text Available A complete building model reconstruction needs data collected from both air and ground. The former often has sparse coverage on building façades, while the latter usually is unable to observe the building rooftops. Attempting to solve the missing data issues in building reconstruction from single data source, we describe an approach for complete building reconstruction that integrates airborne LiDAR data and ground smartphone imagery. First, by taking advantages of GPS and digital compass information embedded in the image metadata of smartphones, we are able to find airborne LiDAR point clouds for the corresponding buildings in the images. In the next step, Structure-from-Motion and dense multi-view stereo algorithms are applied to generate building point cloud from multiple ground images. The third step extracts building outlines respectively from the LiDAR point cloud and the ground image point cloud. An automated correspondence between these two sets of building outlines allows us to achieve a precise registration and combination of the two point clouds, which ultimately results in a complete and full resolution building model. The developed approach overcomes the problem of sparse points on building façades in airborne LiDAR and the deficiency of rooftops in ground images such that the merits of both datasets are utilized.

  8. Actual building energy use patterns and their implications for predictive modeling

    International Nuclear Information System (INIS)

    Heidarinejad, Mohammad; Cedeño-Laurent, Jose G.; Wentz, Joshua R.; Rekstad, Nicholas M.; Spengler, John D.; Srebric, Jelena

    2017-01-01

    Highlights: • Developed three building categories based on energy use patterns of campus buildings. • Evaluated implication of temporal energy data granularity on predictive modeling. • Demonstrated importance of monitoring daily chilled water consumption. • Identified interval electricity data as an indicator of building operation schedules. • Demonstrated a calibration process for energy modeling of a campus building. - Abstract: The main goal of this study is to understand the patterns in which commercial buildings consume energy, rather than evaluating building energy use based on aggregate utility bills typically linked to building principal tenant activity or occupancy type. The energy consumption patterns define buildings as externally-load, internally-load, or mixed-load dominated buildings. Penn State and Harvard campuses serve as case studies for this particular research project. The buildings in these two campuses use steam, chilled water, and electricity as energy commodities and maintain databases of different resolutions to include minute, hourly, daily, and monthly data instances depending on the commodity and available data acquisition system. The results of this study show monthly steam consumption directly correlates to outdoor environmental conditions for 88% of the studied buildings, while chilled water consumption has negligible correlation to the outdoor environmental conditions. Thus, in terms of monthly chilled water consumption, 86% of buildings are internally-load and mixed-load dominated, respectively. Chilled water consumption is better suited for the daily analyses compared to the monthly and hourly analyses. While the influence of building operation schedules affects the analyses at the hourly level, the monthly chilled water consumptions are not good indicators of the building energy consumption patterns. Electricity consumption at the monthly (or seasonal) level can support the building energy simulation tools for the

  9. Development of surrogate models using artificial neural network for building shell energy labelling

    International Nuclear Information System (INIS)

    Melo, A.P.; Cóstola, D.; Lamberts, R.; Hensen, J.L.M.

    2014-01-01

    Surrogate models are an important part of building energy labelling programs, but these models still present low accuracy, particularly in cooling-dominated climates. The objective of this study was to evaluate the feasibility of using an artificial neural network (ANN) to improve the accuracy of surrogate models for labelling purposes. An ANN was applied to model the building stock of a city in Brazil, based on the results of extensive simulations using the high-resolution building energy simulation program EnergyPlus. Sensitivity and uncertainty analyses were carried out to evaluate the behaviour of the ANN model, and the variations in the best and worst performance for several typologies were analysed in relation to variations in the input parameters and building characteristics. The results obtained indicate that an ANN can represent the interaction between input and output data for a vast and diverse building stock. Sensitivity analysis showed that no single input parameter can be identified as the main factor responsible for the building energy performance. The uncertainty associated with several parameters plays a major role in assessing building energy performance, together with the facade area and the shell-to-floor ratio. The results of this study may have a profound impact as ANNs could be applied in the future to define regulations in many countries, with positive effects on optimizing the energy consumption. - Highlights: • We model several typologies which have variation in input parameters. • We evaluate the accuracy of surrogate models for labelling purposes. • ANN is applied to model the building stock. • Uncertainty in building plays a major role in the building energy performance. • Results show that ANN could help to develop building energy labelling systems

  10. Modeling, Estimation and Control of Indoor Climate in Livestock Buildings

    DEFF Research Database (Denmark)

    Wu, Zhuang

    The main objective of this research is to design an efficient control system for the indoor climate of a large-scale partition-less livestock building, in order to maintain a healthy, comfortable and economically energy consuming indoor environment for the agricultural animals and farmers. In thi...... scale livestock buildings, and could be considered as an alternative solution to the current used decentralized PID controller.......The main objective of this research is to design an efficient control system for the indoor climate of a large-scale partition-less livestock building, in order to maintain a healthy, comfortable and economically energy consuming indoor environment for the agricultural animals and farmers....... With necessary assumptions and simplifications, the dominant air flow distributions are investigated and the phenomenon of horizontal variations is well depicted. The designed entire control system consists of an outer feedback closed-loop dynamic controller and an inner feed-forward redundancy optimization...

  11. Modeling energy flexibility of low energy buildings utilizing thermal mass

    DEFF Research Database (Denmark)

    Foteinaki, Kyriaki; Heller, Alfred; Rode, Carsten

    2016-01-01

    In the future energy system a considerable increase in the penetration of renewable energy is expected, challenging the stability of the system, as both production and consumption will have fluctuating patterns. Hence, the concept of energy flexibility will be necessary in order for the consumption...... to match the production patterns, shifting demand from on-peak hours to off-peak hours. Buildings could act as flexibility suppliers to the energy system, through load shifting potential, provided that the large thermal mass of the building stock could be utilized for energy storage. In the present study...... the load shifting potential of an apartment of a low energy building in Copenhagen is assessed, utilizing the heat storage capacity of the thermal mass when the heating system is switched off for relieving the energy system. It is shown that when using a 4-hour preheating period before switching off...

  12. International survey on current occupant modelling approaches in building performance simulation

    NARCIS (Netherlands)

    O'Brien, W.; Gaetani, I.; Gilani, S.; Carlucci, S.; Hoes, P.; Hensen, J.L.M.

    2017-01-01

    It is not evident that practitioners have kept pace with latest research developments in building occupant behaviour modelling; nor are the attitudes of practitioners regarding occupant behaviour modelling well understood. In order to guide research and development efforts, researchers,

  13. Review of Development Survey of Phase Change Material Models in Building Applications

    Directory of Open Access Journals (Sweden)

    Hussein J. Akeiber

    2014-01-01

    Full Text Available The application of phase change materials (PCMs in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data.

  14. Activity measurement and effective dose modelling of natural radionuclides in building material.

    Science.gov (United States)

    Maringer, F J; Baumgartner, A; Rechberger, F; Seidel, C; Stietka, M

    2013-11-01

    In this paper the assessment of natural radionuclides' activity concentration in building materials, calibration requirements and related indoor exposure dose models is presented. Particular attention is turned to specific improvements in low-level gamma-ray spectrometry to determine the activity concentration of necessary natural radionuclides in building materials with adequate measurement uncertainties. Different approaches for the modelling of the effective dose indoor due to external radiation resulted from natural radionuclides in building material and results of actual building material assessments are shown. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Moisture buffering and its consequence in whole building hygrothermal modeling

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2008-01-01

    Moisture absorption and desorption of materials in contact with indoor air of buildings can be used as a passive, i.e., nonmechanical, way to moderate the variation of indoor humidity. This phenomenon, which is recognized as,moisture buffering', could potentially be used as an attractive feature...... for ventilation if indoor humidity is a parameter for controlling ventilation rate, 2. it is possible to improve the perceived acceptability of indoor air, as judged by the temperature and humidity of the air, by using moisture buffering to control the indoor humidity. The results of the whole building...

  16. Modeling hourly consumption of electricity and district heat in non-residential buildings

    International Nuclear Information System (INIS)

    Kipping, A.; Trømborg, E.

    2017-01-01

    Models for hourly consumption of heat and electricity in different consumer groups on a regional level can yield important data for energy system planning and management. In this study hourly meter data, combined with cross-sectional data derived from the Norwegian energy label database, is used to model hourly consumption of both district heat and electrical energy in office buildings and schools which either use direct electric heating (DEH) or non-electric hydronic heating (OHH). The results of the study show that modeled hourly total energy consumption in buildings with DEH and in buildings with OHH (supplied by district heat) exhibits differences, e.g. due to differences in heat distribution and control systems. In a normal year, in office buildings with OHH the main part of total modeled energy consumption is used for electric appliances, while in schools with OHH the main part is used for heating. In buildings with OHH the share of modeled annual heating energy is higher than in buildings with DEH. Although based on small samples our regression results indicate that the presented method can be used for modeling hourly energy consumption in non-residential buildings, but also that larger samples and additional cross-sectional information could yield improved models and more reliable results. - Highlights: • Schools with district heating (DH) tend to use less night-setback. • DH in office buildings tends to start earlier than direct electric heating (DEH). • In schools with DH the main part of annual energy consumption is used for heating. • In office buildings with DH the main part is used for electric appliances. • Buildings with DH use a larger share of energy for heating than buildings with DEH.

  17. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    methodologies. Thesis studies showed that BIM approaches have the potential to improve AEC/FM communication and collaboration. BIM is by its nature multidisciplinary, bringing AEC/FM project participants together and creating constant communication. However, BIM adoption can lead to technical challenges......, Engineering, Construction, and Facility Management (AEC/ FM) communication, and (b) BPS as a platform for early-stage building performance prediction. The second is to develop (a) relevant AEC/FM communication support instruments, and (b) standardized BIM and BPS execution guidelines and information exchange......, for example, getting BIM-compatible tools to communicate properly. Furthermore, BIM adoption requires organizational change, that is changes in AEC/FM work practices and interpersonal dynamics. Consequently, to ensure that the adoption of BIM is successful, it is recommended that common IT regulations...

  18. Object-oriented integrated approach for the design of scalable ECG systems.

    Science.gov (United States)

    Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija

    2009-01-01

    The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.

  19. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    Science.gov (United States)

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  20. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation

    Directory of Open Access Journals (Sweden)

    M. Palme

    2017-10-01

    Full Text Available This data article presents files supporting calculation for urban heat island (UHI inclusion in building performance simulation (BPS. Methodology is used in the research article “From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect” (Palme et al., 2017 [1]. In this research, a Geographical Information System (GIS study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso. Then, a Principal Component Analysis (PCA is done to obtain reference Urban Tissues Categories (UTC to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG software (version 4.1 beta. Finally, BPS is run out with the Transient System Simulation (TRNSYS software (version 17. In this data paper, four sets of data are presented: 1 PCA data (excel to explain how to group different urban samples in representative UTC; 2 UWG data (text to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso; 3 weather data (text with the resulting rural and urban weather; 4 BPS models (text data containing the TRNSYS models (four building models.

  1. Dynamic analysis of reactor containment building using axisymmetric finite element model

    International Nuclear Information System (INIS)

    Thakkar, S.K.; Dubey, R.N.

    1989-01-01

    The structural safety of nuclear reactor building during earthquake is of great importance in view of possibility of radiation hazards. The rational evaluation of forces and displacements in various portions of structure and foundation during strong ground motion is most important for safe performance and economic design of the reactor building. The accuracy of results of dynamic analysis is naturally dependent on the type of mathematical model employed. Three types of mathematical models are employed for dynamic analysis of reactor building beam model axisymmetric finite element model and three dimensional model. In this paper emphasis is laid on axisymmetric model. This model of containment building is considered a reinfinement over conventional beam model of the structure. The nuclear reactor building on a rocky foundation is considered herein. The foundation-structure interaction is relatively less in this condition. The objective of the paper is to highlight the significance of modelling of non-axisymmetric portion of building, such as reactor internals by equivalent axisymmetric body, on the structural response of the building

  2. Determination of the Thermal Insulation for the Model Building Approach and the Global Effects in Turkey

    Directory of Open Access Journals (Sweden)

    Cenk Onan

    2014-08-01

    Full Text Available One of the most important considerations to be considered in the design of energy efficient buildings is the thickness of the insulation to be applied to the building. In this study the existing building stock in Turkey has been investigated depending on parameters such as the height and the area. A model building has been created covering all of these buildings. Fuel emission reduction of combustion system was calculated in the case of insulation applied to this model building. Heat loss of the existing building stock and exhaust emissions and the contribution to the country's economy with the model building methodology are also determined. The results show that the optimum insulation thicknesses vary between 3.21 and 7.12 cm, the energy savings vary between 9.23 US$/m2 and43.95 US$/m2, and the payback periods vary between 1 and 8.8 years depending on the regions. As a result of the study when the optimum insulation thickness is applied in the model building, the total energy savings for the country are calculated to be 41.7 billion US$. And also total CO2 emissions for the country are calculated to be 57.2 billion kg CO2 per year after insulation.

  3. Comparison of sensorless dimming control based on building modeling and solar power generation

    International Nuclear Information System (INIS)

    Lee, Naeun; Kim, Jonghun; Jang, Cheolyong; Sung, Yoondong; Jeong, Hakgeun

    2015-01-01

    Artificial lighting in office buildings accounts for about 30% of the total building energy consumption. Lighting energy is important to reduce building energy consumption since artificial lighting typically has a relatively large energy conversion factor. Therefore, previous studies have proposed a dimming control using daylight. When applied dimming control, method based on building modeling does not need illuminance sensors. Thus, it can be applied to existing buildings that do not have illuminance sensors. However, this method does not accurately reflect real-time weather conditions. On the other hand, solar power generation from a PV (photovoltaic) panel reflects real-time weather conditions. The PV panel as the sensor improves the accuracy of dimming control by reflecting disturbance. Therefore, we compared and analyzed two types of sensorless dimming controls: those based on the building modeling and those that based on solar power generation using PV panels. In terms of energy savings, we found that a dimming control based on building modeling is more effective than that based on solar power generation by about 6%. However, dimming control based on solar power generation minimizes the inconvenience to occupants and can also react to changes in solar radiation entering the building caused by dirty window. - Highlights: • We conducted sensorless dimming control based on solar power generation. • Dimming controls using building modeling and solar power generation were compared. • The real time weather conditions can be considered by using solar power generation. • Dimming control using solar power generation minimizes inconvenience to occupants

  4. Weather Correlations to Calculate Infiltration Rates for U. S. Commercial Building Energy Models.

    Science.gov (United States)

    Ng, Lisa C; Quiles, Nelson Ojeda; Dols, W Stuart; Emmerich, Steven J

    2018-01-01

    As building envelope performance improves, a greater percentage of building energy loss will occur through envelope leakage. Although the energy impacts of infiltration on building energy use can be significant, current energy simulation software have limited ability to accurately account for envelope infiltration and the impacts of improved airtightness. This paper extends previous work by the National Institute of Standards and Technology that developed a set of EnergyPlus inputs for modeling infiltration in several commercial reference buildings using Chicago weather. The current work includes cities in seven additional climate zones and uses the updated versions of the prototype commercial building types developed by the Pacific Northwest National Laboratory for the U. S. Department of Energy. Comparisons were made between the predicted infiltration rates using three representations of the commercial building types: PNNL EnergyPlus models, CONTAM models, and EnergyPlus models using the infiltration inputs developed in this paper. The newly developed infiltration inputs in EnergyPlus yielded average annual increases of 3 % and 8 % in the HVAC electrical and gas use, respectively, over the original infiltration inputs in the PNNL EnergyPlus models. When analyzing the benefits of building envelope airtightening, greater HVAC energy savings were predicted using the newly developed infiltration inputs in EnergyPlus compared with using the original infiltration inputs. These results indicate that the effects of infiltration on HVAC energy use can be significant and that infiltration can and should be better accounted for in whole-building energy models.

  5. Study on vertical seismic response model of BWR-type reactor building

    International Nuclear Information System (INIS)

    Konno, T.; Motohashi, S.; Izumi, M.; Iizuka, S.

    1993-01-01

    A study on advanced seismic design for LWR has been carried out by the Nuclear Power Engineering Corporation (NUPEC), under the sponsorship of the Ministry of International Trade and Industry (MITI) of Japan. As a part of the study, it has been investigated to construct an accurate analytical model of reactor buildings for a seismic response analysis, which can reasonably represent dynamic characteristics of the building. In Japan, vibration models of reactor buildings for horizontal ground motion have been studied and examined through many simulation analyses for forced vibration tests and earthquake observations of actual buildings. And now it is possible to establish a reliable horizontal vibration model on the basis of multi-lumped mass and spring model. However, vertical vibration models have not been so much studied as horizontal models, due to less observed data for vertical motions. In this paper, the vertical seismic response models of a BWR-type reactor building including soil-structure interaction effect are numerically studied, by comparing the dynamic characteristics of (1) three dimensional finite element model, (2) multi-stick lumped mass model with a flexible base-mat, (3) multi-stick lumped mass model with a rigid base-mat and (4) single-stick lumped mass model. In particular, the BWR-type reactor building has the long span truss roof which is considered to be one of the critical members to vertical excitation. The modelings of the roof trusses are also studied

  6. Use of MCAM in creating 3D neutronics model for ITER building

    International Nuclear Information System (INIS)

    Zeng Qin; Wang Guozhong; Dang Tongqiang; Long Pengcheng; Loughlin, Michael

    2012-01-01

    Highlights: ► We created a 3D neutronics model of the ITER building. ► The model was produced from the engineering CAD model by MCAM software. ► The neutron flux map in the ITER building was calculated. - Abstract: The three dimensional (3D) neutronics reference model of International Thermonuclear Experimental Reactor (ITER) only defines the tokamak machine and extends to the bio-shield. In order to meet further 3D neutronics analysis needs, it is necessary to create a 3D reference model of the ITER building. Monte Carlo Automatic Modeling Program for Radiation Transport Simulation (MCAM) was developed as a computer aided design (CAD) based bi-directional interface program between general CAD systems and Monte Carlo radiation transport simulation codes. With the help of MCAM version 4.8, the 3D neutronics model of ITER building was created based on the engineering CAD model. The calculation of the neutron flux map in ITER building during operation showed the correctness and usability of the model. This model is the first detailed ITER building 3D neutronics model and it will be made available to all international organization collaborators as a reference model.

  7. Use of MCAM in creating 3D neutronics model for ITER building

    Energy Technology Data Exchange (ETDEWEB)

    Zeng Qin [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Wang Guozhong, E-mail: mango33@mail.ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Dang Tongqiang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Long Pengcheng [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui 230031 (China); School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui 230027 (China); Loughlin, Michael [ITER Organization, Route de Vinon sur Verdon, 13115 St. Paul-Lz-Durance (France)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer We created a 3D neutronics model of the ITER building. Black-Right-Pointing-Pointer The model was produced from the engineering CAD model by MCAM software. Black-Right-Pointing-Pointer The neutron flux map in the ITER building was calculated. - Abstract: The three dimensional (3D) neutronics reference model of International Thermonuclear Experimental Reactor (ITER) only defines the tokamak machine and extends to the bio-shield. In order to meet further 3D neutronics analysis needs, it is necessary to create a 3D reference model of the ITER building. Monte Carlo Automatic Modeling Program for Radiation Transport Simulation (MCAM) was developed as a computer aided design (CAD) based bi-directional interface program between general CAD systems and Monte Carlo radiation transport simulation codes. With the help of MCAM version 4.8, the 3D neutronics model of ITER building was created based on the engineering CAD model. The calculation of the neutron flux map in ITER building during operation showed the correctness and usability of the model. This model is the first detailed ITER building 3D neutronics model and it will be made available to all international organization collaborators as a reference model.

  8. Construction cost prediction model for conventional and sustainable college buildings in North America

    Directory of Open Access Journals (Sweden)

    Othman Subhi Alshamrani

    2017-03-01

    Full Text Available The literature lacks in initial cost prediction models for college buildings, especially comparing costs of sustainable and conventional buildings. A multi-regression model was developed for conceptual initial cost estimation of conventional and sustainable college buildings in North America. RS Means was used to estimate the national average of construction costs for 2014, which was subsequently utilized to develop the model. The model could predict the initial cost per square feet with two structure types made of steel and concrete. The other predictor variables were building area, number of floors and floor height. The model was developed in three major stages, such as preliminary diagnostics on data quality, model development and validation. The developed model was successfully tested and validated with real-time data.

  9. Exposure Modeling for Polychlorinated Biphenyls in School Buildings

    Science.gov (United States)

    There is limited research on characterizing exposures from PCB sources for occupants of school buildings. PCB measurement results from six schools were used to estimate potential exposure distributions for four age groups (4-5, 6-10, 11-14, 14-18 year-olds) using the Stochastic...

  10. Semantically rich 3D building and cadastral models for valuation

    NARCIS (Netherlands)

    Isikdag, U.; Horhammer, M.; Zlatanova, S.; Kathmann, R.; Van Oosterom, P.J.M.

    2014-01-01

    Valuation of real estate/ properties is in many countries/ cities the basis for fair taxation. The value depends on many aspects, including the physical real world aspects (geometries, material of object as build) and legal/virtual aspects (rights, restrictions, responsibilities, zoning/development

  11. Damping in building structures during earthquakes: test data and modeling

    International Nuclear Information System (INIS)

    Coats, D.W. Jr.

    1982-01-01

    A review and evaluation of the state-of-the-art of damping in building structures during earthquakes is presented. The primary emphasis is in the following areas: 1) the evaluation of commonly used mathematical techniques for incorporating damping effects in both simple and complex systems; 2) a compilation and interpretation of damping test data; and 3) an evaluation of structure testing methods, building instrumentation practices, and an investigation of rigid-body rotation effects on damping values from test data. A literature review provided the basis for evaluating mathematical techiques used to incorporate earthquake induced damping effects in simple and complex systems. A discussion on the effectiveness of damping, as a function of excitation type, is also included. Test data, from a wide range of sources, has been compiled and interpreted for buidings, nuclear power plant structures, piping, equipment, and isolated structural elements. Test methods used to determine damping and frequency parameters are discussed. In particular, the advantages and disadvantages associated with the normal mode and transfer function approaches are evaluated. Additionally, the effect of rigid-body rotations on damping values deduced from strong-motion building response records is investigated. A discussion of identification techniques typically used to determine building parameters (frequency and damping) from strong motion records is included. Finally, an analytical demonstration problem is presented to quantify the potential error in predicting fixed-base structural frequency and damping values from strong motion records, when rigid-body rotations are not properly accounted for

  12. Procedure for identifying models for the heat dynamics of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik

    This report describes a new method for obtaining detailed information about the heat dynamics of a building using frequent reading of the heat consumption. Such a procedure is considered to be of uttermost importance as a key procedure for using readings from smart meters, which is expected...

  13. Modelling surface pressure fluctuation on medium-rise buildings

    NARCIS (Netherlands)

    Snæbjörnsson, J.T.; Geurts, C.P.W.

    2006-01-01

    This paper describes the results of two experiments into the fluctuating characteristics of windinduced pressures on buildings in a built-up environment. The experiments have been carried out independently in Iceland and The Netherlands and can be considered to represent two separate cases of

  14. application of christer's inspection model for building maintenance

    African Journals Online (AJOL)

    system. Information was collected by means of forms and 100 were completed for trades including painter, plumber ... the threat poorly maintained buildings pose to public safety [2]. 2. ..... by operating an inspection system as,. ∫ ∞ h=T. ∫ h.

  15. Aggregation Potentials for Buildings - Business Models of Demand Response and Virtual Power Plants

    DEFF Research Database (Denmark)

    Ma, Zheng; Billanes, Joy Dalmacio; Jørgensen, Bo Nørregaard

    2017-01-01

    programs, national regulations and energy market structures strongly influence buildings’ participation in the aggregation market. Under the current Nordic market regulation, business model one is the most feasible one, and business model two faces more challenges due to regulation barriers and limited...... aggregation market with unclear incentives is still a challenge for buildings to participate in the aggregation market. However, few studies have investigated business models for building participation in the aggregation market. Therefore, this paper develops four business models for buildings to participate...

  16. The research of contamination regularities of historical buildings and architectural monuments by methods of computer modeling

    Directory of Open Access Journals (Sweden)

    Kuzmichev Andrey A.

    2017-01-01

    Full Text Available Due to the active step of urbanization and rapid development of industry the external appearance of buildings and architectural monuments of urban environment from visual ecology position requires special attention. Dust deposition by polluted atmospheric air is one of the key aspects of degradation of the facades of buildings. With the help of modern computer modeling methods it is possible to evaluate the impact of polluted atmospheric air on the external facades of the buildings in order to save them.

  17. BARRIERS AND CHALLENGES OF BUILDING INFORMATION MODELLING IMPLEMENTATION IN JORDANIAN CONSTRUCTION INDUSTRY

    OpenAIRE

    Mohammed A.KA. AL-Btoush*, Ahmad Tarmizi Haron

    2017-01-01

    Construction companies are faced with the need to innovatively integrate the construction process and address project development challenges. One way of doing that is the integration of building information modelling (BIM) in the building design and development cycles. However, due to the lack of clear understanding and the absence of a holistic implementation guideline, many companies are unable to fully achieve BIM potentials or implement BIM in their project and building lifecycle. BIM imp...

  18. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach

    International Nuclear Information System (INIS)

    Lü, Xiaoshu; Lu, Tao; Kibert, Charles J.; Viljanen, Martti

    2015-01-01

    Highlights: • This paper presents a new modeling method to forecast energy demands. • The model is based on physical–statistical approach to improving forecast accuracy. • A new method is proposed to address the heterogeneity challenge. • Comparison with measurements shows accurate forecasts of the model. • The first physical–statistical/heterogeneous building energy modeling approach is proposed and validated. - Abstract: Energy consumption forecasting is a critical and necessary input to planning and controlling energy usage in the building sector which accounts for 40% of the world’s energy use and the world’s greatest fraction of greenhouse gas emissions. However, due to the diversity and complexity of buildings as well as the random nature of weather conditions, energy consumption and loads are stochastic and difficult to predict. This paper presents a new methodology for energy demand forecasting that addresses the heterogeneity challenges in energy modeling of buildings. The new method is based on a physical–statistical approach designed to account for building heterogeneity to improve forecast accuracy. The physical model provides a theoretical input to characterize the underlying physical mechanism of energy flows. Then stochastic parameters are introduced into the physical model and the statistical time series model is formulated to reflect model uncertainties and individual heterogeneity in buildings. A new method of model generalization based on a convex hull technique is further derived to parameterize the individual-level model parameters for consistent model coefficients while maintaining satisfactory modeling accuracy for heterogeneous buildings. The proposed method and its validation are presented in detail for four different sports buildings with field measurements. The results show that the proposed methodology and model can provide a considerable improvement in forecasting accuracy

  19. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  20. Modeling and optimization of energy generation and storage systems for thermal conditioning of buildings targeting conceptual building design

    Energy Technology Data Exchange (ETDEWEB)

    Grahovac, Milica

    2012-11-29

    The thermal conditioning systems are responsible for almost half of the energy consump-tion by commercial buildings. In many European countries and in the USA, buildings account for around 40% of primary energy consumption and it is therefore vital to explore further ways to reduce the HVAC (Heating, Ventilation and Air Conditioning) system energy consumption. This thesis investigates the relationship between the energy genera-tion and storage systems for thermal conditioning of buildings (shorter: primary HVAC systems) and the conceptual building design. Certain building design decisions irreversibly influence a building's energy performance and, conversely, many generation and storage components impose restrictions on building design and, by their nature, cannot be introduced at a later design stage. The objective is, firstly, to develop a method to quantify this influence, in terms of primary HVAC system dimensions, its cost, emissions and energy consumption and, secondly, to enable the use of the developed method by architects during the conceptual design. In order to account for the non-stationary effects of the intermittent renewable energy sources (RES), thermal storage and for the component part load efficiencies, a time domain system simulation is required. An abstract system simulation method is proposed based on seven pre-configured primary HVAC system models, including components such as boil-ers, chillers and cooling towers, thermal storage, solar thermal collectors, and photovoltaic modules. A control strategy is developed for each of the models and their annual quasi-stationary simulation is performed. The performance profiles obtained are then used to calculate the energy consumption, carbon emissions and costs. The annuity method has been employed to calculate the cost. Optimization is used to automatically size the HVAC systems, based on their simulation performance. Its purpose is to identify the system component dimensions that provide

  1. Modeling and optimization of energy generation and storage systems for thermal conditioning of buildings targeting conceptual building design

    Energy Technology Data Exchange (ETDEWEB)

    Grahovac, Milica

    2012-11-29

    The thermal conditioning systems are responsible for almost half of the energy consump-tion by commercial buildings. In many European countries and in the USA, buildings account for around 40% of primary energy consumption and it is therefore vital to explore further ways to reduce the HVAC (Heating, Ventilation and Air Conditioning) system energy consumption. This thesis investigates the relationship between the energy genera-tion and storage systems for thermal conditioning of buildings (shorter: primary HVAC systems) and the conceptual building design. Certain building design decisions irreversibly influence a building's energy performance and, conversely, many generation and storage components impose restrictions on building design and, by their nature, cannot be introduced at a later design stage. The objective is, firstly, to develop a method to quantify this influence, in terms of primary HVAC system dimensions, its cost, emissions and energy consumption and, secondly, to enable the use of the developed method by architects during the conceptual design. In order to account for the non-stationary effects of the intermittent renewable energy sources (RES), thermal storage and for the component part load efficiencies, a time domain system simulation is required. An abstract system simulation method is proposed based on seven pre-configured primary HVAC system models, including components such as boil-ers, chillers and cooling towers, thermal storage, solar thermal collectors, and photovoltaic modules. A control strategy is developed for each of the models and their annual quasi-stationary simulation is performed. The performance profiles obtained are then used to calculate the energy consumption, carbon emissions and costs. The annuity method has been employed to calculate the cost. Optimization is used to automatically size the HVAC systems, based on their simulation performance. Its purpose is to identify the system component dimensions that provide minimal

  2. Modeling volatile organic compounds sorption on dry building materials using double-exponential model

    International Nuclear Information System (INIS)

    Deng, Baoqing; Ge, Di; Li, Jiajia; Guo, Yuan; Kim, Chang Nyung

    2013-01-01

    A double-exponential surface sink model for VOCs sorption on building materials is presented. Here, the diffusion of VOCs in the material is neglected and the material is viewed as a surface sink. The VOCs concentration in the air adjacent to the material surface is introduced and assumed to always maintain equilibrium with the material-phase concentration. It is assumed that the sorption can be described by mass transfer between the room air and the air adjacent to the material surface. The mass transfer coefficient is evaluated from the empirical correlation, and the equilibrium constant can be obtained by linear fitting to the experimental data. The present model is validated through experiments in small and large test chambers. The predicted results accord well with the experimental data in both the adsorption stage and desorption stage. The model avoids the ambiguity of model constants found in other surface sink models and is easy to scale up

  3. Algorithmic psychometrics and the scalable subject.

    Science.gov (United States)

    Stark, Luke

    2018-04-01

    Recent public controversies, ranging from the 2014 Facebook 'emotional contagion' study to psychographic data profiling by Cambridge Analytica in the 2016 American presidential election, Brexit referendum and elsewhere, signal watershed moments in which the intersecting trajectories of psychology and computer science have become matters of public concern. The entangled history of these two fields grounds the application of applied psychological techniques to digital technologies, and an investment in applying calculability to human subjectivity. Today, a quantifiable psychological subject position has been translated, via 'big data' sets and algorithmic analysis, into a model subject amenable to classification through digital media platforms. I term this position the 'scalable subject', arguing it has been shaped and made legible by algorithmic psychometrics - a broad set of affordances in digital platforms shaped by psychology and the behavioral sciences. In describing the contours of this 'scalable subject', this paper highlights the urgent need for renewed attention from STS scholars on the psy sciences, and on a computational politics attentive to psychology, emotional expression, and sociality via digital media.

  4. Integrated model for characterization of spatiotemporal building energy consumption patterns in neighborhoods and city districts

    International Nuclear Information System (INIS)

    Fonseca, Jimeno A.; Schlueter, Arno

    2015-01-01

    Highlights: • A model to describe spatiotemporal building energy demand patterns was developed. • The model integrates existing methods in urban and energy planning domains. • The model is useful to analyze energy efficiency strategies in neighborhoods. • Applicability in educational, urban and energy planning practices was found. - Abstract: We introduce an integrated model for characterization of spatiotemporal building energy consumption patterns in neighborhoods and city districts. The model addresses the need for a comprehensive method to identify present and potential states of building energy consumption in the context of urban transformation. The focus lies on determining the spatiotemporal variability of energy services in both standing and future buildings in the residential, commercial and industrial sectors. This detailed characterization facilitates the assessment of potential energy efficiency measures at the neighborhood and city district scales. In a novel approach we integrated existing methods in urban and energy planning domains such as spatial analysis, dynamic building energy modeling and energy mapping to provide a comprehensive, multi-scale and multi-dimensional model of analysis. The model is part of a geographic information system (GIS), which serves as a platform for the allocation and future dissemination of spatiotemporal data. The model is validated against measured data and a peer model for a city district in Switzerland. In this context, we present practical applications in the analysis of energy efficiency measures in buildings and urban zoning. We furthermore discuss potential applications in educational, urban and energy planning practices

  5. A financing model to solve financial barriers for implementing green building projects.

    Science.gov (United States)

    Lee, Sanghyo; Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained.

  6. A control-oriented model for combined building climate comfort and aquifer thermal energy storage system

    NARCIS (Netherlands)

    Rostampour Samarin, Vahab; Bloemendal, J.M.; Jaxa-Rozen, M.; Keviczky, T.

    2016-01-01

    This paper presents a control-oriented model for combined building climate comfort and aquifer thermal energy storage (ATES) system. In particular, we first provide a description of building operational systems together with control framework variables. We then focus on the derivation of an

  7. A Financing Model to Solve Financial Barriers for Implementing Green Building Projects

    Science.gov (United States)

    Lee, Baekrae; Kim, Juhyung; Kim, Jaejun

    2013-01-01

    Along with the growing interest in greenhouse gas reduction, the effect of greenhouse gas energy reduction from implementing green buildings is gaining attention. The government of the Republic of Korea has set green growth as its paradigm for national development, and there is a growing interest in energy saving for green buildings. However, green buildings may have financial barriers that have high initial construction costs and uncertainties about future project value. Under the circumstances, governmental support to attract private funding is necessary to implement green building projects. The objective of this study is to suggest a financing model for facilitating green building projects with a governmental guarantee based on Certified Emission Reduction (CER). In this model, the government provides a guarantee for the increased costs of a green building project in return for CER. And this study presents the validation of the model as well as feasibility for implementing green building project. In addition, the suggested model assumed governmental guarantees for the increased cost, but private guarantees seem to be feasible as well because of the promising value of the guarantee from CER. To do this, certification of Clean Development Mechanisms (CDMs) for green buildings must be obtained. PMID:24376379

  8. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen; Guo, Siyue

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an office building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.

  9. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  10. Developing Scalable Information Security Systems

    Directory of Open Access Journals (Sweden)

    Valery Konstantinovich Ablekov

    2013-06-01

    Full Text Available Existing physical security systems has wide range of lacks, including: high cost, a large number of vulnerabilities, problems of modification and support system. This paper covers an actual problem of developing systems without this list of drawbacks. The paper presents the architecture of the information security system, which operates through the network protocol TCP/IP, including the ability to connect different types of devices and integration with existing security systems. The main advantage is a significant increase in system reliability, scalability, both vertically and horizontally, with minimal cost of both financial and time resources.

  11. An Empirical Validation of Building Simulation Software for Modelling of Double-Skin Facade (DSF)

    DEFF Research Database (Denmark)

    Larsen, Olena Kalyanova; Heiselberg, Per; Felsmann, Clemens

    2009-01-01

    buildings, but their accuracy might be limited in cases with DSFs because of the complexity of the heat and mass transfer processes within the DSF. To address this problem, an empirical validation of building models with DSF, performed with various building simulation tools (ESP-r, IDA ICE 3.0, VA114......Double-skin facade (DSF) buildings are being built as an attractive, innovative and energy efficient solution. Nowadays, several design tools are used for assessment of thermal and energy performance of DSF buildings. Existing design tools are well-suited for performance assessment of conventional......, TRNSYS-TUD and BSim) was carried out in the framework of IEA SHC Task 34 /ECBCS Annex 43 "Testing and Validation of Building Energy Simulation Tools". The experimental data for the validation was gathered in a full-scale outdoor test facility. The empirical data sets comprise the key-functioning modes...

  12. Reducing failures rate within the project documentation using Building Information Modelling, especially Level of Development

    Directory of Open Access Journals (Sweden)

    Prušková Kristýna

    2018-01-01

    Full Text Available Paper´s focus is on differences between traditional modelling in 2D software and modelling within the BIM technology. Research uncovers failures connected to the traditional way of designing and construction of project documentation. There are revealed and shown mismatches within the project documentation. Solution within the Building information modelling Technology is outlined. As a reference, there is used experience with design of specific building in both ways of construction of project documentation: in the way of traditional modelling and in the way when using BIM technology, especially using Level of Development. Output of this paper is pointing to benefits of using advanced technology in building design, thus Building Information Modelling, especially Level of Development, which leads to reducing failures rate within the project documentation.

  13. Building Energy Modeling and Control Methods for Optimization and Renewables Integration

    Science.gov (United States)

    Burger, Eric M.

    This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled

  14. Scalable, full-colour and controllable chromotropic plasmonic printing

    Science.gov (United States)

    Xue, Jiancai; Zhou, Zhang-Kai; Wei, Zhiqiang; Su, Rongbin; Lai, Juan; Li, Juntao; Li, Chao; Zhang, Tengwei; Wang, Xue-Hua

    2015-01-01

    Plasmonic colour printing has drawn wide attention as a promising candidate for the next-generation colour-printing technology. However, an efficient approach to realize full colour and scalable fabrication is still lacking, which prevents plasmonic colour printing from practical applications. Here we present a scalable and full-colour plasmonic printing approach by combining conjugate twin-phase modulation with a plasmonic broadband absorber. More importantly, our approach also demonstrates controllable chromotropic capability, that is, the ability of reversible colour transformations. This chromotropic capability affords enormous potentials in building functionalized prints for anticounterfeiting, special label, and high-density data encryption storage. With such excellent performances in functional colour applications, this colour-printing approach could pave the way for plasmonic colour printing in real-world commercial utilization. PMID:26567803

  15. Scalable power selection method for wireless mesh networks

    CSIR Research Space (South Africa)

    Olwal, TO

    2009-01-01

    Full Text Available This paper addresses the problem of a scalable dynamic power control (SDPC) for wireless mesh networks (WMNs) based on IEEE 802.11 standards. An SDPC model that accounts for architectural complexities witnessed in multiple radios and hops...

  16. Scalability and efficiency of genetic algorithms for geometrical applications

    NARCIS (Netherlands)

    Dijk, van S.F.; Thierens, D.; Berg, de M.; Schoenauer, M.

    2000-01-01

    We study the scalability and efficiency of a GA that we developed earlier to solve the practical cartographic problem of labeling a map with point features. We argue that the special characteristics of our GA make that it fits in well with theoretical models predicting the optimal population size

  17. A Massively Scalable Architecture for Instant Messaging & Presence

    NARCIS (Netherlands)

    Schippers, Jorrit; Remke, Anne Katharina Ingrid; Punt, Henk; Wegdam, M.; Haverkort, Boudewijn R.H.M.; Thomas, N.; Bradley, J.; Knottenbelt, W.; Dingle, N.; Harder, U.

    2010-01-01

    This paper analyzes the scalability of Instant Messaging & Presence (IM&P) architectures. We take a queueing-based modelling and analysis approach to ��?nd the bottlenecks of the current IM&P architecture at the Dutch social network Hyves, as well as of alternative architectures. We use the

  18. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  19. Evaluation of the Effective Moisture Penetration Depth Model for Estimating Moisture Buffering in Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Woods, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Winkler, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-01-01

    This study examines the effective moisture penetration depth (EMPD) model, and its suitability for building simulations. The EMPD model is a compromise between the simple, inaccurate effective capacitance approach and the complex, yet accurate, finite-difference approach. Two formulations of the EMPD model were examined, including the model used in the EnergyPlus building simulation software. An error in the EMPD model we uncovered was fixed with the release of EnergyPlus version 7.2, and the EMPD model in earlier versions of EnergyPlus should not be used.

  20. Modeling of Heat Transfer in Rooms in the Modelica "Buildings" Library

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nouidui, Thierry Stephane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-11-01

    This paper describes the implementation of the room heat transfer model in the free open-source Modelica \\Buildings" library. The model can be used as a single room or to compose a multizone building model. We discuss how the model is decomposed into submodels for the individual heat transfer phenomena. We also discuss the main physical assumptions. The room model can be parameterized to use different modeling assumptions, leading to linear or non-linear differential algebraic systems of equations. We present numerical experiments that show how these assumptions affect computing time and accuracy for selected cases of the ANSI/ASHRAE Standard 140- 2007 envelop validation tests.