WorldWideScience

Sample records for building scalable models

  1. Building scalable apps with Redis and Node.js

    CERN Document Server

    Johanan, Joshua

    2014-01-01

    If the phrase scalability sounds alien to you, then this is an ideal book for you. You will not need much Node.js experience as each framework is demonstrated in a way that requires no previous knowledge of the framework. You will be building scalable Node.js applications in no time! Knowledge of JavaScript is required.

  2. Scalability of Sustainable Business Models in Hybrid Organizations

    Directory of Open Access Journals (Sweden)

    Adam Jabłoński

    2016-02-01

    Full Text Available The dynamics of change in modern business create new mechanisms for company management to determine their pursuit and the achievement of their high performance. This performance maintained over a long period of time becomes a source of ensuring business continuity by companies. An ontological being enabling the adoption of such assumptions is such a business model that has the ability to generate results in every possible market situation and, moreover, it has the features of permanent adaptability. A feature that describes the adaptability of the business model is its scalability. Being a factor ensuring more work and more efficient work with an increasing number of components, scalability can be applied to the concept of business models as the company’s ability to maintain similar or higher performance through it. Ensuring the company’s performance in the long term helps to build the so-called sustainable business model that often balances the objectives of stakeholders and shareholders, and that is created by the implemented principles of value-based management and corporate social responsibility. This perception of business paves the way for building hybrid organizations that integrate business activities with pro-social ones. The combination of an approach typical of hybrid organizations in designing and implementing sustainable business models pursuant to the scalability criterion seems interesting from the cognitive point of view. Today, hybrid organizations are great spaces for building effective and efficient mechanisms for dialogue between business and society. This requires the appropriate business model. The purpose of the paper is to present the conceptualization and operationalization of scalability of sustainable business models that determine the performance of a hybrid organization in the network environment. The paper presents the original concept of applying scalability in sustainable business models with detailed

  3. A Scalability Model for ECS's Data Server

    Science.gov (United States)

    Menasce, Daniel A.; Singhal, Mukesh

    1998-01-01

    This report presents in four chapters a model for the scalability analysis of the Data Server subsystem of the Earth Observing System Data and Information System (EOSDIS) Core System (ECS). The model analyzes if the planned architecture of the Data Server will support an increase in the workload with the possible upgrade and/or addition of processors, storage subsystems, and networks. The approaches in the report include a summary of the architecture of ECS's Data server as well as a high level description of the Ingest and Retrieval operations as they relate to ECS's Data Server. This description forms the basis for the development of the scalability model of the data server and the methodology used to solve it.

  4. The Concept of Business Model Scalability

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten

    2015-01-01

    The power of business models lies in their ability to visualize and clarify how firms’ may configure their value creation processes. Among the key aspects of business model thinking are a focus on what the customer values, how this value is best delivered to the customer and how strategic partner...... and discusses the term scalability from a company-level perspective. It illustrates how managers should be using this term for the benefit of their business by focusing on business models capable of achieving exponentially increasing returns to scale.......The power of business models lies in their ability to visualize and clarify how firms’ may configure their value creation processes. Among the key aspects of business model thinking are a focus on what the customer values, how this value is best delivered to the customer and how strategic partners...... are leveraged in this value creation, delivery and realization exercise. Central to the mainstream understanding of business models is the value proposition towards the customer and the hypothesis generated is that if the firm delivers to the customer what he/she requires, then there is a good...

  5. Model Building

    OpenAIRE

    Frampton, Paul H.

    1997-01-01

    In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly...

  6. Scalable Text Mining with Sparse Generative Models

    OpenAIRE

    Puurula, Antti

    2016-01-01

    The information age has brought a deluge of data. Much of this is in text form, insurmountable in scope for humans and incomprehensible in structure for computers. Text mining is an expanding field of research that seeks to utilize the information contained in vast document collections. General data mining methods based on machine learning face challenges with the scale of text data, posing a need for scalable text mining methods. This thesis proposes a solution to scalable text mining: gener...

  7. Building a Distributed Infrastructure for Scalable Triple Stores

    Institute of Scientific and Technical Information of China (English)

    Jing Zhou; Wendy Hall; David De Roure

    2009-01-01

    Built specifically for the Semantic Web, triple stores are required to accommodate a large number of RDF triples and remain primarily centralized. As triple stores grow and evolve with time, there is a demanding need for scalable techniques to remove resource and performance bottlenecks in such systems. To this end, we propose a fully decentralized peer-to-peer architecture for large scale triple stores in which triples are maintained by individual stakeholders, and a semantics-directed search protocol, mediated by topology reorganization, for locating triples of interest. We test our design through simulations and the results show anticipated improvements over existing techniques for distributed triple stores. In addition to engineering future large scale triple stores, our work will in particular benefit the federation of stand-alone triple stores of today to achieve desired scalability.

  8. Building a scalable event-level metadata service for ATLAS

    International Nuclear Information System (INIS)

    The ATLAS TAG Database is a multi-terabyte event-level metadata selection system, intended to allow discovery, selection of and navigation to events of interest to an analysis. The TAG Database encompasses file- and relational-database-resident event-level metadata, distributed across all ATLAS Tiers. An oracle hosted global TAG relational database, containing all ATLAS events, implemented in Oracle, will exist at Tier O. Implementing a system that is both performant and manageable at this scale is a challenge. A 1 TB relational TAG Database has been deployed at Tier 0 using simulated tag data. The database contains one billion events, each described by two hundred event metadata attributes, and is currently undergoing extensive testing in terms of queries, population and manageability. These 1 TB tests aim to demonstrate and optimise the performance and scalability of an Oracle TAG Database on a global scale. Partitioning and indexing strategies are crucial to well-performing queries and manageability of the database and have implications for database population and distribution, so these are investigated. Physics query patterns are anticipated, but a crucial feature of the system must be to support a broad range of queries across all attributes. Concurrently, event tags from ATLAS Computing System Commissioning distributed simulations are accumulated in an Oracle-hosted database at CERN, providing an event-level selection service valuable for user experience and gathering information about physics query patterns. In this paper we describe the status of the Global TAG relational database scalability work and highlight areas of future direction

  9. Scalable learning of probabilistic latent models for collaborative filtering

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2015-01-01

    Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represe...... favorable behavior in relation to cold-start situations....

  10. Semantic Models for Scalable Search in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Dennis Pfisterer

    2013-03-01

    Full Text Available The Internet of Things is anticipated to connect billions of embedded devices equipped with sensors to perceive their surroundings. Thereby, the state of the real world will be available online and in real-time and can be combined with other data and services in the Internet to realize novel applications such as Smart Cities, Smart Grids, or Smart Healthcare. This requires an open representation of sensor data and scalable search over data from diverse sources including sensors. In this paper we show how the Semantic Web technologies RDF (an open semantic data format and SPARQL (a query language for RDF-encoded data can be used to address those challenges. In particular, we describe how prediction models can be employed for scalable sensor search, how these prediction models can be encoded as RDF, and how the models can be queried by means of SPARQL.

  11. A Scalable Prescriptive Parallel Debugging Model

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Quarfot Nielsen, Niklas; Lee, Gregory L.;

    2015-01-01

    Debugging is a critical step in the development of any parallel program. However, the traditional interactive debugging model, where users manually step through code and inspect their application, does not scale well even for current supercomputers due its centralized nature. While lightweight...

  12. Optimal, scalable forward models for computing gravity anomalies

    International Nuclear Information System (INIS)

    Full text: We describe three approaches for computing a gravity signal from a density anomaly. These include the classical summation method and two methods which solve the Poisson equation for the gravitational potential using either Finite Elements (FE) coupled with a multilevel preconditioner, or the Fast Multipole Method (FMM). The methods utilising the PDE formulation described here differ from previously published approaches in the gravity modelling literature in that they are optimal, implying that the memory and computational time required both scale linearly with respect to the number of unknowns in the potential field. All of the implementations presented here are developed to run on massively parallel, distributed memory computers. Through numerical experiments, we compare the methods on the basis of their discretisation error, CPU time and parallel scalability. We demonstrate the parallel scalability of all these techniques by running forward models with upto 108 voxels on 1000's of cores. (author)

  13. Building Models and Building Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Kaj Asbjørn; Skauge, Jørn

    teoretiske basis for de kapitler, der har et mere teoretisk indhold. De følgende appendikser B-D indeholder nærmere karakteristika om de to modellerings CAD-programmer ArchiCAD og Architectural Desktop tillige med en sammenligning mellem de to værktøjer. I de resterende to appendikser beskrives de specielle...... problemstillinger vedrørende modellering af de to "Sorthøjparken"-modeller og de resul­terende modeller bliver præsenteret og evalueret. Den samlede rapport er udgivet på projektets hjemmeside: www.iprod.aau.dk/bygit/Web3B/ under Technical Reports....

  14. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  15. MR-Tree - A Scalable MapReduce Algorithm for Building Decision Trees

    Directory of Open Access Journals (Sweden)

    Vasile PURDILĂ

    2014-03-01

    Full Text Available Learning decision trees against very large amounts of data is not practical on single node computers due to the huge amount of calculations required by this process. Apache Hadoop is a large scale distributed computing platform that runs on commodity hardware clusters and can be used successfully for data mining task against very large datasets. This work presents a parallel decision tree learning algorithm expressed in MapReduce programming model that runs on Apache Hadoop platform and has a very good scalability with dataset size.

  16. ANALYZING AVIATION SAFETY REPORTS: FROM TOPIC MODELING TO SCALABLE MULTI-LABEL CLASSIFICATION

    Data.gov (United States)

    National Aeronautics and Space Administration — ANALYZING AVIATION SAFETY REPORTS: FROM TOPIC MODELING TO SCALABLE MULTI-LABEL CLASSIFICATION AMRUDIN AGOVIC*, HANHUAI SHAN, AND ARINDAM BANERJEE Abstract. The...

  17. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  18. A Framework for Modelling and Analysis of Software Systems Scalability

    OpenAIRE

    Duboc, L.; Rosenblum, D. S.; Wicks, T.

    2006-01-01

    Scalability is a widely-used term in scientific papers, technical magazines and software descriptions. Its use in the most varied contexts contribute to a general confusion about what the term really means. This lack of consensus is a potential source of problems, as assumptions are made in the face of a scalability claim. A clearer and widely-accepted understanding of scalability is required to restore the usefulness of the term. This research investigates commonly found definitions of scala...

  19. Optimal, scalable forward models for computing gravity anomalies

    CERN Document Server

    May, Dave A

    2011-01-01

    We describe three approaches for computing a gravity signal from a density anomaly. The first approach consists of the classical "summation" technique, whilst the remaining two methods solve the Poisson problem for the gravitational potential using either a Finite Element (FE) discretization employing a multilevel preconditioner, or a Green's function evaluated with the Fast Multipole Method (FMM). The methods utilizing the PDE formulation described here differ from previously published approaches used in gravity modeling in that they are optimal, implying that both the memory and computational time required scale linearly with respect to the number of unknowns in the potential field. Additionally, all of the implementations presented here are developed such that the computations can be performed in a massively parallel, distributed memory computing environment. Through numerical experiments, we compare the methods on the basis of their discretization error, CPU time and parallel scalability. We demonstrate t...

  20. Developing a scalable modeling architecture for studying survivability technologies

    Science.gov (United States)

    Mohammad, Syed; Bounker, Paul; Mason, James; Brister, Jason; Shady, Dan; Tucker, David

    2006-05-01

    To facilitate interoperability of models in a scalable environment, and provide a relevant virtual environment in which Survivability technologies can be evaluated, the US Army Research Development and Engineering Command (RDECOM) Modeling Architecture for Technology Research and Experimentation (MATREX) Science and Technology Objective (STO) program has initiated the Survivability Thread which will seek to address some of the many technical and programmatic challenges associated with the effort. In coordination with different Thread customers, such as the Survivability branches of various Army labs, a collaborative group has been formed to define the requirements for the simulation environment that would in turn provide them a value-added tool for assessing models and gauge system-level performance relevant to Future Combat Systems (FCS) and the Survivability requirements of other burgeoning programs. An initial set of customer requirements has been generated in coordination with the RDECOM Survivability IPT lead, through the Survivability Technology Area at RDECOM Tank-automotive Research Development and Engineering Center (TARDEC, Warren, MI). The results of this project are aimed at a culminating experiment and demonstration scheduled for September, 2006, which will include a multitude of components from within RDECOM and provide the framework for future experiments to support Survivability research. This paper details the components with which the MATREX Survivability Thread was created and executed, and provides insight into the capabilities currently demanded by the Survivability faculty within RDECOM.

  1. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  2. A framework for building scalable web applications for high-resolution cluster-based display walls

    OpenAIRE

    Tang, Jason

    2015-01-01

    As technology advances, researchers in the natural sciences collect ever-increasing amounts of data. While computer science research often focuses on effective ways to perform computations on large data sets, the visualization of large data sets can be just as important for achieving new insights. Just as cluster computing enables scalable computation on large data sets, so can cluster-based display walls enable scalable visualization of large data sets. At the same time, visualization ...

  3. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  4. Semantic Models for Scalable Search in the Internet of Things

    OpenAIRE

    Dennis Pfisterer; Kay Römer; Richard Mietz; Sven Groppe

    2013-01-01

    The Internet of Things is anticipated to connect billions of embedded devices equipped with sensors to perceive their surroundings. Thereby, the state of the real world will be available online and in real-time and can be combined with other data and services in the Internet to realize novel applications such as Smart Cities, Smart Grids, or Smart Healthcare. This requires an open representation of sensor data and scalable search over data from diverse sources including sensors. In this paper...

  5. Building a Model Astrolabe

    CERN Document Server

    Ford, Dominic

    2012-01-01

    This paper presents a hands-on introduction to the medieval astrolabe, based around a working model which can be constructed from photocopies of the supplied figures. As well as describing how to assemble the model, I also provide a brief explanation of how each of its various parts might be used. The printed version of this paper includes only the parts needed to build a single model prepared for use at latitudes around 52{\\deg}N, but an accompanying electronic file archive includes equivalent images which can be used to build models prepared for use at any other latitude. The vector graphics scripts used to generate the models are also available for download, allowing customised astrolabes to be made.

  6. A Scalable Blocked Gibbs Sampling Algorithm For Gaussian And Poisson Regression Models

    OpenAIRE

    Johnson, Nicholas A.; Kuehnel, Frank O.; Amini, Ali Nasiri

    2016-01-01

    Markov Chain Monte Carlo (MCMC) methods are a popular technique in Bayesian statistical modeling. They have long been used to obtain samples from posterior distributions, but recent research has focused on the scalability of these techniques for large problems. We do not develop new sampling methods but instead describe a blocked Gibbs sampler which is sufficiently scalable to accomodate many interesting problems. The sampler we describe applies to a restricted subset of the Generalized Linea...

  7. Exploratory Model Building

    OpenAIRE

    Bhatnagar, Raj

    2013-01-01

    Some instances of creative thinking require an agent to build and test hypothetical theories. Such a reasoner needs to explore the space of not only those situations that have occurred in the past, but also those that are rationally conceivable. In this paper we present a formalism for exploring the space of conceivable situation-models for those domains in which the knowledge is primarily probabilistic in nature. The formalism seeks to construct consistent, minimal, and desirable situation-d...

  8. RF CMOS modeling: a scalable model of RF-MOSFET with different numbers of fingers

    Energy Technology Data Exchange (ETDEWEB)

    Yu Yuning; Sun Lingling; Liu Jun, E-mail: yuyuning126@126.com [Key Laboratory of RF Circuits and Systems of Ministry of Education, Hangzhou Dianzi University, Hangzhou 310018 (China)

    2010-11-15

    A novel scalable model for multi-finger RF MOSFETs modeling is presented. All the parasitic components, including gate resistance, substrate resistance and wiring capacitance, are directly determined from the layout. This model is further verified using a standard 0.13 {mu}m RF CMOS process with nMOSFETs of different numbers of gate fingers, with the per gate width fixed at 2.5 {mu}m and the gate length at 0.13 {mu}m. Excellent agreement between measured and simulated S-parameters from 100 MHz to 20 GHz demonstrate the validity of this model.

  9. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  10. A simple, scalable and low-cost method to generate thermal diagnostics of a domestic building

    International Nuclear Information System (INIS)

    Highlights: • Our diagnostic method uses a single field measurement from a temperature logger. • Building technical performance and occupant behaviour are addressed simultaneously. • Our algorithm learns a thermal model of a home and diagnoses the heating system. • We propose a novel clustering approach to decouple user behaviour from technical performance. • Our diagnostic confidence is enhanced using a large scale deployment. - Abstract: Traditional approaches to understand the problem of the energy performance in the domestic sector include on-site surveys by energy assessors and the installation of complex home energy monitoring systems. The time and money that needs to be invested by the occupants and the form of feedback generated by these approaches often makes them unattractive to householders. This paper demonstrates a simple, low cost method that generates thermal diagnostics for dwellings, measuring only one field dataset; internal temperature over a period of 1 week. A thermal model, which is essentially a learning algorithm, generates a set of thermal diagnostics about the primary heating system, the occupants’ preferences and the impact of certain interventions, such as lowering the thermostat set-point. A simple clustering approach is also proposed to categorise homes according to their building fabric thermal performance and occupants’ energy efficiency with respect to ventilation. The advantage of this clustering approach is that the occupants receive tailored advice on certain actions that if taken will improve the overall thermal performance of a dwelling. Due to the method’s low cost and simplicity it could facilitate government initiatives, such as the ‘Green Deal’ in the UK

  11. Scalability of the Muscular Action in a Parametric 3D Model of the Index Finger

    OpenAIRE

    Sancho Brú, Joaquín Luís; Vergara Monedero, Margarita; Rodríguez Cervantes, Pablo Jesús; Giurintano, David J.; Pérez González, Antonio

    2008-01-01

    A method for scaling the muscle action is proposed and used to achieve a 3D inverse dynamic model of the human finger with all its components scalable. This method is based on scaling the PCSA (physiological cross-sectional area) in a Hill muscle model. Different anthropometric parameters and maximal grip force data have been measured and their correlations have been analysed and used for scaling the PCSA of each muscle. A linear relationship between the normalised PCSA and the pr...

  12. Scalable audio separation with light kernel additive modelling

    OpenAIRE

    Liutkus, Antoine; Fitzgerald, Derry; Rafii, Zafar

    2015-01-01

    Recently, Kernel Additive Modelling (KAM) was proposed as a unified framework to achieve multichannel audio source separation. Its main feature is to use kernel models for locally describing the spectrograms of the sources. Such kernels can capture source features such as repetitivity, stability over time and/or frequency, self-similarity, etc. KAM notably subsumes many popular and effective methods from the state of the art, including REPET and harmonic/percussive separation with median filt...

  13. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  14. Modeling as Theory-Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    The purpose of this contribution is to make the idea of modeling as theory-building operational. We conceive the modeling process as a theory-building process, thereby opening up a new perspective on the methodology of modeling in the social sciences. By reconceptualizing the notion of modeling, we hope to convey the advantages of more conceptual thinking in management. Practitioners could gain effectiveness in dealing with external complexity if they would espouse the modeling task as a disc...

  15. Model Transport: Towards Scalable Transfer Learning on Manifolds

    DEFF Research Database (Denmark)

    Freifeld, Oren; Hauberg, Søren; Black, Michael J.

    “commutes” with learning. Consequently, our compact framework, applicable to a large class of manifolds, is not restricted by the size of either the training or test sets. We demonstrate the approach by transferring PCA and logistic-regression models of real-world data involving 3D shapes and image...

  16. A Scalable Cloud Library Empowering Big Data Management, Diagnosis, and Visualization of Cloud-Resolving Models

    Science.gov (United States)

    Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.

    2015-12-01

    A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a

  17. Scalable and Robust BDDC Preconditioners for Reservoir and Electromagnetics Modeling

    KAUST Repository

    Zampini, S.

    2015-09-13

    The purpose of the study is to show the effectiveness of recent algorithmic advances in Balancing Domain Decomposition by Constraints (BDDC) preconditioners for the solution of elliptic PDEs with highly heterogeneous coefficients, and discretized by means of the finite element method. Applications to large linear systems generated by div- and curl- conforming finite elements discretizations commonly arising in the contexts of modelling reservoirs and electromagnetics will be presented.

  18. Scalability of Sustainable Business Models in Hybrid Organizations

    OpenAIRE

    Adam Jabłoński

    2016-01-01

    The dynamics of change in modern business create new mechanisms for company management to determine their pursuit and the achievement of their high performance. This performance maintained over a long period of time becomes a source of ensuring business continuity by companies. An ontological being enabling the adoption of such assumptions is such a business model that has the ability to generate results in every possible market situation and, moreover, it has the features of permanent adapta...

  19. Model Transport: Towards Scalable Transfer Learning on Manifolds

    OpenAIRE

    Freifeld, Oren; Hauberg, Søren; Black, Michael J.

    2014-01-01

    We consider the intersection of two research fields: transfer learning and statistics on manifolds. In particular, we consider, for manifold-valued data, transfer learning of tangent-space models such as Gaussians distributions, PCA, regression, or classifiers. Though one would hope to simply use ordinary Rn-transfer learning ideas, the manifold structure prevents it. We overcome this by basing our method on inner-product-preserving parallel transport, a well-known tool widely used in other p...

  20. Lightweight and Scalable Intrusion Trace Classification Using Interelement Dependency Models Suitable for Wireless Sensor Network Environment

    OpenAIRE

    Dae-Ki Kang

    2013-01-01

    We present a lightweight and scalable method for classifying network and program traces to detect system intrusion attempts. By employing interelement dependency models to overcome the independence violation problem inherent in the Naive Bayes learners, our method yields intrusion detectors with better accuracy. For efficient and lightweight counting of -gram features without losing accuracy, we use a -truncated generalized suffix tree ( -TGST) for storing -gram features. The -TGST storage me...

  1. Bayesian forecasting and scalable multivariate volatility analysis using simultaneous graphical dynamic models

    OpenAIRE

    Gruber, Lutz F.; West, Mike

    2016-01-01

    The recently introduced class of simultaneous graphical dynamic linear models (SGDLMs) defines an ability to scale on-line Bayesian analysis and forecasting to higher-dimensional time series. This paper advances the methodology of SGDLMs, developing and embedding a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. The advances include developments in Bayesian computation for scalability, and a case study in exploring the resu...

  2. How to develop scalable business model?:a study on the scalability of business model in Finnish ICT & software industry

    OpenAIRE

    Nguyen, H.

    2014-01-01

    The revolution of Information Communication Technology (ICT) and globalization leverages the business model concept to become more popular in order to support the firm to achieve competitive advantage in dynamic business environment. The start up is not restrict in their size and their novelty but able to be agile by efficiently and effectively exploiting business opportunity through business model innovation. Given these points, the study want to find an optimal combination and fit between t...

  3. Building Mental Models by Dissecting Physical Models

    Science.gov (United States)

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  4. Center for Programming Models for Scalable Parallel Computing: Future Programming Models

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Guang, R.

    2008-07-24

    The mission of the pmodel center project is to develop software technology to support scalable parallel programming models for terascale systems. The goal of the specific UD subproject is in the context developing an efficient and robust methodology and tools for HPC programming. More specifically, the focus is on developing new programming models which facilitate programmers in porting their application onto parallel high performance computing systems. During the course of the research in the past 5 years, the landscape of microprocessor chip architecture has witnessed a fundamental change – the emergence of multi-core/many-core chip architecture appear to become the mainstream technology and will have a major impact to for future generation parallel machines. The programming model for shared-address space machines is becoming critical to such multi-core architectures. Our research highlight is the in-depth study of proposed fine-grain parallelism/multithreading support on such future generation multi-core architectures. Our research has demonstrated the significant impact such fine-grain multithreading model can have on the productivity of parallel programming models and their efficient implementation.

  5. A Scalable Version of the Navy Operational Global Atmospheric Prediction System Spectral Forecast Model

    Directory of Open Access Journals (Sweden)

    Thomas E. Rosmond

    2000-01-01

    Full Text Available The Navy Operational Global Atmospheric Prediction System (NOGAPS includes a state-of-the-art spectral forecast model similar to models run at several major operational numerical weather prediction (NWP centers around the world. The model, developed by the Naval Research Laboratory (NRL in Monterey, California, has run operational at the Fleet Numerical Meteorological and Oceanographic Center (FNMOC since 1982, and most recently is being run on a Cray C90 in a multi-tasked configuration. Typically the multi-tasked code runs on 10 to 15 processors with overall parallel efficiency of about 90%. resolution is T159L30, but other operational and research applications run at significantly lower resolutions. A scalable NOGAPS forecast model has been developed by NRL in anticipation of a FNMOC C90 replacement in about 2001, as well as for current NOGAPS research requirements to run on DOD High-Performance Computing (HPC scalable systems. The model is designed to run with message passing (MPI. Model design criteria include bit reproducibility for different processor numbers and reasonably efficient performance on fully shared memory, distributed memory, and distributed shared memory systems for a wide range of model resolutions. Results for a wide range of processor numbers, model resolutions, and different vendor architectures are presented. Single node performance has been disappointing on RISC based systems, at least compared to vector processor performance. This is a common complaint, and will require careful re-examination of traditional numerical weather prediction (NWP model software design and data organization to fully exploit future scalable architectures.

  6. SCALABLE PERCEPTUAL AUDIO REPRESENTATION WITH AN ADAPTIVE THREE TIME-SCALE SINUSOIDAL SIGNAL MODEL

    Institute of Scientific and Technical Information of China (English)

    Al-Moussawy Raed; Yin Junxun; Song Shaopeng

    2004-01-01

    This work is concerned with the development and optimization of a signal model for scalable perceptual audio coding at low bit rates. A complementary two-part signal model consisting of Sines plus Noise (SN) is described. The paper presents essentially a fundamental enhancement to the sinusoidal modeling component. The enhancement involves an audio signal scheme based on carrying out overlap-add sinusoidal modeling at three successive time scales,large, medium, and small. The sinusoidal modeling is done in an analysis-by-synthesis overlapadd manner across the three scales by using a psychoacoustically weighted matching pursuits.The sinusoidal modeling residual at the first scale is passed to the smaller scales to allow for the modeling of various signal features at appropriate resolutions. This approach greatly helps to correct the pre-echo inherent in the sinusoidal model. This improves the perceptual audio quality upon our previous work of sinusoidal modeling while using the same number of sinusoids. The most obvious application for the SN model is in scalable, high fidelity audio coding and signal modification.

  7. Genetic algorithms and genetic programming for multiscale modeling: Applications in materials science and chemistry and advances in scalability

    Science.gov (United States)

    Sastry, Kumara Narasimha

    2007-03-01

    building blocks in organic chemistry---indicate that MOGAs produce High-quality semiempirical methods that (1) are stable to small perturbations, (2) yield accurate configuration energies on untested and critical excited states, and (3) yield ab initio quality excited-state dynamics. The proposed method enables simulations of more complex systems to realistic, multi-picosecond timescales, well beyond previous attempts or expectation of human experts, and 2--3 orders-of-magnitude reduction in computational cost. While the two applications use simple evolutionary operators, in order to tackle more complex systems, their scalability and limitations have to be investigated. The second part of the thesis addresses some of the challenges involved with a successful design of genetic algorithms and genetic programming for multiscale modeling. The first issue addressed is the scalability of genetic programming, where facetwise models are built to assess the population size required by GP to ensure adequate supply of raw building blocks and also to ensure accurate decision-making between competing building blocks. This study also presents a design of competent genetic programming, where traditional fixed recombination operators are replaced by building and sampling probabilistic models of promising candidate programs. The proposed scalable GP, called extended compact GP (eCGP), combines the ideas from extended compact genetic algorithm (eCGA) and probabilistic incremental program evolution (PIPE) and adaptively identifies, propagates and exchanges important subsolutions of a search problem. Results show that eCGP scales cubically with problem size on both GP-easy and GP-hard problems. Finally, facetwise models are developed to explore limitations of scalability of MOGAs, where the scalability of multiobjective algorithms in reliably maintaining Pareto-optimal solutions is addressed. The results show that even when the building blocks are accurately identified, massive multimodality

  8. Toward a scalable flexible-order model for 3D nonlinear water waves

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Ducrozet, Guillaume; Bingham, Harry B.;

    For marine and coastal applications, current work are directed toward the development of a scalable numerical 3D model for fully nonlinear potential water waves over arbitrary depths. The model is high-order accurate, robust and efficient for large-scale problems, and support will be included for...... flexibility in the description of structures by the use of curvilinear boundary-fitted meshes. The mathematical equations for potential waves in the physical domain is transformed through $\\sigma$-mapping(s) to a time-invariant boundary-fitted domain which then becomes a basis for an efficient solution...

  9. A framework for scalable parameter estimation of gene circuit models using structural information

    KAUST Repository

    Kuwahara, Hiroyuki

    2013-06-21

    Motivation: Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Results: Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. The Author 2013.

  10. NYU3T: teaching, technology, teamwork: a model for interprofessional education scalability and sustainability.

    Science.gov (United States)

    Djukic, Maja; Fulmer, Terry; Adams, Jennifer G; Lee, Sabrina; Triola, Marc M

    2012-09-01

    Interprofessional education is a critical precursor to effective teamwork and the collaboration of health care professionals in clinical settings. Numerous barriers have been identified that preclude scalable and sustainable interprofessional education (IPE) efforts. This article describes NYU3T: Teaching, Technology, Teamwork, a model that uses novel technologies such as Web-based learning, virtual patients, and high-fidelity simulation to overcome some of the common barriers and drive implementation of evidence-based teamwork curricula. It outlines the program's curricular components, implementation strategy, evaluation methods, and lessons learned from the first year of delivery and describes implications for future large-scale IPE initiatives. PMID:22920424

  11. Project Final Report: Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James

    2014-01-06

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  12. Darwinian Model Building

    International Nuclear Information System (INIS)

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment). Variation is achieved by cross-over and mutation operations on the genotype which consists in the present case of a single chromosome. The genotypes 'live' in the environment of the data. Nested Sampling is used to optimize the free parameters of the models given the data, thus giving rise to the phenotypes. Selection is based on the phenotypes.The evidences which naturally follow from the Nested Sampling Algorithm are used in a second level of Nested Sampling to find increasingly better models.The data in this paper originate from the Leiden Cytology and Pathology Laboratory (LCPL), which screens pap smears for cervical cancer. We have data for 1750 women who on average underwent 5 tests each. The data on individual women are treated as a small time series. We will try to estimate the next value of the prime cancer indicator from previous tests of the same woman.

  13. Darwinian Model Building

    Science.gov (United States)

    Kester, Do; Bontekoe, Romke

    2011-03-01

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment). Variation is achieved by cross-over and mutation operations on the genotype which consists in the present case of a single chromosome. The genotypes `live' in the environment of the data. Nested Sampling is used to optimize the free parameters of the models given the data, thus giving rise to the phenotypes. Selection is based on the phenotypes. The evidences which naturally follow from the Nested Sampling Algorithm are used in a second level of Nested Sampling to find increasingly better models. The data in this paper originate from the Leiden Cytology and Pathology Laboratory (LCPL), which screens pap smears for cervical cancer. We have data for 1750 women who on average underwent 5 tests each. The data on individual women are treated as a small time series. We will try to estimate the next value of the prime cancer indicator from previous tests of the same woman.

  14. Progress Report 2008: A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Drake, John B [ORNL; Worley, Patrick H [ORNL; Hoffman, Forrest M [ORNL; Jones, Phil [Los Alamos National Laboratory (LANL)

    2009-01-01

    This project employs multi-disciplinary teams to accelerate development of the Community Climate System Model (CCSM), based at the National Center for Atmospheric Research (NCAR). A consortium of eight Department of Energy (DOE) National Laboratories collaborate with NCAR and the NASA Global Modeling and Assimilation Office (GMAO). The laboratories are Argonne (ANL), Brookhaven (BNL) Los Alamos (LANL), Lawrence Berkeley (LBNL), Lawrence Livermore (LLNL), Oak Ridge (ORNL), Pacific Northwest (PNNL) and Sandia (SNL). The work plan focuses on scalablity for petascale computation and extensibility to a more comprehensive earth system model. Our stated goal is to support the DOE mission in climate change research by helping ... To determine the range of possible climate changes over the 21st century and beyond through simulations using a more accurate climate system model that includes the full range of human and natural climate feedbacks with increased realism and spatial resolution.

  15. Flavored model building

    International Nuclear Information System (INIS)

    In this thesis we discuss possibilities to solve the family replication problem and to understand the observed strong hierarchy among the fermion masses and the diverse mixing pattern of quarks and leptons. We show that non-abelian discrete symmetries which act non-trivially in generation space can serve as profound explanation. We present three low energy models with the permutation symmetry S4, the dihedral group D5 and the double-valued group T' as flavor symmetry. The T' model turns out to be very predictive, since it explains tri-bimaximal mixing in the lepton sector and, moreover, leads to two non-trivial relations in the quark sector, √((md)/(ms))= vertical stroke Vus vertical stroke and √((md)/(ms))= vertical stroke (Vtd)/(Vts) vertical stroke. The main message of the T' model is the observation that the diverse pattern in the quark and lepton mixings can be well-understood, if the flavor symmetry is not broken in an arbitrary way, but only to residual (non-trivial) subgroups. Apart from leading to deeper insights into the origin of the fermion mixings this idea enables us to perform systematic studies of large classes of discrete groups. This we show in our study of dihedral symmetries Dn and D'n. As a result we find only five distinct (Dirac) mass matrix structures arising from a dihedral group, if we additionally require partial unification of either left-handed or left-handed conjugate fermions and the determinant of the mass matrix to be non-vanishing. Furthermore, we reveal the ability of dihedral groups to predict the Cabibbo angle θC, i.e. vertical stroke Vus(cd) vertical stroke cos((3π)/(7)), as well as maximal atmospheric mixing, θ23=(π)/(4), and vanishing θ13 in the lepton sector. (orig.)

  16. Model building in noncommutative geometry

    International Nuclear Information System (INIS)

    Noncommutative geometry (NCG) based on spectral triples allows to unify classical Yang-Mills-Higgs (YMH) theories and General Relativity in a single geometrical framework. The relevant spectral triples contain a finite part which encodes the particle content of the YMH models and is subject to strong geometrical restrictions. These restrictions permit a classification of certain (irreducible) spectral triples and lead to a prominent position of the Standard Model (SM) as a ''minimal'' finite spectral triple. I will give a short introduction to the basic ideas of NCG and present a ''bottom-up'' approach to model building in the framework of NCG. This noncommutative model building kit has led to phenomenologically interesting models beyond the SM. These models extend the fermionic and the gauge sector of the SM as well as the scalar sector.

  17. Flavored model building

    Energy Technology Data Exchange (ETDEWEB)

    Hagedorn, C.

    2008-01-15

    In this thesis we discuss possibilities to solve the family replication problem and to understand the observed strong hierarchy among the fermion masses and the diverse mixing pattern of quarks and leptons. We show that non-abelian discrete symmetries which act non-trivially in generation space can serve as profound explanation. We present three low energy models with the permutation symmetry S{sub 4}, the dihedral group D{sub 5} and the double-valued group T' as flavor symmetry. The T' model turns out to be very predictive, since it explains tri-bimaximal mixing in the lepton sector and, moreover, leads to two non-trivial relations in the quark sector, {radical}((m{sub d})/(m{sub s}))= vertical stroke V{sub us} vertical stroke and {radical}((m{sub d})/(m{sub s}))= vertical stroke (V{sub td})/(V{sub ts}) vertical stroke. The main message of the T' model is the observation that the diverse pattern in the quark and lepton mixings can be well-understood, if the flavor symmetry is not broken in an arbitrary way, but only to residual (non-trivial) subgroups. Apart from leading to deeper insights into the origin of the fermion mixings this idea enables us to perform systematic studies of large classes of discrete groups. This we show in our study of dihedral symmetries D{sub n} and D'{sub n}. As a result we find only five distinct (Dirac) mass matrix structures arising from a dihedral group, if we additionally require partial unification of either left-handed or left-handed conjugate fermions and the determinant of the mass matrix to be non-vanishing. Furthermore, we reveal the ability of dihedral groups to predict the Cabibbo angle {theta}{sub C}, i.e. vertical stroke V{sub us(cd)} vertical stroke = cos((3{pi})/(7)), as well as maximal atmospheric mixing, {theta}{sub 23}=({pi})/(4), and vanishing {theta}{sub 13} in the lepton sector. (orig.)

  18. Scalable Reduced-order Models for Fine-resolution Hydrologic Simulations

    Science.gov (United States)

    Liu, Y.; Pau, G. S. H.

    2014-12-01

    Fine-resolution descriptions of hydrologic variables are desirable for an improved investigation of regional-scale and watershed-scale phenomena. For example, fine-resolution soil moisture allows biogeochemical processes to be modeled at the desired mechanistic scales. However, direct deterministic simulations of fine-resolution land surface variables present many challenges, a prominent one of which is the high computational cost. To address this challenge, we propose the use of reduced-order modeling techniques, such as Gaussian process regression and polynomial chaos expansion, to directly emulate fine-resolution models. Dimension reduction techniques, such as proper orthogonal decomposition method, are further used to improve the efficiency of the resulting reduced order model (ROM). We also develop procedures to efficiently quantify the uncertainties in the ROM solutions. Although ROM, by definition, is computationally efficient, the construction of ROM can be computationally expensive and memory-intensive since we need to use many high-resolution solutions to train the ROM. In addition, high-dimensional regression models can have non-negligible computational demands. To address these computational challenges, we have developed a new parallel and scalable software framework for developing emulators for fine-resolution models. The framework allows ROM to be efficiently constructed from fine-resolution solutions and deployed on high-performance computing platforms. The framework utilizes some existing high-performance computing libraries such as PETSc (Portable, Extensible Toolkit for Scientific Computation), SLEPc (Scalable Library for Eigenvalue Problem Computation) and Elemental. We will demonstrate the accuracy of the ROMs we developed for two fine-resolution surface-subsurface models and the performance of our software framework.

  19. Multipoint videoconferencing with scalable video coding

    Institute of Scientific and Technical Information of China (English)

    ELEFTHERIADIS Alexandros; CIVANLAR M. Reha; SHAPIRO Ofer

    2006-01-01

    We describe a system for multipoint videoconferencing that offers extremely low end-to-end delay, low cost and complexity, and high scalability, alongside standard features associated with high-end solutions such as rate matching and personal video layout. The system accommodates heterogeneous receivers and networks based on the Internet Protocol and relies on scalable video coding to provide a coded representation of a source video signal at multiple temporal and spatial resolutions as well as quality levels. These are represented by distinct bitstream components which are created at each end-user encoder. Depending on the specific conferencing environment, some or all of these components are transmitted to a Scalable Video Conferencing Server (SVCS). The SVCS redirects these components to one or more recipients depending on, e.g., the available network conditions and user preferences. The scalable aspect of the video coding technique allows the system to adapt to different network conditions, and also accommodates different end-user requirements (e.g., a user may elect to view another user at a high or low spatial resolution). Performance results concerning flexibility, video quality and delay of the system are presented using the Joint Scalable Video Model (JSVM) of the forthcoming SVC (H.264 Annex G) standard, demonstrating that scalable coding outperforms existing state-of-the-art systems and offers the right platform for building next-generation multipoint videoconferencing systems.

  20. Alternatives to quintessence model building

    International Nuclear Information System (INIS)

    We discuss the issue of toy model building for the dark energy component of the universe. Specifically, we consider two generic toy models recently proposed as alternatives to quintessence models, respectively known as Cardassian expansion and the Chaplygin gas. We show that the former is entirely equivalent to a class of quintessence models. We determine the observational constraints on the latter, coming from recent supernovae results and from the shape of the matter power spectrum. As expected, these restrict the model to a behavior that closely matches that of a standard cosmological constant Λ

  1. Scalable parallel methods for monolithic coupling in fluid-structure interaction with application to blood flow modeling

    International Nuclear Information System (INIS)

    We introduce and study numerically a scalable parallel finite element solver for the simulation of blood flow in compliant arteries. The incompressible Navier-Stokes equations are used to model the fluid and coupled to an incompressible linear elastic model for the blood vessel walls. Our method features an unstructured dynamic mesh capable of modeling complicated geometries, an arbitrary Lagrangian-Eulerian framework that allows for large displacements of the moving fluid domain, monolithic coupling between the fluid and structure equations, and fully implicit time discretization. Simulations based on blood vessel geometries derived from patient-specific clinical data are performed on large supercomputers using scalable Newton-Krylov algorithms preconditioned with an overlapping restricted additive Schwarz method that preconditions the entire fluid-structure system together. The algorithm is shown to be robust and scalable for a variety of physical parameters, scaling to hundreds of processors and millions of unknowns.

  2. Modelling and stability analysis of emergent behavior of scalable swarm system

    Institute of Scientific and Technical Information of China (English)

    CHEN Shi-ming; FANG Hua-jing

    2006-01-01

    In this paper we propose a two-layer emergent model for scalable swarm system. The first layer describes the individual flocking behavior to the local goal position (the center of minimal circumcircle decided by the neighbors in the positive visual set of individuals) resulting from the individual motion to one or two farthest neighbors in its positive visual set; the second layer describes the emergent aggregating swarm behavior resulting from the individual motion to its local goal position. The scale of the swarm will not be limited because only local individual information is used for modelling in the two-layer topology. We study the stability properties of the swarm emergent behavior based on Lyapunov stability theory. Simulations showed that the swarm system can converge to goal regions while maintaining cohesiveness.

  3. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    CERN Document Server

    Sane, Nimish; Harris, Andrew I; Bhattacharyya, Shuvra S

    2012-01-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decmation factor and center frequency can be reconfigured without the nee...

  4. A scalable theoretical mean-field model for the electron component of an ultracold neutral plasma

    CERN Document Server

    Guthrie, John

    2015-01-01

    The electron component of an ultracold neutral plasma (UCP) is modeled based on a scalable method using a self-consistently determined mean-field approximation. Representative sampling of discrete electrons within the UCP are used to project the electron spatial distribution onto an expansion of orthogonal basis functions. A collision operator acting on the sample electrons is employed in order to drive the distribution toward thermal equilibrium. These equilibrium distributions can be determined for non-zero electron temperatures even in the presence of spherical symmetry-breaking applied electric fields. This is useful for predicting key macroscopic UCP parameters, such as the depth of the electrons' confining potential. Dynamics such as electron oscillations in UCPs with non-uniform density distributions can also be treated by this model.

  5. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    Energy Technology Data Exchange (ETDEWEB)

    Cleary, A J; Smith, S G; Vassilevska, T K; Jefferson, D R

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallel computers and maturation of the technology from an academic to a lab setting.

  6. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  7. Performance and scalability of finite-difference and finite-element wave-propagation modeling on Intel's Xeon Phi

    NARCIS (Netherlands)

    Zhebel, E.; Minisini, S.; Kononov, A.; Mulder, W.A.

    2013-01-01

    With the rapid developments in parallel compute architectures, algorithms for seismic modeling and imaging need to be reconsidered in terms of parallelization. The aim of this paper is to compare scalability of seismic modeling algorithms: finite differences, continuous mass-lumped finite elements a

  8. Working towards a scalable model of problem-based learning instruction in undergraduate engineering education

    Science.gov (United States)

    Mantri, Archana

    2014-05-01

    The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and Communication Engineering, namely Analog Electronics, Digital Electronics and Pulse, Digital & Switching Circuits is presented here. It measures the effects of pedagogy, gender and cognitive styles on the knowledge, skill and attitude of the students. The study was conducted two times with content designed around same set of OEPs but with two different trained facilitators for all the three courses. The repeatability of results for effects of the independent parameters on dependent parameters is studied and inferences are drawn.

  9. File format for storage of scalable video

    Institute of Scientific and Technical Information of China (English)

    BAI Gang; SUN Xiao-yan; WU Feng; YIN Bao-cai; LI Shi-peng

    2006-01-01

    A file format for storage of scalable video is proposed in this paper. A generic model is presented to enable a codec independent description of scalable video stream. The relationships, especially the dependencies, among sub-streams in a scalable video stream are specified sufficiently and effectively in the proposed model. Complying with the presented scalable video stream model, the file format for scalable video is proposed based on ISO Base Media File Format, which is simple and flexible enough to address the demands of scalable video application as well as the non-scalable ones.

  10. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    OpenAIRE

    WoonSeong Jeong; Jong Bum Kim; Clayton, Mark J.; Haberl, Jeff S.; Wei Yan

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The proces...

  11. Irregular Shaped Building Design Optimization with Building Information Modelling

    Directory of Open Access Journals (Sweden)

    Lee Xia Sheng

    2016-01-01

    Full Text Available This research is to recognise the function of Building Information Modelling (BIM in design optimization for irregular shaped buildings. The study focuses on a conceptual irregular shaped “twisted” building design similar to some existing sculpture-like architectures. Form and function are the two most important aspects of new buildings, which are becoming more sophisticated as parts of equally sophisticated “systems” that we are living in. Nowadays, it is common to have irregular shaped or sculpture-like buildings which are very different when compared to regular buildings. Construction industry stakeholders are facing stiff challenges in many aspects such as buildability, cost effectiveness, delivery time and facility management when dealing with irregular shaped building projects. Building Information Modelling (BIM is being utilized to enable architects, engineers and constructors to gain improved visualization for irregular shaped buildings; this has a purpose of identifying critical issues before initiating physical construction work. In this study, three variations of design options differing in rotating angle: 30 degrees, 60 degrees and 90 degrees are created to conduct quantifiable comparisons. Discussions are focused on three major aspects including structural planning, usable building space, and structural constructability. This research concludes that Building Information Modelling is instrumental in facilitating design optimization for irregular shaped building. In the process of comparing different design variations, instead of just giving “yes or no” type of response, stakeholders can now easily visualize, evaluate and decide to achieve the right balance based on their own criteria. Therefore, construction project stakeholders are empowered with superior evaluation and decision making capability.

  12. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Don W.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminary theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.

  13. Energy modelling and capacity building

    International Nuclear Information System (INIS)

    The Planning and Economic Studies Section of the IAEA's Department of Nuclear Energy is focusing on building analytical capacity in MS for energy-environmental-economic assessments and for the elaboration of sustainable energy strategies. It offers a variety of analytical models specifically designed for use in developing countries for (i) evaluating alternative energy strategies; (ii) assessing environmental, economic and financial impacts of energy options; (iii) assessing infrastructure needs; (iv) evaluating regional development possibilities and energy trade; (v) assessing the role of nuclear power in addressing priority issues (climate change, energy security, etc.). These models can be used for analysing energy or electricity systems, and to assess possible implications of different energy, environmental or financial policies that affect the energy sector and energy systems. The models vary in complexity and data requirements, and so can be adapted to the available data, statistics and analytical needs of different countries. These models are constantly updated to reflect changes in the real world and in the concerns that drive energy system choices. They can provide thoughtfully informed choices for policy makers over a broader range of circumstances and interests. For example, they can readily reflect the workings of competitive energy and electricity markets, and cover such topics as external costs. The IAEA further offers training in the use of these models and -just as important- in the interpretation and critical evaluation of results. Training of national teams to develop national competence over the full spectrum of models, is a high priority. The IAEA maintains a broad spectrum of databanks relevant to energy, economic and environmental analysis in MS, and make these data available to analysts in MS for use in their own analytical work. The Reference Technology Data Base (RTDB) and the Reference Data Series (RDS-1) are the major vehicles by which we

  14. Scalable devices

    KAUST Repository

    Krüger, Jens J.

    2014-01-01

    In computer science in general and in particular the field of high performance computing and supercomputing the term scalable plays an important role. It indicates that a piece of hardware, a concept, an algorithm, or an entire system scales with the size of the problem, i.e., it can not only be used in a very specific setting but it\\'s applicable for a wide range of problems. From small scenarios to possibly very large settings. In this spirit, there exist a number of fixed areas of research on scalability. There are works on scalable algorithms, scalable architectures but what are scalable devices? In the context of this chapter, we are interested in a whole range of display devices, ranging from small scale hardware such as tablet computers, pads, smart-phones etc. up to large tiled display walls. What interests us mostly is not so much the hardware setup but mostly the visualization algorithms behind these display systems that scale from your average smart phone up to the largest gigapixel display walls.

  15. Towards a large-scale scalable adaptive heart model using shallow tree meshes

    Science.gov (United States)

    Krause, Dorian; Dickopf, Thomas; Potse, Mark; Krause, Rolf

    2015-10-01

    Electrophysiological heart models are sophisticated computational tools that place high demands on the computing hardware due to the high spatial resolution required to capture the steep depolarization front. To address this challenge, we present a novel adaptive scheme for resolving the deporalization front accurately using adaptivity in space. Our adaptive scheme is based on locally structured meshes. These tensor meshes in space are organized in a parallel forest of trees, which allows us to resolve complicated geometries and to realize high variations in the local mesh sizes with a minimal memory footprint in the adaptive scheme. We discuss both a non-conforming mortar element approximation and a conforming finite element space and present an efficient technique for the assembly of the respective stiffness matrices using matrix representations of the inclusion operators into the product space on the so-called shallow tree meshes. We analyzed the parallel performance and scalability for a two-dimensional ventricle slice as well as for a full large-scale heart model. Our results demonstrate that the method has good performance and high accuracy.

  16. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  17. Scalability and Efficiency of Earth System Models in the Multi-Core-Age

    Science.gov (United States)

    Biercamp, J.; Adamidis, P.; Jahns, T.; Rosenhauer, M.

    2009-04-01

    Climate and Earth system modeling today is performed with intricate systems of coupled models. Complexity and spatial resolution of these models is limited by computing resources. Necessary and envisaged improvements will require an increase of the computing power available to climate models by several orders of magnitude. Until very recently the speed of a single CPU (central processing unit) doubled roughly every two years. This has come to an end for technical and physical reasons. The new rule of thumb says that the number of computational cores will double every two years. Climate modelers will have to learn how to use very large numbers of cores in parallel. Modern supercomputer already use thousands of cores. If, with a global atmosphere model, we wanted to achieve 1000 forecast days per day at a horizontal resolution of 1km, we would need to run it on more than 10.000.000 processing units in parallel. Today nobody knows how to program such an application, how to handle the enormous data streams produced by it and how to pay for the power bill of such a machine. In this talk we will discuss our strategies to scale earth system models to high numbers of processor cores. We will mainly focus on two projects. "ScalES" (Scalable Earth System Models) is a BMBF funded project led by DKRZ which started in January 2009. In this project we will identify bottlenecks which inhibit efficient scaling of typical climate models and will implement prototype solutions in the COSMOS coupled Earth system model. In particular the project will address parallel I/O, load balancing, efficient parallel coupling of component models and efficient use of state-of-the-art computer architectures. "PeAKliM" (Petaflop-Architectures in Climate und Meteorology) is a joint initiative of climate researchers and mathematicians which aims at tackling the question, what kind of architecture is best suited for climate and weather models and to already now investigate into algorithms for future

  18. Scalable Nonlinear Solvers for Fully Implicit Coupled Nuclear Fuel Modeling. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Xiao-Chuan [Univ. of Colorado, Boulder, CO (United States). Dept. of Computer Science; Keyes, David [Columbia Univ., New York, NY (United States); Yang, Chao [Univ. of Colorado, Boulder, CO (United States). Dept. of Computer Science; Zheng, Xiang [Columbia Univ., New York, NY (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-29

    The focus of the project is on the development and customization of some highly scalable domain decomposition based preconditioning techniques for the numerical solution of nonlinear, coupled systems of partial differential equations (PDEs) arising from nuclear fuel simulations. These high-order PDEs represent multiple interacting physical fields (for example, heat conduction, oxygen transport, solid deformation), each is modeled by a certain type of Cahn-Hilliard and/or Allen-Cahn equations. Most existing approaches involve a careful splitting of the fields and the use of field-by-field iterations to obtain a solution of the coupled problem. Such approaches have many advantages such as ease of implementation since only single field solvers are needed, but also exhibit disadvantages. For example, certain nonlinear interactions between the fields may not be fully captured, and for unsteady problems, stable time integration schemes are difficult to design. In addition, when implemented on large scale parallel computers, the sequential nature of the field-by-field iterations substantially reduces the parallel efficiency. To overcome the disadvantages, fully coupled approaches have been investigated in order to obtain full physics simulations.

  19. Virtual building environments (VBE) - Applying information modeling to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  20. Effect of Mobility Models on Reinforcement Learning Based Routing Algorithm Applied for Scalable AD HOC Network Environment

    Directory of Open Access Journals (Sweden)

    Shrirang.Ambaji.Kulkarn

    2010-11-01

    Full Text Available Mobile Ad Hoc Network faces the greatest challenge for better performances in terms of mobilitycharacterization. The mobility of nodes and their underlying mobility models have a profound effect on theperformances of routing protocols which are central to the design of ad hoc networks. Most of thetraditional routing algorithms proposed for ad hoc networks do not scale well when the traffic variationincreases drastically. To model a solution to this problem we consider a reinforcement learning basedrouting algorithm for ad hoc network known as SAMPLE. Most the scalability issues for ad hoc networkperformance investigation have not considered the group mobility of nodes. In this paper we modelrealistic group vehicular mobility model and analyze the robustness of a reinforcement learning basedrouting algorithm under scalable conditions.

  1. Scalable Generalization of Hydraulic Conductivity in Quaternary Strata for Use in a Regional Groundwater Model

    Science.gov (United States)

    Jatnieks, J.; Popovs, K.; Klints, I.; Timuhins, A.; Kalvans, A.; Delina, A.; Saks, T.

    2012-04-01

    The cover of Quaternary sediments especially in formerly glaciated territories usually is the most complex part of the sedimentary sequences. In regional hydro-geological models it is often assumed as a single layer with uniform or calibrated properties (Valner 2003). However, the properties and structure of Quaternary sediments control the groundwater recharge: it can either direct the groundwater flow horizontally towards discharge in topographic lows or vertically, recharging groundwater in the bedrock. This work aims to present calibration results and detail our experience while integrating a scalable generalization of hydraulic conductivity for Quaternary strata in the regional groundwater modelling system for the Baltic artesian basin - MOSYS V1. We also present a method for solving boundary transitions between spatial clusters of lithologically similar structure. In this study the main unit of generalization is the spatial cluster. Clusters are obtained from distance calculations combining the Normalized Compression Distance (NCD) metric, calculated by the CompLearn parameter-free machine learning toolkit, with normalized Euclidean distance measures for coordinates of the borehole log data. A hierarchical clustering solution is used for obtaining cluster membership identifier for each borehole. Using boreholes as generator points for Voronoi tessellation and dissolving resulting polygons according to their cluster membership attribute, allows us to obtain spatial regions representing a certain degree of similarity in lithological structure. This degree of similarity and the spatial heterogeneity of the cluster polygons can be varied by different flattening of the hierarchical cluster model into variable number of clusters. This provides a scalable generalization solution which can be adapted according to model calibration performance. Using the dissimilarity matrix of the NCD metric, a borehole most similar to all the others from the lithological structure

  2. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  3. Model for Refurbishment of Heritage Buildings

    OpenAIRE

    Rasmussen, Torben Valdbjørn

    2014-01-01

    A model intended for the selection of feasible refurbishment measures for heritage buildings was developed. The model showed how to choose, evaluate and implement measures that create synergy between the interests in preserving heritage values and creating cost efficient refurbishment that complies with the requirements for the use of the building. The model focuses on the cooperation and dialogue between authorities and owners, who refurbish heritage buildings. The developed model was used f...

  4. Scalability Test of multiscale fluid-platelet model for three top supercomputers

    Science.gov (United States)

    Zhang, Peng; Zhang, Na; Gao, Chao; Zhang, Li; Gao, Yuxiang; Deng, Yuefan; Bluestein, Danny

    2016-07-01

    We have tested the scalability of three supercomputers: the Tianhe-2, Stampede and CS-Storm with multiscale fluid-platelet simulations, in which a highly-resolved and efficient numerical model for nanoscale biophysics of platelets in microscale viscous biofluids is considered. Three experiments involving varying problem sizes were performed: Exp-S: 680,718-particle single-platelet; Exp-M: 2,722,872-particle 4-platelet; and Exp-L: 10,891,488-particle 16-platelet. Our implementations of multiple time-stepping (MTS) algorithm improved the performance of single time-stepping (STS) in all experiments. Using MTS, our model achieved the following simulation rates: 12.5, 25.0, 35.5 μs/day for Exp-S and 9.09, 6.25, 14.29 μs/day for Exp-M on Tianhe-2, CS-Storm 16-K80 and Stampede K20. The best rate for Exp-L was 6.25 μs/day for Stampede. Utilizing current advanced HPC resources, the simulation rates achieved by our algorithms bring within reach performing complex multiscale simulations for solving vexing problems at the interface of biology and engineering, such as thrombosis in blood flow which combines millisecond-scale hematology with microscale blood flow at resolutions of micro-to-nanoscale cellular components of platelets. This study of testing the performance characteristics of supercomputers with advanced computational algorithms that offer optimal trade-off to achieve enhanced computational performance serves to demonstrate that such simulations are feasible with currently available HPC resources.

  5. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    Science.gov (United States)

    Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our

  6. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  7. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    Science.gov (United States)

    Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF

  8. Modeling, Fabrication and Characterization of Scalable Electroless Gold Plated Nanostructures for Enhanced Surface Plasmon Resonance

    Science.gov (United States)

    Jang, Gyoung Gug

    The scientific and industrial demand for controllable thin gold (Au) film and Au nanostructures is increasing in many fields including opto-electronics, photovoltaics, MEMS devices, diagnostics, bio-molecular sensors, spectro-/microscopic surfaces and probes. In this study, a novel continuous flow electroless (CF-EL) Au plating method is developed to fabricate uniform Au thin films in ambient condition. The enhanced local mass transfer rate and continuous deposition resulting from CF-EL plating improved physical uniformity of deposited Au films and thermally transformed nanoparticles (NPs). Au films and NPs exhibited improved optical photoluminescence (PL) and surface plasmon resonance (SPR), respectively, relative to batch immersion EL (BI-EL) plating. Suggested mass transfer models of Au mole deposition are consistent with optical feature of CF-EL and BI-EL films. The prototype CF-EL plating system is upgraded an automated scalable CF-EL plating system with real-time transmission UV-vis (T-UV) spectroscopy which provides the advantage of CF-EL plating, such as more uniform surface morphology, and overcomes the disadvantages of conventional EL plating, such as no continuous process and low deposition rate, using continuous process and controllable deposition rate. Throughout this work, dynamic morphological and chemical transitions during redox-driven self-assembly of Ag and Au film on silica surfaces under kinetic and equilibrium conditions are distinguished by correlating real-time T-UV spectroscopy with X-ray photoelectron spectroscopy (XPS) and scanning electron microscopy (SEM) measurements. The characterization suggests that four previously unrecognized time-dependent physicochemical regimes occur during consecutive EL deposition of silver (Ag) and Au onto tin-sensitized silica surfaces: self-limiting Ag activation; transitory Ag NP formation; transitional Au-Ag alloy formation during galvanic replacement of Ag by Au; and uniform morphology formation under

  9. Detailed Modeling, Design, and Evaluation of a Scalable Multi-level Checkpointing System

    Energy Technology Data Exchange (ETDEWEB)

    Moody, A T; Bronevetsky, G; Mohror, K M; de Supinski, B R

    2010-04-09

    High-performance computing (HPC) systems are growing more powerful by utilizing more hardware components. As the system mean-time-before-failure correspondingly drops, applications must checkpoint more frequently to make progress. However, as the system memory sizes grow faster than the bandwidth to the parallel file system, the cost of checkpointing begins to dominate application run times. A potential solution to this problem is to use multi-level checkpointing, which employs multiple types of checkpoints with different costs and different levels of resiliency in a single run. The goal is to design light-weight checkpoints to handle the most common failure modes and rely on more expensive checkpoints for less common, but more severe failures. While this approach is theoretically promising, it has not been fully evaluated in a large-scale, production system context. To this end we have designed a system, called the Scalable Checkpoint/Restart (SCR) library, that writes checkpoints to storage on the compute nodes utilizing RAM, Flash, or disk, in addition to the parallel file system. We present the performance and reliability properties of SCR as well as a probabilistic Markov model that predicts its performance on current and future systems. We show that multi-level checkpointing improves efficiency on existing large-scale systems and that this benefit increases as the system size grows. In particular, we developed low-cost checkpoint schemes that are 100x-1000x faster than the parallel file system and effective against 85% of our system failures. This leads to a gain in machine efficiency of up to 35%, and it reduces the the load on the parallel file system by a factor of two on current and future systems.

  10. An Enhanced Secure and Scalable Model for Enterprise Applications using Automated Monitoring

    Directory of Open Access Journals (Sweden)

    A. Kannammal

    2006-01-01

    Full Text Available E-Business must be highly secured and scalable to provide efficient services to millions of clients on the web. This paper proposes a new approach based on shared objects to improve security and mobile agents to improve scalability. The e-business uses shared objects and mobile agents to update the clients automatically with new information. The agent that resides in the database server is informed about the new information by triggering a function. Then the agent updates the shared object which is accessed by another agent that sends the information to the clients. This approach improves security, as clients are not aware of the location of central database and makes e-business more scalable by deploying mobile agents. The shared object is designed in such a way that it synchronizes the data transfer between agents. Proposed approach is implemented in a testing environment and the performance is analyzed. The analysis has shown that the proposed approach improves the security of business data and scalability of database servers providing synchronized data transfer.

  11. Thermal Models for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2012-01-01

    comfort of residents, proper prediction models for indoor temperature have to be developed. This paper presents a model for prediction of indoor temperature and power consumption from electrical space heating in an office building, using stochastic differential equations. The heat dynamic model is build...... being pursued is to use the heat capacity of the thermal mass in buildings to temporarily store excess power production by increasing the electrical heating. Likewise can the electrical heating be postponed in periods with lack of production. To exploit the potential in thermal storage and to ensure the...... actual office building using a maximum likelihood technique....

  12. Scalable coherent interface

    International Nuclear Information System (INIS)

    The Scalable Coherent Interface (IEEE P1596) is establishing an interface standard for very high performance multiprocessors, supporting a cache-coherent-memory model scalable to systems with up to 64K nodes. This Scalable Coherent Interface (SCI) will supply a peak bandwidth per node of 1 GigaByte/second. The SCI standard should facilitate assembly of processor, memory, I/O and bus bridge cards from multiple vendors into massively parallel systems with throughput far above what is possible today. The SCI standard encompasses two levels of interface, a physical level and a logical level. The physical level specifies electrical, mechanical and thermal characteristics of connectors and cards that meet the standard. The logical level describes the address space, data transfer protocols, cache coherence mechanisms, synchronization primitives and error recovery. In this paper we address logical level issues such as packet formats, packet transmission, transaction handshake, flow control, and cache coherence. 11 refs., 10 figs

  13. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...

  14. DOE Commercial Building Benchmark Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Torcelini, P.; Deru, M.; Griffith, B.; Benne, K.; Halverson, M.; Winiarski, D.; Crawley, D. B.

    2008-07-01

    To provide a consistent baseline of comparison and save time conducting such simulations, the U.S. Department of Energy (DOE) has developed a set of standard benchmark building models. This paper will provide an executive summary overview of these benchmark buildings, and how they can save building analysts valuable time. Fully documented and implemented to use with the EnergyPlus energy simulation program, the benchmark models are publicly available and new versions will be created to maintain compatibility with new releases of EnergyPlus. The benchmark buildings will form the basis for research on specific building technologies, energy code development, appliance standards, and measurement of progress toward DOE energy goals. Having a common starting point allows us to better share and compare research results and move forward to make more energy efficient buildings.

  15. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  16. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  17. Non-commutative standard model: model building

    International Nuclear Information System (INIS)

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U*(n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,±1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U*(3) x U*(2) x U*(1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  18. An Algorithm to Translate Building Topology in Building Information Modeling into Object-Oriented Physical Modeling-Based Building Energy Modeling

    OpenAIRE

    WoonSeong Jeong; JeongWook Son

    2016-01-01

    This paper presents an algorithm to translate building topology in an object-oriented architectural building model (Building Information Modeling, BIM) into an object-oriented physical-based energy performance simulation by using an object-oriented programming approach. Our algorithm demonstrates efficient mapping of building components in a BIM model into space boundary conditions in an object-oriented physical modeling (OOPM)-based building energy model, and the translation of building topo...

  19. A scalable model for network situational awareness based on Endsley's situation model

    Institute of Scientific and Technical Information of China (English)

    Hu Wei; Li Jianhua; Chen Xiuzhen; Jiang Xinghao; Zuo Min

    2007-01-01

    The paper introduces the Endsley's situation model into network security to describe the network security situation,and improves Endsley'S data processing to suit network alerts.The proposet model contains the information of incident frequency.incident time and incident space.The HoneyNet dataset is selected to evaluate the proposed model in the evaluation.The paper pmposes three definitions to depict and predigest the whole situation extraction in detail.and a fusion component to reduce the influence of alert redundancy on the total security situation.The less complex extraction makes the situation analysismore efficient,and the fine-grained model makes the analysis have a better expansibility.Finally,the situational variation curves are simulated,and the evaluation results prove the situation model applicable and efficient.

  20. Building Information Modelling in Denmark and Iceland

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Jóhannesson, Elvar Ingi

    2013-01-01

    Purpose – The purpose of this paper is to explore the implementation of building information modelling (BIM) in the Nordic countries of Europe with particular focus on the Danish building industry with the aim of making use of its experience for the Icelandic building industry. Design/methodology......Purpose – The purpose of this paper is to explore the implementation of building information modelling (BIM) in the Nordic countries of Europe with particular focus on the Danish building industry with the aim of making use of its experience for the Icelandic building industry. Design....../methodology/aptroach – The research is based on two separate analyses. In the first part, the deployment of information and communication technology (ICT) in the Icelandic building industry is investigated and compared with the other Nordic countries. In the second part the experience in Denmark from implementing and...... working with BIM is studied. Based on findings from both parts, ideas and recommendations are put forward for the Icelandic building industry about feasible ways of implementing BIM. Findings – Among the results are that the use of BIM is very limited in the Icelandic companies compared to the other...

  1. Modeling energy efficiency of bioclimatic buildings

    Energy Technology Data Exchange (ETDEWEB)

    Tzikopoulos, A.F.; Karatza, M.C.; Paravantis, J.A. [Piraeus Univ. (Greece). Dept. of Technology Education and Digital Systems

    2005-05-01

    The application of bioclimatic principles is a critical factor in reducing energy consumption and CO{sub 2} emissions of the building sector. This paper develops a regression model of energy efficiency as a function of environmental conditions, building characteristics and passive solar technologies. A sample of 77 bioclimatic buildings (including 45 houses) was collected, covering Greece, other Mediterranean areas and the rest of Europe. Average energy efficiency varied from 19.6 to 100% with an average of about 68%. Environmental conditions included latitude, altitude, ambient temperature, degree days and sun hours; building characteristics consisted in building area and volume. Passive solar technologies included (among others) solar water heaters, shading, natural ventilation, greenhouses and thermal storage walls. Degree days and a dummy variable indicating location in the Mediterranean area were the strongest predictors of energy efficiency while taller and leaner buildings tended to be more energy efficient. Surprisingly, many passive technologies did not appear to make a difference on energy efficiency while thermal storage walls in fact seemed to decrease energy efficiency. The model developed may be of use to architects, engineers and policy makers. Suggestions for further research include obtaining more building information, investigating the effect of passive solar technologies and gathering information on the usage of building. (Author)

  2. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  3. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    Science.gov (United States)

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  4. Building dynamic spatial environmental models

    OpenAIRE

    Karssenberg, D.J.

    2003-01-01

    An environmental model is a representation or imitation of complex natural phenomena that can be discerned by human cognitive processes. This thesis deals with the type of environmental models referred to as dynamic spatial environmental models. The word ‘spatial’ refers to the geographic domain which they represent, which is the two- or three-dimensional space, while ‘dynamic’ refers to models simulating changes through time using rules of cause and effect, represented in mathematical equati...

  5. Building dynamic spatial environmental models

    NARCIS (Netherlands)

    Karssenberg, D.J.

    2003-01-01

    An environmental model is a representation or imitation of complex natural phenomena that can be discerned by human cognitive processes. This thesis deals with the type of environmental models referred to as dynamic spatial environmental models. The word ‘spatial’ refers to the geographic domain whi

  6. Economic aspects and models for building codes

    DEFF Research Database (Denmark)

    Bonke, Jens; Pedersen, Dan Ove; Johnsen, Kjeld

    It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study.......It is the purpose of this bulletin to present an economic model for estimating the consequence of new or changed building codes. The object is to allow comparative analysis in order to improve the basis for decisions in this field. The model is applied in a case study....

  7. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  8. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  9. Model for Refurbishment of Heritage Buildings

    DEFF Research Database (Denmark)

    Rasmussen, Torben Valdbjørn

    2014-01-01

    A model intended for the selection of feasible refurbishment measures for heritage buildings was developed. The model showed how to choose, evaluate and implement measures that create synergy between the interests in preserving heritage values and creating cost efficient refurbishment that complies...... with the requirements for the use of the building. The model focuses on the cooperation and dialogue between authorities and owners, who refurbish heritage buildings. The developed model was used for the refurbishment of the listed complex, Fæstningens Materialgård. Fæstningens Materialgård is a case...... study where the Heritage Agency, the Danish Working Environment Authority and the owner as a team cooperated in identifying feasible refurbishments. In this case, the focus centered on restoring and identifying potential energy savings and deciding on energy upgrading measures for the listed complex...

  10. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed on...

  11. Modeling UHF Radio Propagation in Buildings.

    Science.gov (United States)

    Honcharenko, Walter

    The potential implementation of wireless Radio Local Area Networks and Personal Communication Services inside buildings requires a thorough understanding of signal propagation within buildings. This work describes a study leading to a theoretical understanding of wave propagation phenomenon inside buildings. Covered first is propagation in the clear space between the floor and ceiling, which is modeled using Kirchoff -Huygens diffraction theory. This along with ray tracing techniques are used to develop a model to predict signal coverage inside buildings. Simulations were conducted on a hotel building, two office buildings, and a university building to which measurements of CW signals were compared, with good agreement. Propagation to other floors was studied to determine the signal strength as a function of the number of floors separating transmitter and receiver. Diffraction paths and through the floor paths which carry significant power to the receivers were examined. Comparisons were made to measurements in a hotel building and an office building, in which agreements were excellent. As originally developed for Cellular Mobile Radio (CMR) systems, the sector average is obtained from the spatial average of the received signal as the mobile traverses a path of 20 or so wavelengths. This approach has also been applied indoors with the assumption that a unique average could be obtained by moving either end of the radio link. However, unlike in the CMR environment, inside buildings both ends of the radio link are in a rich multipath environment. It is shown both theoretically and experimentally that moving both ends of the link is required to achieve a unique average. Accurate modeling of the short pulse response of a signal within a building will provide insight for determining the hardware necessary for high speed data transmission and recovery, and a model for determining the impulse response is developed in detail. Lastly, the propagation characteristics of

  12. A procedure for Building Product Models

    DEFF Research Database (Denmark)

    Hvam, Lars

    1999-01-01

    activities. A basic assumption is that engineers have to take the responsability for building product models to be used in their domain. To do that they must be able to carry out the modeling task on their own without any need for support from computer science experts. This paper presents a set of simple...

  13. Scalable Efficient Composite Event Detection

    OpenAIRE

    Jayaram, K. R.; Eugster, Patrick

    2010-01-01

    International audience Composite event detection (CED) is the task of identifying combinations of events which are meaningful with respect to program-defined patterns. Recent research in event-based programming has focused on language design (in different paradigms), leading to a wealth of prototype programming models and languages. However, implementing CED in an efficient and scalable manner remains an under-addressed problem. In fact, the lack of scalable algorithms is the main roadbloc...

  14. The Ptolemy project:a scalable model for delivering health information in Africa

    OpenAIRE

    Beveridge, Massey; Howard, Andrew; Burton, Kirsteen; Holder, Warren

    2003-01-01

    How is Africa to build up the medical research it needs? Doctors in African research communities are starved of access to the journals and texts their colleagues in more developed countries regard as fundamental to good practice and research. Isolation, burden of practice, and resource limitations make education and research difficult, but the rapid spread of access to the internet reduces these obstacles and provides an increasingly attractive means to disseminate information and build partn...

  15. A procedure for Building Product Models

    DEFF Research Database (Denmark)

    Hvam, Lars

    1999-01-01

    , easily adaptable concepts and methods from data modeling (object oriented analysis) and domain modeling (product modeling). The concepts are general and can be used for modeling all types of specifications in the different phases in the product life cycle. The modeling techniques presented have been...... activities. A basic assumption is that engineers have to take the responsability for building product models to be used in their domain. To do that they must be able to carry out the modeling task on their own without any need for support from computer science experts. This paper presents a set of simple...

  16. Minimalism in Inflation Model Building

    OpenAIRE

    Dvali, Gia; Riotto, Antonio

    1997-01-01

    In this paper we demand that a successfull inflationary scenario should follow from a model entirely motivated by particle physics considerations. We show that such a connection is indeed possible within the framework of concrete supersymmetric Grand Unified Theories where the doublet-triplet splitting problem is naturally solved. The Fayet-Iliopoulos D-term of a gauge $U(1)_{\\xi}$ symmetry, which plays a crucial role in the solution of the doublet-triplet splitting problem, simultaneously pr...

  17. Minimalism in inflation model building

    Science.gov (United States)

    Dvali, Gia; Riotto, Antonio

    1998-01-01

    In this paper we demand that a successful inflationary scenario should follow from a model entirely motivated by particle physics considerations. We show that such a connection is indeed possible within the framework of concrete supersymmetric Grand Unified Theories where the doublet-triplet splitting problem is naturally solved. The Fayet-Iliopoulos D-term of a gauge U(1)ξ symmetry, which plays a crucial role in the solution of the doublet-triplet splitting problem, simultaneously provides a built-in inflationary slope protected from dangerous supergravity corrections.

  18. U.S. Department of Energy Commercial Reference Building Models of the National Building Stock

    Energy Technology Data Exchange (ETDEWEB)

    Deru, M.; Field, K.; Studer, D.; Benne, K.; Griffith, B.; Torcellini, P.; Liu, B.; Halverson, M.; Winiarski, D.; Rosenberg, M.; Yazdanian, M.; Huang, J.; Crawley, D.

    2011-02-01

    The U.S. Department of Energy (DOE) Building Technologies Program has set the aggressive goal of producing marketable net-zero energy buildings by 2025. This goal will require collaboration between the DOE laboratories and the building industry. We developed standard or reference energy models for the most common commercial buildings to serve as starting points for energy efficiency research. These models represent fairly realistic buildings and typical construction practices. Fifteen commercial building types and one multifamily residential building were determined by consensus between DOE, the National Renewable Energy Laboratory, Pacific Northwest National Laboratory, and Lawrence Berkeley National Laboratory, and represent approximately two-thirds of the commercial building stock.

  19. Alternatives to Quintessence Model-building

    CERN Document Server

    Pina-Avelino, P; De Carvalho, J P M; Martins, C J; Pinto, P

    2003-01-01

    We discuss the issue of toy model building for the dark energy component of the universe. Specifically, we consider two generic toy models recently proposed as alternatives to quintessence models, known as Cardassian expansion and the Chaplygin gas. We show that the former is enteriely equivalent to a class of quintessence models. We determine the observational constraints on the latter, coming from recent supernovae results and from the shape of the matter power spectrum. As expected, these restrict the model to a behaviour that closely matches that of a standard cosmological constant $\\Lambda$.

  20. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE)hourly from −5.6% to 7.5% and CV(RMSE)hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  1. A Scalable Software Architecture Booting and Configuring Nodes in the Whitney Commodity Computing Testbed

    Science.gov (United States)

    Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The Whitney project is integrating commodity off-the-shelf PC hardware and software technology to build a parallel supercomputer with hundreds to thousands of nodes. To build such a system, one must have a scalable software model, and the installation and maintenance of the system software must be completely automated. We describe the design of an architecture for booting, installing, and configuring nodes in such a system with particular consideration given to scalability and ease of maintenance. This system has been implemented on a 40-node prototype of Whitney and is to be used on the 500 processor Whitney system to be built in 1998.

  2. Reconstructing building mass models from UAV images

    KAUST Repository

    Li, Minglei

    2015-07-26

    We present an automatic reconstruction pipeline for large scale urban scenes from aerial images captured by a camera mounted on an unmanned aerial vehicle. Using state-of-the-art Structure from Motion and Multi-View Stereo algorithms, we first generate a dense point cloud from the aerial images. Based on the statistical analysis of the footprint grid of the buildings, the point cloud is classified into different categories (i.e., buildings, ground, trees, and others). Roof structures are extracted for each individual building using Markov random field optimization. Then, a contour refinement algorithm based on pivot point detection is utilized to refine the contour of patches. Finally, polygonal mesh models are extracted from the refined contours. Experiments on various scenes as well as comparisons with state-of-the-art reconstruction methods demonstrate the effectiveness and robustness of the proposed method.

  3. Modelling of risk in the building projects

    OpenAIRE

    Dariusz Skorupka

    2006-01-01

    The paper is concerned with the process of risk modelling in the building projects. Using a model of real object is one of the features of the present research works. In some cases, that method is necessary to carry out some forms of experiments. A model is a copy of reality. Modelling enables automation of the various processes and research of unlimited set of objects. Moreover, formal depiction of reality creates conditions for carrying out broad studies of a given problem and reduces the c...

  4. Protein Models Comparator: Scalable Bioinformatics Computing on the Google App Engine Platform

    OpenAIRE

    Widera, Paweł; Krasnogor, Natalio

    2011-01-01

    The comparison of computer generated protein structural models is an important element of protein structure prediction. It has many uses including model quality evaluation, selection of the final models from a large set of candidates or optimisation of parameters of energy functions used in template-free modelling and refinement. Although many protein comparison methods are available online on numerous web servers, they are not well suited for large scale model comparison: (1) they operate wi...

  5. Building Simulation Modelers are we big-data ready?

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, Jibonananda [ORNL; New, Joshua Ryan [ORNL

    2014-01-01

    Recent advances in computing and sensor technologies have pushed the amount of data we collect or generate to limits previously unheard of. Sub-minute resolution data from dozens of channels is becoming increasingly common and is expected to increase with the prevalence of non-intrusive load monitoring. Experts are running larger building simulation experiments and are faced with an increasingly complex data set to analyze and derive meaningful insight. This paper focuses on the data management challenges that building modeling experts may face in data collected from a large array of sensors, or generated from running a large number of building energy/performance simulations. The paper highlights the technical difficulties that were encountered and overcome in order to run 3.5 million EnergyPlus simulations on supercomputers and generating over 200 TBs of simulation output. This extreme case involved development of technologies and insights that will be beneficial to modelers in the immediate future. The paper discusses different database technologies (including relational databases, columnar storage, and schema-less Hadoop) in order to contrast the advantages and disadvantages of employing each for storage of EnergyPlus output. Scalability, analysis requirements, and the adaptability of these database technologies are discussed. Additionally, unique attributes of EnergyPlus output are highlighted which make data-entry non-trivial for multiple simulations. Practical experience regarding cost-effective strategies for big-data storage is provided. The paper also discusses network performance issues when transferring large amounts of data across a network to different computing devices. Practical issues involving lag, bandwidth, and methods for synchronizing or transferring logical portions of the data are presented. A cornerstone of big-data is its use for analytics; data is useless unless information can be meaningfully derived from it. In addition to technical

  6. Computer modelling of tornado effects on buildings

    International Nuclear Information System (INIS)

    An attempt is made to model the tornado-structure interaction. The tornado is represented as a Rankine-Combined vortex. The computations are done on a rectangular grid system. The governing equations are approximated using control volume procedure. The pressure equations are solved by an efficient preconditioned conjugate gradient procedure. The computed tornado forces are compared with straight boundary layer (SBL) wind. The tornado forces on the roof of the building is more than five times the SBL flow

  7. Lean construction, building information modelling and sustainability

    OpenAIRE

    Koskela, Lauri; Owen, Bob; Dave, Bhargav

    2010-01-01

    This paper investigates the mutual relations of three current drivers of construction: lean construction, building information modelling and sustainability. These drivers are based on infrequently occurring changes, only incidentally simultaneous, in their respective domains. It is contended that the drivers are mutually supportive and thus synergistic. They are aligned in the sense that all require, promote or enable collaboration. It is argued that these three drivers should ...

  8. Building Information Modelling Incorporating Technology Based Assessment

    OpenAIRE

    Murphy, Maurice; Scott, Lloyd

    2011-01-01

    Building Information Modelling (BIM) is currently being developed as a virtual learning tool for construction and surveying students in the Dublin Institute of Technology. This advanced technology is also used to develop a technology based assessment practice for enhancing the learning environment of construction and surveying students. A theoretical design framework is presented in this paper, which combines advanced technology and assessment theory to create a virtual learning environment. ...

  9. Scripted Building Energy Modeling and Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.

    2012-08-01

    Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.

  10. Building Performance Models from Expert Knowledge

    OpenAIRE

    Abernethy, Margaret A.; Horne, Malcolm; Lillis, Anne M.; Malina, Mary A.; Selto, Frank H.

    2003-01-01

    Improving management control of knowledge-based organizations motivates building performance management models (PMM) of causally related, key success factors (KSF) . This study elicits knowledge maps of KSF from field experts. These knowledge maps are layered to create the foundation of the organization’s PMM. The study elicits causal knowledge from experts who through their experience, training, etc. have encoded relational or causal knowledge about complex systems; that is, t...

  11. Precast RC Industrial Building design supported by Building Information Model (BIM)

    OpenAIRE

    Mirkac, Tadej

    2010-01-01

    A Precast RC industrial building with typical skeletal structure is modelled, analyzed and documented. The focus of diploma was preparation of building information model which serves as a support in design. We paid special attention to structural analysis, earthquake design, detailing and project documentation. The industrial building is planned as an extension of a bigger facility, so we decided to use building information model for 3D visualization simulation of phases in construction detai...

  12. An Algorithm to Translate Building Topology in Building Information Modeling into Object-Oriented Physical Modeling-Based Building Energy Modeling

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2016-01-01

    Full Text Available This paper presents an algorithm to translate building topology in an object-oriented architectural building model (Building Information Modeling, BIM into an object-oriented physical-based energy performance simulation by using an object-oriented programming approach. Our algorithm demonstrates efficient mapping of building components in a BIM model into space boundary conditions in an object-oriented physical modeling (OOPM-based building energy model, and the translation of building topology into space boundary conditions to create an OOPM model. The implemented command, TranslatingBuildingTopology, using an object-oriented programming approach, enables graphical representation of the building topology of BIM models and the automatic generation of space boundaries information for OOPM models. The algorithm and its implementation allow coherent object-mapping from BIM to OOPM and facilitate the definition of space boundaries information during model translation for building thermal simulation. In order to demonstrate our algorithm and its implementation, we conducted experiments with three test cases using the BESTEST 600 model. Our experiments show that our algorithm and its implementation enable building topology information to be automatically translated into space boundary information, and facilitates the reuse of BIM data into building thermal simulations without additional export or import processes.

  13. Building Information Modelling for Smart Built Environments

    Directory of Open Access Journals (Sweden)

    Jianchao Zhang

    2015-01-01

    Full Text Available Building information modelling (BIM provides architectural 3D visualization and a standardized way to share and exchange building information. Recently, there has been an increasing interest in using BIM, not only for design and construction, but also the post-construction management of the built facility. With the emergence of smart built environment (SBE technology, which embeds most spaces with smart objects to enhance the building’s efficiency, security and comfort of its occupants, there is a need to understand and address the challenges BIM faces in the design, construction and management of future smart buildings. In this paper, we investigate how BIM can contribute to the development of SBE. Since BIM is designed to host information of the building throughout its life cycle, our investigation has covered phases from architecture design to facility management. Firstly, we extend BIM for the design phase to provide material/device profiling and the information exchange interface for various smart objects. Next, we propose a three-layer verification framework to assist BIM users in identifying possible defects in their SBE design. For the post-construction phase, we have designed a facility management tool to provide advanced energy management of smart grid-connected SBEs, where smart objects, as well as distributed energy resources (DERs are deployed.

  14. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  15. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  16. 3D modeling of buildings outstanding sites

    CERN Document Server

    Héno, Rapha?le

    2014-01-01

    Conventional topographic databases, obtained by capture on aerial or spatial images provide a simplified 3D modeling of our urban environment, answering the needs of numerous applications (development, risk prevention, mobility management, etc.). However, when we have to represent and analyze more complex sites (monuments, civil engineering works, archeological sites, etc.), these models no longer suffice and other acquisition and processing means have to be implemented. This book focuses on the study of adapted lifting means for "notable buildings". The methods tackled in this book cover las

  17. A procedure for building product models

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin; Hansen, Benjamin Loer

    This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes, which are to be supported...... with product models. The next phase includes an analysis of the product assortment, and the set up of a so-called product master. Finally the product model is designed and implemented using object oriented modelling. The procedure is developed in order to ensure that the product models constructed are...... fit for the business processes they support, and properly structured and documented, in order to facilitate that the systems can be maintained continually and further developed. The research has been carried out at the Centre for Industrialisation of Engineering, Department of Manufacturing...

  18. HYDROSCAPE: A SCAlable and ParallelizablE Rainfall Runoff Model for Hydrological Applications

    Science.gov (United States)

    Piccolroaz, S.; Di Lazzaro, M.; Zarlenga, A.; Majone, B.; Bellin, A.; Fiori, A.

    2015-12-01

    In this work we present HYDROSCAPE, an innovative streamflow routing method based on the travel time approach, and modeled through a fine-scale geomorphological description of hydrological flow paths. The model is designed aimed at being easily coupled with weather forecast or climate models providing the hydrological forcing, and at the same time preserving the geomorphological dispersion of the river network, which is kept unchanged independently on the grid size of rainfall input. This makes HYDROSCAPE particularly suitable for multi-scale applications, ranging from medium size catchments up to the continental scale, and to investigate the effects of extreme rainfall events that require an accurate description of basin response timing. Key feature of the model is its computational efficiency, which allows performing a large number of simulations for sensitivity/uncertainty analyses in a Monte Carlo framework. Further, the model is highly parsimonious, involving the calibration of only three parameters: one defining the residence time of hillslope response, one for channel velocity, and a multiplicative factor accounting for uncertainties in the identification of the potential maximum soil moisture retention in the SCS-CN method. HYDROSCAPE is designed with a simple and flexible modular structure, which makes it particularly prone to massive parallelization, customization according to the specific user needs and preferences (e.g., rainfall-runoff model), and continuous development and improvement. Finally, the possibility to specify the desired computational time step and evaluate streamflow at any location in the domain, makes HYDROSCAPE an attractive tool for many hydrological applications, and a valuable alternative to more complex and highly parametrized large scale hydrological models. Together with model development and features, we present an application to the Upper Tiber River basin (Italy), providing a practical example of model performance and

  19. Scalability on LHS (Latin Hypercube Sampling) samples for use in uncertainty analysis of large numerical models

    International Nuclear Information System (INIS)

    The present paper deals with the utilization of advanced sampling statistical methods to perform uncertainty and sensitivity analysis on numerical models. Such models may represent physical phenomena, logical structures (such as boolean expressions) or other systems, and various of their intrinsic parameters and/or input variables are usually treated as random variables simultaneously. In the present paper a simple method to scale-up Latin Hypercube Sampling (LHS) samples is presented, starting with a small sample and duplicating its size at each step, making it possible to use the already run numerical model results with the smaller sample. The method does not distort the statistical properties of the random variables and does not add any bias to the samples. The results is a significant reduction in numerical models running time can be achieved (by re-using the previously run samples), keeping all the advantages of LHS, until an acceptable representation level is achieved in the output variables. (author)

  20. A model and framework for reliable build systems

    CERN Document Server

    Coetzee, Derrick; Necula, George

    2012-01-01

    Reliable and fast builds are essential for rapid turnaround during development and testing. Popular existing build systems rely on correct manual specification of build dependencies, which can lead to invalid build outputs and nondeterminism. We outline the challenges of developing reliable build systems and explore the design space for their implementation, with a focus on non-distributed, incremental, parallel build systems. We define a general model for resources accessed by build tasks and show its correspondence to the implementation technique of minimum information libraries, APIs that return no information that the application doesn't plan to use. We also summarize preliminary experimental results from several prototype build managers.

  1. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  2. A Scalable Model for the Performance Evaluation of ROADMs with Generic Switching Capabilities

    Directory of Open Access Journals (Sweden)

    Athanasios S Tsokanos

    2010-10-01

    Full Text Available In order to evaluate the performance of Reconfigurable Optical Add/Drop Multiplexers (ROADMs consisting of a single large switch, in circuit switched Wavelength-Division Multiplexing (WDM networks, a theoretical Queuing Network Model (QNM is developed, which consists of two M/M/c/c loss systems each of which is analyzed in isolation. An overall analytical blocking probability of a ROADM is obtained. This model can also be used for the performance optimization of ROADMs with a single switch capable of switching all or a partial number of the wavelengths being used. It is demonstrated how the proposed model can be used for the performance evaluation of a ROADM for different number of wavelengths inside the switch, in various traffic intensity conditions producing an exact blocking probability solution. The accuracy of the analytical results is validated by simulation.

  3. Automatic Generation of 3D Building Models with Multiple Roofs

    Institute of Scientific and Technical Information of China (English)

    Kenichi Sugihara; Yoshitugu Hayashi

    2008-01-01

    Based on building footprints (building polygons) on digital maps, we are proposing the GIS and CG integrated system that automatically generates 3D building models with multiple roofs. Most building polygons' edges meet at right angles (orthogonal polygon). The integrated system partitions orthogonal building polygons into a set of rectangles and places rectangular roofs and box-shaped building bodies on these rectangles. In order to partition an orthogonal polygon, we proposed a useful polygon expression in deciding from which vertex a dividing line is drawn. In this paper, we propose a new scheme for partitioning building polygons and show the process of creating 3D roof models.

  4. Developmental Impact Analysis of an ICT-Enabled Scalable Healthcare Model in BRICS Economies

    Directory of Open Access Journals (Sweden)

    Dhrubes Biswas

    2012-06-01

    Full Text Available This article highlights the need for initiating a healthcare business model in a grassroots, emerging-nation context. This article’s backdrop is a history of chronic anomalies afflicting the healthcare sector in India and similarly placed BRICS nations. In these countries, a significant percentage of populations remain deprived of basic healthcare facilities and emergency services. Community (primary care services are being offered by public and private stakeholders as a panacea to the problem. Yet, there is an urgent need for specialized (tertiary care services at all levels. As a response to this challenge, an all-inclusive health-exchange system (HES model, which utilizes information communication technology (ICT to provide solutions in rural India, has been developed. The uniqueness of the model lies in its innovative hub-and-spoke architecture and its emphasis on affordability, accessibility, and availability to the masses. This article describes a developmental impact analysis (DIA that was used to assess the impact of this model. The article contributes to the knowledge base of readers by making them aware of the healthcare challenges emerging nations are facing and ways to mitigate those challenges using entrepreneurial solutions.

  5. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  6. A Scalable Security Model for Enabling Dynamic Virtual Private Execution Infrastructures on the Internet

    OpenAIRE

    Vicat-Blanc Primet, Pascale; Gelas, Jean-Patrick; Mornard, Olivier; Koslovski, Guilherme; Roca, Vincent; Giraud, Lionel; Montagnat, Johan; Truong Huu, Tram

    2009-01-01

    With the expansion and the convergence of computing and communication, the dynamic provisioning of customized pro- cessing and networking infrastructures as well as resource virtualization are appealing concepts and technologies. There- fore, new models and tools are needed to allow users to create, trust and exploit such on-demand virtual infrastruc- tures within wide area distributed environments. This pa- per proposes to combine network and system virtualization with cryptographic identifi...

  7. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local...... dependence and unequal item discrimination, are discussed. The methods are illustrated and motivated using a simulation study and a real data example....

  8. Iterative build OMIT maps: Map improvement by iterative model-building and refinement without model bias

    Energy Technology Data Exchange (ETDEWEB)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2008-02-12

    A procedure for carrying out iterative model-building, density modification and refinement is presented in which the density in an OMIT region is essentially unbiased by an atomic model. Density from a set of overlapping OMIT regions can be combined to create a composite 'Iterative-Build' OMIT map that is everywhere unbiased by an atomic model but also everywhere benefiting from the model-based information present elsewhere in the unit cell. The procedure may have applications in the validation of specific features in atomic models as well as in overall model validation. The procedure is demonstrated with a molecular replacement structure and with an experimentally-phased structure, and a variation on the method is demonstrated by removing model bias from a structure from the Protein Data Bank.

  9. Helicopter model rotor-blade vortex interaction impulsive noise: Scalability and parametric variations

    Science.gov (United States)

    Splettstoesser, W. R.; Schultz, K. J.; Boxwell, D. A.; Schmitz, F. H.

    1984-01-01

    Acoustic data taken in the anechoic Deutsch-Niederlaendischer Windkanal (DNW) have documented the blade vortex interaction (BVI) impulsive noise radiated from a 1/7-scale model main rotor of the AH-1 series helicopter. Averaged model scale data were compared with averaged full scale, inflight acoustic data under similar nondimensional test conditions. At low advance ratios (mu = 0.164 to 0.194), the data scale remarkable well in level and waveform shape, and also duplicate the directivity pattern of BVI impulsive noise. At moderate advance ratios (mu = 0.224 to 0.270), the scaling deteriorates, suggesting that the model scale rotor is not adequately simulating the full scale BVI noise; presently, no proved explanation of this discrepancy exists. Carefully performed parametric variations over a complete matrix of testing conditions have shown that all of the four governing nondimensional parameters - tip Mach number at hover, advance ratio, local inflow ratio, and thrust coefficient - are highly sensitive to BVI noise radiation.

  10. Development of Residential Prototype Building Models and Analysis System for Large-Scale Energy Efficiency Studies Using EnergyPlus

    Energy Technology Data Exchange (ETDEWEB)

    Mendon, Vrushali V.; Taylor, Zachary T.

    2014-09-10

    ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype building models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.

  11. Procedural modeling historical buildings for serious games

    Directory of Open Access Journals (Sweden)

    Gonzalo Besuievsky

    2013-11-01

    Full Text Available In this paper we target the goal of obtaining detailed historical virtual buildings, like a castle or a city old town, through a methodology that facilitates their reconstruction. We allow having in a short time an approximation model that is flexible for being explored, analyzed and eventually modified. This is crucial for serious game development pipelines, whose objective is focused not only on accuracy and realism, but also on transmitting a sense of immersion to the player.

  12. Building phenomenological models of complex biological processes

    Science.gov (United States)

    Daniels, Bryan; Nemenman, Ilya

    2009-11-01

    A central goal of any modeling effort is to make predictions regarding experimental conditions that have not yet been observed. Overly simple models will not be able to fit the original data well, but overly complex models are likely to overfit the data and thus produce bad predictions. Modern quantitative biology modeling efforts often err on the complexity side of this balance, using myriads of microscopic biochemical reaction processes with a priori unknown kinetic parameters to model relatively simple biological phenomena. In this work, we show how Bayesian model selection (which is mathematically similar to low temperature expansion in statistical physics) can be used to build coarse-grained, phenomenological models of complex dynamical biological processes, which have better predictive powers than microscopically correct, but poorely constrained mechanistic molecular models. We illustrate this on the example of a multiply-modifiable protein molecule, which is a simplified description of multiple biological systems, such as an immune receptors and an RNA polymerase complex. Our approach is similar in spirit to the phenomenological Landau expansion for the free energy in the theory of critical phenomena.

  13. Service Virtualization Using a Non-von Neumann Parallel, Distributed, and Scalable Computing Model

    Directory of Open Access Journals (Sweden)

    Rao Mikkilineni

    2012-01-01

    Full Text Available This paper describes a prototype implementing a high degree of transaction resilience in distributed software systems using a non-von Neumann computing model exploiting parallelism in computing nodes. The prototype incorporates fault, configuration, accounting, performance, and security (FCAPS management using a signaling network overlay and allows the dynamic control of a set of distributed computing elements in a network. Each node is a computing entity endowed with self-management and signaling capabilities to collaborate with similar nodes in a network. The separation of parallel computing and management channels allows the end-to-end transaction management of computing tasks (provided by the autonomous distributed computing elements to be implemented as network-level FCAPS management. While the new computing model is operating system agnostic, a Linux, Apache, MySQL, PHP/Perl/Python (LAMP based services architecture is implemented in a prototype to demonstrate end-to-end transaction management with auto-scaling, self-repair, dynamic performance management and distributed transaction security assurance. The implementation is made possible by a non-von Neumann middleware library providing Linux process management through multi-threaded parallel execution of self-management and signaling abstractions. We did not use Hypervisors, Virtual machines, or layers of complex virtualization management systems in implementing this prototype.

  14. How Promotions Work: Scan Pro-Based Evolutionary Model Building

    OpenAIRE

    Leeflang, Peter S.H.; Heerde, Harald J. van; Dick Wittink

    2002-01-01

    We provide a rationale for evolutionary model building. The basic idea is that to enhance user acceptance it is important that one begins with a relatively simple model. Simplicity is desired so that managers understand models. As a manager uses the model and builds up experience with this decision aid, she will realize its shortcomings. The model will then be expanded and will lead to the increase of complexity. Evolutionary model building also stimulates the generalization of marketing know...

  15. Modelling seasonality in Australian building approvals

    Directory of Open Access Journals (Sweden)

    Harry M Karamujic

    2012-02-01

    Full Text Available The paper examines the impact of seasonal influences on Australian housing approvals, represented by the State of Victoria[1] building approvals for new houses (BANHs. The prime objective of BANHs is to provide timely estimates of future residential building work. Due to the relevance of the residential property sector to the property sector as whole, BANHs are viewed by economic analysts and commentators as a leading indicator of property sector investment and as such the general level of economic activity and employment. The generic objective of the study is to enhance the practice of modelling housing variables. In particular, the study seeks to cast some additional light on modelling the seasonal behaviour of BANHs by: (i establishing the presence, or otherwise, of seasonality in Victorian BANHs; (ii if present, ascertaining is it deterministic or stochastic; (iii determining out of sample forecasting capabilities of the considered modelling specifications; and (iv speculating on possible interpretation of the results. To do so the study utilises a structural time series model of Harwey (1989. The modelling results confirm that the modelling specification allowing for stochastic trend and deterministic seasonality performs best in terms of diagnostic tests and goodness of fit measures. This is corroborated with the analysis of out of sample forecasting capabilities of the considered modelling specifications, which showed that the models with deterministic seasonal specification exhibit superior forecasting capabilities. The paper also demonstrates that if time series are characterized by either stochastic trend or seasonality, the conventional modelling approach[2] is bound to be mis-specified i.e. would not be able to identify statistically significant seasonality in time series.According to the selected modeling specification, factors corresponding to June, April, December and November are found to be significant at five per cent level

  16. Building a Democratic Model of Science Teaching

    Directory of Open Access Journals (Sweden)

    Suhadi Ibnu

    2016-02-01

    Full Text Available Earlier in the last century, learning in science, as was learning in other disciplines, was developed according to the philosophy of behaviorism. This did not serve the purposes of learning in science properly, as the students were forced to absorb information transferred from the main and the only source of learning, the teacher. Towards the end of the century a significant shift from behaviorism to constructivism philosophy took place. The shift promoted the development of more democratic models of learning in science which provided greater opportunities to the students to act as real scientist, chattering for the building of knowledge and scientific skills. Considering the characteristics of science and the characteristics of the students as active learners, the shift towards democratic models of learning is unavoidable and is merely a matter of time

  17. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  18. Anatomically accurate high resolution modeling of human whole heart electromechanics: A strongly scalable algebraic multigrid solver method for nonlinear deformation

    Science.gov (United States)

    Augustin, Christoph M.; Neic, Aurel; Liebmann, Manfred; Prassl, Anton J.; Niederer, Steven A.; Haase, Gundolf; Plank, Gernot

    2016-01-01

    Electromechanical (EM) models of the heart have been used successfully to study fundamental mechanisms underlying a heart beat in health and disease. However, in all modeling studies reported so far numerous simplifications were made in terms of representing biophysical details of cellular function and its heterogeneity, gross anatomy and tissue microstructure, as well as the bidirectional coupling between electrophysiology (EP) and tissue distension. One limiting factor is the employed spatial discretization methods which are not sufficiently flexible to accommodate complex geometries or resolve heterogeneities, but, even more importantly, the limited efficiency of the prevailing solver techniques which is not sufficiently scalable to deal with the incurring increase in degrees of freedom (DOF) when modeling cardiac electromechanics at high spatio-temporal resolution. This study reports on the development of a novel methodology for solving the nonlinear equation of finite elasticity using human whole organ models of cardiac electromechanics, discretized at a high para-cellular resolution. Three patient-specific, anatomically accurate, whole heart EM models were reconstructed from magnetic resonance (MR) scans at resolutions of 220 μm, 440 μm and 880 μm, yielding meshes of approximately 184.6, 24.4 and 3.7 million tetrahedral elements and 95.9, 13.2 and 2.1 million displacement DOF, respectively. The same mesh was used for discretizing the governing equations of both electrophysiology (EP) and nonlinear elasticity. A novel algebraic multigrid (AMG) preconditioner for an iterative Krylov solver was developed to deal with the resulting computational load. The AMG preconditioner was designed under the primary objective of achieving favorable strong scaling characteristics for both setup and solution runtimes, as this is key for exploiting current high performance computing hardware. Benchmark results using the 220 μm, 440 μm and 880 μm meshes demonstrate

  19. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  20. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  1. Systematic model building with flavor symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Plentinger, Florian

    2009-12-19

    The observation of neutrino masses and lepton mixing has highlighted the incompleteness of the Standard Model of particle physics. In conjunction with this discovery, new questions arise: why are the neutrino masses so small, which form has their mass hierarchy, why is the mixing in the quark and lepton sectors so different or what is the structure of the Higgs sector. In order to address these issues and to predict future experimental results, different approaches are considered. One particularly interesting possibility, are Grand Unified Theories such as SU(5) or SO(10). GUTs are vertical symmetries since they unify the SM particles into multiplets and usually predict new particles which can naturally explain the smallness of the neutrino masses via the seesaw mechanism. On the other hand, also horizontal symmetries, i.e., flavor symmetries, acting on the generation space of the SM particles, are promising. They can serve as an explanation for the quark and lepton mass hierarchies as well as for the different mixings in the quark and lepton sectors. In addition, flavor symmetries are significantly involved in the Higgs sector and predict certain forms of mass matrices. This high predictivity makes GUTs and flavor symmetries interesting for both, theorists and experimentalists. These extensions of the SM can be also combined with theories such as supersymmetry or extra dimensions. In addition, they usually have implications on the observed matter-antimatter asymmetry of the universe or can provide a dark matter candidate. In general, they also predict the lepton flavor violating rare decays {mu} {yields} e{gamma}, {tau} {yields} {mu}{gamma}, and {tau} {yields} e{gamma} which are strongly bounded by experiments but might be observed in the future. In this thesis, we combine all of these approaches, i.e., GUTs, the seesaw mechanism and flavor symmetries. Moreover, our request is to develop and perform a systematic model building approach with flavor symmetries and

  2. Scalable Resource Discovery Architecture for Large Scale MANETs

    Directory of Open Access Journals (Sweden)

    Saad Al-Ahmadi

    2014-02-01

    Full Text Available The study conducted a primary investigation into using the Gray cube structure, clustering and Distributed Hash Tables (DHTs to build an efficient virtual network backbone for Resource Discovery (RD tasks in large scale Mobile Ad hoc NET works (MANETs. MANET is an autonomous system of mobile nodes characterized by wireless links. One of the major challenges in MANET is RD protocols responsible for advertising and searching network services. We propose an efficient and scalable RD architecture to meet the challenging requirements of reliable, scalable and power-efficient RD protocol suitable for MANETs with potentially thousands of wireless mobile devices. Our RD is based on virtual network backbone created by dividing the network into several non overlapping localities using multi-hop clustering. In every locality we build a Gray cube with locally adapted dimension. All the Gray cubes are connected through gateways and access points to form virtual backbone used as substrate for DHT operations to distribute, register and locate network resources efficiently. The Gray cube is characterized by low network diameter, low average distance and strong connectivity. We evaluated the proposed RD performance and compared it to some of the well known RD schemes in the literature based on modeling and simulation. The results show the superiority of the proposed RD in terms of delay, load balancing, overloading avoidance, scalability and fault-tolerance.

  3. A multi-layered software architecture model for building software solutions in an urbanized information system

    Directory of Open Access Journals (Sweden)

    Sana Guetat

    2013-01-01

    Full Text Available The concept of Information Systems urbanization has been proposed since the late 1990’s in order to help organizations building agile information systems. Nevertheless, despite the advantages of this concept, it remains too descriptive and presents many weaknesses. In particular, there is a lack of useful architecture models dedicated to defining software solutions compliant with information systems urbanization principles and rules. Moreover, well-known software architecture models do not provide sufficient resources to address the requirements and constraints of urbanized information systems. In this paper, we draw on the “information city” framework to propose a model of software architecture - called the 5+1 Software Architecture Model - which is compliant with information systems urbanization principles and helps organizations building urbanized software solutions. This framework improves the well-established software architecture models and allows the integration of new architectural paradigms. Furthermore, the proposed model contributes to the implementation of information systems urbanization in several ways. On the one hand, this model devotes a specific layer to applications integration and software reuse. On the other hand, it contributes to the information system agility and scalability due to its conformity to the separation of concerns principle.

  4. The Concept Model of Sustainable Buildings Refurbishment

    OpenAIRE

    Mickaitytė Aistė; Kaklauskas Artūras; Tupėnaitė Laura; Zavadskas Edmundas

    2008-01-01

    Sustainable development principles reaching many spheres of human activities, public buildings refurbishment is not an exemption in this case. Buildings refurbishment supports excellent opportunities to reduce energy consumption in buildings as well as encourages other sustainable refurbishment principles implementation - citizens' healthcare, environment protection, rational resources use, information about sustainable refurbishment dissemination and stakeholders groups' awareness. During th...

  5. Real-Time, Scalable, Content-based Twitter users recommendation

    OpenAIRE

    Subercaze, Julien; Gravier, Christophe; Laforest, Frederique

    2015-01-01

    Real-time recommendation of Twitter users based on the content of their profiles is a very challenging task. Traditional IR methods such as TF-IDF fail to handle efficiently large datasets. In this paper we present a scalable approach that allows real time recommendation of users based on their tweets. Our model builds a graph of terms, driven by the fact that users sharing similar interests will share similar terms. We show how this model can be encoded as a compact binary footprint, that al...

  6. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    International Nuclear Information System (INIS)

    The highly automated PHENIX AutoBuild wizard is described. The procedure can be applied equally well to phases derived from isomorphous/anomalous and molecular-replacement methods. The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution

  7. Model Predictive Control for the Operation of Building Cooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Yudong; Borrelli, Francesco; Hencey, Brandon; Coffey, Brian; Bengea, Sorin; Haves, Philip

    2010-06-29

    A model-based predictive control (MPC) is designed for optimal thermal energy storage in building cooling systems. We focus on buildings equipped with a water tank used for actively storing cold water produced by a series of chillers. Typically the chillers are operated at night to recharge the storage tank in order to meet the building demands on the following day. In this paper, we build on our previous work, improve the building load model, and present experimental results. The experiments show that MPC can achieve reduction in the central plant electricity cost and improvement of its efficiency.

  8. A legacy building model for holistic nursing.

    Science.gov (United States)

    Lange, Bernadette; Zahourek, Rothlyn P; Mariano, Carla

    2014-06-01

    This pilot project was an effort to record the historical roots, development, and legacy of holistic nursing through the visionary spirit of four older American Holistic Nurses Association (AHNA) members. The aim was twofold: (a) to capture the holistic nursing career experiences of elder AHNA members and (b) to begin to create a Legacy Building Model for Holistic Nursing. The narratives will help initiate an ongoing, systematic method for the collection of historical data and serve as a perpetual archive of knowledge and inspiration for present and future holistic nurses. An aesthetic inquiry approach was used to conduct in-depth interviews with four older AHNA members who have made significant contributions to holistic nursing. The narratives provide a rich description of their personal and professional evolution as holistic nurses. The narratives are presented in an aesthetic format of the art forms of snapshot, pastiche, and collage rather than traditional presentations of research findings. A synopsis of the narratives is a dialogue between the three authors and provides insight for how a Legacy Model can guide our future. Considerations for practice, education, and research are discussed based on the words of wisdom from the four older holistic nurses. PMID:24080342

  9. Modelling of the heating system for a building

    International Nuclear Information System (INIS)

    The district-heating systems for the heat-energy supply to the building consume substantial resources and the possibility to analyse the behaviour of the building as a part of the system is very important. The dynamic modelling of such a system may be simplified by using modelling software, such as MatLab. The model of the heat flows in the building and in the heating system and domestic water-heating system with heat-energy controllers has been developed. The model is based on the different equations of the heat flows between the elements of the building

  10. Scalable Content Management System

    Directory of Open Access Journals (Sweden)

    Sandeep Krishna S, Jayant Dani

    2013-10-01

    Full Text Available Immense growth in the volume of contents every day demands more scalable system to handle and overcome difficulties in capture, storage, transform, search, sharing and visualization of data, where the data can be a structured or unstructured data of any type. A system to manage the growing contents and overcome the issues and complexity faced using appropriate technologies would advantage over measurable qualities like flexibility, interoperability, customizability, security, auditability, quality, community support, options and cost of licensing. So architecting aContent Management System in terms of enterprise needs and a scalable solution to manage the huge data growth necessitates a Scalable Content Management System.

  11. Modeling thermally active building components using space mapping

    DEFF Research Database (Denmark)

    Pedersen, Frank; Weitzmann, Peter; Svendsen, Svend

    In order to efficiently implement thermally active building components in new buildings, it is necessary to evaluate the thermal interaction between them and other building components. Applying parameter investigation or numerical optimization methods to a differential-algebraic (DAE) model of a...... building provides a systematic way of estimating efficient building designs. However, using detailed numerical calculations of the components in the building is a time consuming process, which may become prohibitive if the DAE model is to be used for parameter variation or optimization. Unfortunately...... simplified models of the components do not always provide useful solutions, since they are not always able to reproduce the correct thermal behavior. The space mapping technique transforms a simplified, but computationally inexpensive model, in order to align it with a detailed model or measurements. This...

  12. A View on Future Building System Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  13. Building Information Modelling design ecologies - a new model?

    OpenAIRE

    Jones, Derek; Dewberry, Emma

    2012-01-01

    This paper considers the barriers to BIM adoption and demonstrates they are symptoms of existing problems in the Architecture, Engineering, Construction, and Operations (AECO) industry. When current external pressures are considered, a varied and complex set of problems emerge that require a significant paradigm change if they are to be resolved sustainably. It is argued that Building Information Modelling (BIM) does not represent a paradigm change on its own and the concept of the design eco...

  14. Integrating Building Information Modeling and Augmented Reality to Improve Investigation of Historical Buildings

    OpenAIRE

    Francesco Chionna; Francesco Argese; Vito Palmieri; Italo Spada; Lucio Colizzi

    2015-01-01

    This paper describes an experimental system to support investigation of historical buildings using Building Information Modeling (BIM) and Augmented Reality (AR). The system requires the use of an off-line software to build the BIM representation and defines a method to integrate diagnostic data into BIM. The system offers access to such information during site investigation using AR glasses supported by marker and marker-less technologies. The main innovation is the possibility to contextual...

  15. Grassmann Averages for Scalable Robust PCA

    DEFF Research Database (Denmark)

    Hauberg, Søren; Feragen, Aasa; Black, Michael J.

    2014-01-01

    As the collection of large datasets becomes increasingly automated, the occurrence of outliers will increase—“big data” implies “big outliers”. While principal component analysis (PCA) is often used to reduce the size of data, and scalable solutions exist, it is well-known that outliers can arbit......, making it scalable to “big noisy data.” We demonstrate TGA for background modeling, video restoration, and shadow removal. We show scalability by performing robust PCA on the entire Star Wars IV movie....

  16. Iterative model-building, structure refinement, and density modification with the PHENIX AutoBuild Wizard

    Energy Technology Data Exchange (ETDEWEB)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2007-04-29

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} to 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.

  17. Complementarity of Historic Building Information Modelling and Geographic Information Systems

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.

    2016-06-01

    In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  18. Vibration Response of Multi Storey Building Using Finite Element Modelling

    Science.gov (United States)

    Chik, T. N. T.; Zakaria, M. F.; Remali, M. A.; Yusoff, N. A.

    2016-07-01

    Interaction between building, type of foundation and the geotechnical parameter of ground may trigger a significant effect on the building. In general, stiffer foundations resulted in higher natural frequencies of the building-soil system and higher input frequencies are often associated with other ground. Usually, vibrations transmitted to the buildings by ground borne are often noticeable and can be felt. It might affect the building and become worse if the vibration level is not controlled. UTHM building is prone to the ground borne vibration due to closed distance from the main road, and the construction activities adjacent to the buildings. This paper investigates the natural frequency and vibration mode of multi storey office building with the presence of foundation system and comparison between both systems. Finite element modelling (FEM) package software of LUSAS is used to perform the vibration analysis of the building. The building is modelled based on the original plan with the foundation system on the structure model. The FEM results indicated that the structure which modelled with rigid base have high natural frequency compare to the structure with foundation system. These maybe due to soil structure interaction and also the damping of the system which related to the amount of energy dissipated through the foundation soil. Thus, this paper suggested that modelling with soil is necessary to demonstrate the soil influence towards vibration response to the structure.

  19. Scalable computations in penetration mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Kimsey, K.D.; Schraml, S.J. [Army Research Lab., Aberdeen Proving Ground, MD (United States). Weapons and Materials Research Directorate; Hertel, E.S. [Sandia National Labs., Albuquerque, NM (United States)

    1998-01-01

    This paper presents an overview of an explicit message passing paradigm for an Eulerian finite volume method for modeling solid dynamics problems involving shock wave propagation, multiple materials, and large deformations. Three-dimensional simulations of high-velocity impact were conducted on the IBM SP2, the SGI Power challenge Array, and the SGI Origin 2000. The scalability of the message-passing code on distributed-memory and symmetric multiprocessor architectures is presented and compared to the ideal linear performance.

  20. Statistical models describing the energy signature of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Thavlov, Anders

    2010-01-01

    Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i.......e. the heat dynamics of the building, have been developed. The models can be used to obtain rather detailed knowledge of the energy performance of the building and to optimize the control of the energy consumption for heating, which will be vital in conditions with increasing fluctuation of the energy...... supply or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as...

  1. Environmental sustainability modeling with exergy methodology for building life cycle

    Institute of Scientific and Technical Information of China (English)

    刘猛; 姚润明

    2009-01-01

    As an important human activity,the building industry has created comfortable space for living and work,and at the same time brought considerable pollution and huge consumption of energy and recourses. From 1990s after the first building environmental assessment model-BREEAM was released in the UK,a number of assessment models were formulated as analytical and practical in methodology respectively. This paper aims to introduce a generic model of exergy assessment on environmental impact of building life cycle,taking into consideration of previous models and focusing on natural environment as well as building life cycle,and three environmental impacts will be analyzed,namely energy embodied exergy,resource chemical exergy and abatement exergy on energy consumption,resource consumption and pollutant discharge respectively. The model of exergy assessment on environmental impact of building life cycle thus formulated contains two sub-models,one from the aspect of building energy utilization,and the other from building materials use. Combining theories by ecologists such as Odum,building environmental sustainability modeling with exergy methodology is put forward with the index of exergy footprint of building environmental impacts.

  2. On the area and energy scalability of wireless network-on-chip: a model-based benchmarked design space exploration

    OpenAIRE

    Abadal Cavallé, Sergi; Iannazzo Soteras, Mario Enrique; Nemirovsky, Mario; Cabellos Aparicio, Alberto; Lee, Heekwan; Alarcón Cot, Eduardo José

    2014-01-01

    Networks-on-Chip (NoCs) are emerging as the way to interconnect the processing cores and the memory within a chip multiprocessor. As recent years have seen a significant increase in the number of cores per chip, it is crucial to guarantee the scalability of NoCs in order to avoid communication to become the next performance bottleneck in multicore processors. Among other alternatives, the concept of Wireless Network-on- Chip (WNoC) has been proposed, wherein on-chip anten...

  3. Working group report: Flavor physics and model building

    Indian Academy of Sciences (India)

    M K Parida; Nita Sinha; B Adhikary; B Allanach; A Alok; K S Babu; B Brahmachari; D Choudhury; E J Chun; P K Das; A Ghosal; D Hitlin; W S Hou; S Kumar; H N Li; E Ma; S K Majee; G Majumdar; B Mishra; G Mohanty; S Nandi; H Pas; M K Parida; S D Rindani; J P Saha; N Sahu; Y Sakai; S Sen; C Sharma; C D Sharma; S Shalgar; N N Singh; S Uma Sankar; N Sinha; R Sinha; F Simonetto; R Srikanth; R Vaidya

    2006-11-01

    This is the report of flavor physics and model building working group at WHEPP-9. While activities in flavor physics have been mainly focused on -physics, those in model building have been primarily devoted to neutrino physics. We present summary of working group discussions carried out during the workshop in the above fields, and also briefly review the progress made in some projects subsequently

  4. Grassmann Averages for Scalable Robust PCA

    DEFF Research Database (Denmark)

    Hauberg, Søren; Feragen, Aasa; Black, Michael J.

    vectors (subspaces) or elements of vectors; we focus on the latter and use a trimmed average. The resulting Trimmed Grassmann Average (TGA) is particularly appropriate for computer vision because it is robust to pixel outliers. The algorithm has low computational complexity and minimal memory requirements......, making it scalable to “big noisy data.” We demonstrate TGA for background modeling, video restoration, and shadow removal. We show scalability by performing robust PCA on the entire Star Wars IV movie....

  5. ACCURACY ASSESSMENT OF BUILDING MODELS CREATED FROM LASER SCANNING DATA

    OpenAIRE

    Borkowski, A; Jóźków, G.

    2012-01-01

    Recently, it can be observed a growing interest in 3D building or city models created from laser scanning data. These models are used in many areas of interest. In this work the accuracy assessment of 3D buildings models created from airborne and terrestrial laser scanning data was carried out. TLS data for modelling were acquired with average point spacing about 0.02 m. In order to model invisible from the ground building elements such as roofs, the LIDAR data was used with density of about ...

  6. Building Cyberinfrastructure to Support a Real-time National Flood Model

    Science.gov (United States)

    Salas, F. R.; Maidment, D. R.; Tolle, K.; Navarro, C.; David, C. H.; Corby, R.

    2014-12-01

    The National Weather Service (NWS) is divided into 13 regional forecast centers across the country where the Sacramento Soil Moisture Accounting (SAC-SMA) model is run on average over a 10 day period, 5 days in the past and 5 days in the future. Model inputs and outputs such as precipitation and surface runoff are spatially aggregated over approximately 6,600 forecast basins with an average area of 1,200 square kilometers. In contrast, the NHDPlus dataset, which represents the geospatial fabric of the country, defines over 3 million catchments with an average area of 3 square kilometers. Downscaling the NWS land surface model outputs to the NHDPlus catchment scale in real-time requires the development of cyberinfrastructure to manage, share, compute and visualize large quantities of hydrologic data; streamflow computations through time for over 3 million river reaches. Between September 2014 and May 2015, the National Flood Interoperability Experiment (NFIE), coordinated through the Integrated Water Resource Science and Services (IWRSS) partners, will focus on building a national flood model for the country. This experiment will work to seamlessly integrate data and model services available on local and cloud servers (e.g. Azure) through disparate data sources operating at various spatial and temporal scales. As such, this paper will present a scalable information model that leverages the Routing Application for Parallel Computation of Discharge (RAPID) model to produce real-time flow estimates for approximately 67,000 NHDPlus river reaches in the NWS West Gulf River Forecast Center region.

  7. Modelling the heat dynamics of buildings using stochastic

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae; Madsen, Henrik

    2000-01-01

    This paper describes the continuous time modelling of the heat dynamics of a building. The considered building is a residential like test house divided into two test rooms with a water based central heating. Each test room is divided into thermal zones in order to describe both short and long term...... variations. Besides modelling the heat transfer between thermal zones, attention is put on modelling the heat input from radiators and solar radiation. The applied modelling procedure is based on collected building performance data and statistical methods. The statistical methods are used in parameter...

  8. A Heat Dynamic Model for Intelligent Heating of Buildings

    DEFF Research Database (Denmark)

    Thavlov, Anders; Bindner, Henrik W.

    2015-01-01

    of the building in time. This way the thermal mass of the building can be used to absorb energy from renewable energy source when available and postpone heating in periods with lack of renewable energy generation. The model is used in a model predictive controller to ensure the residential comfort......This article presents a heat dynamic model for prediction of the indoor temperature in an office building. The model has been used in several flexible load applications, where the indoor temperature is allowed to vary around a given reference to provide power system services by shifting the heating...

  9. Building

    OpenAIRE

    Seavy, Ryan

    2014-01-01

    Building for concrete is temporary. The building of wood and steel stands against the concrete to give form and then gives way, leaving a trace of its existence behind. Concrete is not a building material. One does not build with concrete. One builds for concrete.

  10. Energy Savings Modeling of Standard Commercial Building Re-tuning Measures: Large Office Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Nicholas; Katipamula, Srinivas; Wang, Weimin; Huang, Yunzhi; Liu, Guopeng

    2012-06-01

    Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS's capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This report investigates the energy savings potential of several common HVAC system retuning measures on a typical large office building prototype model, using the Department of Energy's building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy as an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated - each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All of these measures and combinations were simulated in 16 cities representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy

  11. Learning by building: A visual modelling language for psychology students

    OpenAIRE

    Mulholland, Paul; Watt, Stuart

    2000-01-01

    Cognitive modelling involves building computational models of psychological theories in order to learn more about them, and is a major research area allied to psychology and artificial intelligence. The main problem is that few psychology students have previous programming experience. The course lecturer can avoid the problem by presenting the area only in general terms. This leaves the process of building and testing models, which is central to the methodology, an unknown. Alternatively, stu...

  12. Modelling the Austrian sector heat consumption of buildings

    International Nuclear Information System (INIS)

    The supply of thermal energy for heating of buildings has been simulated using the computer code MESSAGE which has been developed. Thereby climatic conditions, construction details, modern heating technologies, water heater facilities and the user behaviour have been taken into consideration. Various improvements have been proposed. The result is an improved computer model simulating the heat consumption of buildings in Austria. (Suda)

  13. Whole-Building Hygrothermal Modeling in IEA Annex 41

    DEFF Research Database (Denmark)

    Rode, Carsten; Woloszyn, Monika

    2007-01-01

    Annex 41 of the International Energy Agency’s (IEA) Energy Conservation in Buildings and Community Systems program (ECBCS) is a cooperative project on “Whole-Building Heat, Air, and Moisture Response” (MOIST-ENG). Subtask 1 of that project set out to advance development in modeling the ntegral he...

  14. Numeric Analysis for Relationship-Aware Scalable Streaming Scheme

    Directory of Open Access Journals (Sweden)

    Heung Ki Lee

    2014-01-01

    Full Text Available Frequent packet loss of media data is a critical problem that degrades the quality of streaming services over mobile networks. Packet loss invalidates frames containing lost packets and other related frames at the same time. Indirect loss caused by losing packets decreases the quality of streaming. A scalable streaming service can decrease the amount of dropped multimedia resulting from a single packet loss. Content providers typically divide one large media stream into several layers through a scalable streaming service and then provide each scalable layer to the user depending on the mobile network. Also, a scalable streaming service makes it possible to decode partial multimedia data depending on the relationship between frames and layers. Therefore, a scalable streaming service provides a way to decrease the wasted multimedia data when one packet is lost. However, the hierarchical structure between frames and layers of scalable streams determines the service quality of the scalable streaming service. Even if whole packets of layers are transmitted successfully, they cannot be decoded as a result of the absence of reference frames and layers. Therefore, the complicated relationship between frames and layers in a scalable stream increases the volume of abandoned layers. For providing a high-quality scalable streaming service, we choose a proper relationship between scalable layers as well as the amount of transmitted multimedia data depending on the network situation. We prove that a simple scalable scheme outperforms a complicated scheme in an error-prone network. We suggest an adaptive set-top box (AdaptiveSTB to lower the dependency between scalable layers in a scalable stream. Also, we provide a numerical model to obtain the indirect loss of multimedia data and apply it to various multimedia streams. Our AdaptiveSTB enhances the quality of a scalable streaming service by removing indirect loss.

  15. Integrating Building Information Modeling and Augmented Reality to Improve Investigation of Historical Buildings

    Directory of Open Access Journals (Sweden)

    Francesco Chionna

    2015-12-01

    Full Text Available This paper describes an experimental system to support investigation of historical buildings using Building Information Modeling (BIM and Augmented Reality (AR. The system requires the use of an off-line software to build the BIM representation and defines a method to integrate diagnostic data into BIM. The system offers access to such information during site investigation using AR glasses supported by marker and marker-less technologies. The main innovation is the possibility to contextualize through AR not only existing BIM properties but also results from non-invasive tools. User evaluations show how the use of the system may enhance the perception of engineers during the investigation process.

  16. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  17. Building Component Library: An Online Repository to Facilitate Building Energy Model Creation; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, K.; Long, N.; Swindler, A.

    2012-05-01

    This paper describes the Building Component Library (BCL), the U.S. Department of Energy's (DOE) online repository of building components that can be directly used to create energy models. This comprehensive, searchable library consists of components and measures as well as the metadata which describes them. The library is also designed to allow contributors to easily add new components, providing a continuously growing, standardized list of components for users to draw upon.

  18. Building energy modeling for green architecture and intelligent dashboard applications

    Science.gov (United States)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the

  19. Jeddah Historical Building Information Modelling "JHBIM" - Object Library

    Science.gov (United States)

    Baik, A.; Alitany, A.; Boehm, J.; Robson, S.

    2014-05-01

    The theory of using Building Information Modelling "BIM" has been used in several Heritage places in the worldwide, in the case of conserving, documenting, managing, and creating full engineering drawings and information. However, one of the most serious issues that facing many experts in order to use the Historical Building Information Modelling "HBIM", is creating the complicated architectural elements of these Historical buildings. In fact, many of these outstanding architectural elements have been designed and created in the site to fit the exact location. Similarly, this issue has been faced the experts in Old Jeddah in order to use the BIM method for Old Jeddah historical Building. Moreover, The Saudi Arabian City has a long history as it contains large number of historic houses and buildings that were built since the 16th century. Furthermore, the BIM model of the historical building in Old Jeddah always take a lot of time, due to the unique of Hijazi architectural elements and no such elements library, which have been took a lot of time to be modelled. This paper will focus on building the Hijazi architectural elements library based on laser scanner and image survey data. This solution will reduce the time to complete the HBIM model and offering in depth and rich digital architectural elements library to be used in any heritage projects in Al-Balad district, Jeddah City.

  20. Guidelines for Using Building Information Modeling for Energy Analysis of Buildings

    Directory of Open Access Journals (Sweden)

    Thomas Reeves

    2015-12-01

    Full Text Available Building energy modeling (BEM, a subset of building information modeling (BIM, integrates energy analysis into the design, construction, and operation and maintenance of buildings. As there are various existing BEM tools available, there is a need to evaluate the utility of these tools in various phases of the building lifecycle. The goal of this research was to develop guidelines for evaluation and selection of BEM tools to be used in particular building lifecycle phases. The objectives of this research were to: (1 Evaluate existing BEM tools; (2 Illustrate the application of the three BEM tools; (3 Re-evaluate the three BEM tools; and (4 Develop guidelines for evaluation, selection and application of BEM tools in the design, construction and operation/maintenance phases of buildings. Twelve BEM tools were initially evaluated using four criteria: interoperability, usability, available inputs, and available outputs. Each of the top three BEM tools selected based on this initial evaluation was used in a case study to simulate and evaluate energy usage, daylighting performance, and natural ventilation for two academic buildings (LEED-certified and non-LEED-certified. The results of the case study were used to re-evaluate the three BEM tools using the initial criteria with addition of the two new criteria (speed and accuracy, and to develop guidelines for evaluating and selecting BEM tools to analyze building energy performance. The major contribution of this research is the development of these guidelines that can help potential BEM users to identify the most appropriate BEM tool for application in particular building lifecycle phases.

  1. Artificial intelligence support for scientific model-building

    Science.gov (United States)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  2. Rapid Texture Mapping from Image Sequences for Building Geometry Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zuxun; WU Jun; ZHANG Jianqing

    2003-01-01

    An effective approach,mapping the texture for building model based on the digital photogrammetric theory, is proposed. The easily-acquired image sequences from digital video camera on helicopter are used astexture resource, and the correspon-dence between the space edge in building geometry model and its line feature in image sequences is determined semiautomatically. The experimental results in production of three-dimensional data for car navigation show us an attractive future both in efficiency and effect.

  3. Research on the simulation framework in Building Information Modeling

    OpenAIRE

    Liang, Nan; Xu, Hongqing; Yu, Qiong

    2012-01-01

    In recent ten years, Building Information Modeling (BIM) has been proposed and applied in the industry of architecture. For the high efficiency and visualization, BIM and correlative technologies are welcomed by architects, engineers, builders and owners, thus the technologies on modeling for design has been widely researched. However, little attention is given to simulation while simulation is an important part of design for building, maybe because it is seen as somewhat less related to the ...

  4. Building predictive models of soil particle-size distribution

    OpenAIRE

    Alessandro Samuel-Rosa; Ricardo Simão Diniz Dalmolin; Pablo Miguel

    2013-01-01

    Is it possible to build predictive models (PMs) of soil particle-size distribution (psd) in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographi...

  5. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. PMID:27292581

  6. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC’s schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs’ schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  7. Building 3D models with modo 701

    CERN Document Server

    García, Juan Jiménez

    2013-01-01

    The book will focus on creating a sample application throughout the book, building gradually from chapter to chapter.If you are new to the 3D world, this is the key to getting started with a modern software in the modern visualization industry. Only minimal previous knowledge is needed.If you have some previous knowledge about 3D content creation, you will find useful tricks that will differentiate the learning experience from a typical user manual from this, a practical guide concerning the most common problems and situations and how to solve them.

  8. Development of hazard-compatible building fragility and vulnerability models

    Science.gov (United States)

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  9. Modelling of Building Interiors with Mobile Phone Sensor Data

    Directory of Open Access Journals (Sweden)

    Julian Rosser

    2015-06-01

    Full Text Available Creating as-built plans of building interiors is a challenging task. In this paper we present a semi-automatic modelling system for creating residential building interior plans and their integration with existing map data to produce building models. Taking a set of imprecise measurements made with an interactive mobile phone room mapping application, the system performs spatial adjustments in accordance with soft and hard constraints imposed on the building plan geometry. The approach uses an optimisation model that exploits a high accuracy building outline, such as can be found in topographic map data, and the building topology to improve the quality of interior measurements and generate a standardised output. We test our system on building plans of five residential homes. Our evaluation shows that the approach enables construction of accurate interior plans from imprecise measurements. The experiments report an average accuracy of 0.24 m, close to the 0.20 m recommended by the CityGML LoD4 specification.

  10. An Occupant Behavior Model for Building Energy Efficiency and Safety

    Science.gov (United States)

    Pan, L. L.; Chen, T.; Jia, Q. S.; Yuan, R. X.; Wang, H. T.; Ding, R.

    2010-05-01

    An occupant behavior model is suggested to improve building energy efficiency and safety. This paper provides a generic outline of the model, which includes occupancy behavior abstraction, model framework and primary structure, input and output, computer simulation results as well as summary and outlook. Using information technology, now it's possible to collect large amount of information of occupancy. Yet this can only provide partial and historical information, so it's important to develop a model to have full view of the researched building as well as prediction. We used the infrared monitoring system which is set at the front door of the Low Energy Demo Building (LEDB) at Tsinghua University in China, to provide the time variation of the total number of occupants in the LEDB building. This information is used as input data for the model. While the RFID system is set on the 1st floor, which provides the time variation of the occupants' localization in each region. The collected data are used to validate the model. The simulation results show that this presented model provides a feasible framework to simulate occupants' behavior and predict the time variation of the number of occupants in the building. Further development and application of the model is also discussed.

  11. Building models for marketing decisions : past, present and future

    NARCIS (Netherlands)

    Leeflang, P.S.H.; Wittink, Dick R.

    2000-01-01

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models t

  12. Building models for marketing decisions : Past, present and future

    NARCIS (Netherlands)

    Leeflang, PSH; Wittink, DR

    2000-01-01

    We review five eras of model building in marketing, with special emphasis on the fourth and the fifth eras, the present and the future. At many firms managers now routinely use model-based results for marketing decisions. Given an increasing number of successful applications, the demand for models t

  13. Building Test Cases through Model Driven Engineering

    Science.gov (United States)

    Sousa, Helaine; Lopes, Denivaldo; Abdelouahab, Zair; Hammoudi, Slimane; Claro, Daniela Barreiro

    Recently, Model Driven Engineering (MDE) has been proposed to face the complexity in the development, maintenance and evolution of large and distributed software systems. Model Driven Architecture (MDA) is an example of MDE. In this context, model transformations enable a large reuse of software systems through the transformation of a Platform Independent Model into a Platform Specific Model. Although source code can be generated from models, defects can be injected during the modeling or transformation process. In order to delivery software systems without defects that cause errors and fails, the source code must be submitted to test. In this paper, we present an approach that takes care of test in the whole software life cycle, i.e. it starts in the modeling level and finishes in the test of source code of software systems. We provide an example to illustrate our approach.

  14. Building Simple Hidden Markov Models. Classroom Notes

    Science.gov (United States)

    Ching, Wai-Ki; Ng, Michael K.

    2004-01-01

    Hidden Markov models (HMMs) are widely used in bioinformatics, speech recognition and many other areas. This note presents HMMs via the framework of classical Markov chain models. A simple example is given to illustrate the model. An estimation method for the transition probabilities of the hidden states is also discussed.

  15. Building fire zone model with symbolic mathematics

    Institute of Scientific and Technical Information of China (English)

    武红梅; 郜冶; 周允基

    2009-01-01

    To apply the fire modelling for the fire engineer with symbolic mathematics,the key equations of a zone model were demonstrated. There were thirteen variables with nine constraints,so only four ordinary differential equations (ODEs) were required to solve. A typical fire modelling with two-room structure was studied. Accordingly,the source terms included in the ODEs were simplified and modelled,and the fourth Runge-Kutta method was used to solve the ordinary differential equations (ODEs) with symbolic mathematics. Then a zone model could be used with symbolic mathematics. It is proposed that symbolic mathematics is possible for use by fire engineer.

  16. CFD Modeling of Airflow in a Livestock Building

    DEFF Research Database (Denmark)

    Rong, Li; Elhadidi, B.; Khalifa, H. E.;

    2010-01-01

    In this paper, a 2D simulation for a typical livestock building is performed to assess the ammonia emission removal rate to the atmosphere. Two geometry models are used and compared in order to represent the slatted floor. In the first model the floor is modeled as a slatted floor and in the second...... exploring the accuracy of the porous jump assumption by comparing the velocity, and ammonia concentration in a 2D simulation, heated solid bodies are added to represent the livestock in the following simulations. The results of simulations with heat source also indicate that modeling the slatted floor with...... livestock buildings....

  17. Linking Remote Sensing Data and Energy Balance Models for a Scalable Agriculture Insurance System for sub-Saharan Africa

    Science.gov (United States)

    Brown, M. E.; Osgood, D. E.; McCarty, J. L.; Husak, G. J.; Hain, C.; Neigh, C. S. R.

    2014-12-01

    One of the most immediate and obvious impacts of climate change is on the weather-sensitive agriculture sector. Both local and global impacts on production of food will have a negative effect on the ability of humanity to meet its growing food demands. Agriculture has become more risky, particularly for farmers in the most vulnerable and food insecure regions of the world such as East Africa. Smallholders and low-income farmers need better financial tools to reduce the risk to food security while enabling productivity increases to meet the needs of a growing population. This paper will describe a recently funded project that brings together climate science, economics, and remote sensing expertise to focus on providing a scalable and sensor-independent remote sensing based product that can be used in developing regional rainfed agriculture insurance programs around the world. We will focus our efforts in Ethiopia and Kenya in East Africa and in Senegal and Burkina Faso in West Africa, where there are active index insurance pilots that can test the effectiveness of our remote sensing-based approach for use in the agriculture insurance industry. The paper will present the overall program, explain links to the insurance industry, and present comparisons of the four remote sensing datasets used to identify drought: the CHIRPS 30-year rainfall data product, the GIMMS 30-year vegetation data product from AVHRR, the ESA soil moisture ECV-30 year soil moisture data product, and a MODIS Evapotranspiration (ET) 15-year dataset. A summary of next year's plans for this project will be presented at the close of the presentation.

  18. QOS OF WEB SERVICE: SURVEY ON PERFORMANCE AND SCALABILITY

    OpenAIRE

    Ch Ram Mohan Reddy; R. V Raghavendra Rao; D Evangelin Geetha; T. V. Suresh Kumar; K Rajani Kanth

    2013-01-01

    In today’s scenario, most of the organizations provide the services through the web. This makes the web service an important research area. In addition, early design and building web services, it is necessary to concentrate on the quality of web services. Performance is an important quality attributes that to be considered during the designing of web services. The expected performance can be achieved by proper scheduling of resources and scalability of the system. Scalability i...

  19. Duct thermal performance models for large commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Wray, Craig P.

    2003-10-01

    Despite the potential for significant energy savings by reducing duct leakage or other thermal losses from duct systems in large commercial buildings, California Title 24 has no provisions to credit energy-efficient duct systems in these buildings. A substantial reason is the lack of readily available simulation tools to demonstrate the energy-saving benefits associated with efficient duct systems in large commercial buildings. The overall goal of the Efficient Distribution Systems (EDS) project within the PIER High Performance Commercial Building Systems Program is to bridge the gaps in current duct thermal performance modeling capabilities, and to expand our understanding of duct thermal performance in California large commercial buildings. As steps toward this goal, our strategy in the EDS project involves two parts: (1) developing a whole-building energy simulation approach for analyzing duct thermal performance in large commercial buildings, and (2) using the tool to identify the energy impacts of duct leakage in California large commercial buildings, in support of future recommendations to address duct performance in the Title 24 Energy Efficiency Standards for Nonresidential Buildings. The specific technical objectives for the EDS project were to: (1) Identify a near-term whole-building energy simulation approach that can be used in the impacts analysis task of this project (see Objective 3), with little or no modification. A secondary objective is to recommend how to proceed with long-term development of an improved compliance tool for Title 24 that addresses duct thermal performance. (2) Develop an Alternative Calculation Method (ACM) change proposal to include a new metric for thermal distribution system efficiency in the reporting requirements for the 2005 Title 24 Standards. The metric will facilitate future comparisons of different system types using a common ''yardstick''. (3) Using the selected near-term simulation approach

  20. Fitting of Parametric Building Models to Oblique Aerial Images

    Science.gov (United States)

    Panday, U. S.; Gerke, M.

    2011-09-01

    In literature and in photogrammetric workstations many approaches and systems to automatically reconstruct buildings from remote sensing data are described and available. Those building models are being used for instance in city modeling or in cadastre context. If a roof overhang is present, the building walls cannot be estimated correctly from nadir-view aerial images or airborne laser scanning (ALS) data. This leads to inconsistent building outlines, which has a negative influence on visual impression, but more seriously also represents a wrong legal boundary in the cadaster. Oblique aerial images as opposed to nadir-view images reveal greater detail, enabling to see different views of an object taken from different directions. Building walls are visible from oblique images directly and those images are used for automated roof overhang estimation in this research. A fitting algorithm is employed to find roof parameters of simple buildings. It uses a least squares algorithm to fit projected wire frames to their corresponding edge lines extracted from the images. Self-occlusion is detected based on intersection result of viewing ray and the planes formed by the building whereas occlusion from other objects is detected using an ALS point cloud. Overhang and ground height are obtained by sweeping vertical and horizontal planes respectively. Experimental results are verified with high resolution ortho-images, field survey, and ALS data. Planimetric accuracy of 1cm mean and 5cm standard deviation was obtained, while buildings' orientation were accurate to mean of 0.23° and standard deviation of 0.96° with ortho-image. Overhang parameters were aligned to approximately 10cm with field survey. The ground and roof heights were accurate to mean of - 9cm and 8cm with standard deviations of 16cm and 8cm with ALS respectively. The developed approach reconstructs 3D building models well in cases of sufficient texture. More images should be acquired for completeness of

  1. Team learning: building shared mental models

    OpenAIRE

    Van den Bossche, Piet; Gijselaers, Wim; Segers, Mien; Woltjer, Geert; Kirschner, Paul A.

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning behaviors and team effectiveness. Analyses were performed on student-teams engaged in a business simulation game. The measurement of shared mental models was based on cognitive mapping techniques. ...

  2. Building a better model of cancer

    Directory of Open Access Journals (Sweden)

    DeGregori James

    2006-10-01

    Full Text Available Abstract The 2006 Cold Spring Harbor Laboratory meeting on the Mechanisms and Models of Cancer was held August 16–20. The meeting featured several hundred presentations of many short talks (mostly selected from the abstracts and posters, with the airing of a number of exciting new discoveries. We will focus this meeting review on models of cancer (primarily mouse models, highlighting recent advances in new mouse models that better recapitulate sporadic tumorigenesis, demonstrations of tumor addiction to tumor suppressor inactivation, new insight into senescence as a tumor barrier, improved understanding of the evolutionary paths of cancer development, and environmental/immunological influences on cancer.

  3. Scalable filter banks

    Science.gov (United States)

    Hur, Youngmi; Okoudjou, Kasso A.

    2015-08-01

    A finite frame is said to be scalable if its vectors can be rescaled so that the resulting set of vectors is a tight frame. The theory of scalable frame has been extended to the setting of Laplacian pyramids which are based on (rectangular) paraunitary matrices whose column vectors are Laurent polynomial vectors. This is equivalent to scaling the polyphase matrices of the associated filter banks. Consequently, tight wavelet frames can be constructed by appropriately scaling the columns of these paraunitary matrices by diagonal matrices whose diagonal entries are square magnitude of Laurent polynomials. In this paper we present examples of tight wavelet frames constructed in this manner and discuss some of their properties in comparison to the (non tight) wavelet frames they arise from.

  4. Modelling, design, and optimization of net-zero energy buildings

    CERN Document Server

    Athienitis, Andreas

    2015-01-01

    Building energy design is currently going through a period of major changes. One key factor of this is the adoption of net-zero energy as a long term goal for new buildings in most developed countries. To achieve this goal a lot of research is needed to accumulate knowledge and to utilize it in practical applications. In this book, accomplished international experts present advanced modeling techniques as well as in-depth case studies in order to aid designers in optimally using simulation tools for net-zero energy building design. The strategies and technologies discussed in this book are, ho

  5. Scalable Content Management System

    OpenAIRE

    Sandeep Krishna S, Jayant Dani

    2013-01-01

    Immense growth in the volume of contents every day demands more scalable system to handle and overcome difficulties in capture, storage, transform, search, sharing and visualization of data, where the data can be a structured or unstructured data of any type. A system to manage the growing contents and overcome the issues and complexity faced using appropriate technologies would advantage over measurable qualities like flexibility, interoperability, customizabi...

  6. Scalable Resolution Display Walls

    KAUST Repository

    Leigh, Jason

    2013-01-01

    This article will describe the progress since 2000 on research and development in 2-D and 3-D scalable resolution display walls that are built from tiling individual lower resolution flat panel displays. The article will describe approaches and trends in display hardware construction, middleware architecture, and user-interaction design. The article will also highlight examples of use cases and the benefits the technology has brought to their respective disciplines. © 1963-2012 IEEE.

  7. Team learning: building shared mental models

    NARCIS (Netherlands)

    Bossche, van den P.; Gijselaers, W.; Segers, M.; Woltjer, G.B.; Kirschner, P.

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning

  8. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  9. Team Learning: Building Shared Mental Models

    Science.gov (United States)

    Van den Bossche, Piet; Gijselaers, Wim; Segers, Mien; Woltjer, Geert; Kirschner, Paul

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning behaviors and team effectiveness. Analyses were…

  10. Building Water Models, A Different Approach

    CERN Document Server

    Izadi, Saeed; Onufriev, Alexey V

    2014-01-01

    Simplified, classical models of water are an integral part of atomistic molecular simulations, especially in biology and chemistry where hydration effects are critical. Yet, despite several decades of effort, these models are still far from perfect. Presented here is an alternative approach to constructing point charge water models - currently, the most commonly used type. In contrast to the conventional approach, we do not impose any geometry constraints on the model other than symmetry. Instead, we optimize the distribution of point charges to best describe the "electrostatics" of the water molecule, which is key to many unusual properties of liquid water. The search for the optimal charge distribution is performed in 2D parameter space of key lowest multipole moments of the model, to find best fit to a small set of bulk water properties at room temperature. A virtually exhaustive search is enabled via analytical equations that relate the charge distribution to the multipole moments. The resulting "optimal"...

  11. Conjugate modelling of convective drying phenomena in porous building materials

    International Nuclear Information System (INIS)

    Moisture storage and the associated heat and moisture transport in buildings have a large impact on the building envelope durability, the energy consumption in buildings and the indoor climate. Nowadays HAM (Heat, Air and Moisture transport) models are widely used to simulate and predict the effect of these transport phenomena in detail. Recently these HAM models are being coupled to CFD (Computational Fluid Dynamics) to study the moisture exchange between air and porous materials on a local scale (microclimates). The objective of this research is to develop such a model to study drying phenomena. In this paper the emphasis lies on the modelling of convective drying of porous building materials. An important aspect for the correct modelling of convective drying is the way the air boundary is implemented. A short literature review reveals that different modelling approaches can be used. This paper gives a short overview of the state of the art in conjugate heat and mass transport modelling for convective drying. In this review shortcomings of currently applied modelling approaches are highlighted. Finally the newly developed model is used to simulate the convective drying of a sample of ceramic brick. These simulations were then compared with measurements from literature. A good agreement was found.

  12. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  13. Modelling energy demand in the Norwegian building stock

    Energy Technology Data Exchange (ETDEWEB)

    Sartori, Igor

    2008-07-15

    Energy demand in the building stock in Norway represents about 40% of the final energy consumption, of which 22% goes to the residential sector and 18% to the service sector. In Norway there is a strong dependency on electricity for heating purposes, with electricity covering about 80% of the energy demand in buildings. The building sector can play an important role in the achievement of a more sustainable energy system. The work performed in the articles presented in this thesis investigates various aspects related to the energy demand in the building sector, both in singular cases and in the stock as a whole. The work performed in the first part of this thesis on development and survey of case studies provided background knowledge that was then used in the second part, on modelling the entire stock. In the first part, a literature survey of case studies showed that, in a life cycle perspective, the energy used in the operating phase of buildings is the single most important factor. Design of low-energy buildings is then beneficial and should be pursued, even though it implies a somewhat higher embodied energy. A case study was performed on a school building. First, a methodology using a Monte Carlo method in the calibration process was explored. Then, the calibrated model of the school was used to investigate measures for the achievement of high energy efficiency standard through renovation work. In the second part, a model was developed to study the energy demand in a scenario analysis. The results showed the robustness of policies that included conservation measures against the conflicting effects of the other policies. Adopting conservation measures on a large scale showed the potential to reduce both electricity and total energy demand from present day levels while the building stock keeps growing. The results also highlighted the inertia to change of the building stock, due to low activity levels compared to the stock size. It also became clear that a deeper

  14. Experimental and analytical studies of a deeply embedded reactor building model considering soil-building interaction. Pt. 1

    International Nuclear Information System (INIS)

    The purpose of this paper is to describe the dynamic characteristics of a deeply embedded reactor building model derived from experimental and analytical studies which considers soil-building interaction behaviour. The model building is made of reinforced concrete. It has two stories above ground level and a basement, resting on sandy gravel layer at a depth of 3 meters. The backfill around the building was made to ground level. The model building is simplified and reduced to about one-fifteenth (1/15) of the prototype. It has bearing wall system for the basement and the first story, and frame system for the second. (orig.)

  15. Building probabilistic graphical models with Python

    CERN Document Server

    Karkera, Kiran R

    2014-01-01

    This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you. This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.

  16. BUILDING A SUSTAINABLE REGION ECONOMIC DEVELOPMENT MODEL

    Directory of Open Access Journals (Sweden)

    Pshunetlev A. A.

    2014-09-01

    Full Text Available The article contains basic assumptions of the region sustainable economic development model, which can be used to gain new knowledge about economic processes, contribute to the stability of the regional development, as well as serve as an educational tool in the study of relevant disciplines

  17. Building Location Models for Visual Place Recognition

    OpenAIRE

    Stumm, Elena; Mei, Christopher; Lacroix, Simon

    2015-01-01

    This paper deals with the task of appearance-based mapping and place recognition. Previously, the scope of a location generally varied between either using discrete poses or loosely defined sequences of poses, facing problems related to perceptual aliasing and path invariance respectively. Here, we present a unified framework for defining, modelling and recognizing places in a way which is directly related to the underlying structure of features in the environment. A covisibility map of the e...

  18. On the geometry of cosmological model building

    OpenAIRE

    Scholz, Erhard

    2005-01-01

    This article analyzes the present anomalies of cosmology from the point of view of integrable Weyl geometry. It uses P.A.M. Dirac's proposal for a weak extension of general relativity, with some small adaptations. Simple models with interesting geometrical and physical properties, not belonging to the Friedmann-Lema\\^{\\i}tre class, are studied in this frame. Those with positive spatial curvature (Einstein-Weyl universes) go well together with observed mass density $\\Omega_m$, CMB, supernovae ...

  19. Impact of the U.S. National Building Information Model Standard (NBIMS) on Building Energy Performance Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2007-08-01

    The U.S. National Institute for Building Sciences (NIBS) started the development of the National Building Information Model Standard (NBIMS). Its goal is to define standard sets of data required to describe any given building in necessary detail so that any given AECO industry discipline application can find needed data at any point in the building lifecycle. This will include all data that are used in or are pertinent to building energy performance simulation and analysis. This paper describes the background that lead to the development of NBIMS, its goals and development methodology, its Part 1 (Version 1.0), and its probable impact on building energy performance simulation and analysis.

  20. A Unified Building Model for 3D Urban GIS

    Directory of Open Access Journals (Sweden)

    Ihab Hijazi

    2012-07-01

    Full Text Available Several tasks in urban and architectural design are today undertaken in a geospatial context. Building Information Models (BIM and geospatial technologies offer 3D data models that provide information about buildings and the surrounding environment. The Industry Foundation Classes (IFC and CityGML are today the two most prominent semantic models for representation of BIM and geospatial models respectively. CityGML has emerged as a standard for modeling city models while IFC has been developed as a reference model for building objects and sites. Current CAD and geospatial software provide tools that allow the conversion of information from one format to the other. These tools are however fairly limited in their capabilities, often resulting in data and information losses in the transformations. This paper describes a new approach for data integration based on a unified building model (UBM which encapsulates both the CityGML and IFC models, thus avoiding translations between the models and loss of information. To build the UBM, all classes and related concepts were initially collected from both models, overlapping concepts were merged, new objects were created to ensure the capturing of both indoor and outdoor objects, and finally, spatial relationships between the objects were redefined. Unified Modeling Language (UML notations were used for representing its objects and relationships between them. There are two use-case scenarios, both set in a hospital: “evacuation” and “allocating spaces for patient wards” were developed to validate and test the proposed UBM data model. Based on these two scenarios, four validation queries were defined in order to validate the appropriateness of the proposed unified building model. It has been validated, through the case scenarios and four queries, that the UBM being developed is able to integrate CityGML data as well as IFC data in an apparently seamless way. Constraints and enrichment functions are

  1. Models for describing the thermal characteristics of building components

    DEFF Research Database (Denmark)

    Jimenez, M.J.; Madsen, Henrik

    2008-01-01

    example. For the analysis of these tests, dynamic analysis models and methods are required. However, a wide variety of models and methods exists, and the problem of choosing the most appropriate approach for each particular case is a non-trivial and interdisciplinary task. Knowledge of a large family of...... these approaches may therefore be very useful for selecting a suitable approach for each particular case. This paper presents an overview of models that can be applied for modelling the thermal characteristics of buildings and building components using data from outdoor testing. The choice of approach...... mathematically demonstrated. The characteristics of each type of model are highlighted. Some available software tools for each of the methods described will be mentioned. A case study also demonstrating the difference between linear and nonlinear models is considered....

  2. Building entity models through observation and learning

    Science.gov (United States)

    Garcia, Richard; Kania, Robert; Fields, MaryAnne; Barnes, Laura

    2011-05-01

    To support the missions and tasks of mixed robotic/human teams, future robotic systems will need to adapt to the dynamic behavior of both teammates and opponents. One of the basic elements of this adaptation is the ability to exploit both long and short-term temporal data. This adaptation allows robotic systems to predict/anticipate, as well as influence, future behavior for both opponents and teammates and will afford the system the ability to adjust its own behavior in order to optimize its ability to achieve the mission goals. This work is a preliminary step in the effort to develop online entity behavior models through a combination of learning techniques and observations. As knowledge is extracted from the system through sensor and temporal feedback, agents within the multi-agent system attempt to develop and exploit a basic movement model of an opponent. For the purpose of this work, extraction and exploitation is performed through the use of a discretized two-dimensional game. The game consists of a predetermined number of sentries attempting to keep an unknown intruder agent from penetrating their territory. The sentries utilize temporal data coupled with past opponent observations to hypothesize the probable locations of the opponent and thus optimize their guarding locations.

  3. Building a 3-D Appearance Model of the Human Face

    OpenAIRE

    Sjöstrand, Karl; Larsen, Rasmus; Lading, Brian

    2003-01-01

    This paper describes a method for building an appearance model from three-dimensional data of human faces. The data consists of 3-D vertices, polygons and a texture map. The method uses a set of nine manually placed landmarks to automatically form a dense correspondence of thousands of points. This makes sure the model is able to capture the subtle details of a face. The model can be used for face segmentation and fully automated face registration.

  4. Building a 3-D Appearance Model of the Human Face

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Larsen, Rasmus; Lading, Brian

    2003-01-01

    This paper describes a method for building an appearance model from three-dimensional data of human faces. The data consists of 3-D vertices, polygons and a texture map. The method uses a set of nine manually placed landmarks to automatically form a dense correspondence of thousands of points. This...... makes sure the model is able to capture the subtle details of a face. The model can be used for face segmentation and fully automated face registration....

  5. Building aggregate timber supply models from individual harvest choice

    OpenAIRE

    Polyakov, Maksym; Wear, David N.; Huggett, Robert

    2009-01-01

    Timber supply has traditionally been modelled using aggregate data. In this paper, we build aggregate supply models for four roundwood products for the US state of North Carolina from a stand-level harvest choice model applied to detailed forest inventory. The simulated elasticities of pulpwood supply are much lower than reported by previous studies. Cross price elasticities indicate a dominant influence of sawtimber markets on pulpwood supply. This approach allows predicting the supply conse...

  6. Green Template for Life Cycle Assessment of Buildings Based on Building Information Modeling: Focus on Embodied Environmental Impact

    OpenAIRE

    Sungwoo Lee; Sungho Tae; Seungjun Roh; Taehyung Kim

    2015-01-01

    The increased popularity of building information modeling (BIM) for application in the construction of eco-friendly green buildings has given rise to techniques for evaluating green buildings constructed using BIM features. Existing BIM-based green building evaluation techniques mostly rely on externally provided evaluation tools, which pose problems associated with interoperability, including a lack of data compatibility and the amount of time required for format conversion. To overcome thes...

  7. Building a generalized distributed system model

    Science.gov (United States)

    Mukkamala, R.

    1992-01-01

    The key elements in the second year (1991-92) of our project are: (1) implementation of the distributed system prototype; (2) successful passing of the candidacy examination and a PhD proposal acceptance by the funded student; (3) design of storage efficient schemes for replicated distributed systems; and (4) modeling of gracefully degrading reliable computing systems. In the third year of the project (1992-93), we propose to: (1) complete the testing of the prototype; (2) enhance the functionality of the modules by enabling the experimentation with more complex protocols; (3) use the prototype to verify the theoretically predicted performance of locking protocols, etc.; and (4) work on issues related to real-time distributed systems. This should result in efficient protocols for these systems.

  8. Model-building codes for membrane proteins.

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, David Noyes; Hunt, Thomas W.; Brown, W. Michael; Schoeniger, Joseph S. (Sandia National Laboratories, Livermore, CA); Slepoy, Alexander; Sale, Kenneth L. (Sandia National Laboratories, Livermore, CA); Young, Malin M. (Sandia National Laboratories, Livermore, CA); Faulon, Jean-Loup Michel; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA)

    2005-01-01

    We have developed a novel approach to modeling the transmembrane spanning helical bundles of integral membrane proteins using only a sparse set of distance constraints, such as those derived from MS3-D, dipolar-EPR and FRET experiments. Algorithms have been written for searching the conformational space of membrane protein folds matching the set of distance constraints, which provides initial structures for local conformational searches. Local conformation search is achieved by optimizing these candidates against a custom penalty function that incorporates both measures derived from statistical analysis of solved membrane protein structures and distance constraints obtained from experiments. This results in refined helical bundles to which the interhelical loops and amino acid side-chains are added. Using a set of only 27 distance constraints extracted from the literature, our methods successfully recover the structure of dark-adapted rhodopsin to within 3.2 {angstrom} of the crystal structure.

  9. Modeling of heat and mass transfer in lateritic building envelopes

    International Nuclear Information System (INIS)

    The aim of the present work is to investigate the behavior of building envelopes made of local lateritic soil bricks subjected to different climatic conditions. The analysis is developed for the prediction of the temperature, relative humidity and water content behavior within the walls. The building envelopes studied in this work consist of lateritic soil bricks with incorporation of natural pozzolan or sawdust in order to obtain small thermal conductivity and low-density materials, and limit the heat transfer between the atmospheric climate and the inside environment. In order to describe coupled heat and moisture transfer in wet porous materials, the coupled equations were solved by the introduction of diffusion coefficients. A numerical model HMtrans, developed for prediction of beat and moisture transfer in multi-layered building components, was used to simulate the temperature, water content and relative humidity profiles within the building envelopes. The results allow the prediction of the duration of the exposed building walls to the local weather conditions. They show that for any of three climatic conditions considered, relative humidity and water content do not exceed 87% and 5% respectively. There is therefore minimum possibility of water condensation in the materials studied. The durability of building envelopes made of lateritic soil bricks with incorporation of natural pozzolan or sawdust is not strongly affected by the climatic conditions in tropical and equatorial regions. (author)

  10. A general model of confidence building: analysis and implications

    International Nuclear Information System (INIS)

    For more than two decades, security approaches in Europe have included confidence building. Many have argued that Confidence-Building Measures (CBMS) played an essential role in the enormous transformations that took place there. Thus, it is hardly,surprising that CBMs have been proposed as measures to reduce tensions and transform security relationships elsewhere in the world. The move toward wider application of CBMs has strengthened recently, as conventional military, diplomatic, and humanitarian approaches seem to have failed to address problems associated with peace-building and peace support operations. There is, however, a serious problem. We don't really know why, or even how, CBMs work. Consequently, we have no reliable way to design CBMs that would be appropriate in substance, form, and timing for regions culturally, geographically, and militarily different from Europe. Lacking a solid understanding of confidence building, we are handicapped in our efforts to extend its successes to the domain of peace building and peace support. To paraphrase Macintosh, if we don't know how CBMs succeeded in the past, then we are unlikely to be good at maintaining, improving, or extending them. The specific aim of this project is to step into this gap, using the methods of game theory to clarify some aspects of the underlying logic of confidence building. Formal decision models will be shown to contribute new and valuable insights that will assist in the design of CBMs to contribute to new problems and in new arenas. (author)

  11. SIMPLIFIED BUILDING MODELS EXTRACTION FROM ULTRA-LIGHT UAV IMAGERY

    Directory of Open Access Journals (Sweden)

    O. Küng

    2012-09-01

    Full Text Available Generating detailed simplified building models such as the ones present on Google Earth is often a difficult and lengthy manual task, requiring advanced CAD software and a combination of ground imagery, LIDAR data and blueprints. Nowadays, UAVs such as the AscTec Falcon 8 have reached the maturity to offer an affordable, fast and easy way to capture large amounts of oblique images covering all parts of a building. In this paper we present a state-of-the-art photogrammetry and visual reconstruction pipeline provided by Pix4D applied to medium resolution imagery acquired by such UAVs. The key element of simplified building models extraction is the seamless integration of the outputs of such a pipeline for a final manual refinement step in order to minimize the amount of manual work.

  12. Integrated Urban System and Energy Consumption Model: Residential Buildings

    Directory of Open Access Journals (Sweden)

    Rocco Papa

    2014-05-01

    Full Text Available This paper describes a segment of research conducted within the project PON 04a2_E Smart Energy Master for the energetic government of the territory conducted by the Department of Civil, Architectural and Environment Engineering, University of Naples "Federico II".  In particular, this article is part of the study carried out for the definition of the comprehension/interpretation model that correlates buildings, city’s activities and users’ behaviour in order to promote energy savings. In detail, this segment of the research wants to define the residential variables to be used in the model. For this purpose a knowledge framework at international level has been defined, to estimate the energy requirements of residential buildings and the identification of a set of parameters, whose variation has a significant influence on the energy consumption of residential buildings.

  13. Simplified Building Models Extraction from Ultra-Light Uav Imagery

    Science.gov (United States)

    Küng, O.; Strecha, C.; Fua, P.; Gurdan, D.; Achtelik, M.; Doth, K.-M.; Stumpf, J.

    2011-09-01

    Generating detailed simplified building models such as the ones present on Google Earth is often a difficult and lengthy manual task, requiring advanced CAD software and a combination of ground imagery, LIDAR data and blueprints. Nowadays, UAVs such as the AscTec Falcon 8 have reached the maturity to offer an affordable, fast and easy way to capture large amounts of oblique images covering all parts of a building. In this paper we present a state-of-the-art photogrammetry and visual reconstruction pipeline provided by Pix4D applied to medium resolution imagery acquired by such UAVs. The key element of simplified building models extraction is the seamless integration of the outputs of such a pipeline for a final manual refinement step in order to minimize the amount of manual work.

  14. Annual review of scalable computing

    CERN Document Server

    Kwong, Yuen Chung

    2004-01-01

    This volume presents original articles, reviewing various aspects of scalable computing. Parallel computation with optically interconnected systems makes its first appearance, and further work on distributed Java is also reported. Optimizing data grids and group communication are studied in two analytical chapters. The comprehensive treatment of these topics adds further to the current literature. Contents: An OMIS-based Approach to Monitoring Distributed Java Applications; Scalable and Self-Optimizing Data Grids; The Immediate Dependency Relation; Highly Scalable Parallel Matrix Computing wit

  15. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  16. Functional Model of Higher Educational Institution Library Building up

    OpenAIRE

    Murat K. Baimul'din; Asemgul' S. Smagulova; Gulnur B. Abildaeva; Zagira B. Saimanova

    2013-01-01

    The article presents the technology of integrated data processing, related to academic and library-bibliographic activities for specialties work programs development and book sufficiency estimation in integrated research and information system of higher educational institution. The model of higher educational institution library building up, based on monitoring of educational process book sufficiency and literature demand is introduced

  17. Building information modeling (BIM) approach to the GMT Project

    Science.gov (United States)

    Teran, Jose; Sheehan, Michael; Neff, Daniel H.; Adriaanse, David; Grigel, Eric; Farahani, Arash

    2014-07-01

    The Giant Magellan Telescope (GMT), one of several next generation Extremely Large Telescopes (ELTs), is a 25.4 meter diameter altitude over azimuth design set to be built at the summit of Cerro Campánas at the Las Campánas Observatory in Chile. The paper describes the use of Building Information Modeling (BIM) for the GMT project.

  18. Getting Started and Working with Building Information Modeling

    Science.gov (United States)

    Smith, Dana K.

    2009-01-01

    This article will assume that one has heard of Building Information Modeling or BIM but has not developed a strategy as to how to get the most out of it. The National BIM Standard (NBIMS) has defined BIM as a digital representation of physical and functional characteristics of a facility. As such, it serves as a shared knowledge resource for…

  19. Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations

    Science.gov (United States)

    Sung, Christopher Teh Boon

    2011-01-01

    Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…

  20. Analogue Behavioral Modeling of Switched-Current Building Block Circuits

    Institute of Scientific and Technical Information of China (English)

    ZENG Xuan; WANG Wei; SHI Jianlei; TANG Pushan; D.ZHOU

    2001-01-01

    This paper proposes a behavioral modeling technique for the second-generation switched-current building block circuits. The proposed models are capable of capturing the non-ideal behavior of switched-current circuits, which includes the charge injection effects and device mismatch effects. As a result, system performance degradations due to the building block imperfections can be detected at the early design stage by fast behavioral simulations. To evaluate the accuracy of the proposed models, we developed a time-domain behavioral simulator. Experimental results have shown that compared with SPICE, the behavioral modeling error is less than 2.15%, while behavioral simulation speed up is 4 orders in time-domain.

  1. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  2. On a computational model of building thermal dynamic response

    Science.gov (United States)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  3. Decision model for facade contractor selection – EDA center building

    OpenAIRE

    Marinič, Jani

    2010-01-01

    In my thesis I presented a decision model for facade contractor selection for the business part of EDA center in Nova Gorica. EDA center is a commercial and residential building, that will be constructed in the city center. The garage, commercial, business and residential part of the building will bring to the city new conceptual quality areas that will provide additional comfort for residents and visitors. It´s easy to come to the right decision in a simple and fast way to solve a problem wi...

  4. Modeling, Estimation and Control of Indoor Climate in Livestock Buildings

    DEFF Research Database (Denmark)

    Wu, Zhuang

    The main objective of this research is to design an efficient control system for the indoor climate of a large-scale partition-less livestock building, in order to maintain a healthy, comfortable and economically energy consuming indoor environment for the agricultural animals and farmers. In this...... resilience of the control system to disturbances beyond its bandwidth, increases the manipulators utilization efficiency, and reduces energy consumption by solving a constrained convex optimization. Through comparative simulation results analysis, the proposed modeling and control technique is proved to be...... scale livestock buildings, and could be considered as an alternative solution to the current used decentralized PID controller....

  5. Building social business models: lessons from the Grameen experience

    OpenAIRE

    Moingeon, Bertrand; Yunus, Muhammad; Lehmann-Ortega, Laurence

    2009-01-01

    The social business idea borrows some concepts from the capitalist economy, and therefore the implementation of social businesses can likewise borrow some concepts from conventional business literature. As an illustration, the notion of business model, which is currently attracting much attention from researchers, can be revisited so as to enable the building of social businesses. Social business models are needed alongside conventional ones. After defining what a social business is, the auth...

  6. Using Model Driven Engineering technologies for building authoring applications

    OpenAIRE

    Beaudoux, Olivier; Blouin, Arnaud; Jézéquel, Jean-Marc

    2010-01-01

    Building authoring applications is a tedious and complex task that requires a high programming effort. Document technologies, especially XML based ones, can help in reducing such an effort by providing common bases for manipulating documents. Still, the overall task consists mainly of writing the application's source code. Model Driven Engineering (MDE) focuses on generating the source code from an exhaustive model of the application. In this paper, we illustrate that MDE technologies can be ...

  7. Querying a regulatory model for compliant building design audit

    OpenAIRE

    Dimyadi, Johannes; Pauwels, Pieter; Spearpoint, Michael; Clifton, Charles; Amor, Robert

    2015-01-01

    The ingredients for an effective automated audit of a building design include a BIM model containing the design information, an electronic regulatory knowledge model, and a practical method of processing these computerised representations. There have been numerous approaches to computer-aided compliance audit in the AEC/FM domain over the last four decades, but none has yet evolved into a practical solution. One reason is that they have all been isolated attempts that lack any form of standar...

  8. Building DNN Acoustic Models for Large Vocabulary Speech Recognition

    OpenAIRE

    Maas, Andrew L.; Qi, Peng; Xie, Ziang; Hannun, Awni Y.; Lengerich, Christopher T.; Jurafsky, Daniel; Ng, Andrew Y.

    2014-01-01

    Deep neural networks (DNNs) are now a central component of nearly all state-of-the-art speech recognition systems. Building neural network acoustic models requires several design decisions including network architecture, size, and training loss function. This paper offers an empirical investigation on which aspects of DNN acoustic model design are most important for speech recognition system performance. We report DNN classifier performance and final speech recognizer word error rates, and co...

  9. First Prismatic Building Model Reconstruction from Tomosar Point Clouds

    Science.gov (United States)

    Sun, Y.; Shahzad, M.; Zhu, X.

    2016-06-01

    This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR) point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007) and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce) the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center) in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.

  10. SCALABLE GRID RESOURCE DISCOVERY THROUGH DISTRIBUTED SEARCH

    Directory of Open Access Journals (Sweden)

    Fouad Butt

    2011-10-01

    Full Text Available This paper proposes a simple and scalable web-based model for grid resource discovery for the Internet.The resource discovery model contains the metadata and resource finder web services. The information ofresource finder web services is kept in the repositories that are distributed in the application layer ofInternet. The resource finder web services will be discovered by sending queries to the repositories in asimilar way as the DNS protocol. The underlying technology for implementation of the two architectures ofthis model is introduced.These architectures are: Direct and Centralized Web-Based Grid Resource Discovery. The resourcediscovery time is computed after simulating each of these models in GridSim.By performing scalability tests, we found that when increasing the load on the grid with more users andresources, the cost of our model in comparison to the grid resource discovery time is marginal.

  11. ISIS++Reference Guide (Iterative Scalable Implicit Solver in C++) Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Alan B. Williams; Benjamin A. Allan; Kyran D. Mish; Robert L. Clay

    1999-04-01

    ISIS++ (Iterative Scalable Implicit Solver in C++) Version 1.1 is a portable, object-oriented framework for solving sparse linear systems of equations. It includes a collection of Krylov solution methods and preconditioners, as well as both uni-processor (serial) and multi-processor (scalable) matrix and vector classes. Though it was developed to solve systems of equations originating from large-scale, 3-D, finite element analyses, it has application in many other fields. This document supersedes the ISIS++ V1.0 Reference Guide, defines the V1. 1 interface specification, and includes the necessary instructions for building and running ISIS++ v 1.1 on Unix platforms. The interface is presented in annotated header format, along with background on design and implementation considerations. A finite difference modeling example problem is included to demonstrate the overall setup and use.

  12. Steam release into buildings: the modelling of steam condensation

    International Nuclear Information System (INIS)

    A lumped-parameter code, HOTSTM, has been developed to model the thermal transient resulting from a steam jet discharging into a large ventilated building, typical of CEGB power plant installations. The code is designed to deal with high pressure steam jets whose typical dimension is small compared with that of the building, so that there is good good mixing within the building. An important factor in limiting the temperature rise of the air/steam mixture is heat transfer to the internal surfaces of the building, the temperatures of which are significantly affected by condensation. A detailed description is given of the method used in HOTSTM to calculate condensation and evaporation on internal building surfaces. The method exploits the well-known analogy between heat and mass transfer, together with correction factors derived from a simplified analytical solution. The validity of the approximations inherent in the method is estimated to affect predictions of bulk gas temperature by only a few degrees. A complete listing of the code, as used for calculating thermal transients in Bradwell Turbine House, is appended to the Report. This is intended as a record of the calculational route used in that work, not as a user's guide to the program. (author)

  13. Modeling and Simulation of Scalable Cloud Computing Environments and the CloudSim Toolkit: Challenges and Opportunities

    CERN Document Server

    Buyya, Rajkumar; Calheiros, Rodrigo N

    2009-01-01

    Cloud computing aims to power the next generation data centers and enables application service providers to lease data center capabilities for deploying applications depending on user QoS (Quality of Service) requirements. Cloud applications have different composition, configuration, and deployment requirements. Quantifying the performance of resource allocation policies and application scheduling algorithms at finer details in Cloud computing environments for different application and service models under varying load, energy performance (power consumption, heat dissipation), and system size is a challenging problem to tackle. To simplify this process, in this paper we propose CloudSim: an extensible simulation toolkit that enables modelling and simulation of Cloud computing environments. The CloudSim toolkit supports modelling and creation of one or more virtual machines (VMs) on a simulated node of a Data Center, jobs, and their mapping to suitable VMs. It also allows simulation of multiple Data Centers to...

  14. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  15. Building enterprise reuse program--A model-based approach

    Institute of Scientific and Technical Information of China (English)

    梅宏; 杨芙清

    2002-01-01

    Reuse is viewed as a realistically effective approach to solving software crisis. For an organization that wants to build a reuse program, technical and non-technical issues must be considered in parallel. In this paper, a model-based approach to building systematic reuse program is presented. Component-based reuse is currently a dominant approach to software reuse. In this approach, building the right reusable component model is the first important step. In order to achieve systematic reuse, a set of component models should be built from different perspectives. Each of these models will give a specific view of the components so as to satisfy different needs of different persons involved in the enterprise reuse program. There already exist some component models for reuse from technical perspectives. But less attention is paid to the reusable components from a non-technical view, especially from the view of process and management. In our approach, a reusable component model--FLP model for reusable component--is introduced. This model describes components from three dimensions (Form, Level, and Presentation) and views components and their relationships from the perspective of process and management. It determines the sphere of reusable components, the time points of reusing components in the development process, and the needed means to present components in terms of the abstraction level, logic granularity and presentation media. Being the basis on which the management and technical decisions are made, our model will be used as the kernel model to initialize and normalize a systematic enterprise reuse program.

  16. Designing a Scalable Fault Tolerance Model for High Performance Computational Chemistry: A Case Study with Coupled Cluster Perturbative Triples.

    Science.gov (United States)

    van Dam, Hubertus J J; Vishnu, Abhinav; de Jong, Wibe A

    2011-01-11

    In the past couple of decades, the massive computational power provided by the most modern supercomputers has resulted in simulation of higher-order computational chemistry methods, previously considered intractable. As the system sizes continue to increase, the computational chemistry domain continues to escalate this trend using parallel computing with programming models such as Message Passing Interface (MPI) and Partitioned Global Address Space (PGAS) programming models such as Global Arrays. The ever increasing scale of these supercomputers comes at a cost of reduced Mean Time Between Failures (MTBF), currently on the order of days and projected to be on the order of hours for upcoming extreme scale systems. While traditional disk-based check pointing methods are ubiquitous for storing intermediate solutions, they suffer from high overhead of writing and recovering from checkpoints. In practice, checkpointing itself often brings the system down. Clearly, methods beyond checkpointing are imperative to handling the aggravating issue of reducing MTBF. In this paper, we address this challenge by designing and implementing an efficient fault tolerant version of the Coupled Cluster (CC) method with NWChem, using in-memory data redundancy. We present the challenges associated with our design, including an efficient data storage model, maintenance of at least one consistent data copy, and the recovery process. Our performance evaluation without faults shows that the current design exhibits a small overhead. In the presence of a simulated fault, the proposed design incurs negligible overhead in comparison to the state of the art implementation without faults. PMID:26606219

  17. Scalable and Practical Nonblocking Switching Networks

    Institute of Scientific and Technical Information of China (English)

    Si-Qing Zheng; Ashwin Gumaste

    2006-01-01

    Large-scale strictly nonblocking (SNB) and wide-sense nonblocking (WSNB) networks may be infeasible due to their high cost. In contrast, rearrangeable nonblocking (RNB) networks are more scalable because of their much lower cost. However, RNB networks are not suitable for circuit switching. In this paper, the concept of virtual nonblockingness is introduced. It is shown that a virtual nonblocking (VNB) network functions like an SNB or WSNB network, but it is constructed with the cost of an RNB network. The results indicate that for large-scale circuit switching applications, it is only needed to build VNB networks.

  18. Modeling, Simulation, and Fabrication of a Fully Integrated, Acid-stable, Scalable Solar-Driven Water-Splitting System

    OpenAIRE

    Walczak, Karl; Chen, Yikai; Karp, Christoph; Beeman, Jeffrey W.; Shaner, Matthew; Spurgeon, Joshua; Sharp, Ian D.; Amashukeli, Xenia; West, William; Jin, Jian; Lewis, Nathan S.; Xiang, Chengxiang

    2015-01-01

    A fully integrated solar-driven water-splitting system comprised of WO3/FTO/p^(+)n Si as the photoanode, Pt/TiO_2/Ti/n^(+)p Si as the photocathode, and Nafion as the membrane separator, was simulated, assembled, operated in 1.0 M HClO_4, and evaluated for performance and safety characteristics under dual side illumination. A multi-physics model that accounted for the performance of the photoabsorbers and electrocatalysts, ion transport in the solution electrolyte, and gaseous product crossove...

  19. Building 235-F Goldsim Fate And Transport Model

    International Nuclear Information System (INIS)

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D and D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ρCi/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ρCi/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met

  20. Building 235-F Goldsim Fate And Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, G. A.; Phifer, M. A.

    2012-09-14

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D&D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ?Ci/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ?Ci/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met.

  1. Air Dispersion Modeling for Building 3026C/D Demolition

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Sjoreen, Andrea L [ORNL; Eckerman, Keith F [ORNL

    2010-06-01

    This report presents estimates of dispersion coefficients and effective dose for potential air dispersion scenarios of uncontrolled releases from Oak Ridge National Laboratory (ORNL) buildings 3026C, 3026D, and 3140 prior to or during the demolition of the 3026 Complex. The Environmental Protection Agency (EPA) AERMOD system1-6 was used to compute these estimates. AERMOD stands for AERMIC Model, where AERMIC is the American Meteorological Society-EPA Regulatory Model Improvement Committee. Five source locations (three in building 3026D and one each in building 3026C and the filter house 3140) and associated source characteristics were determined with the customer. In addition, the area of study was determined and building footprints and intake locations of air-handling systems were obtained. In addition to the air intakes, receptor sites consisting of ground level locations on four polar grids (50 m, 100 m, 200 m, and 500 m) and two intersecting lines of points (50 m separation), corresponding to sidewalks along Central Avenue and Fifth Street. Three years of meteorological data (2006 2008) were used each consisting of three datasets: 1) National Weather Service data; 2) upper air data for the Knoxville-Oak Ridge area; and 3) local weather data from Tower C (10 m, 30 m and 100 m) on the ORNL reservation. Annual average air concentration, highest 1 h average and highest 3 h average air concentrations were computed using AERMOD for the five source locations for the three years of meteorological data. The highest 1 h average air concentrations were converted to dispersion coefficients to characterize the atmospheric dispersion as the customer was interested in the most significant response and the highest 1 h average data reflects the best time-averaged values available from the AERMOD code. Results are presented in tabular and graphical form. The results for dose were obtained using radionuclide activities for each of the buildings provided by the customer.7

  2. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine;

    Model Predictive Control (MPC) of building systems is a promising approach to optimize building energy performance. In contrast to traditional control strategies which are reactive in nature, MPC optimizes the utilization of resources based on the predicted effects. It has been shown that energy...... implemented only in few buildings. The following difficulties hinder the widespread usage of MPC: (1) significant model development time, (2) limited portability of models, (3) model computational demand. In the present study a new model development framework for an MPC system based on a Genetic Algorithm (GA......) optimization is proposed. The framework is intended to allow easy model adaptation for new buildings and fast simulations to meet the strict performance requirements of the GA optimization approach. This is achieved by the introduction of the generic zone model concept and the implementation of the Functional...

  3. Variable cluster analysis method for building neural network model

    Institute of Scientific and Technical Information of China (English)

    王海东; 刘元东

    2004-01-01

    To address the problems that input variables should be reduced as much as possible and explain output variables fully in building neural network model of complicated system, a variable selection method based on cluster analysis was investigated. Similarity coefficient which describes the mutual relation of variables was defined. The methods of the highest contribution rate, part replacing whole and variable replacement are put forwarded and deduced by information theory. The software of the neural network based on cluster analysis, which can provide many kinds of methods for defining variable similarity coefficient, clustering system variable and evaluating variable cluster, was developed and applied to build neural network forecast model of cement clinker quality. The results show that all the network scale, training time and prediction accuracy are perfect. The practical application demonstrates that the method of selecting variables for neural network is feasible and effective.

  4. CAPACITY FACTOR BASED COST MODELS FOR BUILDINGS OF VARIOUS FUNCTIONS

    OpenAIRE

    Andreas Wibowo; Wahyu Wuryanti

    2007-01-01

    The desired accuracy level of an estimate heavily relies on the availability of data and information at the time of preparing the estimate. However, an estimate often must be made when data and information are not complete. At earlier stages of project implementation at which data and information are minimal, a client is often required to prepare a cost estimate. This paper discusses the capacity factor-based cost models for buildings with total areas serving as the proxy of capacity. A total...

  5. Semantic Building Information Model and Multimedia for Facility Management

    OpenAIRE

    Nicolle, Christophe; Cruz, Christophe

    2011-01-01

    International audience In the field of civil engineering, the proliferation of stakeholders and the heterogeneity of modeling tools detract from the quality of the design process, construction and building maintenance. In this paper, we present a Web-based platform lets geographically dispersed project participants--from facility managers and architects to electricians to plumbers--directly use and exchange project documents in a centralized virtual environment using a simple Web browser. ...

  6. A model of backdraft phenomenon in building fires

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to further investigate the physical mechanism of the backdraft phenomenon in building fires, a simplified math ematical model is established based on energy balance equation, and its catastrophe mechanism is analyzed based on catastrophe theory, and the relationship between system control variables and fire conditions is studied. It is indicated that the backdraft phenomenon is a kind of typical catastrophe behavior, and of the common characteristics of catastrophe.

  7. Building predictive models for feature selection in genomic mining

    OpenAIRE

    Figini, Silvia; Giudici, Paolo

    2006-01-01

    Building predictive models for genomic mining requires feature selection, as an essential preliminary step to reduce the large number of variable available. Feature selection is a process to select a subset of features which is the most essential for the intended tasks such as classification, clustering or regression analysis. In gene expression microarray data, being able to select a few genes not only makes data analysis efficient but also helps their biological interpretation. Microarray d...

  8. Is Frequent Pattern Mining useful in building predictive models?

    OpenAIRE

    Karunaratne, Thashmee

    2011-01-01

    The recent studies of pattern mining have given more attention to discovering patterns that are interesting, significant, discriminative and so forth, than simply frequent. Does this imply that the frequent patterns are not useful anymore? In this paper we carry out a survey of frequent pattern mining and, using an empirical study, show how far the frequent pattern mining is useful in building predictive models.

  9. Interaction of Lean and Building Information Modeling in Construction

    OpenAIRE

    Sacks, Rafael; Koskela, Lauri; Dave, Bhargav A.; Owen, Robert

    2010-01-01

    Lean construction and Building Information Modeling are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM...

  10. Load Modelling of Buildings in Mixed Energy Distribution Systems

    OpenAIRE

    Pedersen, Linda

    2007-01-01

    The main topic of this thesis has been the development of a new method for load modelling of buildings in mixed energy distribution systems. The method estimates design load profiles, yearly load profiles, load duration profiles and annual expected energy demand for a specified planning area, all divided into heat and electricity purposes. The heat load demand includes end-uses such as space heating, ventilation heating and hot tap water, while electricity load demand includes end-uses such a...

  11. Quantized Nonlinear Model Predictive Control for a Building

    Czech Academy of Sciences Publication Activity Database

    Pčolka, M.; Žáčeková, E.; Robinett, R.; Čelikovský, Sergej; Šebek, M.

    Sydney: IEEE, 2015, s. 347-352. ISBN 978-1-4799-7787-1. ISSN 1085-1992. [IEEE Conference on Control Applications 2015 (CCA 2015). Sydney (AU), 21.09.2015-23.09.2015] R&D Projects: GA ČR GA13-20433S Institutional support: RVO:67985556 Keywords : nonlinear model predictive control * building climate control Subject RIV: BC - Control Systems Theory

  12. Myria: Scalable Analytics as a Service

    Science.gov (United States)

    Howe, B.; Halperin, D.; Whitaker, A.

    2014-12-01

    At the UW eScience Institute, we're working to empower non-experts, especially in the sciences, to write and use data-parallel algorithms. To this end, we are building Myria, a web-based platform for scalable analytics and data-parallel programming. Myria's internal model of computation is the relational algebra extended with iteration, such that every program is inherently data-parallel, just as every query in a database is inherently data-parallel. But unlike databases, iteration is a first class concept, allowing us to express machine learning tasks, graph traversal tasks, and more. Programs can be expressed in a number of languages and can be executed on a number of execution environments, but we emphasize a particular language called MyriaL that supports both imperative and declarative styles and a particular execution engine called MyriaX that uses an in-memory column-oriented representation and asynchronous iteration. We deliver Myria over the web as a service, providing an editor, performance analysis tools, and catalog browsing features in a single environment. We find that this web-based "delivery vector" is critical in reaching non-experts: they are insulated from irrelevant effort technical work associated with installation, configuration, and resource management. The MyriaX backend, one of several execution runtimes we support, is a main-memory, column-oriented, RDBMS-on-the-worker system that supports cyclic data flows as a first-class citizen and has been shown to outperform competitive systems on 100-machine cluster sizes. I will describe the Myria system, give a demo, and present some new results in large-scale oceanographic microbiology.

  13. Model code for energy conservation in new building construction

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    In response to the recognized lack of existing consensus standards directed to the conservation of energy in building design and operation, the preparation and publication of such a standard was accomplished with the issuance of ASHRAE Standard 90-75 ''Energy Conservation in New Building Design,'' by the American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc., in 1975. This standard addressed itself to recommended practices for energy conservation, using both depletable and non-depletable sources. A model code for energy conservation in building construction has been developed, setting forth the minimum regulations found necessary to mandate such conservation. The code addresses itself to the administration, design criteria, systems elements, controls, service water heating and electrical distribution and use, both for depletable and non-depletable energy sources. The technical provisions of the document are based on ASHRAE 90-75 and it is intended for use by state and local building officials in the implementation of a statewide energy conservation program.

  14. Simulation and Big Data Challenges in Tuning Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, Jibonananda [ORNL; New, Joshua Ryan [ORNL

    2013-01-01

    EnergyPlus is the flagship building energy simulation software used to model whole building energy consumption for residential and commercial establishments. A typical input to the program often has hundreds, sometimes thousands of parameters which are typically tweaked by a buildings expert to get it right . This process can sometimes take months. Autotune is an ongoing research effort employing machine learning techniques to automate the tuning of the input parameters for an EnergyPlus input description of a building. Even with automation, the computational challenge faced to run the tuning simulation ensemble is daunting and requires the use of supercomputers to make it tractable in time. In this proposal, we describe the scope of the problem, the technical challenges faced and overcome, the machine learning techniques developed and employed, and the software infrastructure developed/in development when taking the EnergyPlus engine, which was primarily designed to run on desktops, and scaling it to run on shared memory supercomputers (Nautilus) and distributed memory supercomputers (Frost and Titan). The parametric simulations produce data in the order of tens to a couple of hundred terabytes.We describe the approaches employed to streamline and reduce bottlenecks in the workflow for this data, which is subsequently being made available for the tuning effort as well as made available publicly for open-science.

  15. A stochastic model for scheduling energy flexibility in buildings

    International Nuclear Information System (INIS)

    Due to technological developments and political goals, the electricity system is undergoing significant changes, and a more active demand side is needed. In this paper, we propose a new model to support the scheduling process for energy flexibility in buildings. We have selected an integrated energy carrier approach based on the energy hub concept, which captures multiple energy carriers, converters and storages to increase the flexibility potential. Furthermore, we propose a general classification of load units according to their flexibility properties. Finally, we define price structures that include both time-varying prices and peak power fees. We demonstrate the properties of the model in a case study based on a Norwegian university college building. The study shows that the model is able to reduce costs by reducing peak loads and utilizing price differences between periods and energy carriers. We illustrate and discuss the properties of two different approaches to deal with uncertain parameters: Rolling horizon deterministic planning and rolling horizon stochastic planning, the latter includes explicit modeling of the uncertain parameters. Although in our limited case, the stochastic model does not outperform the deterministic model, our findings indicate that several factors influence this conclusion. We recommend an in-depth analysis in each specific case. - Highlights: • We propose a new model for the scheduling of energy flexibility in buildings. • We cover multiple energy carriers and include converter, storage and load units. • We classify load units according to their flexibility properties. • Our price structure covers different price regimes including peak fees. • We perform a case study and discuss two approaches to handle uncertain parameters

  16. Validation of a Simplified Building Cooling Load Model Using a Complex Computer Simulation Model

    OpenAIRE

    Stewart, Morgan Eugene

    2001-01-01

    Building energy simulation has become a useful tool for predicting cooling, heating and electrical loads for facilities. Simulation models have been validated throughout the years by comparing simulation results to actual measured values. The simulations have become more accurate as approaches were changed to be more comprehensive in their ability to model building features. These simulation models tend to require considerable experience in determining input parameters and large amounts of...

  17. An Evolving Model for Capacity Building with Earth Observation Imagery

    Science.gov (United States)

    Sylak-Glassman, E. J.

    2015-12-01

    For the first forty years of Earth observation satellite imagery, all imagery was collected by civilian or military governmental satellites. Over this timeframe, countries without observation satellite capabilities had very limited access to Earth observation data or imagery. In response to the limited access to Earth observation systems, capacity building efforts were focused on satellite manufacturing. Wood and Weigel (2012) describe the evolution of satellite programs in developing countries with a technology ladder. A country moves up the ladder as they move from producing satellites with training services to building satellites locally. While the ladder model may be appropriate if the goal is to develop autonomous satellite manufacturing capability, in the realm of Earth observation, the goal is generally to derive societal benefit from the use of Earth observation-derived information. In this case, the model for developing Earth observation capacity is more appropriately described by a hub-and-spoke model in which the use of Earth observation imagery is the "hub," and the "spokes" describe the various paths to achieving that imagery: the building of a satellite (either independently or with assistance), the purchase of a satellite, participation in a constellation of satellites, and the use of freely available or purchased satellite imagery. We discuss the different capacity-building activities that are conducted in each of these pathways, such as the "Know-How Transfer and Training" program developed by Surrey Satellite Technology Ltd. , Earth observation imagery training courses run by SERVIR in developing countries, and the use of national or regional remote sensing centers (such as those in Morocco, Malaysia, and Kenya) to disseminate imagery and training. In addition, we explore the factors that determine through which "spoke" a country arrives at the ability to use Earth observation imagery, and discuss best practices for achieving the capability to use

  18. Building information modeling in the architectural design phases

    DEFF Research Database (Denmark)

    Hermund, Anders

    The overall economical benefits of Building Information Modeling are generally comprehensible, but are there other problems with the implementation of BIM as a formulized system in a field that ultimately is dependant on a creative input? Is optimization and economic benefit really contributing...... with an architectural quality? In Denmark the implementation of the digital working methods related to BIM has been introduced by government law in 2007. Will the important role of the architect as designer change in accordance with these new methods, and does the idea of one big integrated model...... represent a paradox in relation to designing? The BIM mindset requires changes on many levels....

  19. User-friendly graph editing for procedural modeling of buildings.

    Science.gov (United States)

    Patow, Gustavo

    2012-01-01

    A proposed rule-based editing metaphor intuitively lets artists create buildings without changing their workflow. It's based on the realization that the rule base represents a directed acyclic graph and on a shift in the development paradigm from product-based to rule-based representations. Users can visually add or edit rules, connect them to control the workflow, and easily create commands that expand the artist's toolbox (for example, Boolean operations or local controlling operators). This approach opens new possibilities, from model verification to model editing through graph rewriting. PMID:24804948

  20. The Proposal of Model for Building Cooperation Management in Company

    Directory of Open Access Journals (Sweden)

    Josef Vodák

    2015-12-01

    Full Text Available The goal of the article is to use detailed literature analysis and findings of an empirical research, and to propose model for building cooperation management in a company. The article brings a valuable tool to company managers in a form of a complex and detailed model to achieve successful implementation of cooperation management in a company. The article thus provides a tool for company managers for managing their cooperation projects and activities. Use of this tool is meant to help minimize occurrence of conflict situations and to support smooth progress of cooperation activities.

  1. Building a House Prices Forecasting Model in Hong Kong

    Directory of Open Access Journals (Sweden)

    Xin Janet

    2012-11-01

    Full Text Available This paper builds a house prices forecasting model for private residential houses in HongKong, based on general macroeconomic indicators, housing related data and demographicfactors for the period of 1980 to 2001. A reduce form economic model has been derivedfrom a multiple regression analysis where three sets and eight models were derived foranalysis and comparison. It is found that household income, land supply, population andmovements in the Hang Seng Index play an important role in explaining house pricemovements in Hong Kong. In addition, political events, as identified, cannot be ignored.However, the results of the models are unstable. It is suggested that the OLS may nota best method for house prices model in Hong Kong situation. Alternative methods aresuggested.

  2. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    Science.gov (United States)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and

  3. An Iterative Algorithm to Build Chinese Language Models

    CERN Document Server

    Luo, X; Luo, Xiaoqiang; Roukos, Salim

    1996-01-01

    We present an iterative procedure to build a Chinese language model (LM). We segment Chinese text into words based on a word-based Chinese language model. However, the construction of a Chinese LM itself requires word boundaries. To get out of the chicken-and-egg problem, we propose an iterative procedure that alternates two operations: segmenting text into words and building an LM. Starting with an initial segmented corpus and an LM based upon it, we use a Viterbi-liek algorithm to segment another set of data. Then, we build an LM based on the second set and use the resulting LM to segment again the first corpus. The alternating procedure provides a self-organized way for the segmenter to detect automatically unseen words and correct segmentation errors. Our preliminary experiment shows that the alternating procedure not only improves the accuracy of our segmentation, but discovers unseen words surprisingly well. The resulting word-based LM has a perplexity of 188 for a general Chinese corpus.

  4. Metadata and their impact on processes in Building Information Modeling

    Directory of Open Access Journals (Sweden)

    Vladimir Nyvlt

    2014-04-01

    Full Text Available Building Information Modeling (BIM itself contains huge potential, how to increase effectiveness of every project in its all life cycle. It means from initial investment plan through project and building-up activities to long-term usage and property maintenance and finally demolition. Knowledge Management or better say Knowledge Sharing covers two sets of tools, managerial and technological. Manager`s needs are real expectations and desires of final users in terms of how could they benefit from managing long-term projects, covering whole life cycle in terms of sparing investment money and other resources. Technology employed can help BIM processes to support and deliver these benefits to users. How to use this technology for data and metadata collection, storage and sharing, which processes may these new technologies deploy. We will touch how to cover optimized processes proposal for better and smooth support of knowledge sharing within project time-scale, and covering all its life cycle.

  5. Computational scalability of large size image dissemination

    Science.gov (United States)

    Kooper, Rob; Bajcsy, Peter

    2011-01-01

    We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.

  6. Big data integration: scalability and sustainability

    KAUST Repository

    Zhang, Zhang

    2016-01-26

    Integration of various types of omics data is critically indispensable for addressing most important and complex biological questions. In the era of big data, however, data integration becomes increasingly tedious, time-consuming and expensive, posing a significant obstacle to fully exploit the wealth of big biological data. Here we propose a scalable and sustainable architecture that integrates big omics data through community-contributed modules. Community modules are contributed and maintained by different committed groups and each module corresponds to a specific data type, deals with data collection, processing and visualization, and delivers data on-demand via web services. Based on this community-based architecture, we build Information Commons for Rice (IC4R; http://ic4r.org), a rice knowledgebase that integrates a variety of rice omics data from multiple community modules, including genome-wide expression profiles derived entirely from RNA-Seq data, resequencing-based genomic variations obtained from re-sequencing data of thousands of rice varieties, plant homologous genes covering multiple diverse plant species, post-translational modifications, rice-related literatures, and community annotations. Taken together, such architecture achieves integration of different types of data from multiple community-contributed modules and accordingly features scalable, sustainable and collaborative integration of big data as well as low costs for database update and maintenance, thus helpful for building IC4R into a comprehensive knowledgebase covering all aspects of rice data and beneficial for both basic and translational researches.

  7. Scalable Gravity Offload System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is a scalable gravity off-load system that enables controlled integrated testing of Surface System elements such as rovers, habitats, and...

  8. Scalable Gravity Offload System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A scalable gravity offload device simulates reduced gravity for the testing of various surface system elements such as mobile robots, excavators, habitats, and...

  9. From neurons to nests: nest-building behaviour as a model in behavioural and comparative neuroscience

    OpenAIRE

    Hall, Zachary J.; Meddle, Simone L.; Healy, Susan D.

    2016-01-01

    Despite centuries of observing the nest building of most extant bird species, we know surprisingly little about how birds build nests and, specifically, how the avian brain controls nest building. Here, we argue that nest building in birds may be a useful model behaviour in which to study how the brain controls behaviour. Specifically, we argue that nest building as a behavioural model provides a unique opportunity to study not only the mechanisms through which the brain controls behaviour wi...

  10. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  11. Implications of imaginary chemical potential for model building of QCD

    CERN Document Server

    Kashiwa, Kouji

    2016-01-01

    Properties of QCD at finite imaginary chemical potential are revisited to utilize for the model building of QCD in low energy regimes. For example, the electric holonomy which is closely related to the Polyakov-loop drastically affects thermodynamic quantities beside the Roberge-Weiss transition line. To incorporate several properties at finite imaginary chemical potential, it is important to introduce the holonomy effects to the coupling constant of effective models. This extension is possible by considering the entanglement vertex. We show justifications of the entanglement vertex based on the derivation of the effective four-fermi interaction in the Nambu--Jona-Lasinio model and present its general form with the local approximation. To discuss how to remove model ambiguities in the entanglement vertex, we calculate the chiral condensate with different $\\mathbb{Z}_3$ sectors and the dual quark condensate.

  12. Theory Building- Towards an understanding of business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2009-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start to think more radically, when considering to innovate their business model. However, the development and innovation of business...... models is a complex venture and has not been widely researched yet. The objective of this paper is therefore 1) to build a [descriptive] theoretical understanding, based on Christensen's (2005) three-step procedure, to business models and their innovation and, as a result of that, 2) to strengthen...... researchers' and practitioners' perspectives as to how the process of business model innovation can be realized. By using various researchers' perspectives and assumptions, we identify relevant inconsistencies, which consequently lead us to propose possible supplementary solutions. We conclude our paper by...

  13. Building America Case Study: Accelerating the Delivery of Home-Performance Upgrades Using a Synergistic Business Model, Minneapolis, Minnesota

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    Achieving Building America energy savings goals (40 percent by 2030) will require many existing homes to install energy upgrades. Engaging large numbers of homeowners in building science-guided upgrades during a single remodeling event has been difficult for a number of reasons. Performance upgrades in existing homes tend to occur over multiple years and usually result from component failures (furnace failure) and weather damage (ice dams, roofing, siding). This research attempted to: A) Understand the homeowner's motivations regarding investing in building science based performance upgrades. B) Determining a rapidly scalable approach to engage large numbers of homeowners directly through existing customer networks. C) Access a business model that will manage all aspects of the contractor-homeowner-performance professional interface to ensure good upgrade decisions over time. The solution results from a synergistic approach utilizing networks of suppliers merging with networks of homeowner customers. Companies in the $400 to $800 billion home services industry have proven direct marketing and sales proficiencies that have led to the development of vast customer networks. Companies such as pest control, lawn care, and security have nurtured these networks by successfully addressing the ongoing needs of homes. This long-term access to customers and trust established with consistent delivery has also provided opportunities for home service providers to grow by successfully introducing new products and services like attic insulation and air sealing. The most important component for success is a business model that will facilitate and manage the process. The team analyzes a group that developed a working model.

  14. Building and Linking a Microsimulation Model to a CGE Model : the South African Microsimulation Model

    OpenAIRE

    Nicolas Hérault

    2005-01-01

    This paper describes the project of building a micro-macro model for South Africa. The aim is to deal with the links between globalisation and poverty or inequality, explaining the effects of trade liberalisation on poverty and inequality. The main issue of interest is the effect of international trade on households (especially their income); some changes may contribute to reduce poverty while other changes could work against the poor. The approach presented in this paper relies on combining ...

  15. A SWOT analysis on the implementation of Building Information Models within the geospatial environment

    NARCIS (Netherlands)

    Isikdag, U.; Zlatanova, S.

    2009-01-01

    Building Information Models as product models and Building Information Modelling as a process which supports information management throughout the lifecycle of a building are becoming more widely used in the Architecture/Engineering/Construction (AEC) industry. In order to facilitate various urban m

  16. Occupants' satisfaction toward building environmental quality: structural equation modeling approach.

    Science.gov (United States)

    Kamaruzzaman, Syahrul Nizam; Egbu, C O; Zawawi, Emma Marinie Ahmad; Karim, Saipol Bari Abd; Woon, Chen Jia

    2015-05-01

    It is accepted that occupants who are more satisfied with their workplace's building internal environment are more productive. The main objective of the study was to measure the occupants' level of satisfaction and the perceived importance of the design or refurbishment on office conditions. The study also attempted to determine the factors affecting the occupants' satisfaction with their building or office conditions. Post-occupancy evaluations were conducted using a structured questionnaire developed by the Built Environment Research Group at the University of Manchester, UK. Our questionnaires incorporate 22 factors relating to the internal environment and rate these in terms of "user satisfaction" and "degree of importance." The questions were modified to reflect the specific setting of the study and take into consideration the local conditions and climate in Malaysia. The overall mean satisfaction of the occupants toward their office environment was 5.35. The results were measured by a single item of overall liking of office conditions in general. Occupants were more satisfied with their state of health in the workplace, but they were extremely dissatisfied with the distance away from a window. The factor analysis divided the variables into three groups, namely intrusion, air quality, and office appearance. Structural equation modeling (SEM) was then used to determine which factor had the most significant influence on occupants' satisfaction: appearance. The findings from the study suggest that continuous improvement in aspects of the building's appearance needs to be supported with effective and comprehensive maintenance to sustain the occupants' satisfaction. PMID:25864077

  17. Modelling piezoelectric energy harvesting potential in an educational building

    International Nuclear Information System (INIS)

    Highlights: • Energy harvesting potential of commercialized piezoelectric tiles is analyzed. • The parameters which will affect the energy harvesting efficiency are determined. • The potential could cover 0.5% of the total energy usage of the library building. • A simplified evaluation indicator is proposed to test the considered paving area. - Abstract: In this paper, potential application of a commercial piezoelectric energy harvester in a central hub building at Macquarie University in Sydney, Australia is examined and discussed. Optimization of the piezoelectric tile deployment is presented according to the frequency of pedestrian mobility and a model is developed where 3.1% of the total floor area with the highest pedestrian mobility is paved with piezoelectric tiles. The modelling results indicate that the total annual energy harvesting potential for the proposed optimized tile pavement model is estimated at 1.1 MW h/year. This potential energy generation may be further increased to 9.9 MW h/year with a possible improvement in piezoelectric energy conversion efficiency integrated into the system. This energy harvesting potential would be sufficient to meet close to 0.5% of the annual energy needs of the building. The study confirms that locating high traffic areas is critical for optimization of the energy harvesting efficiency, as well as the orientation of the tile pavement significantly affects the total amount of the harvested energy. A Density Flow evaluation is recommended in this study to qualitatively evaluate the piezoelectric power harvesting potential of the considered area based on the number of pedestrian crossings per unit time

  18. Communicate and collaborate by using building information modeling

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    Building Information Modeling (BIM) represents a new approach within the Architecture, Engineering, and Construction (AEC) industry, one that encourages collaboration and engagement of all stakeholders on a project. This study discusses the potential of adopting BIM as a communication and...... collaboration platform. The discussion is based on: (1) a review of the latest BIM literature, (2) a qualitative survey of professionals within the industry, and (3) mapping of available BIM standards. This study presents the potential benefits, risks, and the overarching challenges of adopting BIM, and makes...

  19. Modelling consensus building in Delphi practices for participated transport planning

    CERN Document Server

    Pira, Michela Le; Ignaccolo, Matteo; Pluchino, Alessandro

    2015-01-01

    In this study a consensus building process based on a combination of Analytic Hierarchy Process (AHP) and Delphi method is presented and applied to the decision-making process about alternative policy measures to promote cycling mobility. An agent-based model is here used to reproduce the same process of convergence of opinions, with the aim to understand the role of network topology, stakeholder influence and other sensitive variables on the emergence of consensus. It can be a useful tool for decision-makers to guide them in planning effective participation processes.

  20. Toward Building a New Seismic Hazard Model for Mainland China

    Science.gov (United States)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  1. ALADDIN - enhancing applicability and scalability

    International Nuclear Information System (INIS)

    The ALADDIN project aims at the study and development of flexible, accurate, and reliable techniques and principles for computerised event classification and fault diagnosis for complex machinery and industrial processes. The main focus of the project is on advanced numerical techniques, such as wavelets, and empirical modelling with neural networks. This document reports on recent important advancements, which significantly widen the practical applicability of the developed principles, both in terms of flexibility of use, and in terms of scalability to large problem domains. In particular, two novel techniques are here described. The first, which we call Wavelet On- Line Pre-processing (WOLP), is aimed at extracting, on-line, relevant dynamic features from the process data streams. This technique allows a system a greater flexibility in detecting and processing transients at a range of different time scales. The second technique, which we call Autonomous Recursive Task Decomposition (ARTD), is aimed at tackling the problem of constructing a classifier able to discriminate among a large number of different event/fault classes, which is often the case when the application domain is a complex industrial process. ARTD also allows for incremental application development (i.e. the incremental addition of new classes to an existing classifier, without the need of retraining the entire system), and for simplified application maintenance. The description of these novel techniques is complemented by reports of quantitative experiments that show in practice the extent of these improvements. (Author)

  2. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    Science.gov (United States)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and

  3. INTEGRATING SMARTPHONE IMAGES AND AIRBORNE LIDAR DATA FOR COMPLETE URBAN BUILDING MODELLING

    OpenAIRE

    Zhang, Shenman; Shan, Jie; Zhang, Zhichao; Yan, Jixing; Hou, Yaolin

    2016-01-01

    A complete building model reconstruction needs data collected from both air and ground. The former often has sparse coverage on building façades, while the latter usually is unable to observe the building rooftops. Attempting to solve the missing data issues in building reconstruction from single data source, we describe an approach for complete building reconstruction that integrates airborne LiDAR data and ground smartphone imagery. First, by taking advantages of GPS and digital compass inf...

  4. Building predictive models of soil particle-size distribution

    Directory of Open Access Journals (Sweden)

    Alessandro Samuel-Rosa

    2013-04-01

    Full Text Available Is it possible to build predictive models (PMs of soil particle-size distribution (psd in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index. The PMs explained more than half of the data variance. This performance is similar to (or even better than that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd of soils in regions of complex geology.

  5. Models in theory building: the case of early string theory

    International Nuclear Information System (INIS)

    The history of the origins and first steps of string theory, from Veneziano's formulation of his famous scattering amplitude in 1968 to the 'first string revolution' in 1984, provides rich material for discussing traditional issues in the philosophy of science. This paper focusses on the initial phase of this history, that is the making of early string theory out of the 'dual theory of strong interactions' motivated by the aim of finding a viable theory of hadrons in the framework of the so-called S-matrix theory of the Sixties: from the first two models proposed (the Dual Resonance Model and the Shapiro-Virasoro Model) to all the subsequent endeavours to extend and complete the theory, including its string interpretation. As is the aim of this paper to show, by representing an exemplary illustration of the building of a scientific theory out of tentative and partial models this is a particularly fruitful case study for the current philosophical discussion on how to characterize a scientific model, a scientific theory, and the relation between models and theories.

  6. Responsive, Flexible and Scalable Broader Impacts (Invited)

    Science.gov (United States)

    Decharon, A.; Companion, C.; Steinman, M.

    2010-12-01

    In many educator professional development workshops, scientists present content in a slideshow-type format and field questions afterwards. Drawbacks of this approach include: inability to begin the lecture with content that is responsive to audience needs; lack of flexible access to specific material within the linear presentation; and “Q&A” sessions are not easily scalable to broader audiences. Often this type of traditional interaction provides little direct benefit to the scientists. The Centers for Ocean Sciences Education Excellence - Ocean Systems (COSEE-OS) applies the technique of concept mapping with demonstrated effectiveness in helping scientists and educators “get on the same page” (deCharon et al., 2009). A key aspect is scientist professional development geared towards improving face-to-face and online communication with non-scientists. COSEE-OS promotes scientist-educator collaboration, tests the application of scientist-educator maps in new contexts through webinars, and is piloting the expansion of maps as long-lived resources for the broader community. Collaboration - COSEE-OS has developed and tested a workshop model bringing scientists and educators together in a peer-oriented process, often clarifying common misconceptions. Scientist-educator teams develop online concept maps that are hyperlinked to “assets” (i.e., images, videos, news) and are responsive to the needs of non-scientist audiences. In workshop evaluations, 91% of educators said that the process of concept mapping helped them think through science topics and 89% said that concept mapping helped build a bridge of communication with scientists (n=53). Application - After developing a concept map, with COSEE-OS staff assistance, scientists are invited to give webinar presentations that include live “Q&A” sessions. The webinars extend the reach of scientist-created concept maps to new contexts, both geographically and topically (e.g., oil spill), with a relatively small

  7. Dispersion model for airborne particulates inside a building

    International Nuclear Information System (INIS)

    An empirical model has been developed for the spread of airborne radioactive particles after they are released inside a building. The model has been useful in performing safety analyses of actinide materials facilities at the Savannah River Plant (SRP). These facilities employ the multiple-air-zone concept; that is, ventilation air flows from rooms or areas of least radioactive material hazard, through zones of increasing hazard, to a treatment system. A composite of the data for dispersion of airborne activity during 12 actual case incidents at SRP forms the basis for this model. These incidents occurred during approximately 90 plant-years of experience at SRP with the chemical and metallurgical processing of purified neptunium and plutonium after their recovery from irradiated uranium. The model gives ratios of the airborne activity concentrations in rooms and corridors near the site of the release. The multiple-air-zone concept has been applied to many designs of nuclear facilities as a safety feature to limit the spread of airborne activity from a release. The model illustrates the limitations of this concept: it predicts an apparently anomalous behavior of airborne particulates; namely, a small migration against the flow of the ventilation air

  8. BUILDING ROBUST APPEARANCE MODELS USING ON-LINE FEATURE SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    PORTER, REID B. [Los Alamos National Laboratory; LOVELAND, ROHAN [Los Alamos National Laboratory; ROSTEN, ED [Los Alamos National Laboratory

    2007-01-29

    In many tracking applications, adapting the target appearance model over time can improve performance. This approach is most popular in high frame rate video applications where latent variables, related to the objects appearance (e.g., orientation and pose), vary slowly from one frame to the next. In these cases the appearance model and the tracking system are tightly integrated, and latent variables are often included as part of the tracking system's dynamic model. In this paper we describe our efforts to track cars in low frame rate data (1 frame/second) acquired from a highly unstable airborne platform. Due to the low frame rate, and poor image quality, the appearance of a particular vehicle varies greatly from one frame to the next. This leads us to a different problem: how can we build the best appearance model from all instances of a vehicle we have seen so far. The best appearance model should maximize the future performance of the tracking system, and maximize the chances of reacquiring the vehicle once it leaves the field of view. We propose an online feature selection approach to this problem and investigate the performance and computational trade-offs with a real-world dataset.

  9. Highly scalable Ab initio genomic motif identification

    KAUST Repository

    Marchand, Benoît

    2011-01-01

    We present results of scaling an ab initio motif family identification system, Dragon Motif Finder (DMF), to 65,536 processor cores of IBM Blue Gene/P. DMF seeks groups of mutually similar polynucleotide patterns within a set of genomic sequences and builds various motif families from them. Such information is of relevance to many problems in life sciences. Prior attempts to scale such ab initio motif-finding algorithms achieved limited success. We solve the scalability issues using a combination of mixed-mode MPI-OpenMP parallel programming, master-slave work assignment, multi-level workload distribution, multi-level MPI collectives, and serial optimizations. While the scalability of our algorithm was excellent (94% parallel efficiency on 65,536 cores relative to 256 cores on a modest-size problem), the final speedup with respect to the original serial code exceeded 250,000 when serial optimizations are included. This enabled us to carry out many large-scale ab initio motiffinding simulations in a few hours while the original serial code would have needed decades of execution time. Copyright 2011 ACM.

  10. Model Building Strategies for Predicting Multiple Landslide Events

    Science.gov (United States)

    Lombardo, L.; Cama, M.; Märker, M.; Parisi, L.; Rotigliano, E.

    2013-12-01

    A model building strategy is tested to assess the susceptibility for extreme climatic events driven landslides. In fact, extreme climatic inputs such as storms typically are very local phenomena in the Mediterranean areas, so that with the exception of recently stricken areas, the landslide inventories which are required to train any stochastic model are actually unavailable. A solution is here proposed, consisting in training a susceptibility model in a source catchment, which was implemented by applying the binary logistic regression technique, and exporting its predicting function (selected predictors regressed coefficients) in a target catchment to predict its landslide distribution. To test the method we exploit the disaster that occurred in the Messina area (southern Italy) on the 1st of October 2009 where, following a 250mm/8hours storm, approximately 2000 debris flow/debris avalanches landslides in an area of 21km2 triggered, killing thirty-seven people, injuring more than one hundred, and causing 0.5M euro worth of structural damage. The debris flows and debris avalanches phenomena involved the thin weathered mantle of the Varisican low to high grade metamorphic rocks that outcrop in the eastern slopes of the Peloritan Belt. Two 10km2 wide stream catchments, which are located inside the storm core area were exploited: susceptibility models trained in the Briga catchment were tested when exported to predict the landslides distribution in the Giampilieri catchment. The prediction performance (based on goodness of fit, prediction skill, accuracy and precision assessment) of the exported model was then compared with that of a model prepared in the Giampilieri catchment exploiting its landslide inventory. The results demonstrate that the landslide scenario observed in the Giampilieri catchment can be predicted with the same high performance without knowing its landslide distribution: we obtained in fact a very poor decrease in predictive performance when

  11. Modelling the affordance in the field of green building

    OpenAIRE

    Bona, Audrey

    2016-01-01

    The energetic performance of sustainable buildings is significantly lower than expected and therefore the impact of user behaviour becomes a crucial element. Different solutions are implemented to achieve predicted performance; these range from information i.e. user guides, to the influence of user behaviour through building automation reducing the users’ control. The possibility of designing more efficient buildings without altering the relationship between the user and the building, and wit...

  12. BIM (Building Information Modeling) and TCO (Total Cost of Ownership)

    Science.gov (United States)

    Christensen, Douglas K.

    2009-01-01

    There are some words in the building industry that seem to be clear and understandable to say, yet they need some help in understanding the depth of the meaning. When the term maintenance is talked about there seems to be some agreement that it does not mean building a new building. Maintenance as a term covers many areas and if not clarified…

  13. Application of Mathematical Model of Evacuation for Large Stadium Building

    Directory of Open Access Journals (Sweden)

    Bing Zhang

    2013-02-01

    Full Text Available The statistics of sports arena accidents show that the main reasons which leading to crowd stampede are the exports blockage and the poor surrounding transportations. In the process of evacuation, the most common problem is that there are a large number of people are stranded and also they are the main carrier which leading to crowded stampede. With large amounts of data and reasonable evaluations on staffs and transportation instruments. We propose inflow model in the crowding state, principle of maximum flow on channel design, optimal model of vehicle parking, evacuation model of subways and buses, according to sections of evacuation in stadiums. We analyze their usage area, marginal conditions and real data. Finally, we get some valuable results, which are curves of density and flow, evacuation time, formula for channel design, optimal parking design and formulas for evacuation time of subways and buses. Such data suits the real data from varied references. With the help of models and results, we get the total time of evacuation, simulation of progress and give parts of real situations of evacuation. According to such results, 100000 people’s evacuation can be finished in about 45 min. On such basis, we propose some optimal plans for stadium and its surroundings building.

  14. Optimization of Enzymatic Biochemical Logic for Noise Reduction and Scalability: How Many Biocomputing Gates Can be Interconnected in a Circuit?

    CERN Document Server

    Privman, V; Solenov, D; Pita, M; Katz, E

    2008-01-01

    We report an experimental evaluation of the "input-output surface" for a biochemical AND gate. The obtained data are modeled within the rate-equation approach, with the aim to map out the gate function and cast it in the language of logic variables appropriate for analysis of Boolean logic for scalability. In order to minimize "analog" noise, we consider a theoretical approach for determining an optimal set for the process parameters to minimize "analog" noise amplification for gate concatenation. We establish that under optimized conditions, presently studied biochemical gates can be concatenated for up to order 10 processing steps. Beyond that, new paradigms for avoiding noise build-up will have to be developed. We offer a general discussion of the ideas and possible future challenges for both experimental and theoretical research for advancing scalable biochemical computing.

  15. Building a sustainable Academic Health Department: the South Carolina model.

    Science.gov (United States)

    Smith, Lillian Upton; Waddell, Lisa; Kyle, Joseph; Hand, Gregory A

    2014-01-01

    Given the limited resources available to public health, it is critical that university programs complement the development needs of agencies. Unfortunately, academic and practice public health entities have long been challenged in building sustainable collaborations that support practice-based research, teaching, and service. The academic health department concept offers a promising solution. In South Carolina, the partners started their academic health department program with a small grant that expanded into a dynamic infrastructure that supports innovative professional exchange and development programs. This article provides a background and describes the key elements of the South Carolina model: joint leadership, a multicomponent memorandum of agreement, and a shared professional development mission. The combination of these elements allows the partners to leverage resources and deftly respond to challenges and opportunities, ultimately fostering the sustainability of the collaboration. PMID:24667204

  16. Building Information Modelling for Cultural Heritage: A review

    Science.gov (United States)

    Logothetis, S.; Delinasiou, A.; Stylianidis, E.

    2015-08-01

    We discuss the evolution and state-of-the-art of the use of Building Information Modelling (BIM) in the field of culture heritage documentation. BIM is a hot theme involving different characteristics including principles, technology, even privacy rights for the cultural heritage objects. Modern documentation needs identified the potential of BIM in the recent years. Many architects, archaeologists, conservationists, engineers regard BIM as a disruptive force, changing the way professionals can document and manage a cultural heritage structure. The latest years, there are many developments in the BIM field while the developed technology and methods challenged the cultural heritage community in the documentation framework. In this review article, following a brief historic background for the BIM, we review the recent developments focusing in the cultural heritage documentation perspective.

  17. Building Information Modeling for Managing Design and Construction

    DEFF Research Database (Denmark)

    Berard, Ole Bengt

    , consequently, the information flow are unique. Therefore, the present study suggests a method for identifying information requirements collaboratively between the design and the construction team. The method is based on pull scheduling for design from lean construction. Furthermore, the study suggests the......Contractors planning and executing construction work encounter many kinds of problems with design information, such as uncoordinated drawings and specification, missing relevant information, and late delivery of design information. Research has shown that missing design information and unintended...... outcome of construction work. Even though contractors regularly encounter design information problems, these issues are accepted as a condition of doing business and better design information has yet to be defined. Building information modeling has the inherent promise of improving the quality of design...

  18. Multi-criteria decision model for retrofitting existing buildings

    Directory of Open Access Journals (Sweden)

    M. D. Bostenaru Dan

    2004-01-01

    Full Text Available Decision is an element in the risk management process. In this paper the way how science can help in decision making and implementation for retrofitting buildings in earthquake prone urban areas is investigated. In such interventions actors from various spheres are involved. Their interests range among minimising the intervention for maximal preservation or increasing it for seismic safety. Research was conducted to see how to facilitate collaboration between these actors. A particular attention was given to the role of time in actors' preferences. For this reason, on decision level, both the processural and the personal dimension of risk management, the later seen as a task, were considered. A systematic approach was employed to determine the functional structure of a participative decision model. Three layers on which actors implied in this multi-criteria decision problem interact were identified: town, building and element. So-called 'retrofit elements' are characteristic bearers in the architectural survey, engineering simulations, costs estimation and define the realms perceived by the inhabitants. This way they represent an interaction basis for the interest groups considered in a deeper study. Such orientation means for actors' interaction were designed on other levels of intervention as well. Finally, an 'experiment' for the implementation of the decision model is presented: a strategic plan for an urban intervention towards reduction of earthquake hazard impact through retrofitting. A systematic approach proves thus to be a very good communication basis among the participants in the seismic risk management process. Nevertheless, it can only be applied in later phases (decision, implementation, control only, since it serves verifying and improving solution and not developing the concept. The 'retrofit elements' are a typical example of the detailing degree reached in the retrofit design plans in these phases.

  19. Scalable Quantum Computing with "Enhancement" Quantum Dots

    CERN Document Server

    Lyanda-Geller, Y B; Yang, M J

    2005-01-01

    We propose a novel scheme of solid state realization of a quantum computer based on single spin "enhancement mode" quantum dots as building blocks. In the enhancement quantum dots, just one electron can be brought into initially empty dot, in contrast to depletion mode dots based on expelling of electrons from multi-electron dots by gates. The quantum computer architectures based on depletion dots are confronted by several challenges making scalability difficult. These challenges can be successfully met by the approach based on ehnancement mode, capable of producing square array of dots with versatile functionalities. These functionalities allow transportation of qubits, including teleportation, and error correction based on straightforward one- and two-qubit operations. We describe physical properties and demonstrate experimental characteristics of enhancement quantum dots and single-electron transistors based on InAs/GaSb composite quantum wells. We discuss the materials aspects of quantum dot quantum compu...

  20. Dispersion model for airborne particulates inside a building

    International Nuclear Information System (INIS)

    An empirical model has been developed for the spread of airborne radioactive particles after they are released inside a building. The model has been useful in performing safety analyses of actinide materials facilities at the Savannah River Plant (SRP), operated for the US Department of Energy by the Du Pont Company. These facilities employ the multiple-air-zone concept; that is, ventilation air flows from rooms or areas of least radioactive material hazard, through zones of increasing hazard, to a treatment system. A composite of the data for dispersion of airborne activity during 12 actual case incidents at SRP forms the basis for this model. These incidents occurred during approximately 90 plant-years of experience at SRP with the chemical and metallurgical processing of purified neptunium and plutonium after their recovery from irradiated uranium. The model gives ratios of the airborne activity concentrations in rooms and corridors near the site of the release. All data are normalized to the data from the air sampler nearest the release point. The model can be applied in predicting airborne activity concentrations from particulate releases elsewhere, if the facility in question has similar features of floor plan, air velocity, and air flow direction. The multiple-air-zone concept has been applied to many designs of nuclear facilities as a safety feature to limit the spread of airborne activity from a release. The model illustrates the limitations of this concept: it predicts an apparently anomalous behavior of airborne particulates; namely, a small migration against the flow of the ventilation air. The following phenomena are suggested as possible mechanisms for this migration: eddy currents in the air flow; leaks of ventilation air between zones; open doors; movement of personnel during an incident; inadequate flow of ventilation air; and thermal gradients. 2 references, 12 figures, 4 tables

  1. Green Template for Life Cycle Assessment of Buildings Based on Building Information Modeling: Focus on Embodied Environmental Impact

    Directory of Open Access Journals (Sweden)

    Sungwoo Lee

    2015-12-01

    Full Text Available The increased popularity of building information modeling (BIM for application in the construction of eco-friendly green buildings has given rise to techniques for evaluating green buildings constructed using BIM features. Existing BIM-based green building evaluation techniques mostly rely on externally provided evaluation tools, which pose problems associated with interoperability, including a lack of data compatibility and the amount of time required for format conversion. To overcome these problems, this study sets out to develop a template (the “green template” for evaluating the embodied environmental impact of using a BIM design tool as part of BIM-based building life-cycle assessment (LCA technology development. Firstly, the BIM level of detail (LOD was determined to evaluate the embodied environmental impact, and constructed a database of the impact factors of the embodied environmental impact of the major building materials, thereby adopting an LCA-based approach. The libraries of major building elements were developed by using the established databases and compiled evaluation table of the embodied environmental impact of the building materials. Finally, the green template was developed as an embodied environmental impact evaluation tool and a case study was performed to test its applicability. The results of the green template-based embodied environmental impact evaluation of a test building were validated against those of its actual quantity takeoff (2D takeoff, and its reliability was confirmed by an effective error rate of ≤5%. This study aims to develop a system for assessing the impact of the substances discharged from concrete production process on six environmental impact categories, i.e., global warming (GWP, acidification (AP, eutrophication (EP, abiotic depletion (ADP, ozone depletion (ODP, and photochemical oxidant creation (POCP, using the life a cycle assessment (LCA method. To achieve this, we proposed an LCA method

  2. Status and Perceptions of the Application of Building Information Modeling for Improved Building Projects Delivery in Nigeria

    Directory of Open Access Journals (Sweden)

    S.C Ugochukwu

    2015-11-01

    Full Text Available Building Information Modeling (BIM is a new and innovative approach to building design, construction, and management. It is a cutting-edge, state of the art technology that is not only transforming, but improving the building delivery/production process in developed countries of the world. Sadly, Nigeria is yet to adopt this revolutionary technology in her construction industry. This study thus, sought to evaluate the present status of application of BIM in building projects in Nigeria, with a view to betoning its importance in improving the present state of building delivery in the country. This was effected by means of a field survey of building professionals in which their perceptions were analyzed, based on a structured questionnaire administration; in order to elicit their level of awareness of BIM application, determine their extent of participation in BIM projects, identify and rank the most suitable procurement method that encourages BIM application, the barriers to the application of BIM and the benefits of BIM application to building delivery in Nigeria. Results/Findings revealed that knowledge of BIM application among professionals is very poor (33%, participation/use of BIM in projects is non-existent, the collaborative method of procurement best supports BIM application, lack of awareness remains the major barrier to BIM application, while simultaneous access to project database by stakeholders is the highest ranked benefit of BIM application. The study concludes that Nigeria still has a long way to go in understanding, embracing and applying BIM to improve the traditional and stagnant state of her building delivery process. Hence, all hands should be on deck; the government, professional bodies, construction organizations and the academia to ensure that BIM becomes a priority with respect to legislations, training, research and use in the Nigerian building industry

  3. Scalable shared-memory multiprocessing

    CERN Document Server

    Lenoski, Daniel E

    1995-01-01

    Dr. Lenoski and Dr. Weber have experience with leading-edge research and practical issues involved in implementing large-scale parallel systems. They were key contributors to the architecture and design of the DASH multiprocessor. Currently, they are involved with commercializing scalable shared-memory technology.

  4. Annual review of scalable computing

    CERN Document Server

    Kwong, Yuen Chung

    2000-01-01

    Continuing the Series on Scalable Computing launched in 1999, this volume presents five articles reviewing significant current developments in the field. The topics include the collaborative activities support system, parallel languages, Internet Java, the multithreaded dataflow machine, and task allocation algorithms.

  5. Scalability study of solid xenon

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, J.; Cease, H.; Jaskierny, W. F.; Markley, D.; Pahlka, R. B.; Balakishiyeva, D.; Saab, T.; Filipenko, M.

    2015-04-01

    We report a demonstration of the scalability of optically transparent xenon in the solid phase for use as a particle detector above a kilogram scale. We employed a cryostat cooled by liquid nitrogen combined with a xenon purification and chiller system. A modified {\\it Bridgeman's technique} reproduces a large scale optically transparent solid xenon.

  6. Intelligent Controls for Net-Zero Energy Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Haorong; Cho, Yong; Peng, Dongming

    2011-10-30

    The goal of this project is to develop and demonstrate enabling technologies that can empower homeowners to convert their homes into net-zero energy buildings in a cost-effective manner. The project objectives and expected outcomes are as follows: • To develop rapid and scalable building information collection and modeling technologies that can obtain and process “as-built” building information in an automated or semiautomated manner. • To identify low-cost measurements and develop low-cost virtual sensors that can monitor building operations in a plug-n-play and low-cost manner. • To integrate and demonstrate low-cost building information modeling (BIM) technologies. • To develop decision support tools which can empower building owners to perform energy auditing and retrofit analysis. • To develop and demonstrate low-cost automated diagnostics and optimal control technologies which can improve building energy efficiency in a continual manner.

  7. Activity measurement and effective dose modelling of natural radionuclides in building material

    International Nuclear Information System (INIS)

    In this paper the assessment of natural radionuclides' activity concentration in building materials, calibration requirements and related indoor exposure dose models is presented. Particular attention is turned to specific improvements in low-level gamma-ray spectrometry to determine the activity concentration of necessary natural radionuclides in building materials with adequate measurement uncertainties. Different approaches for the modelling of the effective dose indoor due to external radiation resulted from natural radionuclides in building material and results of actual building material assessments are shown. - Highlights: • Dose models for indoor radiation exposure due to natural radionuclides in building materials. • Strategies and methods in radionuclide metrology, activity measurement and dose modelling. • Selection of appropriate parameters in radiation protection standards for building materials. • Scientific-based limitations of indoor exposure due to natural radionuclides in building materials

  8. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    Science.gov (United States)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to

  9. Scalable Recommendation from Web Usage Mining using Method of Moments

    OpenAIRE

    Dasgupta, Sayantan

    2015-01-01

    With the advent of mass-available Internet, twenty-first century observed a steady growth in web based commercial services and technology companies. Most of them are based on web applications that receive huge amount of user traffics, and generate massive amount of web usage data containing user-item interactions. We attempt to build a recommendation algorithm based on such web usage data. It is essential that recommendation algorithms for such applications are highly scalable in nature. Exis...

  10. Application of MCAM 4.8 in creating neutronics model for ITER building

    International Nuclear Information System (INIS)

    The neutronics reference model of the international thermonuclear experimental reactor ITER only defines the tokamak machine and extends to the bio-shield. In order to meet future neutronics analysis needs, it is necessary to create a reference model of the ITER building beyond the bio-shield. With the help of Monte Carlo automatic modeling program MCAM and based on the engineering CAD model, the neutronics simulation model of ITER building complex was created. This model is the first neutronics model of ITER building complex and will be distributed to the world as a reference model. (authors)

  11. Particle Image Velocimetry Measurement of Unsteady Turbulent Flow around Regularly Arranged High-Rise Building Models

    OpenAIRE

    Sato, Tsuyoshi; Hagishima, Aya; Ikegaya, Naoki; Tanimoto, Jun

    2013-01-01

    Recent studies proved turbulent flow properties in high-rise building models differ from those in low-rise building models by comparing turbulent statistics. Although it is important to understand the flow characteristics within and above high-rise building models in the study of urban environment, it is still unknown and under investigation. For this reason, we performed wind tunnel experiment using Particle Image Velocimetry (PIV) to investigate and identify the turbulent flow properties an...

  12. Subjective comparison of temporal and quality scalability

    DEFF Research Database (Denmark)

    Korhonen, Jari; Reiter, Ulrich; You, Junyong

    2011-01-01

    reduced either by downscaling the frame rate (temporal scalability) or the image quality (quality scalability). However, the user preferences between different scalability types are not well known in different scenarios. In this paper, we present a methodology for subjective comparison between temporal...

  13. Sustainable building – From role model projects to industrial transformation

    OpenAIRE

    Meistad, Torill Randi

    2015-01-01

    Background and purpose Improving energy efficiency and sustainability is a challenge for the construction industry. The United Nation’s environment programme (UNEP) and the EU’s Energy Performance of Buildings Directive require a change in building practices. The challenge is how to facilitate the transformation. The purpose of this PhD research is to increase the understanding of how the Norwegian construction industry is transforming towards sustainable building. Four issu...

  14. Modelling energy demand in the Norwegian building stock

    OpenAIRE

    Sartori, Igor

    2008-01-01

    Energy demand in the building stock in Norway represents about 40% of the final energy consumption, of which 22% goes to the residential sector and 18% to the service sector. In Norway there is a strong dependency on electricity for heating purposes, with electricity covering about 80% of the energy demand in buildings. The building sector can play an important role in the achievement of a more sustainable energy system. The work performed in the articles presented in this thesis investigates...

  15. Life cycle sustainability assessment modeling of building construction

    OpenAIRE

    Dong, Yahong; 董雅紅

    2014-01-01

    Building industry is one of the most influential economic sectors, which accounts for 10% of the gross domestic product (GDP) globally and 40% of the world energy consumption. To achieve the goal of sustainable development, it is necessary to understand the sustainability performance of building construction in the environmental, the economic and the social aspects. This study quantitatively evaluates impacts of building construction in the three aspects by using the recently developed life c...

  16. Models test on dynamic structure-structure interaction of nuclear power plant buildings

    International Nuclear Information System (INIS)

    A reactor building of an NPP (nuclear power plant) is generally constructed closely adjacent to a turbine building and other buildings such as the auxiliary building, and in increasing numbers of NPPs, multiple plants are being planned and constructed closely on a single site. In these situations, adjacent buildings are considered to influence each other through the soil during earthquakes and to exhibit dynamic behaviour different from that of separate buildings, because those buildings in NPP are generally heavy and massive. The dynamic interaction between buildings during earthquake through the soil is termed here as 'dynamic cross interaction (DCI)'. In order to comprehend DCI appropriately, forced vibration tests and earthquake observation are needed using closely constructed building models. Standing on this background, Nuclear Power Engineering Corporation (NUPEC) had planned the project to investigate the DCI effect in 1993 after the preceding SSI (soil-structure interaction) investigation project, 'model tests on embedment effect of reactor building'. The project consists of field and laboratory tests. The field test is being carried out using three different building construction conditions, e.g. a single reactor building to be used for the comparison purposes as for a reference, two same reactor buildings used to evaluate pure DCI effects, and two different buildings, reactor and turbine building models to evaluate DCI effects under the actual plant conditions. Forced vibration tests and earthquake observations are planned in the field test. The laboratory test is planned to evaluate basic characteristics of the DCI effects using simple soil model made of silicon rubber and structure models made of aluminum. In this test, forced vibration tests and shaking table tests are planned. The project was started in April 1994 and will be completed in March 2002. This paper describes an outline and the summary of the current status of this project. (orig.)

  17. A comprehensive framework of building model reconstruction from airborne LiDAR data

    Science.gov (United States)

    Xiao, Y.; Wang, C.; Xi, X. H.; Zhang, W. M.

    2014-03-01

    This paper presents a comprehensive framework of reconstructing 3D building models from airborne LiDAR data, which involves building extraction, roof segmentation and model generation. Firstly, building points are extracted from LiDAR point clouds by removing walls, trees, ground and noises. Walls and trees are identified by the normal and multi-return features respectively and then ground and noise are detected by the region growing algorithm which aims at extracting smooth surfaces. Then the connected component analysis is performed to extract building points. Secondly, once the building points are acquired, building roofs are separated by the region growing algorithm which employs the normal vector and curvature of points to detect planar clusters. Finally, by combining regular building outlines obtained from building points and roof intersections acquired from the roof segmentation results, 3D building models with high accuracy are derived. Experimental results demonstrate that the proposed method is able to correctly obtain building points and reconstruct 3D building models with high accuracy.

  18. VOC sink behaviour on building materials--model evaluation

    Science.gov (United States)

    The event of 11 September 2001 underscored the need to study the vulnerability of buildings to weapons of mass destruction (WMD), including chemical, biological, physical, and radiological agents. Should these agents be released inside a building, they would interact with interio...

  19. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    extend the co-array model to other languages in a small experimental version of Co-array Python. Another collaborative project defined a Fortran 95 interface to ARMCI to encourage Fortran programmers to use the one-sided communication model in anticipation of their conversion to the co-array model later. A collaborative project with the Earth Sciences community at NASA Goddard and GFDL experimented with the co-array model within computational kernels related to their climate models, first using CafLib and then extending the co-array model to use design patterns. Future work will build on the design-pattern idea with a redesign of CafLib as a true object-oriented library using Fortran 2003 and as a parallel numerical library using Fortran 2008.

  20. TH*: Scalable Distributed Trie Hashing

    Directory of Open Access Journals (Sweden)

    Aridj Mohamed

    2010-11-01

    Full Text Available In today's world of computers, dealing with huge amounts of data is not unusual. The need to distribute this data in order to increase its availability and increase the performance of accessing it is more urgent than ever. For these reasons it is necessary to develop scalable distributed data structures. In this paper we propose a TH* distributed variant of the Trie Hashing data structure. First we propose Thsw new version of TH without node Nil in digital tree (trie, then this version will be adapted to multicomputer environment. The simulation results reveal that TH* is scalable in the sense that it grows gracefully, one bucket at a time, to a large number of servers, also TH* offers a good storage space utilization and high query efficiency special for ordering operations.

  1. Implementation of building information modeling in Malaysian construction industry

    Science.gov (United States)

    Memon, Aftab Hameed; Rahman, Ismail Abdul; Harman, Nur Melly Edora

    2014-10-01

    This study has assessed the implementation level of Building Information Modeling (BIM) in the construction industry of Malaysia. It also investigated several computer software packages facilitating BIM and challenges affecting its implementation. Data collection for this study was carried out using questionnaire survey among the construction practitioners. 95 completed forms of questionnaire received against 150 distributed questionnaire sets from consultant, contractor and client organizations were analyzed statistically. Analysis findings indicated that the level of implementation of BIM in the construction industry of Malaysia is very low. Average index method employed to assess the effectiveness of various software packages of BIM highlighted that Bentley construction, AutoCAD and ArchiCAD are three most popular and effective software packages. Major challenges to BIM implementation are it requires enhanced collaboration, add work to a designer, interoperability and needs enhanced collaboration. For improving the level of implementing BIM in Malaysian industry, it is recommended that a flexible training program of BIM for all practitioners must be created.

  2. Uncertainty modelling of critical column buckling for reinforced concrete buildings

    Indian Academy of Sciences (India)

    Kasim A Korkmaz; Fuat Demir; Hamide Tekeli

    2011-04-01

    Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic algorithms can be a solution to generate more dependable results. This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. Fuzzy logic algorithm was employed in the study. Lower and upper bounds of elastic modulus representing material properties were defined to take uncertainties into account. The results show that uncertainties play an important role in stability analyses and should be considered in the design. The proposed approach is applicable to both future numerical and experimental researches. According to the study results, it is seen that, calculated buckling load values are stayed in lower and upper bounds while the load values are different for same concrete strength values by using different code formula.

  3. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    International Nuclear Information System (INIS)

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C6H6 eq.) calculated by the previous model was much lower (1965 kg C6H6 eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings

  4. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Minho, E-mail: minmin40@hanmail.net [Asset Management Division, Mate Plus Co., Ltd., 9th Fl., Financial News Bldg. 24-5 Yeouido-dong, Yeongdeungpo-gu, Seoul, 150-877 (Korea, Republic of); Hong, Taehoon, E-mail: hong7@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of); Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of)

    2015-01-15

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.

  5. TENSOR-BASED QUALITY PREDICTION FOR BUILDING MODEL RECONSTRUCTION FROM LIDAR DATA AND TOPOGRAPHIC MAP

    OpenAIRE

    Lin, B C; You, R. J.

    2012-01-01

    A quality prediction method is proposed to evaluate the quality of the automatic reconstruction of building models. In this study, LiDAR data and topographic maps are integrated for building model reconstruction. Hence, data registration is a critical step for data fusion. To improve the efficiency of the data fusion, a robust least squares method is applied to register boundary points extracted from LiDAR data and building outlines obtained from topographic maps. After registration,...

  6. The Model Transformation-based Tool Building Techniques and Their Implementation

    OpenAIRE

    Edgars Rencis

    2012-01-01

    Abstract In Doctoral Thesis „The Model Transformation-based Tool Building Techniques and Their Implementation” a model transformation- and metamodel-based domainspecific tool building area is inspected paying the main attention to the problem of making the development and usage of such tools easier. The tool building platform GRAF is examined since it has been partly developed by the author. This platform is supplemented with several services alleviating both the development...

  7. A mass transfer model for predicting emission of the volatile organic compounds in wet building materials

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tao; JIA Li

    2008-01-01

    A new mass transfer model is developped to predict the volatile organic compounds (VOCs) from fresh wet building materials. The dry section of wet materials during the process of VOC emission from wet building materials is considered in this new model, differing from the mass transfer-based models in other literatures. The mechanism of effect of saturated vapor pressure on the surface of wet building materials in the process of VOC emission is discussed. The concentration of total volatile organic compounds (TVOC) in the building materials gradually decreases as the emission of VOCs begins, and the vapor pressure of VOCs on the surface of wet building materials decreases in the case of newly wet building materials. To ensure the partial pressure of VOCs on the surface of wet building materials to be saturated vapor pressure, the interface of gas-wet layer is lowered, and a dry layer of no-volatile gases in the material is formed. Compared with the results obtained by VB model, CFD model and the ex-periment data, the results obtained by the present model agree well with the results obtained by CFD model and the experiment data. The present model is more accurate in predicting emission of VOC from wet building materials than VB model.

  8. The effect of simplifying the building description on the numerical modeling of its thermal performance

    Energy Technology Data Exchange (ETDEWEB)

    Stetiu, C.

    1993-07-01

    A thermal building simulation program is a numerical model that calculates the response of the building envelopes to weather and human activity, simulates dynamic heating and cooling loads, and heating and cooling distribution systems, and models building equipment operation. The scope of the research is to supply the users of such programs with information about the dangers and benefits of simplifying the input to their models. The Introduction describes the advantages of modeling the heat transfer mechanisms in a building. The programs that perform this type of modeling have, however, limitations. The user is therefore often put in the situation of simplifying the floor plans of the building under study, but not being able to check the effects that this approximation introduces in the results of the simulation. Chapter 1 is a description of methods. It also introduces the floor plans for the office building under study and the ``reasonable`` floor plans simplifications. Chapter 2 presents DOE-2, the thermal building simulation program used in the sensitivity study. The evaluation of the accuracy of the DOE-2 program itself is also presented. Chapter 3 contains the sensitivity study. The complicated nature of the process of interpreting the temperature profile inside a space leads to the necessity of defining different building modes. The study compares the results from the model of the detailed building description with the results from the models of the same building having simplified floor plans. The conclusion is reached that a study of the effects of simplifying the floor plans of a building is important mainly for defining the cases in which this approximation is acceptable. Different results are obtained for different air conditioning/load regimes of the building. 9 refs., 24 figs.

  9. Semantic Bim and GIS Modelling for Energy-Efficient Buildings Integrated in a Healthcare District

    Science.gov (United States)

    Sebastian, R.; Böhms, H. M.; Bonsma, P.; van den Helm, P. W.

    2013-09-01

    The subject of energy-efficient buildings (EeB) is among the most urgent research priorities in the European Union (EU). In order to achieve the broadest impact, innovative approaches to EeB need to resolve challenges at the neighbourhood level, instead of only focusing on improvements of individual buildings. For this purpose, the design phase of new building projects as well as building retrofitting projects is the crucial moment for integrating multi-scale EeB solutions. In EeB design process, clients, architects, technical designers, contractors, and end-users altogether need new methods and tools for designing energy-efficiency buildings integrated in their neighbourhoods. Since the scope of designing covers multiple dimensions, the new design methodology relies on the inter-operability between Building Information Modelling (BIM) and Geospatial Information Systems (GIS). Design for EeB optimisation needs to put attention on the inter-connections between the architectural systems and the MEP/HVAC systems, as well as on the relation of Product Lifecycle Modelling (PLM), Building Management Systems (BMS), BIM and GIS. This paper is descriptive and it presents an actual EU FP7 large-scale collaborative research project titled STREAMER. The research on the inter-operability between BIM and GIS for holistic design of energy-efficient buildings in neighbourhood scale is supported by real case studies of mixed-use healthcare districts. The new design methodology encompasses all scales and all lifecycle phases of the built environment, as well as the whole lifecycle of the information models that comprises: Building Information Model (BIM), Building Assembly Model (BAM), Building Energy Model (BEM), and Building Operation Optimisation Model (BOOM).

  10. Mathematical and Statistical Models and Methods for Describing the Thermal Characteristics of Buildings

    DEFF Research Database (Denmark)

    Madsen, Henrik; Bacher, Peder; Andersen, Philip Hvidthøft Delff

    2010-01-01

    temperature are needed or beneficial. The suite of models described consists of nonlinear stochastic models, linear stochas- tic models, transfer function models, frequency response function models, impulse response models and regression models. The final choice of model depends on the purpose of the modelling...... methods for time series modelling or system identification. Applying these methods the following can be achieved: Characterization of the energy performance of buildings (including energy labelling), identification of how to improve the thermal performance of the building, and improved control of the energy...

  11. Modeling zero energy building: technical and economical optimization

    OpenAIRE

    Ferrara, Maria; Virgone, Joseph; Fabrizio, Enrico; Kuznik, Frédéric; Filippi, Marco

    2013-01-01

    International audience This study was born in the context of new challenges imposed by the recast of Energy Performance of Buildings. The aim of this work is to provide a useful method to deal with a huge number of simulations corresponding to a large number of single-family house configurations in order to optimize a constructive solution from both technical and economical point of view. The method combines the use of TRNSYS, building energy simulation program, with GenOpt, Generic Optimi...

  12. Modelling Zero Energy Buildings: Parametric study for the technical optimization

    OpenAIRE

    Ferrara, Maria; Fabrizio, Enrico; Filippi, Marco

    2014-01-01

    This study was born in the context of new challenges imposed by the recast of the EU Energy Performance of Buildings Directive. The aim of this work is to develop strategies to identify and investigate the relationship between decisional variables within a nZEB design concept, providing a useful method to deal with a huge number of simulations corresponding to a large number of building configurations in order to find one optimized constructive solution. The method combines the use of the TRN...

  13. Risk Classification Model for Design and Build Projects

    OpenAIRE

    O. E. Ogunsanmi; O. A. Salako; O. M. Ajayi

    2011-01-01

    The purpose of this paper is to investigate if the various risk sources in Design and Build projects can be classified into three risk groups of cost, time and quality using the discriminant analysis technique. Literature search was undertaken to review issues of risk sources, classification of the identified risks into a risk structure, management of risks and effects of risks all on Design and Build projects as well as concepts of discriminant analysis as a statistical technique. This liter...

  14. Empirical Validation of Building Simulation Software : Modeling of Double Facades

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group.......The work described in this report is the result of a collaborative effort of members of the International Energy Agency (IEA), Task 34/43: Testing and validation of building energy simulation tools experts group....

  15. A generic component model for building systems software

    OpenAIRE

    Coulson, Geoffrey; Blair, Gordon; Grace, Paul; Taiani, Francois; Joolia, Ackbar; Lee, Kevin; Ueyama, Jo; Sivaharan, Thirunavukkarasu

    2008-01-01

    Component-based software structuring principles are now commonplace at the application level; but componentization is far less established when it comes to building low-level systems software. Although there have been pioneering efforts in applying componentization to systems-building, these efforts have tended to target specific application domains (e.g., embedded systems, operating systems, communications systems, programmable networking environments, or middleware platforms). They also ten...

  16. Integrating Smartphone Images and Airborne LIDAR Data for Complete Urban Building Modelling

    Science.gov (United States)

    Zhang, Shenman; Shan, Jie; Zhang, Zhichao; Yan, Jixing; Hou, Yaolin

    2016-06-01

    A complete building model reconstruction needs data collected from both air and ground. The former often has sparse coverage on building façades, while the latter usually is unable to observe the building rooftops. Attempting to solve the missing data issues in building reconstruction from single data source, we describe an approach for complete building reconstruction that integrates airborne LiDAR data and ground smartphone imagery. First, by taking advantages of GPS and digital compass information embedded in the image metadata of smartphones, we are able to find airborne LiDAR point clouds for the corresponding buildings in the images. In the next step, Structure-from-Motion and dense multi-view stereo algorithms are applied to generate building point cloud from multiple ground images. The third step extracts building outlines respectively from the LiDAR point cloud and the ground image point cloud. An automated correspondence between these two sets of building outlines allows us to achieve a precise registration and combination of the two point clouds, which ultimately results in a complete and full resolution building model. The developed approach overcomes the problem of sparse points on building façades in airborne LiDAR and the deficiency of rooftops in ground images such that the merits of both datasets are utilized.

  17. Flood vulnerability assessment of residential buildings by explicit damage process modelling

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    2015-01-01

    masonry building. Results are presented in terms of a parameter study for several building parameters and hazard characteristics, as well as, in terms of a comparison with damage data and literature vulnerability models. The parameter study indicates that hazard characteristics and building...... characteristics impact damage ratios as expected. Furthermore, the results are comparable to vulnerability models in literature. Strengths and shortcomings of the model are discussed. The modelling approach is considered as a step towards the establishment of vulnerability models that can serve as a basis...

  18. Development of surrogate models using artificial neural network for building shell energy labelling

    International Nuclear Information System (INIS)

    Surrogate models are an important part of building energy labelling programs, but these models still present low accuracy, particularly in cooling-dominated climates. The objective of this study was to evaluate the feasibility of using an artificial neural network (ANN) to improve the accuracy of surrogate models for labelling purposes. An ANN was applied to model the building stock of a city in Brazil, based on the results of extensive simulations using the high-resolution building energy simulation program EnergyPlus. Sensitivity and uncertainty analyses were carried out to evaluate the behaviour of the ANN model, and the variations in the best and worst performance for several typologies were analysed in relation to variations in the input parameters and building characteristics. The results obtained indicate that an ANN can represent the interaction between input and output data for a vast and diverse building stock. Sensitivity analysis showed that no single input parameter can be identified as the main factor responsible for the building energy performance. The uncertainty associated with several parameters plays a major role in assessing building energy performance, together with the facade area and the shell-to-floor ratio. The results of this study may have a profound impact as ANNs could be applied in the future to define regulations in many countries, with positive effects on optimizing the energy consumption. - Highlights: • We model several typologies which have variation in input parameters. • We evaluate the accuracy of surrogate models for labelling purposes. • ANN is applied to model the building stock. • Uncertainty in building plays a major role in the building energy performance. • Results show that ANN could help to develop building energy labelling systems

  19. MATCHING LSI FOR SCALABLE INFORMATION RETRIEVAL

    Directory of Open Access Journals (Sweden)

    Rajagopal Palsonkennedy

    2012-01-01

    Full Text Available Latent Semantic Indexing (LSI is one of the well-liked techniques in the information retrieval fields. Different from the traditional information retrieval techniques, LSI is not based on the keyword matching simply. It uses statistics and algebraic computations. Based on Singular Value Decomposition (SVD, the higher dimensional matrix is converted to a lower dimensional approximate matrix, of which the noises could be filtered. And also the issues of synonymy and polysemy in the traditional techniques can be prevail over based on the investigations of the terms related with the documents. However, it is notable that LSI suffers a scalability issue due to the computing complexity of SVD. This study presents a distributed LSI algorithm MR-LSI which can solve the scalability issue using Hadoop framework based on the distributed computing model Map Reduce. It also solves the overhead issue caused by the involved clustering algorithm by k-means algorithm. The evaluations indicate that MR-LSI can gain noteworthy improvement compared to the other scheme on processing large scale of documents. One significant advantage of Hadoop is that it supports various computing environments so that the issue of unbalanced load among nodes is highlighted.Hence, a load balancing algorithm based on genetic algorithm for balancing load in static environment is proposed. The results show that it can advance the performance of a cluster according to different levels.

  20. Scalability of the plasma physics code GEM

    CERN Document Server

    Scott, Bruce D; Hoenen, Olivier; Karmakar, Anumap; Fazendeiro, Luis

    2013-01-01

    We discuss a detailed weak scaling analysis of GEM, a 3D MPI-parallelised gyrofluid code used in theoretical plasma physics at the Max Planck Institute of Plasma Physics, IPP at Garching b. M\\"unchen, Germany. Within a PRACE Preparatory Access Project various versions of the code have been analysed on the HPC systems SuperMUC at LRZ and JUQUEEN at J\\"ulich Supercomputing Centre (JSC) to improve the parallel scalability of the application. The diagnostic tool Scalasca has been used to filter out suboptimal routines. The code uses the electromagnetic gyrofluid model which is a superset of magnetohydrodynamic and drift-Alfv\\'en microturbulance and also includes several relevant kinetic processes. GEM can be used with different geometries depending on the targeted use case, and has been proven to show good scalability when the computational domain is distributed amongst two dimensions. Such a distribution allows grids with sufficient size to describe small scale tokamak devices. In order to enable simulation of v...

  1. Experimental and Numerical Analysis of Wind Driven Natural Ventilation in a Building Scale Model

    DEFF Research Database (Denmark)

    Heiselberg, Per Kvols; True, Jan Per Jensen; Sandberg, Mats;

    2004-01-01

    Airflow through openings in a cross ventilated building scale model was investigated in a wind tunnel and by numerical predictions. Predictions for a wind direction perpendicular to the building showed an airflow pattern consisting of streamlines entering the room, that originated from approximat......Airflow through openings in a cross ventilated building scale model was investigated in a wind tunnel and by numerical predictions. Predictions for a wind direction perpendicular to the building showed an airflow pattern consisting of streamlines entering the room, that originated from...

  2. Review of Development Survey of Phase Change Material Models in Building Applications

    Directory of Open Access Journals (Sweden)

    Hussein J. Akeiber

    2014-01-01

    Full Text Available The application of phase change materials (PCMs in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data.

  3. A New Model for Building Digital Science Education Collections

    Science.gov (United States)

    Niepold, F.; McCaffrey, M.; Morrill, C.; Ganse, J.; Weston, T.

    2005-12-01

    The Polar Regions play an integral role in how our Earth system operates. However, the Polar Regions are marginally studied in the K-12 classroom in the United States. The International Polar Year's (IPY) coordinated campaign of polar observations, research, and analysis that will be multidisciplinary in scope and international in participation offers a powerful opportunity for K-12 classroom. The IPY's scientific objective to better understand the key roles of the Polar Regions in global processes will allow students a window into the poles and this unique regions role in the Earth system. IPY will produce careful, useful scientific information that will advance our understanding of the Polar Regions and their connections to the rest of the globe. The IPY is an opportunity to inspire the next generation of very young Earth system scientists. The IPY's draft education & outreach position paper asks a key question that must guide future educational projects; "Why is the polar regions and polar research important to all people on earth?" In efforts to coordinate educational activities and collaborate with international projects, United States national agencies, and other educational initiatives, it is the purpose of this session to explore potential partnerships, while primarily recommending a model for educational product development and review. During such a large international science endeavor, numerous educational activities and opportunities are developed, but these educational programs can suffer from too many unconnected options being available to teachers and students. Additionally, activities often are incompatible with each other making classroom implementation unnecessarily complex and prohibitively time consuming for teachers. A newly develop educational activity collection technique developed for DLESE offers an effective model for IPY product gap analysis and development. The Climate Change Collection developed as a pilot project for the Digital Library

  4. Comparison of sensorless dimming control based on building modeling and solar power generation

    International Nuclear Information System (INIS)

    Artificial lighting in office buildings accounts for about 30% of the total building energy consumption. Lighting energy is important to reduce building energy consumption since artificial lighting typically has a relatively large energy conversion factor. Therefore, previous studies have proposed a dimming control using daylight. When applied dimming control, method based on building modeling does not need illuminance sensors. Thus, it can be applied to existing buildings that do not have illuminance sensors. However, this method does not accurately reflect real-time weather conditions. On the other hand, solar power generation from a PV (photovoltaic) panel reflects real-time weather conditions. The PV panel as the sensor improves the accuracy of dimming control by reflecting disturbance. Therefore, we compared and analyzed two types of sensorless dimming controls: those based on the building modeling and those that based on solar power generation using PV panels. In terms of energy savings, we found that a dimming control based on building modeling is more effective than that based on solar power generation by about 6%. However, dimming control based on solar power generation minimizes the inconvenience to occupants and can also react to changes in solar radiation entering the building caused by dirty window. - Highlights: • We conducted sensorless dimming control based on solar power generation. • Dimming controls using building modeling and solar power generation were compared. • The real time weather conditions can be considered by using solar power generation. • Dimming control using solar power generation minimizes inconvenience to occupants

  5. Multi-scale modelling to improve climate data for building energy models

    OpenAIRE

    Mauree, Dasaraden; Kämpf, Jérôme Henri; Scartezzini, Jean-Louis

    2015-01-01

    The recent AR5 report from the Intergovernmental Panel on Climate Change has again stressed on the need for mitigation and adaptation measures to tackle issues related to climate change. Tackling future urban planning and energy efficiency in the building sector is crucial as they account for almost 40% of energy use in developed countries. A one-dimensional canopy interface module (CIM) was recently developed to improve the surface representation in meteorological models and to enhance boun...

  6. Use of MCAM in creating 3D neutronics model for ITER building

    International Nuclear Information System (INIS)

    Highlights: ► We created a 3D neutronics model of the ITER building. ► The model was produced from the engineering CAD model by MCAM software. ► The neutron flux map in the ITER building was calculated. - Abstract: The three dimensional (3D) neutronics reference model of International Thermonuclear Experimental Reactor (ITER) only defines the tokamak machine and extends to the bio-shield. In order to meet further 3D neutronics analysis needs, it is necessary to create a 3D reference model of the ITER building. Monte Carlo Automatic Modeling Program for Radiation Transport Simulation (MCAM) was developed as a computer aided design (CAD) based bi-directional interface program between general CAD systems and Monte Carlo radiation transport simulation codes. With the help of MCAM version 4.8, the 3D neutronics model of ITER building was created based on the engineering CAD model. The calculation of the neutron flux map in ITER building during operation showed the correctness and usability of the model. This model is the first detailed ITER building 3D neutronics model and it will be made available to all international organization collaborators as a reference model.

  7. On the Impact of Building Attenuation Models in VANET Simulations of Urban Scenarios

    Directory of Open Access Journals (Sweden)

    Luis Urquiza-Aguiar

    2015-01-01

    Full Text Available Buildings are important elements of cities for VANETs, since these obstacles may attenuate communications between vehicles. Consequently, the impact of buildings has to be considered as part of the attenuation model in VANET simulations of urban scenarios. However, the more elaborated the model, the more information needs to be processed during the simulation, which implies longer processing times. This complexity in simulations is not always worth it, because simplified channel models occasionally offer very accurate results. We compare three approaches to model the impact of buildings in the channel model of simulated VANETs in two urban scenarios. The simulation results for our evaluation scenarios of a traffic-efficiency application indicate that modeling the influence of buildings in urban areas as the total absence of communication between vehicles gives similar results to modeling such influence in a more realistic fashion and could be considered a conservative bound in the performance metrics.

  8. Regulatory odour model development: Survey of modelling tools and datasets with focus on building effects

    DEFF Research Database (Denmark)

    Olesen, H. R.; Løfstrøm, P.; Berkowicz, R.;

    relation to odour problems due to animal farming. However, the model needs certain improvements and validation in order to be fully suited for that purpose. The report represents a survey of existing literature, models and data sets. It includes a brief overview of the state-of-the-art of atmospheric...... dispersion models for estimating local concentration levels in general. However, the report focuses on some particular issues, which are relevant for subsequent work on odour due to animal production. An issue of primary concern is the effect that buildings (stables) have on flow and dispersion. The handling......A project within the framework of a larger research programme, Action Plan for the Aquatic Environment III (VMP III) aims towards improving an atmospheric dispersion model (OML). The OML model is used for regulatory applications in Denmark, and it is the candidate model to be used also in future in...

  9. Modelling the heat dynamics of a building using stochastic differential equations

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae; Madsen, Henrik; Hansen, Lars Henrik

    2000-01-01

    This paper describes the continuous time modelling of the heat dynamics of a building. The considered building is a residential like test house divided into two test rooms with a water based central heating. Each test room is divided into thermal zones in order to describe both short and long term...... variations. Besides modelling the heat transfer between thermal zones, attention is put on modelling the heat input from radiators and solar radiation. The applied modelling procedure is based on collected building performance data and statistical methods. The statistical methods are used in parameter...

  10. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  11. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach

    International Nuclear Information System (INIS)

    Highlights: • This paper presents a new modeling method to forecast energy demands. • The model is based on physical–statistical approach to improving forecast accuracy. • A new method is proposed to address the heterogeneity challenge. • Comparison with measurements shows accurate forecasts of the model. • The first physical–statistical/heterogeneous building energy modeling approach is proposed and validated. - Abstract: Energy consumption forecasting is a critical and necessary input to planning and controlling energy usage in the building sector which accounts for 40% of the world’s energy use and the world’s greatest fraction of greenhouse gas emissions. However, due to the diversity and complexity of buildings as well as the random nature of weather conditions, energy consumption and loads are stochastic and difficult to predict. This paper presents a new methodology for energy demand forecasting that addresses the heterogeneity challenges in energy modeling of buildings. The new method is based on a physical–statistical approach designed to account for building heterogeneity to improve forecast accuracy. The physical model provides a theoretical input to characterize the underlying physical mechanism of energy flows. Then stochastic parameters are introduced into the physical model and the statistical time series model is formulated to reflect model uncertainties and individual heterogeneity in buildings. A new method of model generalization based on a convex hull technique is further derived to parameterize the individual-level model parameters for consistent model coefficients while maintaining satisfactory modeling accuracy for heterogeneous buildings. The proposed method and its validation are presented in detail for four different sports buildings with field measurements. The results show that the proposed methodology and model can provide a considerable improvement in forecasting accuracy

  12. Sizing Thermally Activated Building Systems (TABS): A Brief Literature Review and Model Evaluation

    OpenAIRE

    Basu, Chandrayee; Schiavon, Stefano; Bauman, Fred

    2012-01-01

    While Thermally Activated Building Systems (TABS) is a recognized low-energy HVAC candidate system for net-zero-energy buildings, sizing of these systems is complex due to their slow thermal response. In this paper, seven design and control models have been reviewed and characterized systematically with an aim to investigate their applicability in various design scenarios and at different design stages. The design scenarios include variable space heat gain, different building thermal mass and...

  13. Scalable Density-Based Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan;

    2011-01-01

    For knowledge discovery in high dimensional databases, subspace clustering detects clusters in arbitrary subspace projections. Scalability is a crucial issue, as the number of possible projections is exponential in the number of dimensions. We propose a scalable density-based subspace clustering...... synthetic databases show that steering is efficient and scalable, with high quality results. For future work, our steering paradigm for density-based subspace clustering opens research potential for speeding up other subspace clustering approaches as well....

  14. 3D building modeling,organization and application in digital city system

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The real world is a three-dimensional(3D)space requiring that 3D geospatial information applications be developed in alignment with the observer’s visual and perceptive habits.Particularly,3D building model data are required in a wide range of areas such as urban planning,environmental protection,real estate management and emergency response.At the same time,the development of Web service[LU1]technologies allows the possibility of the widely distributed 3D geospatial data on the web.3D city building model with its related information is an important part in the construction of a digital city system,and has become a staple resource on the web nowadays.In view of the hierarchical representation of a 3D building model,an abstract of a 3D building model based on structure details is studied,and a novel representation approach named 3D transparent building hierarchical model is presented in this paper.This approach fully uses both the existing 3D modeling technologies and CAD constructing mapping data.By the spatial relationship description,structural components inside a building can be represented and integrated as hierarchical models in a unified 3D space.In addition,based on the characteristics of the 3D building model data,a service-oriented architecture and Web service technologies for 3D city building models are discussed.The aim of the approach is that 3D city building models can be used as a kind of data resource service on the web,and can also exist independently in various different web applications.

  15. Using Models to Provide Predicted Ranges for Building-Human Interfaces: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Scheib, J.; Pless, S.; Schott, M.

    2013-09-01

    Most building energy consumption dashboards provide only a snapshot of building performance; whereas some provide more detailed historic data with which to compare current usage. This paper will discuss the Building Agent(tm) platform, which has been developed and deployed in a campus setting at the National Renewable Energy Laboratory as part of an effort to maintain the aggressive energyperformance achieved in newly constructed office buildings and laboratories. The Building Agent(tm) provides aggregated and coherent access to building data, including electric energy, thermal energy, temperatures, humidity, and lighting levels, and occupant feedback, which are displayed in various manners for visitors, building occupants, facility managers, and researchers. This paper focuseson the development of visualizations for facility managers, or an energy performance assurance role, where metered data are used to generate models that provide live predicted ranges of building performance by end use. These predicted ranges provide simple, visual context for displayed performance data without requiring users to also assess historical information or trends. Several energymodelling techniques were explored including static lookup-based performance targets, reduced-order models derived from historical data using main effect variables such as solar radiance for lighting performance, and integrated energy models using a whole-building energy simulation program.

  16. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    , Engineering, Construction, and Facility Management (AEC/ FM) communication, and (b) BPS as a platform for early-stage building performance prediction. The second is to develop (a) relevant AEC/FM communication support instruments, and (b) standardized BIM and BPS execution guidelines and information exchange...... and standardized exchange formats, and in-depth preparation and training of AEC/FM project participants are given a high priority. It is essential that this preparation and training are supported by common BIM standards and execution guidelines. Thesis studies also showed that BPS approaches have the potential...... is created by selecting the specific IDM Packages required for the specific AEC/FM project. In this approach, the IDM Project Plan can help communicate the overall scope of the AEC/FM project, processes to be carried out, organizational interactions, and required information exchanges. In this thesis...

  17. Procedure for identifying models for the heat dynamics of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik

    This report describes a new method for obtaining detailed information about the heat dynamics of a building using frequent reading of the heat consumption. Such a procedure is considered to be of uttermost importance as a key procedure for using readings from smart meters, which is expected...

  18. Modeling Best Practice through Online Learning: Building Relationships

    Science.gov (United States)

    Cerniglia, Ellen G.

    2011-01-01

    Students may fear that they will feel unsupported and isolated when engaged in online learning. They don't know how they will be able to build relationships with their teacher and classmates solely based on written words, without facial expressions, tone of voice, and other nonverbal communication cues. Traditionally, online learning required…

  19. Modelling surface pressure fluctuation on medium-rise buildings

    NARCIS (Netherlands)

    Snæbjörnsson, J.T.; Geurts, C.P.W.

    2006-01-01

    This paper describes the results of two experiments into the fluctuating characteristics of windinduced pressures on buildings in a built-up environment. The experiments have been carried out independently in Iceland and The Netherlands and can be considered to represent two separate cases of buildi

  20. Highly Scalable Matching Pursuit Signal Decomposition Algorithm

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research, we propose a variant of the classical Matching Pursuit Decomposition (MPD) algorithm with significantly improved scalability and computational...

  1. Integrated model for characterization of spatiotemporal building energy consumption patterns in neighborhoods and city districts

    International Nuclear Information System (INIS)

    Highlights: • A model to describe spatiotemporal building energy demand patterns was developed. • The model integrates existing methods in urban and energy planning domains. • The model is useful to analyze energy efficiency strategies in neighborhoods. • Applicability in educational, urban and energy planning practices was found. - Abstract: We introduce an integrated model for characterization of spatiotemporal building energy consumption patterns in neighborhoods and city districts. The model addresses the need for a comprehensive method to identify present and potential states of building energy consumption in the context of urban transformation. The focus lies on determining the spatiotemporal variability of energy services in both standing and future buildings in the residential, commercial and industrial sectors. This detailed characterization facilitates the assessment of potential energy efficiency measures at the neighborhood and city district scales. In a novel approach we integrated existing methods in urban and energy planning domains such as spatial analysis, dynamic building energy modeling and energy mapping to provide a comprehensive, multi-scale and multi-dimensional model of analysis. The model is part of a geographic information system (GIS), which serves as a platform for the allocation and future dissemination of spatiotemporal data. The model is validated against measured data and a peer model for a city district in Switzerland. In this context, we present practical applications in the analysis of energy efficiency measures in buildings and urban zoning. We furthermore discuss potential applications in educational, urban and energy planning practices

  2. Agent-Based Evacuation Model Incorporating Fire Scene and Building Geometry

    Institute of Scientific and Technical Information of China (English)

    TANG Fangqin; REN Aizhu

    2008-01-01

    A comprehensive description of the key factors affecting evacuations at fire scones is necessary for accurate simulations.An agent-based simulation model which incorporates the fire scene and the building geometry is developed using a fire dynamics simulator (FDS) based on the computational fluid dynamics and geographic information system (GIS) data to model the occupant response.The building entities are generated for FDS simulation while the spatial analysis on GIS data represents the occupant's knowledge of the building.The influence of the fire is based on a hazard assessment of the combustion products.The agent behavior and decisions are affected by environmental features and the fire field.A case study demonstrates that the evacuation model effectively simulates the coexistence and interactions of the major factors including occupants,building geometry,and fire disaster during the evacuation.The results can be used for the assessments of building designs regarding fire safety.

  3. Modeling and optimization of energy generation and storage systems for thermal conditioning of buildings targeting conceptual building design

    Energy Technology Data Exchange (ETDEWEB)

    Grahovac, Milica

    2012-11-29

    The thermal conditioning systems are responsible for almost half of the energy consump-tion by commercial buildings. In many European countries and in the USA, buildings account for around 40% of primary energy consumption and it is therefore vital to explore further ways to reduce the HVAC (Heating, Ventilation and Air Conditioning) system energy consumption. This thesis investigates the relationship between the energy genera-tion and storage systems for thermal conditioning of buildings (shorter: primary HVAC systems) and the conceptual building design. Certain building design decisions irreversibly influence a building's energy performance and, conversely, many generation and storage components impose restrictions on building design and, by their nature, cannot be introduced at a later design stage. The objective is, firstly, to develop a method to quantify this influence, in terms of primary HVAC system dimensions, its cost, emissions and energy consumption and, secondly, to enable the use of the developed method by architects during the conceptual design. In order to account for the non-stationary effects of the intermittent renewable energy sources (RES), thermal storage and for the component part load efficiencies, a time domain system simulation is required. An abstract system simulation method is proposed based on seven pre-configured primary HVAC system models, including components such as boil-ers, chillers and cooling towers, thermal storage, solar thermal collectors, and photovoltaic modules. A control strategy is developed for each of the models and their annual quasi-stationary simulation is performed. The performance profiles obtained are then used to calculate the energy consumption, carbon emissions and costs. The annuity method has been employed to calculate the cost. Optimization is used to automatically size the HVAC systems, based on their simulation performance. Its purpose is to identify the system component dimensions that provide

  4. Modeling and Validation of Electrical Load Profiling in Residential Buildings in Singapore

    OpenAIRE

    Chuan, Luo; Ukil, Abhisek

    2015-01-01

    The demand of electricity keeps increasing in this modern society and the behavior of customers vary greatly from time to time, city to city, type to type, etc. Generally, buildings are classified into residential, commercial and industrial. This study is aimed to distinguish the types of residential buildings in Singapore and establish a mathematical model to represent and model the load profile of each type. Modeling household energy consumption is the first step in exploring the possible d...

  5. On the impact of building attenuation models in urban VANET simulations

    OpenAIRE

    Luis Urquiza-Aguiar; Carolina Tripp-Barba; José Estrada-Jiménez; Mónica Aguilar Igartua

    2015-01-01

    Buildings are important elements of cities for VANETs, since these obstacles may attenuate communications between vehicles. Consequently, the impact of buildings has to be considered as part of the attenuation model in VANET simulations of urban scenarios. However, the more elaborated the model, the more information needs to be processed during the simulation, which implies longer processing times. This complexity in simulations is not always worth it, because simplified channel models occasi...

  6. Adapting to Students' Social and Health Needs: Suggested Framework for Building Inclusive Models of Practice

    Science.gov (United States)

    Schwitzer, Alan M.

    2009-01-01

    Objective: This article builds on earlier discussions about college health research. The author suggests a 5-step framework that research practitioners can use to build models of practice that accurately address the needs of diverse campus populations. Methods: The author provides 3 illustrations, drawn from published research examining college…

  7. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen; Guo, Siyue

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an office building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.

  8. Seismic resistant analysis of coupled model of reactor coolant system and reactor building

    International Nuclear Information System (INIS)

    Reactor coolant system(RCS) and reactor building are actually coupled with each other. SRP (Revision 2) edited by USNRC particularly pointed out in 3.7.2 that RCS, which is considered a subsystem but is usually analyzed using a coupled model with building. Under this background, this paper selects PC-NPP as a study object, and seismic resistant analysis is performed with a coupled model of building and RCS using response spectrum method and time history method. Finally, analyzed results are compared with those of uncoupled RCS model. In the analysis, building is simulated with cantilever beam model of shear wall combination. In the uncoupled model, each supporting of equipment is modeled using elastic beam element with actual supporting stiffness, which is connected to a rigid cantilever (single-point input) and to an elastic cantilever (multipoint input). Seismic load of coupled model is input from the bottom of building. After comparison, it is shown that the effect of interaction between RCS and building can not be ignored, and the uncoupled model for seismic resistant analysis is inappropriate to be applied in actual seismic design. Through this research, we can control the seismic analysis technique in coupled model and enhance our analysis level of NPP. (authors)

  9. Stability and scalability of piezoelectric flag

    Science.gov (United States)

    Wang, Xiaolin; Alben, Silas; Li, Chenyang; Young, Yin Lu

    2015-11-01

    Piezoelectric material (PZT) has drawn enormous attention in the past decades due to its ability to convert mechanical deformation energy into electrical potential energy, and vice versa, and has been applied to energy harvesting and vibration control. In this work, we consider the effect of PZT on the stability of a flexible flag using the inviscid vortex-sheet model. We find that the critical flutter speed is increased due to the extra damping effect of the PZT, and can also be altered by tuning the output inductance-resistance circuit. Optimal resistance and inductance are found to either maximize or minimize the flutter speed. The former application is useful for the vibration control while the latter is important for energy harvesting. We also discuss the scalability of above system to the actual application in air and water.

  10. Evidence-Based Model Calibration for Efficient Building Energy Services

    OpenAIRE

    Bertagnolio, Stéphane

    2012-01-01

    Energy services play a growing role in the control of energy consumption and the improvement of energy efficiency in non-residential buildings. Most of the energy use analyses involved in the energy efficiency service process require on-field measurements and energy use analysis. Today, while detailed on-field measurements and energy counting stay generally expensive and time-consuming, energy simulations are increasingly cheaper due to the continuous improvement of computer speed. This work ...

  11. Creating a Conceptual Model for Building Responsible Brands

    OpenAIRE

    Kujala, Johanna; Penttilä, Katriina; Tuominen, Pekka

    2011-01-01

    Despite the importance of brands in mediating corporate social responsibility, there has been relatively little research on how responsible brands are developed from the internal perspective of the company. Some research has been conducted from the external perspective, such as the link between ethical issues and consumer purchase behaviour, but there has been relatively little focus on brand-building itself. The present study addresses this gap in the ...

  12. Applicability of the building information model for seismic analysis

    OpenAIRE

    Logonder, Tine

    2009-01-01

    Complexity of structural projects increases and consequently more and more experts are involved in the design and construction process. These facts lead to the need to enhance the interoperability between software tools used in the design and construction process of structure. For that reason the standard IFC (IFC - Industry Foundation Classes, a basic industrial classes) has been developed, which aims to standardize the presentation of buildings data. In the thesis the methods...

  13. Quantum Information Processing using Scalable Techniques

    Science.gov (United States)

    Hanneke, D.; Bowler, R.; Jost, J. D.; Home, J. P.; Lin, Y.; Tan, T.-R.; Leibfried, D.; Wineland, D. J.

    2011-05-01

    We report progress towards improving our previous demonstrations that combined all the fundamental building blocks required for scalable quantum information processing using trapped atomic ions. Included elements are long-lived qubits; a laser-induced universal gate set; state initialization and readout; and information transport, including co-trapping a second ion species to reinitialize motion without qubit decoherence. Recent efforts have focused on reducing experimental overhead and increasing gate fidelity. Most of the experimental duty cycle was previously used for transport, separation, and recombination of ion chains as well as re-cooling of motional excitation. We have addressed these issues by developing and implementing an arbitrary waveform generator with an update rate far above the ions' motional frequencies. To reduce gate errors, we actively stabilize the position of several UV (313 nm) laser beams. We have also switched the two-qubit entangling gate to one that acts directly on 9Be+ hyperfine qubit states whose energy separation is magnetic-fluctuation insensitive. This work is supported by DARPA, NSA, ONR, IARPA, Sandia, and the NIST Quantum Information Program.

  14. Automatic urban building boundary extraction from high resolution aerial images using an innovative model of active contours

    Science.gov (United States)

    Ahmadi, Salman; Zoej, M. J. Valadan; Ebadi, Hamid; Moghaddam, Hamid Abrishami; Mohammadzadeh, Ali

    2010-06-01

    To present a new method for building boundary detection and extraction based on the active contour model, is the main objective of this research. Classical models of this type are associated with several shortcomings; they require extensive initialization, they are sensitive to noise, and adjustment issues often become problematic with complex images. In this research a new model of active contours has been proposed that is optimized for the automatic building extraction. This new active contour model, in comparison to the classical ones, can detect and extract the building boundaries more accurately, and is capable of avoiding detection of the boundaries of features in the neighborhood of buildings such as streets and trees. Finally, the detected building boundaries are generalized to obtain a regular shape for building boundaries. Tests with our proposed model demonstrate excellent accuracy in terms of building boundary extraction. However, due to the radiometric similarity between building roofs and the image background, our system fails to recognize a few buildings.

  15. Scalable Medical Image Understanding by Fusing Cross-Modal Object Recognition with Formal Domain Semantics

    Science.gov (United States)

    Möller, Manuel; Sintek, Michael; Buitelaar, Paul; Mukherjee, Saikat; Zhou, Xiang Sean; Freund, Jörg

    Recent advances in medical imaging technology have dramatically increased the amount of clinical image data. In contrast, techniques for efficiently exploiting the rich semantic information in medical images have evolved much slower. Despite the research outcomes in image understanding, current image databases are still indexed by manually assigned subjective keywords instead of the semantics of the images. Indeed, most current content-based image search applications index image features that do not generalize well and use inflexible queries. This slow progress is due to the lack of scalable and generic information representation systems which can abstract over the high dimensional nature of medical images as well as semantically model the results of object recognition techniques. We propose a system combining medical imaging information with ontological formalized semantic knowledge that provides a basis for building universal knowledge repositories and gives clinicians fully cross-lingual and cross-modal access to biomedical information.

  16. Evaluation of the Effective Moisture Penetration Depth Model for Estimating Moisture Buffering in Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Woods, J.; Winkler, J.; Christensen, D.

    2013-01-01

    This study examines the effective moisture penetration depth (EMPD) model, and its suitability for building simulations. The EMPD model is a compromise between the simple, inaccurate effective capacitance approach and the complex, yet accurate, finite-difference approach. Two formulations of the EMPD model were examined, including the model used in the EnergyPlus building simulation software. An error in the EMPD model we uncovered was fixed with the release of EnergyPlus version 7.2, and the EMPD model in earlier versions of EnergyPlus should not be used.

  17. Lawrence A. Boland, Model Building in Economics. Its Purposes and Limitations

    OpenAIRE

    Boumans, Marcel

    2015-01-01

    Model Building in Economics is a nice and informative exercise in “small-m methodology.” Its intention is to provide a “Forest perspective” on model building, and not so much a discussion of individual “Trees.” If the Trees are meant to be individual models, such as game-theoretic models, DSGE models and simple econometrics models, the Forest and Trees metaphor is an apt description of the book, but if the Forest means methodology it is less so. Methodology is not presented from a Forest pers...

  18. Semi-Automatic Building Models and FAÇADE Texture Mapping from Mobile Phone Images

    Science.gov (United States)

    Jeong, J.; Kim, T.

    2016-06-01

    Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  19. Building sustainable ecosystem-oriented architectures

    CERN Document Server

    Bassil, Youssef

    2012-01-01

    Currently, organizations are transforming their business processes into e-services and service-oriented architectures to improve coordination across sales, marketing, and partner channels, to build flexible and scalable systems, and to reduce integration-related maintenance and development costs. However, this new paradigm is still fragile and lacks many features crucial for building sustainable and progressive computing infrastructures able to rapidly respond and adapt to the always-changing market and environmental business. This paper proposes a novel framework for building sustainable Ecosystem- Oriented Architectures (EOA) using e-service models. The backbone of this framework is an ecosystem layer comprising several computing units whose aim is to deliver universal interoperability, transparent communication, automated management, self-integration, self-adaptation, and security to all the interconnected services, components, and devices in the ecosystem. Overall, the proposed model seeks to deliver a co...

  20. Spectral analysis of pressures measured on two high-rise building models in side-by-side arrangement

    NARCIS (Netherlands)

    Bronkhorst, A.J.; Geurts, C.P.W.; Bentum, C.A. van; Blocken, B.

    2013-01-01

    Pressure measurements on a square plan form high-rise building model and two square high-rise building models in side-by-side arrangement were analysed using the Fast Fourier Transform (FFT) to define peak frequencies resulting from interference. For the isolated building model, a reduced frequency

  1. Issues of Application of Machine Learning Models for Virtual and Real-Life Buildings

    Directory of Open Access Journals (Sweden)

    Young Min Kim

    2016-06-01

    Full Text Available The current Building Energy Performance Simulation (BEPS tools are based on first principles. For the correct use of BEPS tools, simulationists should have an in-depth understanding of building physics, numerical methods, control logics of building systems, etc. However, it takes significant time and effort to develop a first principles-based simulation model for existing buildings—mainly due to the laborious process of data gathering, uncertain inputs, model calibration, etc. Rather than resorting to an expert’s effort, a data-driven approach (so-called “inverse” approach has received growing attention for the simulation of existing buildings. This paper reports a cross-comparison of three popular machine learning models (Artificial Neural Network (ANN, Support Vector Machine (SVM, and Gaussian Process (GP for predicting a chiller’s energy consumption in a virtual and a real-life building. The predictions based on the three models are sufficiently accurate compared to the virtual and real measurements. This paper addresses the following issues for the successful development of machine learning models: reproducibility, selection of inputs, training period, outlying data obtained from the building energy management system (BEMS, and validation of the models. From the result of this comparative study, it was found that SVM has a disadvantage in computation time compared to ANN and GP. GP is the most sensitive to a training period among the three models.

  2. Model of mechanism of providing of strategic firmness of machine-building enterprise

    Directory of Open Access Journals (Sweden)

    I.V. Movchan

    2011-03-01

    Full Text Available In the article is considered theoretical aspects of strategic firmness and the developed algorithmic model of mechanism providing of strategic firmness of machine-building enterprise.

  3. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  4. Fire modeling for Building 221-T - T Plant Canyon Deck and Railroad Tunnel

    International Nuclear Information System (INIS)

    This report was prepared by Hughes Associates, Inc. to document the results of fire models for building 221-T Canyon Deck and Railroad Tunnel. Backup data is contained in document No. WHC-SD-CP-ANAL-010, Rev. 0

  5. Microscopy of a scalable superatom

    CERN Document Server

    Zeiher, Johannes; Hild, Sebastian; Macrì, Tommaso; Bloch, Immanuel; Gross, Christian

    2015-01-01

    Strong interactions can amplify quantum effects such that they become important on macroscopic scales. Controlling these coherently on a single particle level is essential for the tailored preparation of strongly correlated quantum systems and opens up new prospects for quantum technologies. Rydberg atoms offer such strong interactions which lead to extreme nonlinearities in laser coupled atomic ensembles. As a result, multiple excitation of a Micrometer sized cloud can be blocked while the light-matter coupling becomes collectively enhanced. The resulting two-level system, often called "superatom", is a valuable resource for quantum information, providing a collective Qubit. Here we report on the preparation of two orders of magnitude scalable superatoms utilizing the large interaction strength provided by Rydberg atoms combined with precise control of an ensemble of ultracold atoms in an optical lattice. The latter is achieved with sub shot noise precision by local manipulation of a two-dimensional Mott ins...

  6. Scalable Performance Measurement and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, T

    2009-10-27

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  7. Scalable Performance Measurement and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, Todd [Univ. of North Carolina, Chapel Hill, NC (United States)

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  8. Assessment of Retrofitting Measures for a Large Historic Research Facility Using a Building Energy Simulation Model

    OpenAIRE

    Young Tae Chae; Lee, Young M.; David Longinott

    2016-01-01

    A calibrated building simulation model was developed to assess the energy performance of a large historic research building. The complexity of space functions and operational conditions with limited availability of energy meters makes it hard to understand the end-used energy consumption in detail and to identify appropriate retrofitting options for reducing energy consumption and greenhouse gas (GHG) emissions. An energy simulation model was developed to study the energy usage patterns not o...

  9. Nengo: a Python tool for building large-scale functional brain models

    OpenAIRE

    Bekolay, Trevor; Bergstra, James; Hunsberger, Eric; DeWolf, Travis; Terrence C Stewart; Rasmussen, Daniel; Choo, Xuan; Voelker, Aaron Russell; Eliasmith, Chris

    2014-01-01

    Neuroscience currently lacks a comprehensive theory of how cognitive processes can be implemented in a biological substrate. The Neural Engineering Framework (NEF) proposes one such theory, but has not yet gathered significant empirical support, partly due to the technical challenge of building and simulating large-scale models with the NEF. Nengo is a software tool that can be used to build and simulate large-scale models based on the NEF; currently, it is the primary resource for both teach...

  10. Formulation of Japanese consensus-building model for HLW geological disposal site determination. 1. Introduction

    International Nuclear Information System (INIS)

    To establish the sustainable community in Japan, formation of Japanese consensus-building model for HLW geological disposal site determination is one of key issues. In our project, we have reviewed the past history for HLW geological disposal site determination and propose next-generation Japanese consensus-building model which is based on the discussion not only with government and specialists but also with citizens. (author)

  11. AUTOMATIC TOPOLOGY DERIVATION FROM IFC BUILDING MODEL FOR IN-DOOR INTELLIGENT NAVIGATION

    Directory of Open Access Journals (Sweden)

    S. J. Tang

    2015-05-01

    Full Text Available With the goal to achieve an accuracy navigation within the building environment, it is critical to explore a feasible way for building the connectivity relationships among 3D geographical features called in-building topology network. Traditional topology construction approaches for indoor space always based on 2D maps or pure geometry model, which remained information insufficient problem. Especially, an intelligent navigation for different applications depends mainly on the precise geometry and semantics of the navigation network. The trouble caused by existed topology construction approaches can be smoothed by employing IFC building model which contains detailed semantic and geometric information. In this paper, we present a method which combined a straight media axis transformation algorithm (S-MAT with IFC building model to reconstruct indoor geometric topology network. This derived topology aimed at facilitating the decision making for different in-building navigation. In this work, we describe a multi-step deviation process including semantic cleaning, walkable features extraction, Multi-Storey 2D Mapping and S-MAT implementation to automatically generate topography information from existing indoor building model data given in IFC.

  12. Simulation Speed Analysis and Improvements of Modelica Models for Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jorissen, Filip; Wetter, Michael; Helsen, Lieve

    2015-09-21

    This paper presents an approach for speeding up Modelica models. Insight is provided into how Modelica models are solved and what determines the tool’s computational speed. Aspects such as algebraic loops, code efficiency and integrator choice are discussed. This is illustrated using simple building simulation examples and Dymola. The generality of the work is in some cases verified using OpenModelica. Using this approach, a medium sized office building including building envelope, heating ventilation and air conditioning (HVAC) systems and control strategy can be simulated at a speed five hundred times faster than real time.

  13. Modeling and simulation of the energy use in an occupied residential building in cold climate

    International Nuclear Information System (INIS)

    Highlights: ► An overview of the energy-characteristics based on illustrations in graphical figures. ► Figures to support identification and validation energy refurbishment measures. ► Emphasizing energy efficiency measures in early stage of building design. -- Abstract: In order to reduce the energy use in the building sector there is a demand for tools that can identify significant building energy performance parameters. In the work introduced in this paper presents a methodology, based on a simulation module and graphical figures, for interactive investigations of the building energy performance. The building energy use simulation program is called TEKLA and is using EN832 with an improved procedure in calculating the heat loss through the floor and the solar heat gain. The graphical figures are simple and are illustrating the savings based on retrofit measures and climate conditions. The accuracy of the TEKLA simulation was investigated on a typical single-family building in Sweden for a period of time in a space heating demand of relatively cold and mild climate. The model was found applicable for relative investigations. Further, the methodology was applied on a typical single family reference building. The climate data from three locations in Sweden were collected and a set of relevant measures were studied. The investigated examples illustrate how decisions in the early stages of the building design process can have decisive importance on the final building energy performance.

  14. Modeling of two-storey precast school building using Ruaumoko 2D program

    Energy Technology Data Exchange (ETDEWEB)

    Hamid, N. H.; Tarmizi, L. H.; Ghani, K. D. [Faculty of Civil Engineering, Universiti Teknologi MARA, 40450 Shah Alam, Selangor (Malaysia)

    2015-05-15

    The long-distant earthquake loading from Sumatra and Java Island had caused some slight damages to precast and reinforced concrete buildings in West Malaysia such as cracks on wall panels, columns and beams. Subsequently, the safety of existing precast concrete building is needed to be analyzed because these buildings were designed using BS 8110 which did not include the seismic loading in the design. Thus, this paper emphasizes on the seismic performance and dynamic behavior of precast school building constructed in Malaysia under three selected past earthquakes excitations ; El Centro 1940 North-South, El Centro East-West components and San Fernando 1971 using RUAUMOKO 2D program. This program is fully utilized by using prototype precast school model and dynamic non-linear time history analysis. From the results, it can be concluded that two-storey precast school building has experienced severe damage and partial collapse especially at beam-column joint under San Fernando and El Centro North-South Earthquake as its exceeds the allowable inter-storey drift and displacement as specified in Eurocode 8. The San Fernando earthquake has produced a massive destruction to the precast building under viscous damping, ξ = 5% and this building has generated maximum building displacement of 435mm, maximum building drift of 0.68% and maximum bending moment at 8458kNm.

  15. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-01

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  16. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-26

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  17. The model of intellectual support of decision-making in building structures condition management

    Directory of Open Access Journals (Sweden)

    Velichkin V.Z.

    2012-05-01

    Full Text Available Popular methods of decision-making in building structures condition management do not fully consider peculiarities of their up-to-date operation. These approaches do not take into account the kinds of uncertainty occurring at a building designing stage and taking place while monitoring. It leads to the decrease in building targeted application efficiency and increase of controlling organization costs. The following approach suggests the improvement in the decision-making support systems by integration of expert knowledge and experience with tool and visual building structure control results.The purpose of the paper is effective decision-making aimed at uncertainty level decrease in the process of detection of operational impacts on building structures for the required durability provision. This purpose is achieved by artificial intelligence element application (fuzzy sets in the joint analysis of retrospective, current and expert information on the building structure state. The authors suggest selecting building structure state controlling actions with the help of fuzzy conclusions obtained by the usage of designed algorithms and calculated procedures. The applicability of the given approach was proved by the calculated example. A grounded variant of decision on the building structure state intellectual control was submitted (a damaged building wall.On the basis of these results the conclusions on the application field and conditions of the designed algorithms and model were made.

  18. Modeling of two-storey precast school building using Ruaumoko 2D program

    International Nuclear Information System (INIS)

    The long-distant earthquake loading from Sumatra and Java Island had caused some slight damages to precast and reinforced concrete buildings in West Malaysia such as cracks on wall panels, columns and beams. Subsequently, the safety of existing precast concrete building is needed to be analyzed because these buildings were designed using BS 8110 which did not include the seismic loading in the design. Thus, this paper emphasizes on the seismic performance and dynamic behavior of precast school building constructed in Malaysia under three selected past earthquakes excitations ; El Centro 1940 North-South, El Centro East-West components and San Fernando 1971 using RUAUMOKO 2D program. This program is fully utilized by using prototype precast school model and dynamic non-linear time history analysis. From the results, it can be concluded that two-storey precast school building has experienced severe damage and partial collapse especially at beam-column joint under San Fernando and El Centro North-South Earthquake as its exceeds the allowable inter-storey drift and displacement as specified in Eurocode 8. The San Fernando earthquake has produced a massive destruction to the precast building under viscous damping, ξ = 5% and this building has generated maximum building displacement of 435mm, maximum building drift of 0.68% and maximum bending moment at 8458kNm

  19. Modelling and Analysis of Heat Pumps for Zero Emission Buildings

    OpenAIRE

    Småland, Leif

    2013-01-01

    The work of this Master thesis is a continuation of a project work. This defines qualitative and quantitative parameters needed to make a simulation tool for early-stage decision making with regards to the energy supply strategy for non-residential Zero Emission Building (ZEB). The work is based on the assumption that the heat pump (HP) technology will be one of the core technologies for the energy supply strategy in the ZEB concept. The simulation tool proposed should be able to find the bes...

  20. Moisture buffering and its consequence in whole building hygrothermal modeling

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2008-01-01

    Moisture absorption and desorption of materials in contact with indoor air of buildings can be used as a passive, i.e., nonmechanical, way to moderate the variation of indoor humidity. This phenomenon, which is recognized as,moisture buffering', could potentially be used as an attractive feature of...... for ventilation if indoor humidity is a parameter for controlling ventilation rate, 2. it is possible to improve the perceived acceptability of indoor air, as judged by the temperature and humidity of the air, by using moisture buffering to control the indoor humidity. The results of the whole...

  1. Towards The Long-Term Preservation of Building Information Models

    DEFF Research Database (Denmark)

    Beetz, Jacob; Dietze, Stefan; Berndt, René;

    2013-01-01

    Long-term preservation of information about artifacts of the built environment is crucial to provide the ability to retrofit legacy buildings, to preserve cultural heritage, to ensure security precautions, to enable knowledge-reuse of design and engineering solutions and to guarantee the legal......, no existing approach is able to provide a secure and efficient long-term preservation solution covering the broad spectrum of 3D architectural data, while at the same time taking into account the demands of institutional collectors like architecture libraries and archives as well as those of the private...

  2. Scalable Video Coding with Interlayer Signal Decorrelation Techniques

    Directory of Open Access Journals (Sweden)

    Yang Wenxian

    2007-01-01

    Full Text Available Scalability is one of the essential requirements in the compression of visual data for present-day multimedia communications and storage. The basic building block for providing the spatial scalability in the scalable video coding (SVC standard is the well-known Laplacian pyramid (LP. An LP achieves the multiscale representation of the video as a base-layer signal at lower resolution together with several enhancement-layer signals at successive higher resolutions. In this paper, we propose to improve the coding performance of the enhancement layers through efficient interlayer decorrelation techniques. We first show that, with nonbiorthogonal upsampling and downsampling filters, the base layer and the enhancement layers are correlated. We investigate two structures to reduce this correlation. The first structure updates the base-layer signal by subtracting from it the low-frequency component of the enhancement layer signal. The second structure modifies the prediction in order that the low-frequency component in the new enhancement layer is diminished. The second structure is integrated in the JSVM 4.0 codec with suitable modifications in the prediction modes. Experimental results with some standard test sequences demonstrate coding gains up to 1 dB for I pictures and up to 0.7 dB for both I and P pictures.

  3. A Micro-Macro Model for South Africa: Building and Linking a Microsimulation Model to a CGE Model

    OpenAIRE

    Nicolas Hérault

    2005-01-01

    This paper describes a newly-built micro-macro model for South Africa. A computable general equilibrium (CGE) model and a microsimulation (MS) model are combined in a sequential approach in order to build an effective tool to assess the effects of various macroeconomic policies and shocks on South African households. The CGE model is used to simulate the macro-changes in the structure of the economy after the policy change or the macro-shock. In a second step, these changes are passed on to t...

  4. Towards a Very Low Energy Building Stock: Modeling the U.S. Commercial Building Sector to Support Policy and Innovation Planning

    Energy Technology Data Exchange (ETDEWEB)

    Coffey, Brian; Borgeson, Sam; Selkowitz, Stephen; Apte, Josh; Mathew, Paul; Haves, Philip

    2009-07-01

    This paper describes the origin, structure and continuing development of a model of time varying energy consumption in the US commercial building stock. The model is based on a flexible structure that disaggregates the stock into various categories (e.g. by building type, climate, vintage and life-cycle stage) and assigns attributes to each of these (e.g. floor area and energy use intensity by fuel type and end use), based on historical data and user-defined scenarios for future projections. In addition to supporting the interactive exploration of building stock dynamics, the model has been used to study the likely outcomes of specific policy and innovation scenarios targeting very low future energy consumption in the building stock. Model use has highlighted the scale of the challenge of meeting targets stated by various government and professional bodies, and the importance of considering both new construction and existing buildings.

  5. Toward a scalable and collaborative network monitoring overlay

    OpenAIRE

    Castro, Vasco; Carvalho, Paulo; Lima, Solange

    2011-01-01

    This paper presents ongoing work toward the definition of a new network monitoring model which resorts to a cooperative interaction among measurement entities to monitor the quality of network services. Exploring (i) the definition of representative measurement points to form a network monitoring overlay; (ii) the removal of measurement redundancy through composition of metrics; and (iii) a simple active measurement methodology, the proposed model aims to contribute to a scalable, robust a...

  6. How to Build a Course in Mathematical–Biological Modeling: Content and Processes for Knowledge and Skill

    OpenAIRE

    Hoskinson, Anne-Marie

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical–biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance fro...

  7. Extension of the PMV model to non-air-conditioned building in warm climates

    DEFF Research Database (Denmark)

    Fanger, Povl Ole; Toftum, Jørn

    2002-01-01

    The PMV model agrees well with high-quality field studies in buildings with HVAC systems, situated in cold, temperate and warm climates, studied during both summer and winter. In non-air-conditioned buildings in warm climates, occupants may sense the warmth as being less severe than the PMV...... predicts. The main reason is low expectations, but a metabolic rate that is estimated too high can also contribute to explaining the difference. An extension of the PMV model that includes an expectancy factor is introduced for use in non-air-conditioned buildings in warm climates. The extended PMV model...... agrees well with quality field studies in non-air-conditioned buildings of three continents....

  8. Lost opportunities: Modeling commercial building energy code adoption in the United States

    International Nuclear Information System (INIS)

    This paper models the adoption of commercial building energy codes in the US between 1977 and 2006. Energy code adoption typically results in an increase in aggregate social welfare by cost effectively reducing energy expenditures. Using a Cox proportional hazards model, I test if relative state funding, a new, objective, multivariate regression-derived measure of government capacity, as well as a vector of control variables commonly used in comparative state research, predict commercial building energy code adoption. The research shows little political influence over historical commercial building energy code adoption in the sample. Colder climates and higher electricity prices also do not predict more frequent code adoptions. I do find evidence of high government capacity states being 60 percent more likely than low capacity states to adopt commercial building energy codes in the following year. Wealthier states are also more likely to adopt commercial codes. Policy recommendations to increase building code adoption include increasing access to low cost capital for the private sector and providing noncompetitive block grants to the states from the federal government. - Highlights: ► Model the adoption of commercial building energy codes from 1977–2006 in the US. ► Little political influence over historical building energy code adoption. ► High capacity states are over 60 percent more likely than low capacity states to adopt codes. ► Wealthier states are more likely to adopt commercial codes. ► Access to capital and technical assistance is critical to increase code adoption.

  9. Building Mathematical Models of Simple Harmonic and Damped Motion.

    Science.gov (United States)

    Edwards, Thomas

    1995-01-01

    By developing a sequence of mathematical models of harmonic motion, shows that mathematical models are not right or wrong, but instead are better or poorer representations of the problem situation. (MKR)

  10. A sensitivity model for energy consumption in buildings. Part 1: Effect of exterior environment

    Science.gov (United States)

    Lansing, F. L.

    1981-01-01

    A simple analytical model is developed for the simulation of seasonal heating and cooling loads of any class of buildings to complement available computerized techniques which make hourly, daily, and monthly calculations. An expression for the annual energy utilization index, which is a common measure of rating buildings having the same functional utilization, is derived to include about 30 parameters for both building interior and exterior environments. The sensitivity of a general class building to either controlled or uncontrolled weather parameters is examined. A hypothetical office type building, located at the Goldstone Space Communication Complex, Goldstone, California, is selected as an example for the numerical sensitivity evaluations. Several expressions of variations in local outside air temperature, pressure, solar radiation, and wind velocity are presented.

  11. Building generic anatomical models using virtual model cutting and iterative registration

    Directory of Open Access Journals (Sweden)

    Hallgrímsson Benedikt

    2010-02-01

    retrieve a sub-region from it at their ease. Java-based implementation allows our method to be used on various visualization systems including personal computers, workstations, computers equipped with stereo displays, and even virtual reality rooms such as the CAVE Automated Virtual Environment. The technique allows biologists to build generic 3D models of their interest quickly and accurately.

  12. HYPERSTATIC STRUCTURE MAPPING MODEL BUILDING AND OPTIMIZING DESIGN

    Institute of Scientific and Technical Information of China (English)

    XU Gening; GAO Youshan; ZHANG Xueliang; YANG Ruigang

    2007-01-01

    Hyperstatic structure plane model being built by structural mechanics is studied. Space model precisely reflected in real stress of the structure is built by finite element method (FEM) analysis commerce software. Mapping model of complex structure system is set up, with convenient calculation just as in plane model and comprehensive information as in space model. Plane model and space model are calculated under the same working condition. Plane model modular construction inner force is considered as input data; Space model modular construction inner force is considered as output data. Thus specimen is built on input data and output data. Character and affiliation are extracted through training specimen, with the employment of nonlinear mapping capability of the artificial neural network. Mapping model with interpolation and extrapolation is gained, laying the foundation for optimum design. The steel structure of high-layer parking system (SSHLPS) is calculated as an instance. A three-layer back-propagation (BP) net including one hidden layer is constructed with nine input nodes and eight output nodes for a five-layer SSHLPS. The three-layer structure optimization result through the mapping model interpolation contrasts with integrity re-analysis, and seven layers structure through the mapping model extrapolation contrasts with integrity re-analysis. Any layer SSHLPS among 1~8 can be calculated with much accuracy. Amount of calculation can also be reduced if it is applied into the same topological structure, with reduced distortion and assured precision.

  13. Building Higher-Order Markov Chain Models with EXCEL

    Science.gov (United States)

    Ching, Wai-Ki; Fung, Eric S.; Ng, Michael K.

    2004-01-01

    Categorical data sequences occur in many applications such as forecasting, data mining and bioinformatics. In this note, we present higher-order Markov chain models for modelling categorical data sequences with an efficient algorithm for solving the model parameters. The algorithm can be implemented easily in a Microsoft EXCEL worksheet. We give a…

  14. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol;

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...

  15. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Mukkerikar, Amol;

    2011-01-01

    The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...

  16. Building a multilevel modeling network for lipid processing systems

    DEFF Research Database (Denmark)

    Mustaffa, Azizul Azri; Díaz Tovar, Carlos Axel; Hukkerikar, Amol;

    2011-01-01

    The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...

  17. Building an Online Wisdom Community: A Transformational Design Model

    Science.gov (United States)

    Gunawardena, Charlotte N.; Jennings, Barbara; Ortegano-Layne, Ludmila C.; Frechette, Casey; Carabajal, Kayleigh; Lindemann, Ken; Mummert, Julia

    2004-01-01

    This paper discusses the development of a new instructional design model based on socioconstructivist learning theories and distance education principles for the design of online wisdom communities and the efficacy of the model drawing on evaluation results from its implementation in Fall 2002. The model, Final Outcome Centered Around Learner…

  18. Change detection on LOD 2 building models with very high resolution spaceborne stereo imagery

    Science.gov (United States)

    Qin, Rongjun

    2014-10-01

    Due to the fast development of the urban environment, the need for efficient maintenance and updating of 3D building models is ever increasing. Change detection is an essential step to spot the changed area for data (map/3D models) updating and urban monitoring. Traditional methods based on 2D images are no longer suitable for change detection in building scale, owing to the increased spectral variability of the building roofs and larger perspective distortion of the very high resolution (VHR) imagery. Change detection in 3D is increasingly being investigated using airborne laser scanning data or matched Digital Surface Models (DSM), but rare study has been conducted regarding to change detection on 3D city models with VHR images, which is more informative but meanwhile more complicated. This is due to the fact that the 3D models are abstracted geometric representation of the urban reality, while the VHR images record everything. In this paper, a novel method is proposed to detect changes directly on LOD (Level of Detail) 2 building models with VHR spaceborne stereo images from a different date, with particular focus on addressing the special characteristics of the 3D models. In the first step, the 3D building models are projected onto a raster grid, encoded with building object, terrain object, and planar faces. The DSM is extracted from the stereo imagery by hierarchical semi-global matching (SGM). In the second step, a multi-channel change indicator is extracted between the 3D models and stereo images, considering the inherent geometric consistency (IGC), height difference, and texture similarity for each planar face. Each channel of the indicator is then clustered with the Self-organizing Map (SOM), with "change", "non-change" and "uncertain change" status labeled through a voting strategy. The "uncertain changes" are then determined with a Markov Random Field (MRF) analysis considering the geometric relationship between faces. In the third step, buildings are

  19. Software Security and the "Building Security in Maturity" Model

    CERN Document Server

    CERN. Geneva

    2011-01-01

    Using the framework described in my book "Software Security: Building Security In" I will discuss and describe the state of the practice in software security. This talk is peppered with real data from the field, based on my work with several large companies as a Cigital consultant. As a discipline, software security has made great progress over the last decade. Of the sixty large-scale software security initiatives we are aware of, thirty-two---all household names---are currently included in the BSIMM study. Those companies among the thirty-two who graciously agreed to be identified include: Adobe, Aon, Bank of America, Capital One, The Depository Trust & Clearing Corporation (DTCC), EMC, Google, Intel, Intuit, McKesson, Microsoft, Nokia, QUALCOMM, Sallie Mae, Standard Life, SWIFT, Symantec, Telecom Italia, Thomson Reuters, VMware, and Wells Fargo. The BSIMM was created by observing and analyzing real-world data from thirty-two leading software security initiatives. The BSIMM can...

  20. Building gas markets. US versus EU, market versus market model

    International Nuclear Information System (INIS)

    The liberalization process of the gas sector has showed that the reasoning to introduce competition in gas industries separates the services in at least two groups: commodities with relatively low transaction costs, and hence suitable to short-term market coordination, and network services which concentrate most of the specificities related to the physical flows. However, the way to coordinate such network services is still under debate. In this view, in USA specific services are coordinated through long-term contracts, whereas the EU regulatory frame socializes the costs of the network services. In this paper, we develop a general analysis of the major consequences of this fundamental regulatory choice. In addition, we build on such analysis to explain the differences among the current proposals to design the coming European Internal Market.

  1. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    Science.gov (United States)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  2. Albany/FELIX: a parallel, scalable and robust, finite element, first-order Stokes approximation ice sheet solver built for advanced analysis

    Directory of Open Access Journals (Sweden)

    I. Kalashnikova

    2014-11-01

    Full Text Available This paper describes a new parallel, scalable and robust finite-element based solver for the first-order Stokes momentum balance equations for ice flow. The solver, known as Albany/FELIX, is constructed using the component-based approach to building application codes, in which mature, modular libraries developed as a part of the Trilinos project are combined using abstract interfaces and Template-Based Generic Programming, resulting in a final code with access to dozens of algorithmic and advanced analysis capabilities. Following an overview of the relevant partial differential equations and boundary conditions, the numerical methods chosen to discretize the ice flow equations are described, along with their implementation. The results of several verification studies of the model accuracy are presented using: (1 new test cases derived using the method of manufactured solutions, and (2 canonical ice sheet modeling benchmarks. Model accuracy and convergence with respect to mesh resolution is then studied on problems involving a realistic Greenland ice sheet geometry discretized using structured and unstructured meshes. Also explored as a part of this study is the effect of vertical mesh resolution on the solution accuracy and solver performance. The robustness and scalability of our solver on these problems is demonstrated. Lastly, we show that good scalability can be achieved by preconditioning the iterative linear solver using a new algebraic multilevel preconditioner, constructed based on the idea of semi-coarsening.

  3. Modeling carbon dioxide emissions reductions for three commercial reference buildings in Salt Lake City

    Science.gov (United States)

    Lucich, Stephen M.

    In the United States, the buildings sector is responsible for approximately 40% of the national carbon dioxide (CO2) emissions. CO2 is created during the generation of heat and electricity, and has been linked to climate change, acid rain, a variety of health threats, surface water depletion, and the destruction of natural habitats. Building energy modeling is a powerful educational tool that building owners, architects, engineers, city planners, and policy makers can use to make informed decisions. The aim of this thesis is to simulate the reduction in CO2 emissions that may be achieved for three commercial buildings located in Salt Lake City, UT. The following two questions were used to guide this process: 1. How much can a building's annual CO2 emissions be reduced through a specific energy efficiency upgrade or policy? 2. How much can a building's annual CO2 emissions be reduced through the addition of a photovoltaic (PV) array? How large should the array be? Building energy simulations were performed with the Department of Energy's EnergyPlus software, commercial reference building models, and TMY3 weather data. The chosen models were a medium office building, a primary school, and a supermarket. Baseline energy consumption data were simulated for each model in order to identify changes that would have a meaningful impact. Modifications to the buildings construction and operation were considered before a PV array was incorporated. These modifications include (1) an improved building envelope, (2) reduced lighting intensity, and (3) modified HVAC temperature set points. The PV array sizing was optimized using a demand matching approach based on the method of least squares. The arrays tilt angle was optimized using the golden section search algorithm. Combined, energy efficiency upgrades and the PV array reduced building CO2 emissions by 58.6, 54.0, and 52.2% for the medium office, primary school, and supermarket, respectively. However, for these models, it was

  4. TLS for generating multi-LOD of 3D building model

    International Nuclear Information System (INIS)

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown

  5. Using Python to Construct a Scalable Parallel Nonlinear Wave Solver

    KAUST Repository

    Mandli, Kyle

    2011-01-01

    Computational scientists seek to provide efficient, easy-to-use tools and frameworks that enable application scientists within a specific discipline to build and/or apply numerical models with up-to-date computing technologies that can be executed on all available computing systems. Although many tools could be useful for groups beyond a specific application, it is often difficult and time consuming to combine existing software, or to adapt it for a more general purpose. Python enables a high-level approach where a general framework can be supplemented with tools written for different fields and in different languages. This is particularly important when a large number of tools are necessary, as is the case for high performance scientific codes. This motivated our development of PetClaw, a scalable distributed-memory solver for time-dependent nonlinear wave propagation, as a case-study for how Python can be used as a highlevel framework leveraging a multitude of codes, efficient both in the reuse of code and programmer productivity. We present scaling results for computations on up to four racks of Shaheen, an IBM BlueGene/P supercomputer at King Abdullah University of Science and Technology. One particularly important issue that PetClaw has faced is the overhead associated with dynamic loading leading to catastrophic scaling. We use the walla library to solve the issue which does so by supplanting high-cost filesystem calls with MPI operations at a low enough level that developers may avoid any changes to their codes.

  6. Mental models of a water management system in a green building.

    Science.gov (United States)

    Kalantzis, Anastasia; Thatcher, Andrew; Sheridan, Craig

    2016-11-01

    This intergroup case study compared users' mental models with an expert design model of a water management system in a green building. The system incorporates a constructed wetland component and a rainwater collection pond that together recycle water for re-use in the building and its surroundings. The sample consisted of five building occupants and the cleaner (6 users) and two experts who were involved with the design of the water management system. Users' mental model descriptions and the experts' design model were derived from in-depth interviews combined with self-constructed (and verified) diagrams. Findings from the study suggest that there is considerable variability in the user mental models that could impact the efficient functioning of the water management system. Recommendations for improvements are discussed. PMID:27126802

  7. Using Data Mining Techniques to Build a Classification Model for Predicting Employees Performance

    Directory of Open Access Journals (Sweden)

    Qasem A. Al-Radaideh

    2012-02-01

    Full Text Available Human capital is of a high concern for companies’ management where their most interest is in hiring the highly qualified personnel which are expected to perform highly as well. Recently, there has been a growing interest in the data mining area, where the objective is the discovery of knowledge that is correct and of high benefit for users. In this paper, data mining techniques were utilized to build a classification model to predict the performance of employees. To build the classification model the CRISP-DM data mining methodology was adopted. Decision tree was the main data mining tool used to build the classification model, where several classification rules were generated. To validate the generated model, several experiments were conducted using real data collected from several companies. The model is intended to be used for predicting new applicants’ performance.

  8. Model of Next Generation Energy-Efficient Design Software for Buildings

    Institute of Scientific and Technical Information of China (English)

    MA Zhiliang; ZHAO Yili

    2008-01-01

    Energy-efficient design for buildings (EEDB) is a vital step towards building energy-saving. In or-der to greatly improve the EEDB, the next generation EEDB software that makes use of latest technologies needs to be developed. This paper mainly focuses on establishing the model of the next generation EEDB software. Based on the investigation of literatures and the interviews to the designers, the requirements on the next generation EEDB software were identified, where the lifecycle assessment on both energy con-sumption and environmental impacts, 3D graphics support, and building information modeling (BIM) support were stressed. Then the workflow for using the next generation EEDB software was established. Finally,based on the workflow, the framework model for the software was proposed, and the partial models and the corresponding functions were systematically analyzed. The model lays a solid foundation for developing the next generation EEDB software.

  9. Bootstrap data methodology for sequential hybrid model building

    Science.gov (United States)

    Volponi, Allan J. (Inventor); Brotherton, Thomas (Inventor)

    2007-01-01

    A method for modeling engine operation comprising the steps of: 1. collecting a first plurality of sensory data, 2. partitioning a flight envelope into a plurality of sub-regions, 3. assigning the first plurality of sensory data into the plurality of sub-regions, 4. generating an empirical model of at least one of the plurality of sub-regions, 5. generating a statistical summary model for at least one of the plurality of sub-regions, 6. collecting an additional plurality of sensory data, 7. partitioning the second plurality of sensory data into the plurality of sub-regions, 8. generating a plurality of pseudo-data using the empirical model, and 9. concatenating the plurality of pseudo-data and the additional plurality of sensory data to generate an updated empirical model and an updated statistical summary model for at least one of the plurality of sub-regions.

  10. Recent developments in string model-building and cosmology

    OpenAIRE

    Cicoli, Michele

    2016-01-01

    In this talk I discuss recent developments in moduli stabilisation, SUSY breaking and chiral D-brane models together with several interesting features of cosmological models built in the framework of type IIB string compactifications. I show that a non-trivial pre-inflationary dynamics can give rise to a power loss at large angular scales for which there have been mounting observational hints from both WMAP and Planck. I then describe different stringy embeddings of inflationary models which ...

  11. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  12. The ORC method. Effective modelling of thermal performance of multilayer building components

    Energy Technology Data Exchange (ETDEWEB)

    Akander, Jan

    2000-02-01

    The ORC Method (Optimised RC-networks) provides a means of modelling one- or multidimensional heat transfer in building components, in this context within building simulation environments. The methodology is shown, primarily applied to heat transfer in multilayer building components. For multilayer building components, the analytical thermal performance is known, given layer thickness and material properties. The aim of the ORC Method is to optimise the values of the thermal resistances and heat capacities of an RC-model such as to give model performance a good agreement with the analytical performance, for a wide range of frequencies. The optimisation procedure is made in the frequency domain, where the over-all deviation between model and analytical frequency response, in terms of admittance and dynamic transmittance, is minimised. It is shown that ORC's are effective in terms of accuracy and computational time in comparison to finite difference models when used in building simulations, in this case with IDA/ICE. An ORC configuration of five mass nodes has been found to model building components in Nordic countries well, within the application of thermal comfort and energy requirement simulations. Simple RC-networks, such as the surface heat capacity and the simple R-C-configuration are not appropriate for detailed building simulation. However, these can be used as basis for defining the effective heat capacity of a building component. An approximate method is suggested on how to determine the effective heat capacity without the use of complex numbers. This entity can be calculated on basis of layer thickness and material properties with the help of two time constants. The approximate method can give inaccuracies corresponding to 20%. In-situ measurements have been carried out in an experimental building with the purpose of establishing the effective heat capacity of external building components that are subjected to normal thermal conditions. The auxiliary

  13. Design build process flow visualization model plant PLTN PWR type

    International Nuclear Information System (INIS)

    Scale-down version of nuclear power plant type PWR model and process flow visualization has been design and constructed. This scale-down model includes primary and secondary cooling systems, and transmission line in three dimensional layout with a 1: 33,33 scale. The construction of scale model has been done in five steps that are study literature, field survey, drawing scale design, construction, and test. The results is scale-down model integrated with monitoring system using lab view and interlock system using PLC. The test result shows that process flow has operated as required in design specification. (author)

  14. Automatic Generation of 3D Building Models for Sustainable Development

    OpenAIRE

    Sugihara, Kenichi

    2015-01-01

    3D city models are important in urban planning for sustainable development. Urban planners draw maps for efficient land use and a compact city. 3D city models based on these maps are quite effective in understanding what, if this alternative plan is realized, the image of a sustainable city will be. However, enormous time and labour has to be consumed to create these 3D models, using 3D modelling software such as 3ds Max or SketchUp. In order to automate the laborious steps, a GIS and CG inte...

  15. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    Science.gov (United States)

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  16. Economical analyses of build-operate-transfer model in establishing alternative power plants

    International Nuclear Information System (INIS)

    The most widely employed method to meet the increasing electricity demand is building new power plants. The most important issue in building new power plants is to find financial funds. Various models are employed, especially in developing countries, in order to overcome this problem and to find a financial source. One of these models is the build-operate-transfer (BOT) model. In this model, the investor raises all the funds for mandatory expenses and provides financing, builds the plant and, after a certain plant operation period, transfers the plant to the national power organization. In this model, the object is to decrease the burden of power plants on the state budget. The most important issue in the BOT model is the dependence of the unit electricity cost on the transfer period. In this study, the model giving the unit electricity cost depending on the transfer of the plants established according to the BOT model, has been discussed. Unit electricity investment cost and unit electricity cost in relation to transfer period for plant types have been determined. Furthermore, unit electricity cost change depending on load factor, which is one of the parameters affecting annual electricity production, has been determined, and the results have been analyzed. This method can be employed for comparing the production costs of different plants that are planned to be established according to the BOT model, or it can be employed to determine the appropriateness of the BOT model

  17. Energy Savings Modelling of Re-tuning Energy Conservation Measures in Large Office Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Nicholas; Katipamula, Srinivas; Wang, Weimin; Huang, Yunzhi; Liu, Guopeng

    2014-10-20

    Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS’s capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This paper investigates the energy savings potential of several common HVAC system re-tuning measures on a typical large office building, using the Department of Energy’s building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy as an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply-air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated – each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All the individual measures and combinations were simulated in 16 climate locations representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual total HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy

  18. Computational Fluid Dynamics Coupled with Thermal Impact Model for Building Design

    Directory of Open Access Journals (Sweden)

    Sue Ellen Haupt

    2010-10-01

    Full Text Available Thermal effects impact the flow around and within structures.  This computational study assesses features that affect the heating and buoyancy, and thus, the resulting flow both internal and external to a building.  Considerations include the importance of time of day, building materials, sky cover, etc. on the local thermal heating of a passive solar building.  Such impacts are assessed using full thermal coupling between a building energy simulation model and a computational fluid dynamics model, including the effects of thermal radiation, conduction, and convection to analyze the impact of all natural heating, cooling, and flow mechanisms for both the interior and exterior.    Unique features such as Trombe walls add to heat transfer mechanisms.  Analysis is made for three separate seasonal conditions.

  19. An Empirical Validation of Building Simulation Software for Modelling of Double-Skin Facade (DSF)

    DEFF Research Database (Denmark)

    Larsen, Olena Kalyanova; Heiselberg, Per; Felsmann, Clemens;

    2009-01-01

    buildings, but their accuracy might be limited in cases with DSFs because of the complexity of the heat and mass transfer processes within the DSF. To address this problem, an empirical validation of building models with DSF, performed with various building simulation tools (ESP-r, IDA ICE 3.0, VA114......, TRNSYS-TUD and BSim) was carried out in the framework of IEA SHC Task 34 /ECBCS Annex 43 "Testing and Validation of Building Energy Simulation Tools". The experimental data for the validation was gathered in a full-scale outdoor test facility. The empirical data sets comprise the key-functioning modes of...... DSF: 1. Thermal buffer mode (closed DSF cavity) and 2. External air curtain mode (naturally ventilated DSF cavity with the top and bottom openings open to outdoors). By carrying out the empirical tests, it was concluded that all models experience difficulties in predictions during the peak solar loads...

  20. An endpoint damage oriented model for life cycle environmental impact assessment of buildings in China

    Institute of Scientific and Technical Information of China (English)

    GU LiJing; LIN BoRong; GU DaoJin; ZHU YingXin

    2008-01-01

    The midpoint impact assessment methodology and several weighting methods that are currently used by most building Life cycle assessment (LCA) researchers in China, still have some shortcomings. In order to make the evaluation results have better temporal and spatial applicability, the endpoint impact assessment methodology was adopted in this paper. Based on the endpoint damage oriented concept, four endpoints of resource exhaustion, energy exhaustion, human health damage and ecosystem damage were selected according to the situation of China and the specialties of the building industry. Subsequently the formula for calculating each endpoint, the background value for normalization and the weighting factors were defined. Following that, an endpoint damage oriented model to evaluate the life cycle environmental impact of buildings in China was established. This model can produce an integrated indicator for environmental impact, and consequently provides references for directing the sustainable building design.

  1. 一种基于可扩展核心式结构的Internet服务质量模型%A model of QoS of the Internet based on scalable core

    Institute of Scientific and Technical Information of China (English)

    袁晓斌; 杨福猛; 赵兴

    2001-01-01

    With the rapid development of the Internet,a series of architectures, which guarantee QoS, have been conceived. This paper discusses integrated services, differentiated services and other models, which are new architectures for guaranteeing the QoS. The characteristics of each model are compared and analyzed. Considering the influence of per flow state in networks on the quality of service, this paper puts forward a QoS model which provides a service with flexibility, scalability, robustness and better guarantee level. The model combines the virtue of integrated services and differentiated services. The implementing algorithm of the model is also presented.%随着Internet技术的迅速发展,一系列的服务质量(QoS)保证体系逐渐成为人们研究的热点。文章探讨了最新的综合服务、区分服务等一系列QoS的体系模型,对每种模型的特点进行了分析比较,并结合网络数据流的状态对QoS的影响,提出了一种使网络具有灵活性、可扩展性、鲁棒性和较高服务质量保证的QoS模型。该模型综合了集中服务和区分服务2种体系模型的优点,最后给出了该模型实现的算法。

  2. Using open building data in the development of exposure data sets for catastrophe risk modelling

    Science.gov (United States)

    Figueiredo, R.; Martina, M.

    2016-02-01

    One of the necessary components to perform catastrophe risk modelling is information on the buildings at risk, such as their spatial location, geometry, height, occupancy type and other characteristics. This is commonly referred to as the exposure model or data set. When modelling large areas, developing exposure data sets with the relevant information about every individual building is not practicable. Thus, census data at coarse spatial resolutions are often used as the starting point for the creation of such data sets, after which disaggregation to finer resolutions is carried out using different methods, based on proxies such as the population distribution. While these methods can produce acceptable results, they cannot be considered ideal. Nowadays, the availability of open data is increasing and it is possible to obtain information about buildings for some regions. Although this type of information is usually limited and, therefore, insufficient to generate an exposure data set, it can still be very useful in its elaboration. In this paper, we focus on how open building data can be used to develop a gridded exposure model by disaggregating existing census data at coarser resolutions. Furthermore, we analyse how the selection of the level of spatial resolution can impact the accuracy and precision of the model, and compare the results in terms of affected residential building areas, due to a flood event, between different models.

  3. Modelling Emission from Building Materials with Computational Fluid Dynamics

    DEFF Research Database (Denmark)

    Topp, Claus; Nielsen, Peter V.; Heiselberg, Per

    This paper presents a numerical model that by means of computational fluid dynamics (CFD) is capable of dealing with both pollutant transport across the boundary layer and internal diffusion in the source without prior knowledge of which is the limiting process. The model provides the concentration...

  4. Building an Integrative Model for Managing Exploratory Innovation

    DEFF Research Database (Denmark)

    Zarmeen, Parisha; Turri, Vanessa Gina; Sanchez, Ron

    2014-01-01

    Purpose: In this paper we develop an integrated model identifying the key factors involved in managing exploratory innovation processes while also maintaining current business models and processes. Methodology/approach: We first characterize the problem of innovation as consisting of “the four ce...

  5. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    OpenAIRE

    Farzad Jalaei; Ahmad Jrade

    2014-01-01

    Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM) offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA) strategi...

  6. Using Data Mining Techniques to Build a Classification Model for Predicting Employees Performance

    OpenAIRE

    Qasem A. Al-Radaideh; Eman Al Nagi

    2012-01-01

    Human capital is of a high concern for companies’ management where their most interest is in hiring the highly qualified personnel which are expected to perform highly as well. Recently, there has been a growing interest in the data mining area, where the objective is the discovery of knowledge that is correct and of high benefit for users. In this paper, data mining techniques were utilized to build a classification model to predict the performance of employees. To build the classification m...

  7. Benefits of Building Information Modelling in the Project Lifecycle: Construction Projects in Asia

    OpenAIRE

    Jian Li; Ying Wang; Xiangyu Wang; Hanbin Luo; Shih-Chung Kang; Jun Wang; Jun Guo; Yi Jiao

    2014-01-01

    Building Information Modelling (BIM) is a process involving the creation and management of objective data with property, unique identity and relationship. In the Architecture, Engineering and Construction (AEC) industry, BIM is adopted a lot in the lifecycle of buildings because of the high integration of information that it enables. Four-dimensional (4D) computer-aided design (CAD) has been adopted for many years to improve the construction planning process. BIM is adopted throughout buildin...

  8. Analysis Framework for the Interaction Between Lean Construction and Building Information Modelling

    OpenAIRE

    Sacks, Rafael; Dave, Bhargav; Koskela, Lauri; Owen, Robert

    2009-01-01

    Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this persp...

  9. Modeling Data Center Building Blocks for Energy-efficiency and Thermal Simulations

    OpenAIRE

    Vor Dem Berge, Micha; Da Costa, Georges; Jarus, Mateusz; Oleksiak, Ariel; Piatek, Wojciech; Volk, Eugen

    2013-01-01

    In this paper we present a concept and specification of Data Center Efficiency Building Blocks (DEBBs), which represent hardware components of a data center complemented by descriptions of their energy efficiency. Proposed building blocks contain hardware and thermodynamic models that can be applied to simulate a data center and to evaluate its energy efficiency. DEBBs are available in an open repository being built by the CoolEmAll project. In the paper we illustrate the concept by an exampl...

  10. A HISTORICAL TIMBER FRAME MODEL FOR DIAGNOSIS AND DOCUMENTATION BEFORE BUILDING RESTORATION

    OpenAIRE

    Koehl, M.; Viale, A.; S. Reeb

    2013-01-01

    The aim of the project that is described in this paper was to define a four-level timber frame survey mode of a historical building: the so-called "Andlau's Seigniory", Alsace, France. This historical building (domain) was built in the late XVIth century and is now in a stage of renovation in order to become a heritage interpretation centre. The used measurement methods combine Total Station measurements, Photogrammetry and 3D Terrestrial Laser scanner. Different modelling workflows ...

  11. An innovative training model for eco-building technologies in retrofitting.

    OpenAIRE

    Scartezzini, Jean-Louis; CECCHERINI Nelli, Lucia; Sala, Marco

    2015-01-01

    The innovative training model for eco-building technologies in retrofitting projects (founded by EU Commission in the IEE programme in the REE_TROFIT project http://www.reetrofit.eu/content.php) aims to contribute to solve the shortage of local qualified and accredited retrofitting experts, as foreseen in the EPBD and its recast - and as indicated by various European countries in an assessment by the EC - for increasing the energy performance of the existing building stock. The...

  12. AN IMPROVED SNAKE MODEL FOR REFINEMENT OF LIDAR-DERIVED BUILDING ROOF CONTOURS USING AERIAL IMAGES

    OpenAIRE

    Chen, Qi; Wang, Shugen; Liu, Xiuguo

    2016-01-01

    Building roof contours are considered as very important geometric data, which have been widely applied in many fields, including but not limited to urban planning, land investigation, change detection and military reconnaissance. Currently, the demand on building contours at a finer scale (especially in urban areas) has been raised in a growing number of studies such as urban environment quality assessment, urban sprawl monitoring and urban air pollution modelling. LiDAR is known as an effect...

  13. On the integration of Building Information Modelling in undergraduate civil engineering programmes in the United Kingdom.

    OpenAIRE

    Bataw, Anas

    2016-01-01

    The management of data, information and knowledge through the project life cycle of buildings and civil infrastructure projects is becoming increasingly complex. In an attempt to drive efficiencies and address this complexity, the United Kingdom (UK) Government has mandated that Building Information Modelling (BIM) methods must be adopted in all public sector construction projects from 2016. Emerging from the US Department of Defence, BIM is an approach to the co-ordination of design and prod...

  14. Identification of Torsionally Coupled Shear Buildings Models Using a Vector Parameterization

    OpenAIRE

    Concha, Antonio (impresor); Alvarez-Icaza, Luis

    2016-01-01

    A methodology to estimate the shear model of seismically excited, torsionally coupled buildings using acceleration measurements of the ground and floors is presented. A vector parameterization that considers Rayleigh damping for the building is introduced that allows identifying the stiffness/mass and damping/mass ratios of the structure, as well as their eccentricities and radii of gyration. This parameterization has the advantage that its number of parameters is smaller than that obtained w...

  15. Image-Based Airborne LiDAR Point Cloud Encoding for 3d Building Model Retrieval

    Science.gov (United States)

    Chen, Yi-Chen; Lin, Chao-Hung

    2016-06-01

    With the development of Web 2.0 and cyber city modeling, an increasing number of 3D models have been available on web-based model-sharing platforms with many applications such as navigation, urban planning, and virtual reality. Based on the concept of data reuse, a 3D model retrieval system is proposed to retrieve building models similar to a user-specified query. The basic idea behind this system is to reuse these existing 3D building models instead of reconstruction from point clouds. To efficiently retrieve models, the models in databases are compactly encoded by using a shape descriptor generally. However, most of the geometric descriptors in related works are applied to polygonal models. In this study, the input query of the model retrieval system is a point cloud acquired by Light Detection and Ranging (LiDAR) systems because of the efficient scene scanning and spatial information collection. Using Point clouds with sparse, noisy, and incomplete sampling as input queries is more difficult than that by using 3D models. Because that the building roof is more informative than other parts in the airborne LiDAR point cloud, an image-based approach is proposed to encode both point clouds from input queries and 3D models in databases. The main goal of data encoding is that the models in the database and input point clouds can be consistently encoded. Firstly, top-view depth images of buildings are generated to represent the geometry surface of a building roof. Secondly, geometric features are extracted from depth images based on height, edge and plane of building. Finally, descriptors can be extracted by spatial histograms and used in 3D model retrieval system. For data retrieval, the models are retrieved by matching the encoding coefficients of point clouds and building models. In experiments, a database including about 900,000 3D models collected from the Internet is used for evaluation of data retrieval. The results of the proposed method show a clear superiority

  16. A model calibration framework for simultaneous multi-level building energy simulation

    International Nuclear Information System (INIS)

    Highlights: • Introduce a framework for multiple levels of building energy simulation calibration. • Improve the performance reliability of a calibrated model for different ECMs. • Achieve high simulation accuracies at building level, ECM level and zone level. • Create a classification schema to classify input parameters for calibration. • Use evidence and statistical learning to build energy model and reduce discrepancy. - Abstract: Energy simulation, the virtual representation and reproduction of energy processes for an entire building or a specific space, could assist building professionals with identifying relatively optimal energy conservation measures (ECMs). A review of current work revealed that methods for achieving simultaneous high accuracies in different levels of simulations, such as building level and zone level, have not been systematically explored, especially when there are several zones and multiple HVAC units in a building. Therefore, the objective of this paper is to introduce and validate a novel framework that can calibrate a model with high accuracies at multiple levels. In order to evaluate the performance of the calibration framework, we simulated HVAC-related energy consumption at the building level, at the ECM level and at the zone level. The simulation results were compared with the measured HVAC-related energy consumption. Our findings showed that MBE and CV (RMSE) were below 8.5% and 13.5%, respectively, for all three levels of energy simulation, demonstrating that the proposed framework could accurately simulate the building energy process at multiple levels. In addition, in order to estimate the potential energy efficiency improvements when different ECMs are implemented, the model has to be robust to the changes resulting from the building being operated under different control strategies. Mixed energy ground truths from two ECMs were used to calibrate the energy model. The results demonstrated that the model performed

  17. Order selection of thermal models by frequency analysis of measurements for building energy efficiency estimation

    International Nuclear Information System (INIS)

    Highlights: • The partial differential heat equation is introduced in matrix representation. • The link between different representations of thermal models is presented. • Measurable variation of the output is considered for model order reduction. • Model order reduction can optimize building energy performance characterization. - Abstract: Experimental identification of the dynamic models of heat transfer in walls is needed for optimal control and characterization of building energy performance. These models use the heat equation in time domain which can be put in matrix form and then, through state-space representation, transformed in a transfer function which is of infinite order. However, the model acts as a low-pass filter and needs to respond only to the frequency spectrum present in the measured inputs. Then, the order of the transfer function can be determined by using the frequency spectrum of the measured inputs and the accuracy of the sensors. The main idea is that from two models of different orders, the one with a lower order can be used in building parameter identification, when the difference between the outputs is negligible or lower than the output measurement error. A homogeneous light wall is used as an example for a detailed study and examples of homogeneous building elements with very high and very low time constants are given. The first order model is compared with a very high order model (hundreds of states) which can be considered almost continuous in space

  18. Markovian Building Blocks for Individual-Based Modelling

    DEFF Research Database (Denmark)

    Nilsson, Lars Anders Fredrik

    2007-01-01

    The present thesis consists of a summary report, four research articles, one technical report and one manuscript. The subject of the thesis is individual-based stochastic models. The summary report is composed of three parts and a brief history of some basic models in population biology. This his......The present thesis consists of a summary report, four research articles, one technical report and one manuscript. The subject of the thesis is individual-based stochastic models. The summary report is composed of three parts and a brief history of some basic models in population biology....... This history is included in order to provide a reader that has no previous exposure to models in population biology with a sufficient background to understand some of the biological models that are mentioned in the thesis. The first part of the rest of the summary is a description of the dramatic changes...... in the degree of aggregation of sprat or herring in the Baltic during the day, with special focus on the dispersion of the fish from schools at dusk. The next part is a brief introduction to Markovian arrival processes, a type of stochastic processes with potential applications as sub-models in population...

  19. Building a Verilog model for Boundary-Scan architecture

    International Nuclear Information System (INIS)

    This report presents a short introduction of the Boundary-Scan technique, as well as a model designed by Verilog HDL to implement the Boundary-Scan architecture. The idea of Boundary-Scan architecture provides a non-contact method of accessing chip pins during testing. The model implemented by Verilog is simple, but expresses the spirit of the standard very well. As this model is compatible with the IEEE 1149.1 standard it provides the possibility of implementing it in the real world. 11 refs., 11 figs., 2 tabs

  20. 3-D hydrodynamic modelling of flood impacts on a building and indoor flooding processes

    Science.gov (United States)

    Gems, Bernhard; Mazzorana, Bruno; Hofer, Thomas; Sturm, Michael; Gabl, Roman; Aufleger, Markus

    2016-06-01

    Given the current challenges in flood risk management and vulnerability assessment of buildings exposed to flood hazards, this study presents three-dimensional numerical modelling of torrential floods and its interaction with buildings. By means of a case study application, the FLOW-3D software is applied to the lower reach of the Rio Vallarsa torrent in the village of Laives (Italy). A single-family house on the flood plain is therefore considered in detail. It is exposed to a 300-year flood hydrograph. Different building representation scenarios, including an entire impervious building envelope and the assumption of fully permeable doors, light shafts and windows, are analysed. The modelling results give insight into the flooding process of the building's interior, the impacting hydrodynamic forces on the exterior and interior walls, and further, they quantify the impact of the flooding of a building on the flow field on the surrounding flood plain. The presented study contributes to the development of a comprehensive physics-based vulnerability assessment framework. For pure water floods, this study presents the possibilities and limits of advanced numerical modelling techniques within flood risk management and, thereby, the planning of local structural protection measures.